id
stringlengths
3
8
url
stringlengths
32
207
title
stringlengths
1
114
text
stringlengths
93
492k
4747726
https://en.wikipedia.org/wiki/List%20of%20retronyms
List of retronyms
This is a list of retronyms used in the English language. A retronym is a newer name for an existing thing that differentiates the original form or version from a subsequent one. Retronymic adjectives Analog Describes non-digital devices: Analog clock: Before digital clocks, most clocks had faces and hands. See also: Analog watch. Analog drawing: Drawing with conventional tools on a paper or canvas, as opposed to drawing on a computer using a software Analog synthesizer: Before synthesizers contained microchips, every stage of the internal electronic signal flow was analogous to a sound that would eventually be produced at the output stage, and this sound was shaped and altered as it passed through each filter and envelope. Analog watch: Before the advent of the digital watch, all watches had faces and hands. After the advent of the digital watch, watches with faces and hands became known as analog watches. Analog recording Conventional, classic, or traditional Describes devices or methods that have been largely replaced or significantly supplemented by new ones. For example, conventional (non-microwave) oven, or conventional weapon (one which does not incorporate chemical, biological or nuclear payloads). Classic Doctor Who: Used to distinguish the original series of the classic show from the 21st century sequel, New Doctor Who. This retronym is used by the BBC when both of these shows air. Classic Leave It To Beaver: Used to distinguish the original series of the classic sitcom from the 1980s sequel, The New Leave It To Beaver. This retronym was used by TBS when both of these shows aired. Coca-Cola Classic: Originally called Coca-Cola, the name was changed when the original recipe was reintroduced after New Coke failed to catch on. This is an example of a retronym officially coined by a product's manufacturer. Conventional airplane: In the late 1940s and early 1950s, this term was used to distinguish piston-engined aircraft from the new jet types. Conventional landing gear: Term used to distinguish the traditional landing gear arrangement of two main wheels and a tail wheel (also referred to as the "tail-dragger" type) from the newer tricycle landing gear (two main wheels and a nose wheel). Conventional memory: term coined when MS-DOS and other operating systems for the IBM PC and other IBM-like x86 machines went over the 640k memory limit with tricks to access extra memory with different code to address it. Conventional oven: Before the development of the microwave oven, this term was not used. Now it is commonly found in cooking instructions for prepared foods. Conventional war: Before the development of nuclear weapons, this term was not used. (War, Gwynne Dyer) Traditional braces: Used to refer to braces that are metal and crafted by hand, as opposed to SureSmile, Invisalign, and other new technologies. "Traditional Chinese characters": Used to contrast with Simplified Chinese characters. Traditional animation: With the rise of computer animation, hand-drawn, cel-based (or "2D") animation is now referred to as this. Civilian Used to refer to items that are not of military quality or for military use, to differentiate them from the military version. First, I or 1, also part 1, version 1, etc., Senior, the Elder Used when there is a second, third, fourth, etc. version/incarnation of something. This is not a retronym if it is used from the start in the anticipation of subsequent versions. When a dynastic ruler has or adopts the same name as a predecessor, the original is often retrospectively given the Roman numeral I if he did not already use one in his lifetime. For example, the Dutch prince William I of Orange was just William during his lifetime. On the other hand, e.g. emperor Franz Joseph I of Austria was so entitled even though there were no subsequent emperors of that name. In the United States, names (typically of males) may also follow this convention, or the father may be given the suffix Senior (Sr.), with Junior (Jr.) for the son; Roman numerals would be used if the name is repeated again. In some cases, such as US President George Bush and Major League Baseball player Ken Griffey, well-known people have become retroactively referred to as "Senior" after namesake sons rose to prominence in their own right. Also sometimes used to refer to the first incarnation of a movie, video game, etc. after sequels have been created, although such works are seldom renamed in this way officially. When Sony released the PlayStation 2, a redesigned version of the original PlayStation was also released under the name PSone. However, the word "One" doesn't always refer to version 1 of a product, such as in Xbox One. Manual Used to distinguish from automatic or electric versions. Manual transmissions in vehicles were just called "transmissions" until the invention of automatic transmissions. Sometimes they are called "standard" transmissions, but that adjective has become a misnomer in the United States since automatic transmissions have become the standard feature for most models today. Manual typewriters were likewise just called "typewriters" until the invention of electric typewriters. Natural Use to distinguish from artificial versions. Natural dyes like woad, indigo, saffron and madder were simply "dyes" until synthetic dyes were developed in the mid-19th century. Natural gums were just "gums" until synthetic gums were invented. Natural languages are those which evolved naturally in humans through use and repetition without conscious planning or premeditation, as opposed to recently developed constructed languages and formal languages. Natural ropes or plant ropes, such as those made from hemp or sisal, were just "ropes" until ropes made of synthetic materials became common. Natural satellites were just called "satellites" until the launch of Sputnik 1. Natural skin care involves the use of topical creams and lotions made of ingredients available in nature; all skin care was natural until synthetic cosmetics were invented. Natural sponge: all sponges were natural (either made from Luffa aegyptiaca or animal sponges) until polyester and polyurethane sponges came on the market in the mid-20th century. Natural rubber or India rubber was simply called "rubber" until synthetic rubber was invented in 1909. Old Naturally used when there is officially a "new" version of anything, to refer to the previous version. For example, when British money was decimalized and the new penny of 1/100 pound was adopted, the previous penny of 1/240 pound became known as the old penny. Old-fashioned refers to any practice which is no longer customary, e.g. in the context of dress sense, hairstyle or wording, as opposed to (the) fashion, which refers to anything which is at present customary. In popular music and the wider popular culture, the term old school (originally only used in hip-hop, but now in many other genres) has developed a similar meaning, and this has spread to other areas as well. Offline Computer users will sometimes agree to meet offline, i.e. face to face in the real world, as opposed to online in an Internet-based chat room or other such means of electronic communication. Before the Internet became widely used, this was of course the only way to "meet" someone and the term to meet offline was unheard of. Stephen Colbert, on his 4 February 2016 broadcast of The Late Show with Stephen Colbert, remarked on the strangeness of so-called "offline shopping", regarding Amazon.com's retail bookstore endeavor. Real Often used in a derogatory manner to signify that the original product is the "real" product, as if the new alternative is "fake". For example, "Real instruments" for instruments other than the synth; "Real car" for a fuel-burning car, as opposed to an electric car. Regular or plain Used to refer to an original product after new versions are released. For example, one could formerly just ask for a Pepsi. But with the advent of multiple versions like Diet Pepsi and Pepsi Max, one might ask for a regular Pepsi when one wants the original drink. Similarly, regular Oreo cookies were called that after Double Stuf Oreos and other varieties were released. Another example is that in the United States regular gasoline (petrol or petroleum spirit outside the U.S.) has now come to mean 87 octane-rated unleaded (ratings in other countries vary). In the United States almost all gasoline had tetraethyl lead additive and was sold as either regular gasoline (octane rating of 89) or high test (octane ratings of 91 or higher) until leaded petrol was phased out starting in the late 1970s; all new cars made since 1975 have catalytic converters. Plain M&M's: Plain M&M's candies (now Milk Chocolate) would not have been called that until 1954, when Peanut M&M's were introduced. Plain old telephone service (POTS): The term refers to the telephone service still available after the advent of more advanced forms of telephony, such as ISDN, mobile phones, and VoIP Plain text: Before word processing programs for computers with functions such as support for multiple fonts, underlining, bold/italic and other function came along, text files were simply just known as text. "Plain text" is also used in contrast to ciphered text. Regular cab pickup truck (also called single cab) used when extended and crew/double cabs became widely available. Regular coffee: The development of decaffeinated coffee led to this coinage. Regular / Normal cigarette : A tobacco cigarette. Before electronic cigarettes became popular, all commercially available cigarettes were tobacco cigarettes. Along the same lines, the smoking of traditional cigarettes is sometimes referred to as “traditional smoking” in order to distinguish it from vaping, which could also be considered a form of smoking. Tabletop Used to describe the original version of a board game or role-playing game once a video game version has been released. Tabletop can also refer to non-digital games in general in order to contrast them from video games. Vanilla Used to describe an unaltered, plain version of an item, often in reference to software. For example, in computer games with expansion packs, it is used to distinguish the original version from subsequent versions, especially when the original game does not have a subtitle. For example, World of Warcraft could refer to either the original game or one of the expansion packs, so users may refer to the original as "vanilla" to distinguish it from the subsequent versions. Wired Wired or hardwired refer to products such as telephones, headphones, speakers, computer accessories, etc., which are now available in wireless versions. Wireless telegraphy and wireless telephony were some of the first applications of radio technology, in the 1910s and 1920s; "wireless" as a noun today is sometimes simply a synonym for "mobile phone service"/"cell phone service". Nouns Numbers 1994 Level Before the Doom engine had more features added in source ports such as Boom ZDoom and Doom Legacy, all levels for Doom made around 1994 had limitations that constrained the gaming atmosphere. But when more features were added to source ports for better level atmospheres, older-style levels started to be called "1994 levels" to differentiate from the newer kind. 2D With the increasing prevalence of 3-D movies, conventional, non-stereoscopic versions of movies are starting to be called 2D versions. This is also used in reference to animation, to distinguish the older style hand-drawn or more recently vector-based animation from 3D-rendered animation. A–B Acoustic guitar Before the invention of the solid-body electric guitar, all guitars amplified the sound of a plucked string with a resonating hollow body. Similarly: acoustic piano. American Morse Code This was the original signaling alphabet, suggested by Samuel Morse's assistant, Alfred Vail. It has a variety of different units and timings. It was later replaced by the Continental code (also called international Morse code), which has simpler timings and a different alphabet. Also called "railroad code". AM radio Before the introduction of broadcast FM radio, the AM broadcast band radio was known simply as radio, wireless (in the UK) or as medium-wave radio (still the preferred term among radio enthusiasts) to distinguish it from the (also amplitude-modulated) shortwave radio bands. AM(/FM) only radio Before FM radio receivers came to the market, AM receivers were simply just known as radios. However, as AM/FM radios started to include turntables, tape players, CD players, and later on analog AUX inputs, satellite radio and even USB, AM/FM radios without bells and whistles would start to be called AM/FM-only radios on their own. Animal Crossing: Population: Growing! Used to refer to the original GameCube game after the release of its sequels. The name comes from its tagline in English-speaking regions. Apple I Originally released as the Apple Computer, it was renamed after the introduction of the Apple II personal computer. Artistic gymnastics Generally known simply as gymnastics before Rhythmic gymnastics was added to the Olympic program in 1984. At-grade expressway Since freeways are divided highways with 100% grade separations, expressways are at-grade highways with no direct private access. Some jurisdictions have different criteria on the difference of word use, but sometimes they are used interchangeably in areas that don't have many at-grade expressways. Since expressway and freeway are sometimes used interchangeably, the term at-grade expressway has been coined since there was a time when all expressways were at-grade; prior to the 1940s which is when California and Michigan planned out the nation's first freeways. States like Florida sometimes use the term "freeway" in reference to expressways (at-grade or grade-separated) which are free-of-charge to use. Atari 2600 Originally sold as the Atari Video Computer System (or Atari VCS for short). When its successor, the Atari 5200, was released, the VCS was rebranded the Atari 2600, after its part number (CX-2600). Bar soap The common cake of soap used in the tub or shower was familiarly called "soap" or "bath soap"; the term "bar soap" arose with the advent of soaps in liquid and gel form. Black Licorice In North America, licorice is often called "black licorice" to distinguish it from similar confectionery varieties that are not flavored with licorice extract, and commonly manufactured in the form of chewy ropes or tubes. Black powder Called "gunpowder" for centuries while it was in common use. The retronym "black powder" was coined in the late 19th century to differentiate it from the newly developed smokeless powder which superseded it. Black-and-white television Once called simply television, now the retronym is used to distinguish it from color television, which is now more commonly referred to by the unadorned term. Along the same lines: broadcast television, free-to-air television, over-the-air television, silent movie. Furthermore, "Standard Definition Television" has become necessary to distinguish sets from HDTV (high definition). Boeing 737 Classic When Boeing introduced the 737 Next Generation (−600, −700, −800, and −900 series) whilst announcing its successor, the 737 MAX, the −300, −400, and −500 variants of the Boeing 737 still in service were called the 737 Classic. Breadbin C64 When Commodore introduced the C64C, which had a redesigned case, the original C64 model was nicknamed the breadbin to differentiate it. Brick-and-mortar store, high street shop As increasing use of the Internet allowed online stores, accessible only through computers, to compete with established retail shops, the latter began to be called "brick-and-mortar stores" or "high street shops" to indicate that customers could (or had to) visit them to examine and purchase their goods. These two terms are also often used to describe the physical storefronts of a retail business that also sells products online. In the U.S. and Canada, "brick-and-mortar" emphasizes the physical construction of these stores, as opposed to the largely electronic nature of online stores. The terms "high street shop" (UK) or "main street store" or "downtown store" (U.S. and Canada) also serve to differentiate the more traditional retail venue from big-chain "box stores" such as K-mart, Wal-Mart, or Zellers, which did not exist prior to the 1960s. (The name "High Street" is commonly used in the UK for a town's primary thoroughfare. In the U.S. and Canada, it is more likely to be called "Main Street".) British English Was simply referred to as "English" until North American English dialects and British English dialects started to diverge. Broadcast television This term was coined in the U.S. to distinguish it from cable and satellite television. Brown rice Prior to the mid nineteenth century, all rice consumed was brown or, whole grain. With the invention of white rice, brown began to refer to the traditional version. C–E Chicago II Refers to the second album by the band Chicago. The album was originally entitled just Chicago but the name was changed after the release of the third album, Chicago III. (Their first album was called Chicago Transit Authority, as that was the name of the band at the time.) Classical Hollywood Cinema a term commonly used since the 1970s to refer to the mainstream commercial American cinema of roughly 1930–1960, which at the time was simply referred to as "Hollywood", "the cinema", "the movies" etc. (see 'film noir' below). Classic Apple After Apple bought NeXT in 1997 and later became profitable, people began to refer to the pre-1997 history of the company as Classic Apple to differentiate it from the post-1997 Apple as the company was near bankruptcy when it bought NeXT. Apple nowadays is very successful and popular. Classic Mac OS Originally called System and later Mac OS, Apple retroactively added Classic to versions of the operating system from 1 to 9.2.2 (which were partly based on Lisa OS) to differentiate them from the newer Mac OS X (which was based on NeXTSTEP). Cloth diaper (Terry nappy) Before the second half of the 20th century, all diapers (nappies, in the UK) were made from cloth (terry cloth) and simply called diapers (US) or nappies (UK). The advent of the disposable diaper gave rise to this term. Command & Conquer: Tiberian Dawn This name is sometimes used by fans of the Command & Conquer series to refer to the original game of the series, officially known simply as Command & Conquer. Complex instruction set computer This name was coined after the advent of Reduced instruction set computer. Constitution Act, 1867 Prior to 1982, when the patriation of the constitution occurred, Canada's constitution was known as British North America Act 1867. Corn on the cob Before canned corn was widely available, "corn on the cob" was simply "corn". Bic Cristal Before the 2000s, the Bic Cristal was named "Bic Classic" pen. Prior to the 1990s, "Bic Classic" was referred to simply as the "Bic pen". CSI: Las Vegas Not used before the debut of the spinoff series CSI: Miami in 2002, and CSI: NY in 2004. Curved, curly or smart quotes Straight quotes were made widespread by typewriters. The smart designation came about as word processing software would often change straight quotes into curved quotes. Data-transfer USB port Before "recharge-only" (or powered USB) came along, all USB ports could both transfer data, and "recharge" mobile devices. Day baseball Baseball played during the day, as all games were played before electric lighting in stadiums became common. Dairy milk Used to refer to actual milk from a mammal's mammary glands, as opposed to plant milks like soy milk, rice milk, almond milk, and coconut milk. Disposable battery Before rechargeable batteries became popular in AA, AAA, C, D and PP3 form factors, all batteries in those form factors were disposable. However, rechargeable batteries back then were limited to stationary and vehicular (sometimes semi-portable) applications. Divided expressway/freeway (USA) Early expressways and freeways were divided corridors, but recent concepts of freeways and expressways have included occasional undivided corridors for economic and environmental compromises, as well as an initial phase prior to twinning. But it is unclear whether undivided versions existed first. However, the expressway, parkway and freeway concepts were developed with divided highways in mind during the 1910s (parkways) and 1940s (freeways), the German Autobahn would be conceptualized around the same time with similar qualities to freeways. Dumb Phone A phone with either no or limited internet capabilities. These phones also have no or limited ability to run apps. Before smartphones became popular, these were simply considered ‘phones’ or ‘cellphones’. They are also sometimes referred to as feature phones or “flip phones”. Electric guitar English muffin Originally called a 'muffin' in southern England, the prefix is now used to distinguish them from the American version. F–H Face-to-face conference A conference whose participants meet in the same room, as opposed to using telephones or video cameras (similarly:IRL-meeting = in-real-life meeting). Farmall Regular As explained at Farmall tractor, the name Farmall began as a model name but became a sub-brand name as additional models were developed. Fat model In the console collecting scene, a "Fat model" represents consoles released before a model that is more compact and has different hardware specifications. Field hockey (North America) Known simply as "hockey" (as it still is in the UK and Ireland) until ice hockey and roller hockey became popular. (In addition, there is a game called street hockey, which evolved from ice hockey.) Similarly, Field soccer (Football) and Field lacrosse (lacrosse). (Both North America) Film camera As opposed to digital camera. Also, the use of a film camera is often referred to as “film photography”, “analogue photography”, or “traditional photography” in order to distinguish it from digital photography. Film noir Prior to the 1970s, films with "film noir" style were referred to in English-speaking countries simply as dramas or melodramas (see 'Classical Hollywood' above). The term was coined in the 1950s by French critics who were taking the products of Hollywood more seriously than critics in the English-speaking world tended to at the time. First Gundam A nickname, commonly used by Japanese fans of the franchise and coined shortly after the release of Zeta Gundam. Gundam 0079 is also used in the same fashion. First Anglo-Dutch War Renamed after the Second Anglo-Dutch War in 1664. Fixedsys The monospaced system font in Microsoft Windows 1.x and 2.x, simply called System under those systems. In Windows 3.0, System became a proportional font, and the original font was renamed Fixedsys. Fortnite: Save the World Originally titled Fortnite it renamed after the release of Fortnite: Battle Royale. Forward slash Before the introduction of ASCII and electronic keyboards for computers, typewriters had only one type of slash ("/"), normally produced by the unshifted key shared with the question mark. The rise of MS-DOS brought regular use of the backslash ("\") character found on computer keyboards (for specifying directory paths). Before that time the symbol "/" was known simply as a "slash" (US) or "oblique" (UK). (Other typographical names for this character are virgule and solidus. In the UK, the character was traditionally known as an oblique stroke or, more simply, an oblique. To slash means to cut with a scything motion, which is analogous to the motion of the pen as the character is handwritten.) Free-range parenting Traditionally children had less supervision prior to the 21st century; this allowed for more independence and freedom in a child's decision making. There is a modern term called helicopter parenting which refers to parents who overly monitor, plan, and get involved with their kids activities. French franc The currency unit of France before the euro, which was originally the only franc, but had to be distinguished from the Belgian franc, Communauté Financière Africaine franc, and Swiss franc after those countries adopted the term. Friction brake Automotive disc brake or drum brake. Coined after the advent of the regenerative brake in electric or hybrid automobiles. Frizzen This component was called the "hammer" while flintlock firearms were in use. On percussion cap firearms which replaced flintlock the striking component was called the hammer and the term frizzen was applied to the hammer of flintlocks. Full service A radio format that consists of a wide range of programming. Coined after the introduction of contemporary hit radio in the 1950s. Full-size van (US) Coined after the introduction of minivans by the Big Three automakers, although box trucks (bigger vehicles that were considered vans) existed prior to the Big Three's use of full-size van. Game Boy Classic Used to distinguish the original from the Game Boy Pocket, the Game Boy Color, and the Game Boy Advance. Game Boy Mono see Game Boy Classic. Refers to the monochrome graphics these models produced. GM "old-look" transit bus The GM old look did not originally have a name, but in 1959, a new design was released and was called the new look. After this many people started calling the older design the Old Look. Ground warfare The "Ground war" term/phrase developed some time after the widespread adoption of large scale use of aircraft as a viable weapon of war. Hand-barrow Originally, "barrows" suspended the load on poles carried by two people, one in front and one behind. "Wheelbarrows" are first cited by the Oxford English Dictionary to the 14th century, and in the 15th century the term hand-barrow arose to refer to the older sort of barrow, but in the British Isles the more common version was sedan chair (if a person was being carried). Hand grenade All grenades were hand-thrown until the invention of the rifle grenade, and, later, the grenade launcher. Handwritten Crops up in the late 19th century to contrast with "typewritten". Hard cider In Europe and Asia, "cider" refers to fermented (alcoholic) apple juice. In the U.S., "cider" or "apple cider" often refers to unfiltered non-alcoholic apple juice. "Hard cider" specifies the alcoholic version. Hardcover book Prior to the invention of paperbacks, all books were hardcover and simply referred to as "books". Hard disk All disks were hard (i.e. constructed of rigid instead of flexible magnetic material) until the advent of the floppy disk. High-floorAll buses and trams were high-floor until the advent of low-floor trams and low-floor and low-entry buses. Horse cavalry Used to distinguish the now mostly obsolete original use of horses in a military mounted combat role, with the advent of tanks and other motorized vehicles (mechanized cavalry or armored cavalry) following World War I, and the use of helicopters (air cavalry) during the Vietnam War era. Horsecar (Horse Tram in English speaking countries outside North America) Used to describe the horse-pulled predecessor of the modern streetcar / tram. Originally called 'street cars' or just 'cars'. After street railway companies started electrifying their systems around 1900, the term became 'electric street cars' or 'electric trams', to differentiate from the previous horse-drawn vehicles. As time went on the word 'electric' was dropped, and as automobiles began being referred to as cars, the term 'streetcar'(US) or 'tram'(UK) remained to describe a public transit vehicle that ran on rails at street level Hot chocolate In the days before the invention of sweet solid chocolate for eating, the word "chocolate" was usually used to refer to the drink. For a while after the chocolate bar was invented it was referred to as "bar chocolate", but due to its rise in popularity in the latter half of the 19th century it eventually laid claim to the basic word. Human computer Until mechanical computers, and later electronic computers became commercially available, the term "computer", in use from the mid-17th century, meant "one who computes": a person performing mathematical calculations. Teams of people were frequently used to undertake long and often tedious calculations; the work was sometimes divided so that this could be done in parallel. I–L Indoor volleyball Used to differentiate from beach volleyball after the latter gained prominence. Independent bookstore All bookstores were independent until the advent of bookstore chains. Inground pool A swimming pool where the filled high water level is flush with the ground; compared to above an "above ground pool" where the entire pool is above ground level id Tech 1 engine A name applied to the Doom engine. Later game engines by id Software used the "id Tech" nomenclature, beginning with id Tech 4. iBook G3 Originally sold as the iBook, these machines were renamed the iBook G3 after the release of the iBook G4. iBook Clamshell : Originally sold as the iBook, the machine was nicknamed the Clamshell after Apple released the iBook G3 Snow. iBook G3 Snow : Just like its predecessor, the machine was originally sold as the iBook before being nicknamed the iMac G3 Snow by Apple so the name could be used on the iBook G4. iMac iMac G3 : Originally sold as the iMac, the machine was renamed the iMac G3 by Apple so the name could be used on the iMac G4. iMac G4 : Just like its predecessor, the machine was originally sold as the iMac before being renamed the iMac G4 by Apple so the name could be used on the iMac G5. iMac G5 : Just like its predecessors, the machine was originally sold as the iMac before being renamed the iMac G5 by Apple so the name could be used on the Intel-based iMac. iPhone 2G Used to differentiate the original 2007 model of the iPhone from its later models. King's Quest: Quest for the Crown The 1983 game was originally titled King's Quest until the fifth rerelease in 1987 when the subtitle was added to the box art, instructions, and all other materials. This was done to prevent confusion with the sequels which were already on the market. Landline phone service With the advent of cellular or mobile phone services, traditional hard-wired phone service became popularly known as landline phones. Previously, this term was generally only used by military personnel and amateur radio operators. (In the movie The Matrix a landline phone was also referred to as a "hardline".) Even though a considerable amount of landline phone traffic is transmitted via airwaves, this term comes from the physical cabling that provides the "last mile" connection between the customer premises and local phone distribution centers. Because of the communications industry's love for acronyms, landline phone service has also been called POTS—Plain Old Telephone Service. The logical complement of this acronym, "PANS" became a backronym for "Pretty Amazing New Services". In the telecommunications industry the term wireline is used for landline phone services, to distinguish them from wireless or mobile phone services. Wireline is clearly another retronym. Lead-acid car battery Before other battery chemical substances such as Ni-MH and Li-Ion were employed in hybrid and electric vehicles (although some current hybrid cars used lead-acid and some high-end conventional gasoline vehicles use Li-ion), lead-acid batteries were the only batteries for automobiles on the market; and they were also the only rechargeable ones on the market. LED mouse Before laser mice came along, all optical mice employed LEDs. Linear momentum Before the concept of angular momentum was developed, the only type of momentum known was linear. Linear television Before the rise of video on demand, video hosting services, streaming media, and digital video recorders, the only way to consume television was through watching television channels, on broadcast, cable or satellite, which showed a combination of both live and recorded programming at designated times. Lithium primary battery Batteries involving lithium were all primary cells (disposable) before rechargeable lithium-based batteries such as lithium ion batteries (later lithium polymer battery) hit the market. Live action A form of a film that consists of images consisting of predominantly actual actors and objects that exist in the actual world, as opposed to an animated film, which predominantly consists of artificial static images or objects that take advantage of the persistence of vision principle of film to give an illusion of life. Live poker What casinos call the kind of poker played with cards by people sitting at a table; what many others still just call "poker"; also called a "ring game" or "cash game". The term became necessary to distinguish it from video poker, which is far more common in casinos today. Live music Before the publication of recorded music, all music was live. Live band dance Before the advent of DJs (and then automated playlists), all dances had live music. Low-beam headlights simply just headlights before high beams were introduced on motor vehicles. Luggable computer The first generation of computers marketed as "portable", such as the Kaypro or the Osborne series, were quite bulky and were heavier than a bowling ball. The weight was mostly because they had a conventional CRT-type monitor built in. When the first laptop computers came out, the earlier, heavier portable machines became referred to as "luggables". M–P Macintosh 128k Originally named the Macintosh, changed to distinguish from the Macintosh 512k. Madden 89, 90, 91 Respectively known as 1988 video game, 1990 video game, and John Madden Football II, this was in the early days before year numbers were added to the title of Madden NFL video games. Mainframe computer When minicomputers (which were the size and shape of a desk or credenza) were introduced in the early 1970s, existing systems that often consisted of multiple large racks of equipment received the name "mainframe", alluding to the vertical cabinets or "frames" in which they were installed. Manual transmission (also standard transmission) Automotive transmissions were all manual before the invention of the automatic transmission. Meatspace or "meat life" or "real life" All of physical reality, as distinguished from cyberspace. Mechanical disk Before the advent of solid-state ram, and later solid-state flash memory (i.e. no moving parts), all computer disks had moving parts, hence the "mechanical" adjective. These include hard disks, floppy disks, and optical disks (CD-ROMs and DVD-ROMs). Mechanical fuel injection The amount of fuel squirted into an internal combustion engine by a fuel injection system was, before integrated circuitry became applied to motor vehicle engines, originally regulated by a calibrated mechanical linkage. What made for the retronym was the more precise Electronic Fuel Injection, which employed more sensors. Mechanical mouse before the optical mouse was introduced, all computer mice had a mechanical ball. Mechanical watch Prior to the introduction of the first quartz movement watches in the late 1960s, all watches used a mechanical movement. Microsoft Edge Legacy Middle Ages The period in European history from the 5th to the 15th century A. D. The earliest use of the term Middle Ages is recorded in 1604, to differentiate that period from the era of Antiquity and the then-beginning age of Modernity. Minecraft: Java Edition The original release of the game, on Microsoft Windows, was simply known as Minecraft prior to the release of Minecraft: Windows 10 Edition. In addition, other versions of the video game on Microsoft Windows are Minecraft Classic, Minecraft 4k, and Minecraft: Education Edition. Monaural sound, monophonic sound or mono sound Often simplified to simply "mono". Before stereo sound was introduced, mono sound was simply just called sound. Muzzleloader For centuries virtually all firearms were loaded from the muzzle, so there was no need for a term to distinguish this characteristic until the general adoption of breech-loading firearms in the 19th century. Narrow-body aircraft An aircraft arranged along a single aisle permitting up to 6-abreast seating in a cabin below of width. Before the arrival of wide-body aircraft in the early 1970s, narrow-body aircraft was simply just called aircraft. Natural language A language, used by humans, that evolved naturally in its society. Contrast with computer programming languages or constructed languages. Often referred to as human language. Natural person To distinguish humans (the original "persons") from the legal fiction of "juridical persons", non-human entities treated like people in law. Naturally aspirated engines Internal combustion engines that use vacuum and venturi effect to draw the air and fuel mixture into the cylinders, without fuel injection, turbo-charger, or supercharger. oil lamp Before the invention of kerosene lamps and electric lamps in the 19th century, all lamps were oil lamps. Old Nintendo 3DS Used to refer to the original models of the Nintendo 3DS before the release of the New Nintendo 3DS in 2014. Old Labour Term used in the 1990s and 2000s to refer to the policies the UK Labour Party was perceived to have held before Tony Blair's leadership, policies previously referred to simply as "Labour". Old Look A type of transit bus, which gained this name after the introduction of the New Look bus. Both were made by GM Old Testament In the Jewish tradition, the Hebrew Bible is known as the Tanakh. Open sewer Before enclosed pipes, or underground corridors for sewers came along, all sewers were open. For instance, the open sewers in the Middle Ages was largely responsible for The Black Death. Optical zoom The advent of digital cameras (and accompanying digital zoom) necessitated this retronym, describing the "analog" method of achieving close-up using a zoom lens. Opposite-sex marriage coined after the advent of same-sex marriage. Organic farming, organic food Farming practiced without the use of artificial fertilizers, pesticides, and so forth; and the food so produced. Over-the-board chess (also OTB chess) Chess played in real time using a physical chessboard, as opposed to computer chess or correspondence chess. Overground train Used in the UK to refer to trains that run above ground throughout, as opposed to Underground trains which only run partly overground. (The key distinction is that "Overground" trains are not fully integrated into the Underground system.) Paid-for sales, pure sales since the introduction of streaming into chart compilation, with (as in the UK Singles Chart) a certain number of streams often being added together to make a streaming sale, traditional sales of music (whether in physical or digital format) are now often referred to by these terms. Pai Gow tiles Before pai gow poker was created in 1985, the original game with dominoes was simply called pai gow. Pai gow poker is significantly more popular than pai gow played with dominoes so this qualifier is used. Paleoconservative Before the advent of the neoconservative movement in the 1970s and its breakthrough success in the 1990s, American conservatism was largely defined by what would be referred to in the 2000s (decade) as paleoconservatism. Pararescue jumper The term Pararescue jumper is a retronym of the initials "PJ", which were used on Air Force Form 5 (Aircrew Flight Log) to identify anyone on board in order to jump from the aircraft. Pararescuemen originally had no "in flight" duties, and were listed only as "PJ" on the Form 5. The Pararescue position eventually grew to include duties as an aerial gunner and scanner on rotary wing aircraft, a duty now performed by aerial gunners. Currently, aircrew qualified Pararescuemen are recorded using aircrew position identifier "J" ("Pararescue Member") on AFTO form 781. Paper book E-books being commoner by the day, it is now necessary to distinguish books printed on paper from books distributed in a digital form. Paper copy, hard copy With the proliferation of exchange of documents in the form of electronic files, physical copies of documents acquired this retronym. Occasionally extended to the copying devices; i.e. paper copiers. The jocular substitute dead-tree copy is sometimes used. Parallel ATA (PATA) The original ATA interface was parallel; the qualification became necessary when Serial ATA was introduced. Peanut butter Prior to the invention of homogenized peanut butter in the 1920s, all peanut butter was old fashioned or natural, the oil separated and the product required stirring before use. In addition, all peanut butter was creamy or smooth prior to invention of crunchy or chunky peanut butter in the 1920s. Permanent magnet Used for an object that is permanently magnetized rather than an electromagnet. Physical media (data transfer) Refers to the transmission of data over wires, such as copper cables, fibre optic or coaxial cable, as opposed to wireless communication. Physical media (media storage) Refers to the storage of data on physical objects, such as paper, photographs, video tapes, or optical disks, as opposed to cloud storage or streaming media. Physical single After the coming of the legal music download, this term became commonplace to refer to a vinyl, CD or cassette single, which would previously have been referred to simply as a "single". Pickup truck Before SUVs (often referred to as "trucks") were introduced, pickup trucks were those on a sturdy frame with high ground clearance. The term SUV was not coined in the 1990s; prior to then, SUVs were referred to as "trucks" and sometimes "cars". Pipe organ Before smaller reed-based organs and harmoniums were invented, every organ used large pipes. PlayStation 1 or PS1 to distinguish from the PlayStation 2 and its subsequent successors (PS3 and PS4). A smaller version of the original PlayStation was named the PS one, released shortly after the PS2. PowerPC G1 Originally called the PowerPC 601, the processor was nicknamed the G1 after Apple used the G3, G4, and the G5 names to refer to the PowerPC 7xx, PowerPC 74xx, and PowerPC 970 respectively. PowerPC G2 Originally called the PowerPC 603, the processor was nicknamed the G2 after Apple used the G3, G4, and the G5 names to refer to the PowerPC 7xx, PowerPC 74xx, and PowerPC 970 respectively. Primordial element and Transient element Elements that are found in nature, as opposed to those that have to be created in the lab using a collider. Post sedan or post coupe In the United States this indicates a car with a full-height B-pillar, as opposed to a pillarless (half-height B-pillar) hardtop. Generally used only in referring to classic cars from the 1950 to 1980 period because fashion and safety regulations dictate nearly all modern cars are post models. Pragmaticism In 1905, in order to differentiate his original version from more recent forms of Pragmatism, Charles Sanders Peirce renamed his version to Pragmaticism, a term "ugly enough to be safe from kidnappers". Pre-dreadnought battleship The revolution in battleship design brought about by the construction of HMS Dreadnought resulted in almost all the battleships built before her completion becoming known as "pre-Dreadnought battleships", whereas before they had simply been "battleships". Primary cell Also, less formally non-rechargeable battery; Before the introduction of rechargeable batteries, all cells were primary, then when rechargeable batteries came along (lead-acid battery being the first), rechargeable batteries would formally be called "secondary cells". Prime lens A camera lens with a fixed focal length (e.g. 28 mm), as opposed to a zoom lens, which can cover a range of focal lengths (e.g. 28–105 mm). Before the invention of zoom lenses, all camera lenses had a fixed focal length, so they were just called "lenses". Procedural programming Before object-oriented programming was invented in the 1980s, there was just programming. Program Files (x86) Before x86-64 versions of Microsoft Windows were released, all Windows applications since Windows 95 were installed in the directory back when it was simply just C:\Program Files. Prop airplane As jet aircraft became the primary people movers of the airways, the older propeller-based technology received this occasional shorthand nickname to distinguish it. Pulse dialing After touch tone dialing on telephones became common, the older dialing standard became known as pulse dialing. R–Z Raw milk also called fresh milk, refers to milk that has not been pasteurized, a process which did not become standard until the 1800s Real numbers coined after the development of the imaginary numbers. Real mode before protected mode had been introduced in the 80286 processor, the term "real mode" was not in use for MS-DOS memory management. Real tennis was once known simply as tennis, but came into use at the end of the 19th century to distinguish it from the game of lawn tennis patented in 1874. The term "real tennis" has become more vague now since video game tennis has come along. Therefore, real tennis is now court tennis. Red Book audio CD At first, all audio CDs complied with the Red Book standard. Then came other implementations of the audio CD, such as super audio CD, MP3 CDs, and DVD-Audio, and the original is now referred to as Red Book audio to differentiate between different standards. Reel-to-reel or open reel Tape recorders were originally simply tape recorders, as they all used a pair of open reels to hold the magnetic recording medium. The term reel-to-reel was introduced when various forms of cassette tape formats became popular. Reflective liquid crystal display before LCDs had backlighting, all LCDs required the reflection of room light or sunlight in order to see the screen. Regular Nintendo a colloquial nickname for the original Nintendo Entertainment System (NES) coined when Super Nintendo Entertainment System (Super NES) was introduced to the market. Roller skates The term applied to all types of skates, though with the popularization of "rollerblades" during the 1990s, the term roller skates started to refer to older two axle template. Rotary telephone or dial telephone The kind of telephone in common use before touch-tone telephones. Rugby union To differentiate it from its descendant, rugby league. Like hockey, the original term of rugby is still widespread. Scalar processors As opposed to Vector processors. Scripted series Created in the wake of the success of reality television, the term applies to both fiction and non-fiction television with an identified writer or writers. The term can be misleading since reality television is almost never wholly improvised and often includes writing of some kind. Seventy-eight (78) rpm records Before the advent of rpm and 45 rpm vinyl records, these were known simply as records, phonograph records or gramophone records. short file name (officially referred to as 8.3 filename) before the advent of long filenames. FAT file systems only had 11 characters, three of which form the extension. The ISO 9660 filesystem for CD-ROMs has similar specifications to conform to the FAT specs. Shovel Knight: Shovel of Hope Refers to the 2014 video game originally known as Shovel Knight. For the game's 2017 Nintendo Switch release, the game was given the subtitle to make it more consistent with its included DLC campaigns. The overall package was renamed to Shovel Knight: Treasure Trove. Silent film In the earliest days of the film industry, all films were without recorded sound. Once "talkies" became the norm, it became necessary to specify that a particular film was "silent". The term "silent film" is also a misnomer, because silent films were typically presented in theatres with live musical accompaniment. Sit-down restaurant With the rise of fast-food and take-out restaurants, the "standard" restaurant received a new name in the United States. (In the United Kingdom, fast food and takeaway (takeout) outlets are not normally referred to as "restaurants", so the "sit-down" qualifier is not necessary.) Smart Fortwo Originally sold as the Smart City-Coupé, the car was renamed the Fortwo upon the release of the Smart Forfour. Snail mail (also known as land mail, paper mail, p-mail, and postal mail) Non-electronic mail delivered to physical locations, such as one's home or business. Before email and voice mail, all mail was physical, and much slower by comparison – thus, the dysphemistic "snail" appellation. Compare surface mail, below. Sneakernet Before the Internet became popular, the so-called "sneakernet" was simply just a regular transfer of computer data on physical, interchangeable media. For instance, punched tape was used for this purpose at first, then floppy disks, then sneakernet was coined when the Internet became popular, now modern sneakernets involve transfer of Secure digital cards, USB flash drives, external hard drives, optical disks (CDs, DVDs, Blu-rays), etc. Snow skiing Water skiing now necessitates this differentiation. This, however, only applies to an area where both "snow" as well as "water" skiing are likely. "Snow skiing" would not be mentioned in the Alpine regions, unless large lakes offered the availability of water skiing. Solo motorcycle So called instead of motorcycle when some were being built with a sidecar. (see disputed retronyms below for more info). Sourdough Before other approaches to leavening bread were used, all bread dough was at least partially "sour". Special relativity Term introduced after Einstein developed general relativity. standard AUX input (standard auxiliary input) The common name for AUX audio inputs that doesn't employ an iPod dock connector, USB, optical/coaxial S/PDIF digital audio or proprietary mechanical standards that employ multiple standards alongside proprietary audio signaling standards. It usually refers to 1/8th inch TRS connectors, but sometimes it can refer to a set of red and white RCA stereo jacks. Star Trek: The Original Series The series' actual title Star Trek is now often used to refer collectively to the original series and its multitude of spin-offs. Star Wars: Episode IV – A New Hope Originally released in 1977 under the title Star Wars. The new title was applied to a 1979 publication of the script and (following the 1980 release of Star Wars: Episode V – The Empire Strikes Back) to a 1981 amended re-release of the original film. Static electricity see triboelectricity, below. Steam train In the 19th century, before the advent of electric and diesel-powered trains, steam trains were just "trains". Studio recording, studio album Before live albums, music for distribution on records was only recorded in a studio. Super Mario Bros.: The Lost Levels In 1986, the first sequel to the hit NES game Super Mario Bros. was released in Japan as Super Mario Bros. 2. Because of its extreme difficulty and similarity to its predecessor, Nintendo of America opted not to release the game in North America. Instead, Nintendo released a remake of Yume Kōjō: Doki Doki Panic as the North American Super Mario Bros. 2 in 1988. The original sequel was eventually rereleased worldwide as part of the Super Mario All-Stars compilation, but under the moniker Super Mario Bros.: The Lost Levels. Outside of Japan, this name persists. Surface mail Traditional mail, delivered by road, rail, and ship, retrospectively named following the development of airmail. Compare snail mail, above. Survivor: Borneo Broadcast as just Survivor. When the show subsequently used other locales, the location of the first season was added to the title to distinguish it. Terrestrial radio As opposed to satellite radio. Terrestrial television As opposed to satellite television and cable television. Textile top convertible Before retractable hardtops became popular, convertibles mostly had textile tops which folded when stowed away for a top-down ride. Text-only dialogue Before voice acting became commonplace in video games, text was used to convey dialogue between characters (especially in genres such as RPGs and adventure games). Some games, such as the Yakuza series, still uses text-only dialogue in addition to voice acting, depending on the importance of a cutscene. Tie-on pocket Early pockets were pouches, similar to a purse, tied around the waist and worn underneath the wearer's outer garments. Once pockets began to be sewn directly into clothing, these pouch-like pockets needed to be differentiated from those that had been integrated into the garment. Transformers: Generation 1 referring to the original Transformers toyline which ran from 1984 to 1992, and the assorted tie-in media. Then known only as "The Transformers", when the sequel series, Transformers: Generation 2 launched by Hasbro in 1993, all previous subject matter was dubbed "Generation 1" – many individuals did this independently, as it is a logical progression, and when the online fandom began growing in the 1990s, the term became the definitive one for that era. The term subsequently made it into official use through toy reissues and comic books, most notably on Japanese toy packaging. Triboelectricity Electricity was so named from the Greek word for amber, because of the discovery that if it was rubbed (generating what is now called triboelectricity) it would attract objects (due to a charge of static electricity). Electric currents and other forms of generation were discovered later. Tube amplifier Tube amplifiers for musical instruments were largely replaced by "transistor" (or solid state) amplifiers during the 1960s and 1970s. Tube TV or CRT TV Originally, all televisions used a cathode ray tube (CRT) to produce a TV image. But with the recent popularity of newer television technologies such as LCD, plasma, or DLP, some stores now describe the sets that still use a picture tube as tube TVs or CRT TVs. Two-door coupe Before four-door cars started to have coupe-like styling in recent years, coupe mostly referred to 2-door cars. Examples of 4-door cars that have coupe used as a marketing term are the BMW X6 SUV and the Dodge Charger sedan which re-uses the name of a 1970s 2-door car. Ultimate Doom Before Doom II, Ultimate Doom was originally just simply Doom. Doom was originally just a mail-order game, then when Doom II sold successfully in stores, Doom was re-released as a retail product, it was dubbed Ultimate Doom to differentiate from Doom II. It added a new episode called Thy Flesh Consumed. Uncontrolled road (or uncontrolled highway) Before the concept of controlled-access roads, which some call expressways came along, even predating automobiles, all roads had direct access to private property or public event or government grounds. When the controlled-access roads came along, they helped to virtually eliminate direct driveway access to private property or parking lots with only select crossroads for direct access. One had to use the term uncontrolled road to differentiate. However, the introduction of freeways (which other countries referred to as autoroutes, motorways and whatnot) further complicated matters by necessitating the use of the term at-grade expressway (see above). Recent uncontrolled roads have even adopted qualities of freeways and expressways such as paved shoulders (sometimes with rumble strips), freeway speed limits, and grade-separated ramp junctions (though most are just the at-grade "guest" of diamond junctions). Unstyled John Deere tractor After industrial design was applied to the sheet metal styling of John Deere tractors, the distinction unstyled was retronymously applied to earlier models whose model name was the same, for example, styled Model A versus unstyled Model A Upright bicycle The advent of the Recumbent bicycle sometimes requires a speaker to make the distinction between that and the conventional "upright bicycle". Vanilla Doom The advent of source ports for Doom have altered gameplay behavior. Viennese waltz The original waltz, as distinct from other styles of waltz that have since developed. Visible light Before the discovery of invisible wavelengths of electromagnetic radiation, all light was considered visible. Water-activated stamps (gummed stamps) The predominant kind of postage stamp before self-adhesive stamps became popular. Web 1.0 a term used from the mid-2000s onward to refer to the World Wide Web / Internet of the 1990s and early 2000s. At the time, it was referred to simply as "the web" or (less accurately) "the internet" or "the net". Whole milk Milk was formerly available in just one version, with the cream included, and benefited eventually by pasteurization and homogenization. But it was still called simply milk. This variety of milk is now referred to in the U.S. as whole milk (3.25% milkfat) to distinguish it from 2% (reduced fat) milk, 1% (low fat) milk, and skim milk (nearly no fat). In the UK, the terms whole milk (also full-cream milk or full-fat milk) (3.5%), semi-skimmed milk (about 1.5%) and skimmed milk (almost no fat) are commonly used. Whole wheat All flour, bread, pasta, etc. consisted of some combination of endosperm, germ and bran before white flour was created in the mid 19th century and became the more dominant variant when referring to flour. Windows 10, version 1507 Win16 The original, 16-bit Windows API, as distinguished from the newer Win32 and Win64. Windows 8.0 Zorro I The original Amiga (Amiga 1000) computer expansion bus was simply "Zorro" before the Amiga Zorro II was developed for the next generation Amiga, the Amiga 2000. Zune 30 Used to describe the first-generation Zune device; the "30" was added after the release of the Zune 4, 8, and 80 Geographic retronyms Proper names These are proper names for the described regions, or corridors. Abandoned Pennsylvania Turnpike A section of the Pennsylvania Turnpike between Breezewood and Hustontown which was bypassed by a new alignment that bypassed the tunnels because it was too costly to blast away more rock to widen the travel lanes. Asia Minor The name Asia was first applied to the mainland east of the Aegean islands, and later extended to the greater landmass of which that is a peninsula. Baja California The name California was first applied to the peninsula (thought to be an island) now known as Baja ("Lower"), and later extended – and then restricted – to Alta ("Upper") California, and finally to the current U.S. state. East Indies After Columbus landed in the West Indies. East Prussia Prussia began as a duchy in what is now Poland. As the highest-ranking dignity of the Hohenzollern dynasty, the name came to be applied to their territories stretching across Germany. The name East Prussia became more significant when it was separated from the rest of Prussia and Germany by the Polish Corridor. EUxx "EU" followed by two digits is often used in statistics to indicate the different makeup of the European Union EU12: the twelve-member EU as founded in 1993; most of the Western European nations EU15: the fifteen-member EU after Austria, Finland and Sweden joined in 1995 EU25: the EU from 2004 to 2007 after ten eastern and central European nations joined EU27: the EU from 2007 to 2013, after Romania and Bulgaria were added EU28: the EU from 2013 to 2020, after Croatia joined EU27 is now used to refer to the EU after the United Kingdom left in 2020; it was also used after the 2016 Brexit referendum to refer to "the EU countries less the UK" as they negotiated with the UK government First Chinatown First Chinatown refers to Toronto's original Chinatown at Dundas and Elizabeth Streets in The Ward, and was known as such until the construction of the new city hall and public square in the 1960s. Most stores that occupied the construction project was cleared through expropriation. The resulting development caused the westward relocation of Chinatown to its current location at Dundas Street and Spadina Avenue. Great Britain Britons fleeing the Germanic invasions settled in Armorica which became Brittany or Little Britannia. Lower Saxony The kingdom and duchies of Saxony are outside the original lowland territory of the Saxon people. Manhattan Chinatown For a long time New York City had only one Chinatown. However, there are now large Chinese communities in Flushing, Queens and Sunset Park, Brooklyn therefore a need has developed to differentiate among the city's three Chinatowns. Old Chinatown London's original Chinatown (destroyed in The Blitz) was in Limehouse; the new Chinatown is in Soho. Also used in Houston, TX to the Chinatown district located east of the George R. Brown Convention Center and south of BBVA/Compass Stadium. Old Toronto Old Toronto refers to the old City of Toronto, prior to the amalgamation of Toronto in 1998. In 1998, the Government of Ontario dissolved the regional municipality of Metropolitan Toronto, as well as the region's constituent municipalities (including Old Toronto). The former municipalities that made up Metropolitan Toronto were amalgamated into a single entity, the present-day city of Toronto. Old World After Columbus landed in the Americas ("The New World"). Old Northwest, Old Southwest and Old West Regions formerly at these extreme corners of the United States. General descriptions These are less official descriptions that are commonly used. Contiguous United States or Lower 48 Referred to simply as The United States before Alaska and Hawaii, which are American exclaves, became states. Historiographic retronyms Aztec Empire Term coined by Alexander von Humboldt in the early 19th Century to differentiate between the pre-Hispanic "Mexican empire" and the then new post-Hispanic one (this, in turn, became known as the First Mexican Empire upon the French-backed enthronement of Maximilian I in 1864). Byzantine Empire Term coined in 1557 to name the East Roman Empire, then defunct by over a century, in the historical period following the disintegration of the Western Roman Empire in 476 AD. The entity was commonly known as 'Roman Empire' to its inhabitants and 'Greek Empire' to contemporary Western Europeans. Gran Colombia Historians' term for the first "Republic of Colombia", which included what are now Colombia, Venezuela, Ecuador, and Panama. Polish–Lithuanian Commonwealth Term coined in the 20th century, after the restoration of separate Poland and Lithuania as independent states. Weimar Republic Used to refer to the German Reich during the period in which it was a liberal democracy, prior to being taken over by the Nazi Party. World War I/First World War Originally this was called "The Great War" and commonly believed to be "the war to end all wars". However, when a second war enveloped Europe, Asia, and much of the Pacific, it became necessary to distinguish them. This convention has been used for many series of wars, going back as far as the First Peloponnesian War or earlier. Most recently, the 1991 war in the Persian Gulf, formerly called "Desert Storm" or just the "Gulf War", is now (since the 2003 invasion of Iraq) often referred to as "The First Gulf War". Airports When an airport consists of only one passenger facility, most people just call it "The airport" or "The terminal". But when an airport expands, it is often necessary to give the original building a retronymic adjective to avoid confusion. While some airports just rename older terminals or concourse with letters or numbers (e.g. Terminal 1 or Concourse B), other methods include: Cardinal directions – when Newark opened Terminals A and B in the early 1970s, the existing passenger terminal was renamed the "North Terminal". Proper names – Detroit Metro Airport only had one passenger terminal until 1966, at which point the existing facility was identified as the "L.C. Smith Terminal". Disputed retronyms Note: These terms imply age-old concepts, but the terms are usually applied to newer concepts with similar qualities. Since some of these terms fall under different contexts, that's where the confusion comes in. 2.5D 2.5D generally refers to computer data which uses 2D plane data to render in 3D, and sometimes 2D sprites in that environment too; in which Doom and Duke Nukem 3D famously had this. However 3D structures in general have existed before computers, but the term 2.5D was coined after computer games (and CAD) fully went 3D. Bose Acoustic Wave System The Bose Acoustic Wave System was the first Bose product to use the term "Wave System" in its name. Newcomers to the Bose audio product lineup would think that this was just their top-of-the-line product, but this product was introduced prior to the introduction of Bose's simple Wave Radio (or Wave System) products, in which the simpler Wave Systems were cost-reduced versions of the "Wave" System lineup, but the name "Acoustic Wave System" was used before Wave by itself was used, in which is somewhat arguable to historians. Disney's Aladdin In the early days of the Disney's Aladdin media franchise, when the 1992 Disney film Aladdin first came out, the media franchise was simply just known as Aladdin. But other movies and media bearing the name Aladdin existed before this media franchise, so later on the name Disney's Aladdin would come along to clear things up. Expressway The term expressway often refers to continuous highways with no private driveways but sometimes they have at-grade intersections. But in some jurisdictions it is synonymous with freeways, which have 100% grade separations. The term expressway wasn't coined until freeways were built, but the expressway concept itself has existed before freeways, in which because the term is sometimes synonymous with freeway, this is why the term at-grade expressway is sometimes used in reference to expressways with at-grade intersections. Mechless car stereo Most car stereo had no "moving parts" prior to the introduction of interchangeable media such as vinyl records (Highway Hifi by Chrysler), 8-track cartridges, compact cassettes and CDs, but recent omission of CD players (cassette player omission prior) has left the systems mechless when solid-state means to play audio with MP3 and other file format support such as secure digital, compact flash and USB came along, of which many of these systems have an analog AUX input. The term mechless usually refers to more recent systems, thus it disputes its status as a "retronym". Occasionally, many car stereos are AM/FM only without AUX inputs, in which it is possible to use FM transmitters with them. Another thing that disputes this retronym, is that earlier AM/FM tuners had moving parts of their own just for the adjustment of frequencies (i.e. a string-driven variable capacitor and a static frequency display with a moving needle) prior to the introduction of digital readout with endless loop tuning (and later endless loop seek tuning) in the early 1980s. Nintendo 2DS Before Nintendo 3DS came out, all Nintendo DSes were 2D to the meatspace level. However the product officially known as the Nintendo 2DS is simply a console that is capable of running 3DS games but without the parallax effect. 2DS doesn't refer to pre-3DS models, which is why this is a "disputed retronym". Solo motorcycle So called instead of motorcycle when some were being built with a sidecar. However, this retronym has gone into dispute because the so-called solo motorcycle can accommodate two passengers by itself. Push lawnmower With the introduction of lawnmowers powered by gasoline (petrol or petroleum spirit outside the U.S.) and electricity, the manually propelled lawnmower became known as the push mower. After self-propelled "riding" mowers became common, the term push mower was also applied to non-riding mowers. Sonic 3 & Knuckles The video game Sonic The Hedgehog 3 was supposed to have featured more stages, most which ended up being included in the later released Sonic & Knuckles. However, time and cost constraints forced the game cartridge to be "two interlocking pieces" since Sega, back in late 1993 had to make a big compromise in order to ensure a fair cost for a stand-alone cartridge for the Mega Drive/Genesis. Having a cartridge with enough capacity for both games on its own as "one giant game" could have meant pushing the cost up too high for an integrated product, so the game resulted in having Sonic & Knuckles exist as its own cartridge which had a "lock on" cartridge port in order to include Sonic 3 levels. Sonic 3 itself wasn't marketed to have Sonic & Knuckles levels even though original plans wanted to include them, so in some contexts, the name "Sonic 3 & Knuckles" as a subgame using a cartridge would be disputed as a "retronym" for Sonic 3 as a game. Standard transmission In the traditional sense, the term "standard" transmission refers to manual transmissions. However, the standard feature for transmissions on newer vehicles are automatic transmissions. Wet signature or wet-ink signature a handwritten signature, as opposed to an electronic signature. Xbox One The "Xbox One" (alternatively spelled "Xbox 1") used to be a colloquial nickname for the original Xbox video game console following the launch of the Xbox 360. However, this felt into disuse when Microsoft introduced the Xbox One, the third generation of the brand, to the market. Double retronyms Double-retronyms in general may just be the differentiation of adjectives and nouns that form retronyms in the first place, but there are other scenarios, such as political bodies splitting apart where there is simultaneous coinage of new names but no confirmed "original" is claimed. The nouns are in alphabetical order: North Carolina and South Carolina In general this area is referred to as The Carolinas, but this former province was split around 1729. East Germany and West Germany The splitting of Germany from 1945 to 1990 after World War II led to the construction of the infamous Berlin Wall in 1961. Inverse Double retronym Where the introduction of a double retronym makes the adjective factually incorrect. Southern Ireland The most northerly tip of the Island of Ireland is in the Republic of Ireland which is otherwise known as Southern Ireland References Retronyms
532475
https://en.wikipedia.org/wiki/Camellia%20%28cipher%29
Camellia (cipher)
In cryptography, Camellia is a symmetric key block cipher with a block size of 128 bits and key sizes of 128, 192 and 256 bits. It was jointly developed by Mitsubishi Electric and NTT of Japan. The cipher has been approved for use by the ISO/IEC, the European Union's NESSIE project and the Japanese CRYPTREC project. The cipher has security levels and processing abilities comparable to the Advanced Encryption Standard. The cipher was designed to be suitable for both software and hardware implementations, from low-cost smart cards to high-speed network systems. It is part of the Transport Layer Security (TLS) cryptographic protocol designed to provide communications security over a computer network such as the Internet. The cipher was named for the flower Camellia japonica, which is known for being long-lived as well as because the cipher was developed in Japan. Design Camellia is a Feistel cipher with either 18 rounds (when using 128-bit keys) or 24 rounds (when using 192- or 256-bit keys). Every six rounds, a logical transformation layer is applied: the so-called "FL-function" or its inverse. Camellia uses four 8×8-bit S-boxes with input and output affine transformations and logical operations. The cipher also uses input and output key whitening. The diffusion layer uses a linear transformation based on a matrix with a branch number of 5. Security analysis Camellia is considered a modern, safe cipher. Even using the smaller key size option (128 bits), it's considered infeasible to break it by brute-force attack on the keys with current technology. There are no known successful attacks that weaken the cipher considerably. The cipher has been approved for use by the ISO/IEC, the European Union's NESSIE project and the Japanese CRYPTREC project. The Japanese cipher has security levels and processing abilities comparable to the AES/Rijndael cipher. Camellia is a block cipher which can be completely defined by minimal systems of multivariate polynomials: The Camellia (as well as AES) S-boxes can be described by a system of 23 quadratic equations in 80 terms. The key schedule can be described by equations in 768 variables using linear and quadratic terms. The entire block cipher can be described by equations in variables using linear and quadratic terms. In total, equations in variables using linear and quadratic terms are required. The number of free terms is , which is approximately the same number as for AES. Theoretically, such properties might make it possible to break Camellia (and AES) using an algebraic attack, such as extended sparse linearisation, in the future, provided that the attack becomes feasible. Patent status Although Camellia is patented, it is available under a royalty-free license. This has allowed the Camellia cipher to become part of the OpenSSL Project, under an open-source license, since November 2006. It has also allowed it to become part of the Mozilla's NSS (Network Security Services) module. Adoption Support for Camellia was added to the final release of Mozilla Firefox 3 in 2008 (disabled by default as of Firefox 33 in 2014 in spirit of the "Proposal to Change the Default TLS Ciphersuites Offered by Browsers", and has been dropped from version 37 in 2015). Pale Moon, a fork of Mozilla/Firefox, continues to offer Camellia and had extended its support to include Galois/Counter mode (GCM) suites with the cipher, but has removed the GCM modes again with release 27.2.0, citing the apparent lack of interest in them. Later in 2008, the FreeBSD Release Engineering Team announced that the cipher had also been included in the FreeBSD 6.4-RELEASE. Also, support for the Camellia cipher was added to the disk encryption storage class geli of FreeBSD by Yoshisato Yanagisawa. In September 2009, GNU Privacy Guard added support for Camellia in version 1.4.10. VeraCrypt (a fork of TrueCrypt) included Camellia as one of its supported encryption algorithms. Moreover, various popular security libraries, such as Crypto++, GnuTLS, mbed TLS and OpenSSL also include support for Camellia. On March 26, 2013, Camellia was announced as having been selected again for adoption in Japan's new e-Government Recommended Ciphers List as the only 128-bit block cipher encryption algorithm developed in Japan. This coincides with the CRYPTREC list being updated for the first time in 10 years. The selection was based on Camellia's high reputation for ease of procurement, and security and performance features comparable to those of the Advanced Encryption Standard (AES). Camellia remains unbroken in its full implementation. An impossible differential attack on 12-round Camellia without FL/FL−1 layers does exist. Performance The S-boxes used by Camellia share a similar structure to AES's S-box. As a result, it is possible to accelerate Camellia software implementations using CPU instruction sets designed for AES, such as x86 AES-NI, by affine isomorphism. Standardization Camellia has been certified as a standard cipher by several standardization organizations: CRYPTREC NESSIE IETF Algorithm : A Description of the Camellia Encryption Algorithm Block cipher mode : Camellia Counter Mode and Camellia Counter with CBC-MAC Mode Algorithms S/MIME : Use of the Camellia Encryption Algorithm in Cryptographic Message Syntax (CMS) XML Encryption : Additional XML Security Uniform Resource Identifiers (URIs) TLS/SSL : Addition of Camellia Cipher Suites to Transport Layer Security (TLS) : Camellia Cipher Suites for TLS : Addition of the Camellia Cipher Suites to Transport Layer Security (TLS) IPsec : The Camellia Cipher Algorithm and Its Use With IPsec : Modes of Operation for Camellia for Use with IPsec Kerberos : Camellia Encryption for Kerberos 5 OpenPGP : The Camellia Cipher in OpenPGP RSA-KEM in CMS : Use of the RSA-KEM Key Transport Algorithm in the Cryptographic Message Syntax (CMS) PSKC : Portable Symmetric Key Container (PSKC) Smart grid : Internet Protocols for the Smart Grid ISO/IEC ISO/IEC 18033-3:2010 Information technology—Security techniques—Encryption algorithms—Part 3: Block ciphers ITU-T Security mechanisms and procedures for NGN (Y.2704) RSA Laboratories Approved cipher in the PKCS#11 TV-Anytime Forum Approved cipher in TV-Anytime Rights Management and Protection Information for Broadcast Applications Approved cipher in Bi-directional Metadata Delivery Protection References General External links Camellia's English home page by NTT 256 bit ciphers – CAMELLIA reference implementation and derived code Use of the Camellia Encryption Algorithm in Cryptographic Message Syntax (CMS) A Description of the Camellia Encryption Algorithm Additional XML Security Uniform Resource Identifiers (URIs) Addition of Camellia Cipher Suites to Transport Layer Security (TLS) The Camellia Cipher Algorithm and Its Use With IPsec Camellia Counter Mode and Camellia Counter with CBC-MAC Mode Algorithms Modes of Operation for Camellia for Use with IPsec Certification of Camellia Cipher as IETF standard for OpenPGP Camellia Cipher Suites for TLS Use of the RSA-KEM Key Transport Algorithm in the Cryptographic Message Syntax (CMS) Portable Symmetric Key Container (PSKC) Internet Protocols for the Smart Grid Addition of the Camellia Cipher Suites to Transport Layer Security (TLS) ISO/IEC 18033-3:2010 Information technology—Security techniques—Encryption algorithms—Part 3: Block ciphers Feistel ciphers Mitsubishi Electric products, services and standards 2000 introductions
1754605
https://en.wikipedia.org/wiki/MicroBlaze
MicroBlaze
The MicroBlaze is a soft microprocessor core designed for Xilinx field-programmable gate arrays (FPGA). As a soft-core processor, MicroBlaze is implemented entirely in the general-purpose memory and logic fabric of Xilinx FPGAs. MicroBlaze was introduced in 2002. Overview In terms of its instruction set architecture, MicroBlaze is similar to the RISC-based DLX architecture described in a popular computer architecture book by Patterson and Hennessy. With few exceptions, the MicroBlaze can issue a new instruction every cycle, maintaining single-cycle throughput under most circumstances. The MicroBlaze has a versatile interconnect system to support a variety of embedded applications. MicroBlaze's primary I/O bus, the AXI interconnect, is a system-memory mapped transaction bus with master–slave capability. Older versions of the MicroBlaze used the CoreConnect PLB bus. The majority of vendor-supplied and third-party IP interface to AXI directly (or through an AXI interconnect). For access to local-memory (FPGA RAM), MicroBlaze uses a dedicated LMB bus, which provides fast on-chip storage. User-defined coprocessors are supported through dedicated AXI4-Stream connections. The coprocessor(s) interface can accelerate computationally intensive algorithms by offloading parts or the entirety of the computation to a user-designed hardware module. Many aspects of the MicroBlaze can be user configured: cache size, pipeline depth (3-stage, 5-stage, or 8-stage), embedded peripherals, memory management unit, and bus-interfaces can be customized. The area-optimized version of MicroBlaze, which uses a 3-stage pipeline, sacrifices clock frequency for reduced logic area. The performance-optimized version expands the execution pipeline to 5 stages, allowing top speeds of more than 700 MHz (on Virtex UltraScale+ FPGA family). Also, key processor instructions which are rarely used but more expensive to implement in hardware can be selectively added/removed (e.g. multiply, divide, and floating point operations). This customization enables a developer to make the appropriate design trade-offs for a specific set of host hardware and application software requirements. With the memory management unit, MicroBlaze is capable of hosting operating systems requiring hardware-based paging and protection, such as the Linux kernel. Otherwise it is limited to operating systems with a simplified protection and virtual memory model, e.g. FreeRTOS or Linux without MMU support. MicroBlaze's overall throughput is substantially less than a comparable hard CPU core (such as the ARM Cortex-A9 in the Zynq). Vivado Xilinx's Vivado Design Suite is the development environment for building current MicroBlaze (or ARM - see Zynq) embedded processor systems in Xilinx FPGAs. Older versions used Xilinx's EDK (Embedded Development Kit) development package. Designers use the Vivado IP Integrator to configure and build the hardware specification of their embedded system (processor core, memory-controller, I/O peripherals, etc.) The IP Integrator converts the designer's block design into a synthesizeable RTL description (Verilog or VHDL), and automates the implementation of the embedded system (from RTL to the bitstream-file.) For the MicroBlaze core, Vivado generates an encrypted (non human-readable) netlist. The SDK handles the software that will execute on the embedded system. Powered by the GNU toolchain (GNU Compiler Collection, GNU Debugger), the SDK enables programmers to write, compile, and debug C/C++ applications for their embedded system. Xilinx's tools provides the possibility of running software in simulation, or using a suitable FPGA-board to download and execute on the actual system. Purchasers of Vivado are granted a perpetual license to use MicroBlaze in Xilinx FPGAs with no recurring royalties. The license does not grant the right to use MicroBlaze outside of Xilinx's devices. Alternative compilers and development tools have been made available from Altium but an EDK installation and license is still required. Open source In June 2009, MicroBlaze became the first soft-CPU architecture to be merged into the mainline Linux kernel source tree. This work was performed by Michal Simek and supported by PetaLogix and Xilinx. As of September 2009, MicroBlaze GNU tools support is also being contributed to the Free Software Foundation's mainline repositories. Support for MicroBlaze is included in GCC releases starting with version 4.6 Support was added to LLVM in April 2010, but subsequently removed in July 2013 due to a lack of maintainer. Clones aeMB, implemented in Verilog, LGPL license OpenFire subset, implemented in Verilog, MIT license MB-Lite, implemented in VHDL, LGPL license MB-Lite+, implemented in VHDL, LGPL license myBlaze, implemented in MyHDL, LGPL license SecretBlaze, implemented in VHDL, GPL license Other soft processors Nios II TSK3000 Xtensa LatticeMico32 ARC See also OpenCores - a home for many open source soft processor projects PicoBlaze References External links MicroBlaze on Xilinx website Soft microprocessors
5155886
https://en.wikipedia.org/wiki/Geomorphometry
Geomorphometry
Geomorphometry, or geomorphometrics ( + + ), is the science and practice of measuring the characteristics of terrain, the shape of the surface of the Earth, and the effects of this surface form on human and natural geography. It gathers various mathematical, statistical and image processing techniques that can be used to quantify morphological, hydrological, ecological and other aspects of a land surface. Common synonyms for geomorphometry are geomorphological analysis (after geomorphology), terrain morphometry, terrain analysis, and land surface analysis. Geomorphometrics is the discipline based on the computational measures of the geometry, topography and shape of the Earth's horizons, and their temporal change. This is a major component of geographic information systems (GIS) and other software tools for spatial analysis. In simple terms, geomorphometry aims at extracting (land) surface parameters (morphometric, hydrological, climatic etc.) and objects (watersheds, stream networks, landforms etc.) using input digital land surface model (also known as digital elevation model, DEM) and parameterization software. Extracted surface parameters and objects can then be used, for example, to improve mapping and modelling of soils, vegetation, land use, geomorphological and geological features and similar. With the rapid increase of sources of DEMs today (and especially due to the Shuttle Radar Topography Mission and LIDAR-based projects), extraction of land surface parameters is becoming more and more attractive to numerous fields ranging from precision agriculture, soil-landscape modelling, climatic and hydrological applications to urban planning, education and space research. The topography of almost all Earth has been today sampled or scanned, so that DEMs are available at resolutions of 100 m or better at global scale. Land surface parameters are today successfully used for both stochastic and process-based modelling, the only remaining issue being the level of detail and vertical accuracy of the DEM. History Although geomorphometry started with ideas of Brisson (1808) and Gauss (1827), the field did not evolve much until the development of GIS and DEM datasets in the 1970s. Geomorphology (which focuses on the processes that modify the land surface) has a long history as a concept and area of study, with geomorphometry being one of the oldest related disciplines. Geomatics is a more recently evolved sub-discipline, and even more recent is the concept of geomorphometrics. This has only recently been developed since the availability of more flexible and capable geographic information system (GIS) software, as well as higher resolution Digital Elevation Model (DEM). It is a response to the development of this GIS technology to gather and process DEM data (e.g. remote sensing, the Landsat program and photogrammetry). Recent applications proceed with the integration of geomorphometrics with digital image analysis variables obtained by aerial and satellite remote sensing. As the triangulated irregular network (TIN) arose as an alternative model for representing the terrain surface, corresponding algorithms were developed for deriving measurements from it. Surface gradient Derivatives A variety of basic measurements can be derived from the terrain surface, generally applying the techniques of vector calculus. That said, the algorithms typically used in GIS and other software use approximate calculations that produce similar results in much less time with discrete datasets than the pure continuous function methods. Many strategies and algorithms have been developed, each having advantages and disadvantages. Surface normal and gradient The surface normal at any point on the terrain surface is a vector ray that is perpendicular to the surface. The surface gradient () is the vector ray that is tangent to the surface, in the direction of steepest downhill slope. Slope Slope or grade is a measure of how steep the terrain is at any point on the surface, deviating from a horizontal surface. In principle, it is the angle between the gradient vector and the horizontal plane, given either as an angular measure α (common in scientific applications) or as the ratio , commonly expressed as a percentage, such that p = tan α. The latter is commonly used in engineering applications, such as road and railway construction. Deriving slope from a raster digital elevation model requires calculating a discrete approximation of the surface derivative based on the elevation of a cell and those of its surrounding cells, and several methods have been developed. For example, the Horne method, implemented in ArcGIS, uses the elevation of a cell and its eight immediate neighbors, spaced by the cell size or resolution r: The partial derivatives are then approximated as weighted averages of the differences between the opposing sides: The slope (in percent) is then calculated using the Pythagorean theorem: The second derivative of the surface (i.e., curvature) can be derived using similarly analogous calculations. Aspect The aspect of the terrain at any point on the surface is the direction the slope is "facing," or the cardinal direction of the steepest downhill slope. In principle, it is the projection of the gradient onto the horizontal slope. In practice using a raster digital elevation model, it is approximated using one of the same partial derivative approximation methods developed for slope. Then the aspect is calculated as: This yields a counter-clockwise bearing, with 0° at east. Other derived products Illumination/Shaded Relief/Analytical Hillshading Another useful product that can be derived from the terrain surface is a shaded relief image, which approximates the degree of illumination of the surface from a light source coming from a given direction. In principle, the degree of illumination is inversely proportional to the angle between the surface normal vector and the illumination vector; the wider the angle between the vectors, the darker that point on the surface is. In practice, it can be calculated from the slope α and aspect β, compared to a corresponding altitude φ and azimuth θ of the light source: The resultant image is rarely useful for analytical purposes, but is most commonly used as an intuitive visualization of the terrain surface, because it looks like an illuminated three dimensional model of the surface. Topographic feature extraction Natural terrain features, such as mountains and canyons, can often be recognized as patterns in elevation and its derivative properties. The most basic patterns include locations where the terrain changes abruptly, such as peaks (local elevation maxima), pits (local elevation minima), ridges (linear maxima), channels (linear minima), and passes (the intersections of ridges and channels). Due to limitations of resolution, axis-orientation, and object-definitions the derived spatial data may yield meaning with subjective observation or parameterisation, or alternatively processed as fuzzy data to handle the varying contributing errors more quantitatively – for example as a 70% overall chance of a point representing the peak of a mountain given the available data, rather than an educated guess to deal with the uncertainty. Local Relief In many applications, it is useful to know how much the surface varies in each local area. For example, one may need to distinguish between mountainous areas and high plateaus, both of which are high in elevation, but with different degrees of "ruggedness." The local relief of a cell is a measurement of this variability in the surrounding neighborhood (typically the cells within a given radius), for which several measures have been used, including simple summary statistics such as the total range of values in the neighborhood, an interquantile range, or the standard deviation. More complex formulas have also been developed to capture more subtle variation. Applications Quantitative surface analysis through geomorphometrics provides a variety of tools for scientists and managers interested in land management. Applications areas include: Landscape ecology Biogeography In many situations, terrain can have a profound effect on local environments, especially in semi-arid climates and mountainous areas, including well-known effects such as Altitudinal zonation and the Slope effect. This can make it a significant factor in modeling and mapping microclimates, vegetation distribution, wildlife habitat, and precision agriculture. Hydrology Due to the simple fact that water flows downhill, the surface derivatives of the terrain surface can predict surface stream flow. This can be used to construct stream networks, delineate drainage basins, and calculate total flow accumulation. Visibility Mountains and other landforms can block the visibility between locations on opposite sides. Predicting this effect is a valuable tool for applications as varied as military tactics and locating cell sites. Common tools in terrain analysis software include computing the line-of-sight visibility between two points, and generating a viewshed, the region of all points that are visible from a single point. Earthworks Many construction projects require significant modification of the terrain surface, including both the removal and addition of material. By modeling the current and designed surface, engineers can calculate the volume of cuts and fills, and predict potential issues such as slope stability and erosion potential. Geomorphometricians As a relatively new and unknown branch of GIS the topic of geomorphometrics has few 'famous' pioneer figures as is the case with other fields such as hydrology (Robert Horton) or geomorphology (G. K. Gilbert). In the past geomorphometrics have been used in a wide range of studies (including some high-profile geomorphology papers by academics such as Evans, Leopold and Wolman) but it is only recently that GIS practitioners have begun to integrate it within their work. Nonetheless it is becoming increasingly used by researchers such as Andy Turner and Joseph Wood. International organisations Large institutions are increasingly developing GIS-based geomorphometric applications, one example being the creation of a Java-based software package for geomorphometrics in association with the University of Leeds. Training Academic institutions are increasingly devoting more resources into geomorphometrics training and specific courses although these are still currently limited to a few universities and training centres. The most accessible at present include online geomorphometrics resource library in conjunction with the University of Leeds and lectures and practicals delivered as part of wider GIS modules, the most comprehensive at present offered at the University of British Columbia (overseen by Brian Klinkenberg) and at Dalhousie University. Geomorphometry/geomorphometrics software The following computer software has specialized terrain analysis modules or extensions (listed in alphabetical order): ANUDEM ArcGIS (Spatial Analyst extension) GRASS GIS (r.param.scale, r.slope.aspect, etc.) ILWIS LandSerf SAGA GIS (Terrain analysis modules) Whitebox Geospatial Analysis Tools (Terrain Analysis, LiDAR Analysis, Hydrological Tools, and Stream Network Analysis modules) See also Digital Elevation Model Geography Geomatics Geometry Geographic information system Geomorphology Landforms Landsat program Morphometry Photogrammetry Remote sensing Scientific modelling Topography References Further reading Mark,D.M. (1975) Geomorphometric parameters: a review and evaluation Geographical Annals, 57, (1); pp 165–177 Miller, C.L. and Laflamme, R.A. (1958): The Digital Terrain Model-Theory & Application. MIT Photogrammetry Laboratory. Pike, R. J.. Geomorphometry –- progress, practice, and prospect. Zeitschrift für Geomorphologie Supplementband 101 (1995): 221-238. Pike, R.J., Evans, I., Hengl, T., 2008. Geomorphometry: A Brief Guide. In: Geomorphometry - Concepts, Software, Applications, Hengl, T. and Hannes I. Reuter (eds.), Series Developments in Soil Science vol. 33, Elsevier, pp. 3-33, External links www.geomorphometry.org - a non-commercial association of researchers and experts. An extensive review of bibliography of Geomorphometry literature by Richard J. Pike (report 02-465) - University of Leeds - school of Geography, geomorphometrics home page - example of Leeds University-developed geomorphometrics output with processing- and resolution-based parameters - University of British Columbia - department of Geography - Dalhousie University - geomorphology and landscape evolution module Topography techniques
7693570
https://en.wikipedia.org/wiki/RavMonE.exe
RavMonE.exe
RavMonE, also known as RJump, is a Trojan that opens a backdoor on computers running Microsoft Windows. Once a computer is infected, the virus allows unauthorized users to gain access to the computer's contents. This poses a security risk for the infected machine's user, as the attacker can steal personal information, and use the computer as an access point into an internal network. RavMonE was made famous in September 2006 when a number of iPod videos were shipped with the virus already installed. Because the virus only infects Windows computers, it can be inferred that Apple's contracted manufacturer was not using Macintosh computers. Apple came under some public criticism for releasing the virus with their product. Description RavMonE is a worm written in the Python scripting language and was converted into a Windows executable file using the Py2Exe tool. It attempts to spread by copying itself to mapped and removable storage drives. It can be transmitted by opening infected email attachments and downloading infected files from the Internet. It can also be spread through removable media, such as CD-ROMs, flash memory, digital cameras and multimedia players. Action Once the virus is executed, it performs the following tasks. It copies itself to %WINDIR% as RavMonE.exe. It adds the value "RavAV" = "%WINDIR%\RavMonE.exe" to the registry key HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows\CurrentVersion\Run. It opens a random port and accepts remote commands. It creates a log file RavMonLog to store the port number. It posts a HTTP request to advise the attacker of the infected computer's IP address and the number of the port opened. When a removable storage device is connected to the infected computer it copies the following files to that device: autorun.inf - a script to execute the worm the next time the device is connected to a computer msvcr71.dll - in case the target device lacks this support, Microsoft C Runtime Library module containing standard functions such as to copy memory and print to the console ravmon.exe - a copy of the worm Aliases BackdoorRajump (Symantec) W32/JisxA.worm (Panda) W32/RJump-C (Sophos) W32/RJumpA!worm (Fortinet) Win32/RJumpA (ESET) Win32/RJumpA!Worm (CA) WormRJumpA (BitDefender) WormWin32RJump.a (Kaspersky) Worm/RjumpE (Avira) WORM_SIWEOLB (TrendMicro) Worm/GenericAMR (AVG) INF:RJump[Trj](Avast!) See also List of computer viruses (L-R) References External links Alphabetically by publisher: Computer worms Trojan horses
25211885
https://en.wikipedia.org/wiki/Child%20pornography
Child pornography
Child pornography (also called CP, child sexual abuse material, CSAM, child porn, or kiddie porn) is pornography that unlawfully exploits children for sexual stimulation. It may be produced with the direct involvement or sexual assault of a child (also known as child sexual abuse images) or it may be simulated child pornography. Abuse of the child occurs during the sexual acts or lascivious exhibitions of genitals or pubic areas which are recorded in the production of child pornography. Child pornography may use a variety of mediums, including writings, magazines, photos, sculpture, drawing, painting, animation, sound recording, film, video, and video games. Child pornography may be created for profit or other reasons. Laws regarding child pornography generally include sexual images involving prepubescents, pubescent, or post-pubescent minors and computer-generated images that appear to involve them. Most possessors of child pornography who are arrested are found to possess images of prepubescent children; possessors of pornographic images of post-pubescent minors are less likely to be prosecuted, even though those images also fall within the statutes. The prepubescent pornography is viewed and collected by pedophiles for a variety of purposes, ranging from private sexual uses, trading with other pedophiles, preparing children for sexual abuse as part of the process known as "child grooming", or enticement leading to entrapment for sexual exploitation such as production of new child pornography or child prostitution. Children themselves also sometimes produce child pornography on their own initiative or by the coercion of an adult. Child pornography is illegal and censored in most jurisdictions in the world. Ninety-four of 187 Interpol member states had laws specifically addressing child pornography as of 2008, though this does not include nations that ban all pornography. Of those 94 countries, 58 criminalized possession of child pornography regardless of intent to distribute. Both distribution and possession are now criminal offenses in almost all Western countries. A wide movement is working to globalize the criminalization of child pornography, including major international organizations such as the United Nations and the European Commission. Producers of child pornography try to avoid prosecution by distributing their material across national borders, though this issue is increasingly being addressed with regular arrests of suspects from a number of countries occurring over the last few years. Terminology In the 2000s, use of the term child abuse images increased by both scholars and law enforcement personnel because the term "pornography" can carry the inaccurate implication of consent and create distance from the abusive nature of the material. A similar term, child sexual abuse material, is used by some official bodies, and similar terms such as "child abuse material", "documented child sexual abuse", and "depicted child sexual abuse" are also used, as are the acronyms CAM and CAI. The term "child pornography" retains its legal definitions in various jurisdictions, along with related terms such as "indecent photographs of a child" and others. In 2008, the World Congress III against the Sexual Exploitation of Children and Adolescents stated in their formally adopted pact that:Increasingly the term 'child abuse images' is being used to refer to the sexual exploitation of children and adolescents in pornography. This is to reflect the seriousness of the phenomenon and to emphasize that pornographic images of children are in fact records of a crime being committed.Interpol and policing institutions of various governments, including among others the United States Department of Justice, enforce child pornography laws internationally. Since 1999, the Interpol Standing Working Group on Offenses Against Minors has used the following definition: Child pornography is the consequence of the exploitation or sexual abuse perpetrated against a child. It can be defined as any means of depicting or promoting sexual abuse of a child, including print and/or audio, centered on sex acts or the genital organs of children. Child sexual abuse in production and distribution Abuse of the child occurs during the sexual acts or lascivious exhibitions of genitals or pubic areas which are recorded in the production of child pornography. Children of all ages, including infants, are abused in the production of child pornography. The United States Department of Justice estimates that pornographers have recorded the abuse of more than one million children in the United States alone. There is an increasing trend towards younger victims and greater brutality; according to Flint Waters, an investigator with the federal Internet Crimes Against Children Task Force, "These guys are raping infants and toddlers. You can hear the child crying, pleading for help in the video. It is horrendous." According to the World Congress against Commercial Sexual Exploitation of Children, "While impossible to obtain accurate data, a perusal of the child pornography readily available on the international market indicates that a significant number of children are being sexually exploited through this medium."<ref name="healty">{{cite web|url=http://www.csecworldcongress.org/PDF/en/Stockholm/Background_reading/Theme_papers/Theme%20paper%20Pornography%201996_EN.pdf |title=Child pornography: an international perspective, Margaret A. Healty, 1996 |url-status=dead |archive-url=https://web.archive.org/web/20080228232548/http://www.csecworldcongress.org/PDF/en/Stockholm/Background_reading/Theme_papers/Theme%20paper%20Pornography%201996_EN.pdf |archive-date=28 February 2008 }}</ref> The United Kingdom children's charity NCH has stated that demand for child pornography on the Internet has led to an increase in sex abuse cases, due to an increase in the number of children abused in the production process. In a study analyzing men arrested for child pornography possession in the United States over a one-year period from 2000 to 2001, 83% had pornographic images of prepubescent children and 80% had images graphically depicting sexual penetration. 21% had images depicting violence such as bondage, rape, or torture and most of those involved images of children who were gagged, bound, blindfolded, or otherwise enduring sadistic sex. 39% had child-pornography videos with motion and sound. 79% also had images of nude or semi-nude children, but only 1% possessed such images alone. Law enforcement found that 48% had more than 100 graphic still images, and 14% had 1,000 or more graphic images. 40% were "dual offenders", who sexually victimized children and possessed child pornography. A 2007 study in Ireland, undertaken by the Garda Síochána, revealed the most serious content in a sample of over 100 cases involving indecent images of children. In 44% of cases, the most serious images depicted nudity or erotic posing, in 7% they depicted sexual activity between children, in 7% they depicted non-penetrative sexual activity between adults and children, in 37% they depicted penetrative sexual activity between adults and children, and in 5% they depicted sadism or bestiality. Relation to child molestation and abuse Experts differ over any causal link between child pornography and child sexual abuse, with some experts saying that it increases the risk of child sexual abuse, and others saying that use of child pornography reduces the risk of offending.Diamond, Milton. The Effects of Pornography: an international perspective, Pacific Center for Sex and Society", University of Hawai’i, 4 October 2009. Retrieved 8 June 2014. A 2008 American review of the use of Internet communication to lure children outlines the possible links to actual behaviour regarding the effects of Internet child pornography. According to one paper from the Mayo Clinic based on case reports of those under treatment, 30% to 80% of individuals who viewed child pornography and 76% of individuals who were arrested for Internet child pornography had molested a child. As the total number of those who view such images can not be ascertained, the ratio of passive viewing to molestation remains unknown. The report also notes that it is difficult to define the progression from computerized child pornography to physical acts against children. Several professors of psychology state that memories of child abuse are maintained as long as visual records exist, are accessed, and are "exploited perversely." A study by Wolak, Finkelhor, and Mitchell states that:rates of child sexual abuse have declined substantially since the mid-1990s, a time period that corresponds to the spread of CP online. ... The fact that this trend is revealed in multiple sources tends to undermine arguments that it is because of reduced reporting or changes in investigatory or statistical procedures. ... [T]o date, there has not been a spike in the rate of child sexual abuse that corresponds with the apparent expansion of online CP. Typology In the late 1990s, the COPINE project ("Combating Paedophile Information Networks in Europe") at University College Cork, in cooperation with the Paedophile Unit of the London Metropolitan Police, developed a typology to categorize child abuse images for use in both research and law enforcement. The ten-level typology was based on analysis of images available on websites and internet newsgroups. Other researchers have adopted similar ten-level scales. In 2002 in the UK, the Sentencing Advisory Panel adapted the COPINE scale to five levels and recommended its adoption for sentencing guidelines, omitting levels 1 to 3 and recommending that levels 4 to 6 combine as sentencing level 1 and that the four levels from 7 to 10 each form an individual severity level, for a total of 5 sentencing stages. Proliferation Internet proliferation Philip Jenkins notes that there is "overwhelming evidence that [child pornography] is all but impossible to obtain through nonelectronic means." The Internet has radically changed how child pornography is reproduced and disseminated, and, according to the United States Department of Justice, resulted in a massive increase in the "availability, accessibility, and volume of child pornography." The production of child pornography has become very profitable and is no longer limited to paedophiles. Digital cameras and Internet distribution facilitated by the use of credit cards and the ease of transferring images across national borders has made it easier than ever before for users of child pornography to obtain the photographs and videos. The NCMEC estimated in 2003 that since 1997 the number of child pornography images available on the Internet had increased by 1500%. In 2007, the British-based Internet Watch Foundation reported that child pornography on the Internet is becoming more brutal and graphic, and the number of images depicting violent abuse has risen fourfold since 2003. The CEO stated "The worrying issue is the severity and the gravity of the images is increasing. We're talking about prepubescent children being raped." About 80 percent of the children in the abusive images are female, and 91 percent appear to be children under the age of 12. Prosecution is difficult because multiple international servers are used, sometimes to transmit the images in fragments to evade the law. Some child pornographers also circumvent detection by using viruses to illegally gain control of computers on which they remotely store child pornography. In one case, a Massachusetts man was charged with possession of child pornography when hackers used his computer to access pornographic sites and store pornographic pictures without his knowledge. The U.S. Court of Appeals for the Tenth Circuit has ruled that if a user downloads child pornography from a file sharing network and possesses it in his "shared folder" without configuring the software to not share that content, he can be charged with distributing child pornography. Regarding internet proliferation, the U.S. Department of Justice states that "At any one time there are estimated to be more than one million pornographic images of children on the Internet, with 200 new images posted daily." They also note that a single offender arrested in the U.K. possessed 450,000 child pornography images, and that a single child pornography site received a million hits in a month. Further, that much of the trade in child pornography takes place at hidden levels of the Internet, and that it has been estimated that there are between 50,000 and 100,000 paedophiles involved in organised pornography rings around the world, and that one third of these operate from the United States. One massive international child pornography ring was centered in the Netherlands. In the largest ever operation of its kind, police in 30 countries arrested 184 suspects and identified 486 others. Dutch authorities arrested 37-year-old Israeli-born Dutch citizen Amir Ish-Hurwitz, founder and owner of the internet forum Boylover.net, the center of the ring. At its peak, the forum had more than 70,000 members around the world. For a brief time between April 2016 to September 2017 a dark web site known as "Childs Play" was active. Investigators later discovered that the site was run by a group of Australian police. In 2008, the Google search engine adapted a software program in order to faster track child pornography accessible through their site. The software is based in a pattern recognition engine. From 2017 to 2019, the Internet Watch Foundation said it found 118 videos of child sexual abuse (including child rape) on Pornhub. Pornhub quickly removed this content. In 2019, the New York Times reported that child pornography was now a crisis. Tech companies such as Facebook, Microsoft and Dropbox reported over 45 million cases of child sexual abuse material which was more than double what was found the year before and 44 million more than in 2014. In April 2020, BBC Three published a documentary and found that on a single day, a third of Twitter profiles globally advertising 'nudes4sale' (or similar) appeared to belong to underage individuals on various platforms, and many of those used OnlyFans to share their content. Cybersex trafficking Child victims of cybersex trafficking are forced into live streaming, pornographic exploitation on webcam which can be recorded and later sold. Victims are raped by traffickers or coerced to perform sex acts on themselves or other children while being filmed and broadcast in real time. They are frequently forced to watch the paying consumers on shared screens and follow their orders. It occurs in 'cybersex dens', which are rooms equipped with webcams. Overseas predators and pedophiles seek out and pay to watch the victims. Collector behavior and motives Viewers of child pornography who are pedophiles are particularly obsessive about collecting, organizing, categorizing, and labeling their child pornography collection according to age, gender, sex act and fantasy. According to FBI agent Ken Lanning, "collecting" pornography does not mean that they merely view pornography, but that they save it, and "it comes to define, fuel, and validate their most cherished sexual fantasies." An extensive collection indicates a strong sexual preference for children, and if a collector of child pornography is also a pedophile, the owned collection is the single best indicator of what he or she wants to do. The National Society for the Prevention of Cruelty to Children describes researchers Taylor and Quayle's analysis of pedophile pornography collecting: These offenders are likely to employ elaborate security measures to avoid detection. The US DOJ notes that "there is a core of veteran offenders, some of whom have been active in pedophile newsgroups for more than 20 years, who possess high levels of technological expertise," also noting that pedophile bulletin boards often contain technical advice from child pornography users' old hands to newcomers." A 1986 U.S. Senate report found that motives for people's collecting child pornography include arousal and gratification; validation and justification of pedophile behaviour; to show the images to children to lower their inhibitions to engage in sex; preservation of an image of a child at the age of sexual preference; blackmail of depicted individuals; a medium of exchange and communication with other child pornography consumers; and profit. A 2012 U.S. Sentencing Commission report found that child pornography offenders, while "much more likely to be sexually aroused by children than contact sex offenders or the general population", can also have non-sexual motives for collecting child pornography, including initial curiosity, compulsive collecting behaviors, avoidance of stress and dissatisfaction with life, and an ability to create a new and more socially successful identity (within an online community). Some offenders find collecting child pornography enjoyable regardless of whether the images are sexually exciting to them; their interest is in assembling complete sets and organizing the material as a pastime, analogously to what a stamp collector might do. A study was published to the journal Child Abuse and Neglect in January 2021 by researchers at the University of Edinburgh and George Mason University that looked at the collecting and viewing behaviors of individuals previously convicted of child pornography offences in the United States. The researchers sent out a letter through the postal service to 78 previously convicted individuals and compared the results to the behaviors of 524 non-offending individuals in the United States. The study found that 78% of offenders did not organize their collection and 74% had deleted their entire collection at least once. Offenders also displayed a more diverse interest in adult pornography than non-offenders. They were more likely to view bestiality, hentai, teen and nudist material. They also found that none of the offenders viewed child pornography exclusively, with 74% saying they viewed more adult pornography than they did child pornography. Respondents also self-reported their post-conviction pornography viewing habits with 10% of respondents saying they continued to view child pornography post-conviction and at a lesser frequency than they had pre-conviction. The conclusion reached by the researchers is that treatment professionals should consider motivations of offenders beyond primarily pedophilic interests. They suggested that problematic internet usage, general pornography consumption, coping issues, and novelty seeking may be more appropriate motivators for some offenders. Child sex tourism Sex tourists created one source of child pornography that is distributed worldwide. Most of the victims of child sex tourism reside in the developing countries of the world. In 1996, a court in Thailand convicted a German national of child molestation and production of pornography for commercial purposes; he was involved in a child pornography ring which exploited Thai children. A sizable portion of the pornography seized in Sweden and in the Netherlands in the 1990s was produced by sex tourists visiting Southeast Asia. INTERPOL works with its 190 member countries to combat the problem, and launched its first-ever successful global appeal for assistance in 2007 to identify a Canadian man, Christopher Paul Neil, featured in a series of around 200 photographs in which he was shown sexually abusing young Vietnamese and Cambodian children. Organized crime Organized crime is involved in the production and distribution of child pornography, which is found as a common element of organized crime profiles. Organized into groups to produce and distribute pornography, they are often called "sex rings". In 2003, an international police investigation uncovered a Germany-based child pornography ring involving 26,500 suspects who swapped illegal images on the Internet in 166 different countries. In a 2006 case, US and international authorities charged 27 people in nine states and three countries in connection with a child pornography ring that US federal authorities described as "one of the worst" they have discovered. The assistant secretary for Immigration and Customs Enforcement added that the case reflected three larger trends that are becoming more common in child pornography rings. One is the increasing prevalence of "home-grown" pornographic images that are produced by predators themselves, and include live streaming video images of children being abused, not just the circulation of repeated images. Another trend is the growing use of sophisticated security measures and of peer-to-peer networking, in which participants can share files with one another on their computers rather than downloading them from a web site. The group used encryption and data destruction software to protect the files and screening measures to ensure only authorized participants could enter the chat room. A third trend is the increasingly violent and graphic nature of the images involving the abuse of younger children. According to Jim Gamble, CEO of the Child Exploitation and Online Protection Centre, around 50 per cent of sites showing children being abused are operated on a pay-per-view basis. "The people involved in these sites often aren't doing it because they're deviant by nature. They're doing it because they're business people. It's risk versus profits. We need to reduce the profit motivation." The CEOPP was established in 2006, and targets the finances of organised criminal gangs selling images of child abuse. The majority of child pornography seized in the United States is not produced or distributed for profit, and there is little evidence that organized criminals operating with a profit motivation are a major source of child pornography's international dissemination. Laws History In the United States, the first federal law to ban the for-profit production and distribution of child pornography was the Protection of Children Against Sexual Exploitation Act of 1977. In response to New York v. Ferber, , a U.S. Supreme Court decision allowing the prohibition of child pornography even if it did not meet the obscenity standard established in Miller v. California, Congress passed the Child Protection Act of 1984, broadening the definition of child pornography and criminalizing nonprofit child pornography trafficking. The 1986 Meese Report found that child pornography was a cause of serious harm; this led to the passage of the Child Sexual Abuse and Pornography Act of 1986, which increased penalties for repeat offenders. The U.S. Supreme Court decision Osborne v. Ohio, , ruled that the U.S. Constitution allowed prohibition of child pornography possession. The Court noted that at the time of the decision, 19 U.S. states had laws on their books prohibiting child pornography possession. As of 2015, all 50 U.S. states had such laws. Provisions of the Child Pornography Prevention Act of 1996 that banned virtual child pornography were struck down in Ashcroft v. Free Speech Coalition, . Congress passed several laws increasing the penalties for child pornography offenses, so that from 1997 to 2007, the mean sentence of child pornography offenders increased from 20.59 months to 91.30 months of confinement, an increase of 443%. In 2003, Congress passed the PROTECT Act, authorizing lifetime terms of federal supervised release for child pornography offenders; since U.S. Sentencing Guidelines recommend imposing the maximum term of supervised release for all sex offenders, this means that a lifetime term of supervised release is recommended for all child pornography offenders. During 2006, 3,661 suspects were referred to U.S. attorneys for child sex exploitation offenses. Child pornography constituted 69% of referrals, followed by sex abuse (16%) and sex transportation (14%). In 2006, the median prison sentence imposed was greatest for sex abuse offenses (70 months) followed by child pornography (63 months) and sex transportation (60 months). The median sentenced for sex transportation was 60 months in 2006 and 1996. The median sentence increased from 44 to 70 months for sex abuse and from 15 to 63 months for child pornography. In comparison, median prison terms for drug and weapon offenders remained stable and increased for violent offenses. A bill named Internet Safety Act, intended to stop child pornography and protect children from online predators by requiring Internet service providers to keep track of data pertaining to users that were assigned a temporarily assigned network address, was introduced in 2009 but was finally not enacted. In fiscal year 2010, the average term of supervised release for non-production offenders was approximately 20 years; the average term of supervised release for offenders sentenced under the production guideline was nearly 27 years. International coordination of law enforcement Investigations include the 1999 Operation Cathedral which resulted in multi-national arrests and 7 convictions as well as uncovering 750,000 images with 1,200 unique identifiable faces being distributed over the web; Operation Amethyst which occurred in the Republic of Ireland; Operation Auxin which occurred in Australia; Operation Avalanche; Operation Ore based in the United Kingdom; Operation Pin; Operation Predator; the 2004 Ukrainian child pornography raids; and the 2007 international child pornography investigation. A three-year Europol investigation, dubbed Operation Rescue, based on the activities of boylover.net, a popular pedophile chat room, netted over 150 arrests and the rescue of 230 children in 2011. The principal of boylover.net, Israeli-born Dutch citizen Amir Ish-Hurwitz, was jailed 17 March 2011 in the Netherlands. Hundreds of additional suspects remain at large. Even so, the UK based NSPCC said that worldwide an estimated 2% of child pornography websites still had not been removed a year after being identified. One of the primary mandates of the international policing organization Interpol is the prevention of crimes against children involving the crossing of international borders, including child pornography and all other forms of exploitation and trafficking of children. The USA Department of Justice coordinates programs to track and prosecute child pornography offenders across all jurisdictions, from local police departments to federal investigations, and international cooperation with other governments. Efforts by the Department to combat child pornography includes the National Child Victim Identification Program, the world's largest database of child pornography, maintained by the Child Exploitation and Obscenity Section of the United States Department of Justice and the National Center for Missing and Exploited Children (NCMEC) for the purpose of identifying victims of child abuse.CBS News, "Combating Kiddie Porn", 6 April 2003 Police agencies have deployed trained staff to track child pornography files and the computers used to share them as they are distributed on the Internet, and they freely share identifying information for the computers and users internationally. In Europe the CIRCAMP Law Enforcement project is aimed at reducing the availability of abusive material on the Web, combining traditional police investigative methods and Police/Internet industry cooperation by blocking access to domains containing such files. The result is country specific lists according to national legislation in the participating countries. This police initiative has a worldwide scope in its work but is partly financed by the European Commission. When child pornography is distributed across international borders, customs agencies also participate in investigations and enforcement, such as in the 2001–2002 cooperative effort between the United States Customs Service and local operational law enforcement agencies in Russia. A search warrant issued in the US by the Customs Service resulted in seizing of computers and email records by the Russian authorities, and arrests of the pornographers. In spite of international cooperation, less than 1 percent of children who appear in child pornography are located by law enforcement each year, according to Interpol statistics. Google announced in 2008 that it is working with NCMEC to help automate and streamline how child protection workers sift through millions of pornographic images to identify victims of abuse. Google has developed video fingerprinting technology and software to automate the review of some 13 million pornographic images and videos that analysts at the center previously had to review manually. The FBI has begun posting hyperlinks on the Internet that purport to be illegal videos of minors having sex, and then raiding the homes of anyone willing to click on them. In October 2011, hacking collective Anonymous announced they began taking down child pornography websites on the darknet in a vigilante move, and released alleged user names on a pastebin link. National and international law Child pornography laws provide severe penalties for producers and distributors in almost all societies, usually including incarceration, with shorter duration of sentences for non-commercial distribution depending on the extent and content of the material distributed. Convictions for possessing child pornography also usually include prison sentences, but those sentences are often converted to probation for first-time offenders. In 2006, the International Centre for Missing & Exploited Children (ICMEC) published a report of findings on the presence of child pornography legislation in the then-184 INTERPOL member countries. It later updated this information, in subsequent editions, to include 196 UN member countries. The report, entitled "Child Pornography: Model Legislation & Global Review", assesses whether national legislation: (1) exists with specific regard to child pornography; (2) provides a definition of child pornography; (3) expressly criminalizes computer-facilitated offenses; (4) criminalizes the knowing possession of child pornography, regardless of intent to distribute; and (5) requires ISPs to report suspected child pornography to law enforcement or to some other mandated agency. ICMEC stated that it found in its initial report that only 27 countries had legislation needed to deal with child pornography offenses, while 95 countries did not have any legislation that specifically addressed child pornography, making child pornography a global issue worsened by the inadequacies of domestic legislation. The 7th Edition Report found that still only 69 countries had legislation needed to deal with child pornography offenses, while 53 did not have any legislation specifically addressing the problem. Over seven years of research from 2006 to 2012, ICMEC and its Koons Family Institute on International Law and Policy report that they have worked with 100 countries that have revised or put in place new child pornography laws.Permanent Bureau (February 2004), "Strategic Plan Update, submitted by the Permanent Bureau", Hague Conference on Private International Law, Preliminary Document # 14, p. 6The Koons Family Institute on International Law and Policy (2012) "Child Pornography: Model Legislation & Global Review" , 7th Edition A 2008 review of child pornography laws in 187 countries by the International Centre for Missing & Exploited Children (ICMEC) showed that 93 had no laws that specifically addressed child pornography. Of the 94 that did, 36 did not criminalize possession of child pornography regardless of intent to distribute. This review, however, did not count legislation outlawing all pornography as being "specific" to child pornography. It also did not count bans on "the worst forms of child labor". Some societies such as Canada and Australia have laws banning cartoon, manga, or written child pornography and others require ISPs (Internet Service Providers) to monitor internet traffic to detect it.Canadian Arrested for Importing Loli-porn Manga (4 March 2005, Anime News Network). Retrieved 23 June 2008. The United Nations Optional Protocol on the Sale of Children, Child Prostitution and Child Pornography requires parties to outlaw the "producing, distributing, disseminating, importing, exporting, offering, selling or possessing for the above purposes" of child pornography. The Council of Europe's Cybercrime Convention and the EU Framework Decision that became active in 2006 require signatory or member states to criminalize all aspects of child pornography. Article 34 of the United Nations Convention on the Rights of the Child (UNCRC) stated that all signatories shall take appropriate measures to prevent the exploitative use of children in pornographic performances and materials. Artificially generated or simulated imagery Simulated child pornography produced without the direct involvement of children in the production process itself includes modified photographs of real children, non-minor teenagers made to look younger (age regression), fully computer-generated imagery, and adults made to look like children. Drawings or animations that depict sexual acts involving children but are not intended to look like photographs may also be regarded as child pornography. Sexting and filming among minors Sexting is sending, receiving, or forwarding sexually explicit messages, photographs, or images, primarily between mobile phones, of oneself to others (such as dating partners or friends). It may also include the use of a computer or any digital device. Such images may be passed along to others or posted on the Internet. In many jurisdictions, the age of consent is lower than the age of majority, and a minor who is over the age of consent can legally have sex with a person of the same age. Many laws on child pornography were passed before cell phone cameras became common among teenagers close in age to or over the age of consent and sexting was understood as a phenomenon. Teenagers who are legally able to consent to sex, but under the age of majority, can be charged with production and distribution of child pornography if they send naked images of themselves to friends or sex partners of the same age. The University of New Hampshire's Crimes Against Children Research Center estimates that 7 percent of people arrested on suspicion of child pornography production in 2009 were teenagers who shared images with peers consensually. Such arrests also include teenage couples or friends with a small age disparity, where one is a legal adult and the other is not. In some countries, mandatory sentencing requires anybody convicted of such an offense to be placed on a sex offender registry. The vast majority of underage sexual images are produced by adults and adults often solicit underage teenagers to share the images. Legal professionals and academics have criticized the use of child pornography laws with mandatory punishments against teenagers over the age of consent for sex offenses. Florida cyber crimes defense attorney David S. Seltzer wrote of this that "I do not believe that our child pornography laws were designed for these situations ... A conviction for possession of child pornography in Florida draws up to five years in prison for each picture or video, plus a lifelong requirement to register as a sex offender." In a 2013 interview, assistant professor of communications at the University of Colorado Denver, Amy Adele Hasinoff, who studies the repercussions of sexting has stated that the "very harsh" child pornography laws are "designed to address adults exploiting children" and should not replace better sex education and consent training for teens. She went on to say, "Sexting is a sex act, and if it's consensual, that's fine ... Anyone who distributes these pictures without consent is doing something malicious and abusive, but child pornography laws are too harsh to address it." In April 2018, The Daily Telegraph'' reported that of the sexually explicit images of children and teenagers (11 to 15 year-olds) found on the Internet, 31% were made by children or teenagers from November 2017 to February 2018, with 40% in December 2017; 349 cases in January 2017 and 1717 in January 2018. The images were made by children or teenagers photographing or filming each other or as selfies, without adults present or coercing, by unwittingly imitating adult pornographic or nude images or videos (including of celebrities) that they had found on the Internet. The report said that sex offenders trawled for and amassed such images. Organizations There are many anti-child pornography organizations, such as the Financial Coalition Against Child Pornography, Association of Sites Advocating Child Protection, ECPAT International, and International Justice Mission. See also Child erotica CIRCAMP Circles of Support and Accountability Color Climax Corporation Commercial sexual exploitation of children Debate regarding child pornography laws Depictions of youth FBI Immigration and Customs Enforcement (ICE) Internet Watch Foundation and Wikipedia Interpol Kompromat Legal status of cartoon pornography depicting minors Lolicon/Shotacon Mobile Alliance Against Child Sexual Abuse Content National Crime Agency (NCA) Operation Blue Orchid Operation Protect Our Children Preadolescence Prevention Project Dunkelfeld Prostitution of children Protect (political organization) United States Postal Inspection Service Virtuous Pedophiles Relationship between child pornography and child sexual abuse References External links Oppenheimer, Mark, , Part 2, Part 3, Part 4, Part 5 Child Pornography Case Results in Lengthy Prison Sentences, FBI Crimes Pornography Sex crimes Online child abuse
64799547
https://en.wikipedia.org/wiki/Roger%20Noble%20Burnham
Roger Noble Burnham
Roger Noble Burnham (August 10, 1876 – March 14, 1962) was an American sculptor and teacher. He is best remembered for creating The Trojan (1930), the unofficial mascot of the University of Southern California. Life and career He was the eldest of the four children of Arthur Burnham and Katharine Bray. His father was an 1870 graduate of Harvard University, and a Boston banker. He grew up just outside Boston, and attended the Robert G. Shaw School and The Hale School for Boys. Burnham studied art and architecture at Harvard, graduating in 1899. He studied privately with Caroline Hunt Rimmer, and opened his own sculpture studio in Boston, specializing in portrait works. He moved to New York City in 1903, to work on the sculpture program for the 1904 St. Louis World's Fair, under Karl Bitter. He assisted George Brewster on twenty relief portrait medallions for the exterior of the Fair's Palace of Fine Arts, now the St. Louis Art Museum. Burnham graduated from the American Academy of Dramatic Arts in Los Angeles in 1907, and made an extended 1908 tour of the South and Southwest in a production of Richard Brinsley Sheridan's The Rivals. He worked as an actor in early Hollywood films, and later performed Shakespeare, in Los Angeles. He returned to Boston, and taught modeling at Harvard's School of Architecture from 1913 to 1917. He was chairman of the Cambridge, Massachusetts Chapter of the Boy Scouts of America, and moved to Hawaii in June 1917, to establish Boy Scout troops in the U.S. territory. He taught at the Otis Art Institute in Los Angeles from 1926 to 1932. The Trojan The Trojan (1930)—officially named the Trojan Shrine, but affectionately called "Tommy Trojan"—is a monument on the University of Southern California campus. The university's athletic teams are the USC Trojans. The monument consists of a life size bronze warrior, wearing a championship belt and pleated skirt, a mohawk-plumed helmet, shin guards and sandals. He wields the Sword of Knowledge in one hand, and holds the Shield of Courage in the other. The figure stands upon a concrete pedestal, with inscriptions, bronze relief panels, and a Greek key border. The front of the base features the bronze seal of the university, with "The Trojan" inscribed above it, and words listing the traits of an ideal Trojan inscribed below it: "Faithful," "Scholarly," "Skillful," "Courageous," "Ambitious." The left side of the base features a bronze relief panel representing athletics. The right side of the base features a bronze relief panel representing scholarship. The rear of the base features an incised flaming torch, and a bronze plaque with a quote in Latin and English. Burnham's primary models for the figure were All-American USC football players Russ Saunders (head and upper half) and Erny Pinckert (lower half). Military monuments Burnham was living in Honolulu in April 1918, when the United States entered World War I, and joined the Hawaii National Guard. Soon after the armistice, he designed a war memorial for the entrance to the city's Kapiolani Park. "It consists of three figures, the central one typifying Liberty while beneath are a Hawaiian warrior and a Hawaiian maiden. The warrior offers his spear while the maiden extends in outstretched hands a lei." Instead of a formal monument with sculpture, the Waikiki Natatorium War Memorial (1927) was built at Waikiki Beach. He was commissioned to create a statue of Frank Luke, Jr. (1930), for the grounds of the Arizona State Capitol in Phoenix. An ace U.S. Army pilot who died in World War I, Second Lieutenant Luke was posthumously awarded the Medal of Honor. A bronze plaque on the rear of the monument's base lists the names of the other 318 Arizonans who died in the war. The monument was dedicated on Armistice Day, 1930. Luke Air Force Base, in Glendale, Arizona, was named for the pilot in 1941. Burnham took leave from Harvard to serve in the Massachusetts Naval Reserve during the Spanish-American War. More than fifty years later, he created The Spirit of '98 (1950), the United Spanish War Veterans Memorial for the Sawtelle Veterans Home in Los Angeles County. It featured a tall, stepped concrete base, with marble figures of a soldier and sailor flanking a hooded goddess figure holding a torch. Burnham's marble statues were toppled in a 1971 earthquake. The memorial was restored with fiberglass replicas of the statues by sculptor David Wilkins, and relocated to Los Angeles National Cemetery. He created the bronze statue of General Douglas MacArthur for the MacArthur Monument (1955), in Los Angeles's MacArthur Park. Other works Architect Edward T. P. Graham designed the 10-story Boston City Hall Annex (1912-1914), and Burnham (Graham's Harvard classmate) was given the commission to model four colossal figures for its 9th story cornice. Needing some strong vertical projections to carry the lines of the four large Corinthian columns on the front into the attic story, Mr. Edward T. P. Graham, architect of the Annex, decided to use partially attached human figures. Keeping their architectural character in mind, the sculptor designed the figures with a certain stiffness and with emphasis upon the vertical lines of the Greek drapery. The proportions, particularly of the faces, were modified to meet the fact that the figures would be seen from far below and foreshortened.Four standing figures, each 16 feet high and projecting 3 feet 2 inches from the building. Of reinforced concrete to match the limestone of the building.By Roger Noble Burnham. Contracted for by the commission at $8,000, July 29, 1913. The four goddess figures were cast in concrete and hollow (to reduce their weight). Even so, each figure weighed approximately 8 tons (7.26 metric tonnes), and needed to be hoisted more than up to the cornice. Burnham's figures were deemed unsafe, and destroyed in the process of removing them from the façade, in 1947. Following the 1926 premature death of silent screen star Rudolph Valentino, Los Angeles City Council initially opposed the creation of a public monument to him. The city eventually relented, and approved the erection of a memorial fountain in Hollywood's De Longpre Park. Burnham's fountain sculpture was an Art Deco male nude, Aspiration. Made of bronze and covered with gold-leaf, the streamlined figure looked skyward while standing upon a black marble globe. Its cubical black marble base featured an inscription: "Erected in Memory of / RUDOLPH VALENTINO / 1895 - 1926 / Presented by His Friends and / Admirers from Every Walk of / Life and in All Parts of the World / in Appreciation of the Happi- / ness Brought to Them by His / Cinema Portrayals." The fountain was dedicated May 6, 1930, on what would have been Valentino's 35th birthday. The Astronomers Monument (1934), at Griffith Observatory, Los Angeles, was built by the Public Works of Art Project of the New Deal. Designed by sculptor Archibald Garner, it is a hexagonal cast concrete obelisk, crowned by a bronze armillary sphere. Above its star-shaped base are six V-shaped recesses, in each of which stands a cast concrete Art Deco statue of an astronomer from history. The figures are approximately in height, and each was modeled by a different sculptor. Burnham created the John Herschel figure. The monument was dedicated November 25, 1934. Will Rogers, an Oklahoma cowboy who became a nationally known humorist and movie star, died in an August 15, 1935 plane crash. Under contract to 20th Century Fox, the studio decided to name its new sound stage for Rogers, and commissioned Burnham to create a memorial plaque. The dedication was November 23, 1935, and the bronze portrait plaque was unveiled by 7-year-old Shirley Temple, Rogers's intended co-star for their next movie. Burnham created a number of medals for the Medallic Art Company in New York City, and figurines of his sculptures were mass produced by Roxor Studios in Chicago. One early figurine was Speed Demon (1919), which featured a threaded screw hole for mounting as an automobile hood ornament. The Spirit of Rotary (1920) was marketed to members of Rotary International, and Dedication to Service (1921) to members of Kiwanas International. One of his best-selling figurines was The Trojan (1930), marketed to USC alumni. At age 78, Burnham worked on a science exhibit for Disneyland. The centerpiece of Monsanto's Hall of Chemistry was the Chemitron (1955): a giant ring of eight, clear plastic test tubes—each about in height—that individually spun as the entire ring rotated. He modeled anthropomorphic figures representing the eight ingredients used to make plastics: Air, Coal, Limestone, Oil, Phosphate, Salt, Sulphur, and Water. His figures were cast in colorful tinted plastic, and one was mounted atop each spinning test tube. Honors, exhibitions and awards Burnham was a member of the Boston Architectural Club, the American Numismatic Society, the Honolulu Art Society, the Painters and Sculptors Club of Los Angeles, and the Rotary Club of Los Angeles. He exhibited at the 1911 Espoizione Internazionale d'arte in Rome. The Copley Gallery in Boston hosted a one-man-show of his sculpture in early 1913. He exhibited at the 1913 Salon of the Société des Artistes Français in Paris, and the 1913 Exposition Internationale de Gand in Brussels. The John Herron Art Institute in Indianapolis hosted a one-man-show of his sculpture in April/May 1914, and he exhibited at the 1915 Panama-Pacific International Exhibition in San Francisco. He exhibited the bust of his wife and a collection of medals at the 1914 exhibition at the Pennsylvania Academy of the Fine Arts, in Philadelphia; and his door panels for the Forsyth Dental Infirmary at PAFA's 1917 exhibition. His work was part of the sculpture competition of the 1932 Summer Olympics in Los Angeles. Burnham created the Henry O. Avery Prize for Sculpture medal (1904) for the Architectural League of New York, and became the first recipient of that medal. He won the 1912 national competition to design the University of California's scholarship medal. He created the Seal for the 1928 Pacific Southwest Exposition in Long Beach, California, and was awarded a Diploma of Honor at the exposition. His portrait bust of poet Alfred Noyes was awarded First Prize at the Los Angeles County Museum of Art's 1944 annual exhibition. Personal Burnham married writer Eleanor Howard Waring (1868-1959), on June 18, 1909. Their honeymoon trip in a hot air balloon made the front page of The New York Times. He and his wife moved to Hawaii in June 1917. She founded an amateur theatre group in Honolulu, the "Lanai Players," and directed the productions. The couple left Hawaii in 1922, and lived in Berkeley, California for a couple years, before settling in Los Angeles. Burnham found the view from Griffith Park inspiring, and built a studio "in a little dead-end place, perched almost on the Observatory grounds." Burnham was a religious man, and in 1951, "outlined a plan before his city's religious leaders in which he proposed to place a 150-ft. statue of the smiling Jesus upon a mountain towering over Hollywood." The project was never built. Eleanor and Roger Burnham were married for 50 years, until her death in 1959. He died three years later, in Los Angeles, following an extended illness. Both were cremated, and their ashes are located at Forest Hills Cemetery in Jamaica Plain, Massachusetts. Selected works The Ideal Type of Morgan (bronze, 1912), equine figure. Burnham's wife donated a cast to the Georgia Morgan Horse Club. Hawaii War Monument (1919). Proposed monument for Honolulu's Kapiolani Park. Unbuilt. Dedication to Service (medium, 1921), a life size male nude with outstretched arms, looking upward. Exhibited at the Art Institute of Chicago, 1921 Luther Burbank (bronze, 1923), Field Museum, Chicago, Illinois. The life size, half-length figure holds a Burbank Plum in one hand and a Shasta Daisy in the other. Burnham's full size clay model is at the Luther Burbank Home and Gardens, Santa Rosa, California. The Hand of God, Holding the Earth to be Moulded by Man (medium, 1924). Proposed colossal sculpture for a Luther Burbank Memorial Park in Santa Rosa California. Unbuilt. The Trojan (bronze, 1930), height: , University of Southern California, Los Angeles, California. The concrete base features bronze relief plaques and inscriptions. Lieutenant Frank Luke, Jr. (bronze, 1930), Arizona State Capitol, Phoenix, Arizona. Aspiration (gold-leafed bronze, 1930), height: , Rudolph Valentino Memorial Fountain, De Longpre Park, Hollywood, California. John Herschel (cast concrete, 1934), Astronomers Monument, Griffith Observatory, Griffith Park, Los Angeles, California. One of six statues of astronomers, each modeled by a different sculptor. United Spanish War Veterans Memorial, original: (marble and concrete, 1950), replica: (fiberglass and concrete, 1973), Los Angeles National Cemetery, Los Angeles, California. Burnham's marble statues were destroyed in a 1971 earthquake; sculptor David Wilkins created fiberglass replicas. The Answer (1951). Proposed colossal statue of Jesus Christ for a mountain overlooking Los Angeles. Unbuilt. General Douglas MacArthur Monument (bronze, 1955), height: , MacArthur Park, Los Angeles, California 8 anthropomorphic figures (tinted plastic, 1955), Hall of Chemistry, Disneyland, Anaheim, California Portrait busts William H. Crane as "David Harcum" (medium, 1901). American comic actor in a famous role. Edward Perry Warren (bronze, 1911), Oxford University, United Kingdom. Exhibited at the 1911 Espoizione Internazionale d'arte in Rome Eleanor Waring Burnham (medium, 1914), portrait of the sculptor's wife. Exhibited at the 1915 Panama-Pacific International Exposition in San Francisco George Henry Forsyth (bronze, 1916), entrance hall, Forsyth Dental Infirmary for Children, Boston, Massachusetts. (See Relief works, below) Frank Tenney Johnson (bronze, 1922), National Academy of Design, New York City Charles Keeler (medium, 1922), Burnham's friend and neighbor in Berkeley, California Frances E. Willard (marble, 1932), California State Building, Los Angeles W. I. Travers (medium, 1934), Phineas Banning High School, Los Angeles. Posthumous portrait of the school's principal Frances Whitesell (medium, 1935-1939) Senator Francis G. Newlands (medium, 1937), U.S. senator from Nevada Muriel Pulitzer (medium, 1938), sculptor of religious works Alfred Noyes (medium, 1944), British poet and playwright. Awarded First Prize at LACMA's 1944 exhibition Relief works Bust of Margaret Howard Stockett Berkley (bronze, 1909), Maryland Center for History and Culture, Baltimore, Maryland. Portrait of a 9-year-old girl Bust of Fabian Fall (bronze, 1909), Harvard University Library, Cambridge, Massachusetts Justin Morgan Plaque (medium, 1911), a circular plaque of a Morgan horse nuzzling a young woman. Eleanor Waring Burnham wrote a historical novel about the Morgan horse. Ernst Perabo at the Piano (bronze, 1912), diameter: , Museum of Fine Arts, Boston, Massachusetts Exhibited at the 1911 Espoizione Internazionale d'arte in Rome. Exhibited at the 1915 Panama-Pacific International Exposition in San Francisco. Uncle Remus Memorial Tablet (bronze, 1914), Joel Chandler Harris House, Atlanta, Georgia. Dedicated May 23, 1914. "Bre'r Rabbit making a speech to the animals." Burnham and his wife were both members of the Uncle Remus Memorial Association. Bust of Joel Chandler Harris (bronze, 1914), Joel Chandler Harris House, Atlanta, Georgia. Dedicated May 23, 1914. Panel: Education (medium, 1913), Castle Hall, Punahou School, Honolulu, Hawaii Carrington Mason Memorial Plaque (medium, 1915), Cossitt Library, Memphis, Tennessee Forsyth Dental Infirmary for Children, Boston, Massachusetts. Burnham also modeled the bronze bust of the building's donor, George Henry Forsyth, in the entrance hall. The building is now part of the Museum of Fine Arts, Boston. Panel: The Mother: Giver of Life and Love (bronze, 1916), main entrance doors. Panel: The Commonwealth: Giver of Health and Learning (bronze, 1916), main entrance doors Panel: Alice in Wonderland (bronze, 1916), children's entrance doors Panel: Bre'r Rabbit and Bre'r Fox (bronze, 1916), children's entrance doors Fireplace surround: Jack and the Beanstalk (medium, 1916) Susan Williams Graham Memorial Fountain (marble, 1916, 1920), Coker Arboretum, University of North Carolina, Chapel Hill. Created as a watering trough for horses, Burnham's arched panel of a Greek-gowned woman pouring water from an urn was added in 1920. Relocated from Franklin Street to Coker Arboretum in 1956. Lowrey Memorial Fountain (Tennessee pink marble & bronze, 1919), Hawaiian Mission Children's Society, Honolulu, Hawaii. Beside the circular pool is a low-arched marble stele with an inset bronze relief panel, depicting Cherilla Storrs Lowrey and others "at work on plans for a City Beautiful." Hawaii Glee Club Tablet (bronze, 1924), Hamilton Library, University of Hawaii at Manoa, Honolulu A Greek athlete in skirt and helmet and a Hawaiian athlete in loincloth and helmet jointly holding up a laurel wreath. Trophy for annual singing contest among Hawaiian choirs John Muir Memorial Plaque (bronze, 1924), John Muir Cabin site, Yosemite National Park, California Friendliness - Seal of the Pacific Northwest Exhibition (medium, 1928), Long Beach, California. "Circular seal with figure of Friendship, globe and exposition grounds. Lettering around edge." Rudolph Valentino Portrait Plaque (painted plaster, year), diameter: Bust of Albert Beck Wenzell (medium, 1931) Will Rogers Memorial Plaque (bronze, 1935), Will Rogers Sound Stage, 20th Century Fox Studios, Century City, Los Angeles, California. Unveiled by 7-year-old Shirley Temple, November 23, 1935. Alexander Hamilton - Scholar, Soldier, Statesman (bronze, 1935), Alexander Hamilton High School, Los Angeles, California Panel: The Way (medium, 1935), University of Southern California, Los Angeles The Spirit of Hollywood Plaque (medium, 1937), unlocated. Actress Betty Grable posed as the model. Mural: San Pedro Harbor in 1850 (medium, 1949), San Pedro Branch Library (1948), Los Angeles County, California. A large painted relief on the library's façade, depicting a birds-eye view of the harbor a century earlier Architectural sculpture Four colossal cornice figures, Boston City Hall Annex, 26 Court Street, Boston, Massachusetts. Removed and destroyed, 1947. Charity (cast concrete, 1914), height: , holds an infant Industry (cast concrete, 1914), height: , holds a wheel and a spindle of yarn Education (cast concrete, 1914), height: , holds a quill pen and a book Law and Order (cast concrete, 1914), height: , holds fasces and the Book of Laws Augustus Busch Hall, Harvard University, Cambridge, Massachusetts Head of Athena (medium, completed 1917, dedicated 1921), over entrance doors Tritons (medium, completed 1917, dedicated 1921), Chiron (Centaur) (medium, completed 1917, dedicated 1921), cornice figure Miniatures Medal: Henry O. Avery Prize for Sculpture (bronze, 1904), Architectural League of New York. Burnham was the first recipient of the medal he designed. Plaque: Self-Portrait (plaster, 1905), height: , oval bust of Burnham looking directly at the viewer Medal: Decennial Medal for Harvard University Class of 1899 (bronze, 1909), diameter: Medal: University of California Scholarship Medal (bronze, 1912), Medallic Art Company, New York, diameter: . "Roger Noble Burnham of Massachusetts, winner in the recent National competition for a new design for the University Medal." Museum of Fine Arts, Boston, Massachusetts Exhibited at the 1915 Panama-Pacific International Exposition in San Francisco. Medal: Ernst Perabo at the Piano (bronze, 1914), miniature of Burnham's 1910 relief plaque. Figurine: Speed Demon (bronze, 1919), height: , length: , hood ornament Figurine: The Spirit of Rotary (medium, 1920). A hooded goddess holding a shield, wielding a sword, and standing upon the globe. Exhibited at the 1922 Rotary International Convention, held in Los Angeles. Figurine: Dedication to Service (patinated metal, 1921), height: . "Model of a male figure with arms outstretched, standing erect and looking up." Advertized to members of Kiwanis International. Plaque: California Faience (terra cotta, 1922), oval portrait bust of a young woman, height: Medal: Luther Burbank (bronze, 1923), Medallic Art Company, New York. "Jubilee Medal," portrait bust celebrating the horticulturalist's 75th birthday Plaque: Mother and Daughter (bronze, 1926), oval double portrait bust, height: Medal: Pacific Northwest Exposition - Long Beach California 1928 (bronze, 1928), Medallic Art Company, New York, diameter: Figurine: The Trojan (bronze, 1930), height: Figurine: General Douglas MacArthur (bronze, 1952), height: References 1876 births 1962 deaths 20th-century American sculptors American male sculptors American medallists Olympic competitors in art competitions People from Hingham, Massachusetts Harvard University alumni Harvard University faculty Artists from Los Angeles Otis College of Art and Design faculty Sculptors from Massachusetts Sculptors from California Scouting pioneers
1655558
https://en.wikipedia.org/wiki/Gawker%20Media
Gawker Media
Gawker Media LLC (formerly Blogwire, Inc. and Gawker Media, Inc.) was an American online media company and blog network. It was founded by Nick Denton in October 2003 as Blogwire, and was based in New York City. Incorporated in the Cayman Islands, as of 2012, Gawker Media was the parent company for seven different weblogs and many subsites under them: Gawker.com, Deadspin, Lifehacker, Gizmodo, Kotaku, Jalopnik, and Jezebel. All Gawker articles are licensed on a Creative Commons attribution-noncommercial license. In 2004, the company renamed from Blogwire, Inc. to Gawker Media, Inc., and to Gawker Media LLC shortly after. In 2016, the company filed for Chapter 11 bankruptcy protection after damages of $140 million were awarded against the company as a result of the Hulk Hogan sex tape lawsuit. On August 16, 2016, all of the Gawker Media brands, assets except for Gawker.com, were acquired at auction by Univision Communications for $135 million. Two days later on August 18, the company announced that Gawker.com would cease operations the following week, while its other sites will continue to operate. On September 21, 2016, Univision moved all of the Gawker Media properties to their newly-created Gizmodo Media Group. Gizmodo was subsequently acquired by Great Hill Partners along with The Onion in 2019 under the G/O Media Inc. umbrella, reportedly for less than $50 million. Ownership, finances, and traffic While Denton has generally not gone into detail over Gawker Media's finances, he made statements in 2005 that downplayed the profit potential of blogs declaring that "[b]logs are likely to be better for readers than for capitalists. While I love the medium, I've always been skeptical about the value of blogs as businesses", on his personal site. In an article in the February 20, 2006 issue of New York Magazine, Jossip founder David Hauslaib estimated Gawker.coms annual advertising revenue to be at least $1 million, and possibly over $2 million a year. Combined with low operating costs—mostly web hosting fees and writer salaries—Denton was believed to be turning a healthy profit by 2006. In 2015, Gawker Media LLC released its audited revenue for the past five years. In 2010, its revenue was $20 million and operating income of $2.6 million. Gawker Media's revenues steadily increased through 2014 and its audited revenue for 2014 was $45 million with $6.5 million operating income. Business Insider valued the company at $250 million based upon its 2014 revenue. In early 2015, Denton stated that he planned to raise $15 million in debt from various banks so as not to dilute his equity stake in the company by accepting investments from venture capital firms. In June 2016, Gawker Media revealed its corporate finances in a motion for a stay of judgment pending appeal and accompanying affidavits filed in the Bollea v. Gawker case in Florida state court. In the filings, the company stated that it could not afford to pay the $140.1 million judgment or the $50 million appeal bond. The company's balance sheet at the time reflected total assets of $33.8 million ($5.3 million cash, $11.9 million accounts receivable, $12.5 million fixed assets), total current liabilities of $27.7 million; and total long-term liabilities of $22.8 million. A bond broker stated in an affidavit that the company's book value was $10 million. In June 2016, at the time of the company's filing for bankruptcy, Denton had a 29.52% stake in the Gawker Media Group, and his family had another stake through a trust. History Gawker Media was incorporated in Budapest, Hungary, where a small company facility is still maintained. The company was headquartered early on at Nick Denton's personal residence in the New York City neighborhood of SoHo, and it remained there until 2008. That year, he created a new base of operations in Nolita in Manhattan. On April 14, 2008, Gawker.com announced that Gawker Media had sold three sites: Idolator, Gridskipper, and Wonkette. In a fall 2008 memo, Denton announced the layoff of "19 of our 133 editorial positions" at Valleywag, Consumerist, Fleshbot, and other sites, and the hiring of 10 new employees for the most commercially successful sites—Gizmodo, Kotaku, Lifehacker, and Gawker—and others which were deemed to promise similar commercial success (Jezebel, io9, Deadspin, and Jalopnik). Denton also announced the suspension of a bonus payment scheme based on pageviews, by which Gawker had paid $50,000 a month on the average to its staff, citing a need to generate advertising revenue as opposed to increasing traffic. He explained these decisions by referring to the 2008 credit crisis, but stated that the company was still profitable. In September 2008, Gawker reported 274 million pageviews. On November 12, 2008, Gawker announced that Valleywag would fold into Gawker.com. Consumerist was sold to Consumers Union, which took over the site on January 1, 2009. On February 22, 2009, Gawker announced that Defamer.com would fold into Gawker.com. In October 2009, Gawker Media websites were infected with malware in the form of fake Suzuki advertisements. The exploits infected unprotected users with spyware and crashed infected computer's browsers. The network apologized by stating "Sorry About That. Our ad sales team fell for a malware scam. Sorry if it crashed your computer". Gawker shared the correspondence between the scammers and it via Business Insider. On February 15, 2010, Gawker announced it had acquired CityFile, an online directory of celebrities and media personalities. Gawker's Editor-in-Chief Gabriel Snyder announced that he was being replaced by CityFile editor Remy Stern. Source code breach On December 11, 2010, the Gawker group's 1.3 million commenter accounts and their entire website source code was released by a hacker group named Gnosis. Gawker issued an advisory notice stating: "Our user databases appear to have been compromised. The passwords were encrypted. But simple ones may be vulnerable to a brute-force attack. You should change your Gawker password and on any other sites on which you've used the same passwords". Gawker was found to be using DES-based crypt(3) password hashes with 12 bits of salt. Security researchers found that password cracking software "John the Ripper" was able to quickly crack over 50% of the passwords from those records with crackable password hashes. Followers of Twitter accounts set up with the same email and password were spammed with advertisements. The Gnosis group notes that with the source code to the Gawker content management system they obtained, it will be easier to develop new exploits. 2011 redesign and traffic loss As part of a planned overhaul of all Gawker Media sites, on 1 February 2011, some Gawker sites underwent a major design change as part of the larger roll-out. Most notable was the absence of formerly present Twitter and StumbleUpon sharing buttons. Nick Denton explained that Facebook had been by far the biggest contributor to the site's traffic and that the other buttons cluttered the interface. This decision lasted three weeks, after which the buttons were reinstated, and more added. On 7 February 2011, the redesign was rolled out to the remainder of the Gawker sites. The launch was troubled due to server issues. Kotaku.com and io9.com failed to load, displaying links, but no main content, and opening different posts in different tabs did not work, either. The new look emphasised images and de-emphasised the reverse chronological ordering of posts that was typical of blogs. The biggest change was the two-panel layout, consisting of one big story, and a list of headlines on the right. This was seen as an effort to increase the engagement of site visitors, by making the user experience more like that of television. The site redesign also allowed for users to create their own discussion pages, on Gawker's Kinja. Many commenters largely disliked the new design, which was in part attributed to lack of familiarity. Rex Sorgatz, designer of Mediaite and CMO of Vyou, issued a bet that the redesigns would fail to bring in traffic, and Nick Denton took him up on it. The measure was the number of page views by October recorded on Quantcast. Page views after the redesign declined significantly—Gawker's sites had an 80% decrease in overall traffic immediately after the change and a 50% decrease over two weeks—with many users either leaving the site or viewing international versions of the site, which hadn't switched to the new layout. On 28 February 2011, faced with declining traffic, Gawker sites allowed for visitors to choose between the new design and the old design for viewing the sites. Sorgatz was eventually determined to be the winner of the bet, as at the end of September, 2011, Gawker had only 500 million monthly views, not the 510 million it had had prior to the redesign. However, on 5 October 2011, site traffic returned to its pre-redesign numbers, and as of February 2012, site traffic had increased by 10 million over the previous year, according to Quantcast. As of March 23, 2012, commenting on any Gawker site required signing in with a Twitter, Facebook, or Google account. Leaked Quentin Tarantino script In January 2014, Quentin Tarantino filed a copyright lawsuit against Gawker Media for distribution of his 146-page script for The Hateful Eight. He claimed to have given the script to six trusted colleagues, including Bruce Dern, Tim Roth, and Michael Madsen. Due to the spreading of his script, Tarantino told the media that he would not continue with the movie. "Gawker Media has made a business of predatory journalism, violating people's rights to make a buck," Tarantino said in his lawsuit. "This time they went too far. Rather than merely publishing a news story reporting that Plaintiff's screenplay may have been circulating in Hollywood without his permission, Gawker Media crossed the journalistic line by promoting itself to the public as the first source to read the entire Screenplay illegally." Collective action On 22 June 2013, unpaid interns brought a Fair Labor Standards Act action against Gawker Media and founder Nick Denton. As plaintiffs, the interns claimed that their work at sites io9.com, Kotaku.com, Lifehacker.com, and Gawker.TV was "central to Gawker's business model as an Internet publisher," and that Gawker's failure to pay them minimum wage for their work therefore violated the FLSA and state labor laws. Although some interns had been paid, the court granted conditional certification of the collective action. In October 2014, a federal judge ruled that notices could be sent to unpaid interns throughout the company who could potentially want to join the lawsuit. A federal judge later found that the claims of interns who joined the suit as plaintiffs were outside the statute of limitations. On March 29, 2016, a federal judge ruled in favor of Gawker, noting that the plaintiff had correctly been deemed an intern instead of an employee and was the primary beneficiary of his relationship with Gawker Media. Unionization In June 2015, Gawker editorial staff voted to unionize. Employees joined the Writers Guild of America, East. Approximately three-quarters of employees eligible to vote voted in favor of the decision. Gawker staff announced the vote on May 28, 2015. Condé Nast executive prostitution claims In July 2015, Gawker staff writer Jordan Sargent published an article attempting to "out" a married executive at Condé Nast, over a gay porn star's alleged text correspondence. The post sparked heavy criticism for outing the executive, both internally and from outsiders. Denton removed the story the next day, after Gawker Media's managing partnership voted 4-2 to remove the post—marking the first time the website had "removed a significant news story for any reason other than factual error or legal settlement." Gawker's Executive Editor and Editor-in-Chief resigned after the story was dropped from Gawker's website. According to The Daily Beast, "a source familiar with the situation said Gawker ultimately paid the subject of the offending article a tidy undisclosed sum in order to avoid another lawsuit." Daily Mail defamation lawsuit In September 2015, Gawker published a first-person narrative by a former employee of British tabloid The Daily Mail which was critical of the journalistic standards and aggregation policies for its online presence. Daily Mail sued for defamation, stating the article contained "blatant, defamatory falsehoods intended to disparage The Mail." In August 2016, it was reported that Gawker was in the final stages of settling the lawsuit." Hulk Hogan sex tape On October 4, 2012, AJ Daulerio, a Gawker editor, posted a short clip of Hulk Hogan and Heather Clem, the estranged wife of radio personality Bubba the Love Sponge, having sex. Hogan (who went by his real name, Terry Gene Bollea, during the trial) sent Gawker a cease-and-desist order to take the video down, but Denton refused. Denton cited the First Amendment and argued the accompanying commentary had news value. Judge Pamela Campbell issued an injunction ordering Gawker to take down the clip. In April 2013, Gawker wrote, "A judge told us to take down our Hulk Hogan sex tape post. We won't." It also stated that "we are refusing to comply" with the order of the circuit court judge. Hogan filed a lawsuit against Gawker and Denton for violating his privacy, asking for $100 million in damages. In May 2016, billionaire Peter Thiel confirmed in an interview with The New York Times that he had paid $10 million in legal expenses to finance several lawsuits brought by others, including the lawsuit by Terry Bollea (Hogan) against Gawker Media. Thiel referred to his financial support of Bollea's case as "one of my greater philanthropic things that I've done." Thiel was reportedly motivated by anger over a 2007 Gawker article that had outed him as gay. During the Hogan lawsuit trial Daulerio told the court that he would consider a celebrity sex tape non-newsworthy if the subject was under the age of four. Daulerio later told the court he was being flippant in his statements. In January 2016, Gawker Media received its first outside investment by selling a minority stake to Columbus Nova Technology Partners. Denton stated that the deal was reached in part to bolster its financial position in response to the Hogan case. On March 18, 2016, the jury awarded Hulk Hogan $115 million in compensatory damages. On March 21, the jury awarded Hogan an additional $25 million in punitive damages, including $10 million from Denton personally. Denton said the company would appeal the verdict. On April 5, Gawker began the appeal process. On November 2, Gawker reached a $31 million settlement with Bollea and dropped the appeal. Teresa Thomas lawsuit Following the Hulk Hogan lawsuit, Teresa Thomas, a former employee at Yahoo!, filed a lawsuit against Gawker alleging the site said she was dating her boss, and therefore invaded her privacy and defamed her. 2016 Chapter 11 bankruptcy protection On June 10, 2016, Gawker filed for Chapter 11 bankruptcy protection, and reports suggested that the company might be negotiating with potential buyers, including a stalking horse offer from Ziff Davis for "under $100 million". Asset seizure On July 29, 2016, in a meeting with the courts, Denton was chastised for inflating the value of his equity in Gawker. The presiding judge stated that Denton informed the court that the value of his stock was valued at eighty-one million dollars. This valuation was used to give the court and Hogan the impression that Denton's stock would cover the majority of the money owed by the company. However, the stock was found to be valued at thirty million, and not the cited eighty-one million. In the wake of this revelation, the court found that Denton had not acted in good faith, and issued an order stating that Hogan could begin seizing assets from Gawker. Univision Communications acquisition and subsidiary era (2016–present) On August 16, 2016, Univision Communications paid $135 million at auction to acquire all of Gawker Media and its brands. This ended Gawker Media's fourteen years of operation as an independent company, as it was planned at that time to become a unit of Univision. On August 18, 2016, Gawker.com, Gawker Media's flagship site, announced that it would be ceasing operations the week after. Univision continued to operate Gawker Media's six other websites, Deadspin, Gizmodo, Jalopnik, Jezebel, Kotaku, and Lifehacker. Gawker's article archive remains online, and its employees were transferred to the remaining six websites or elsewhere in Univision. On August 22, 2016, at 22:33 GMT, Denton posted Gawker's final article. On September 10, 2016, Univision removed six controversial posts from various Gawker Media sites, each with the note: "This story is no longer available as it is the subject of pending litigation against the prior owners of this site." List of blogs previously operated by Gawker Media Sold to Univision, renamed Gizmodo Media Group Deadspin – Sports Gizmodo – Gadget and technology lifestyle Jalopnik – Cars and automotive culture Jezebel – Celebrity, sex, and fashion for women Kotaku – Video games and East Asian pop culture Lifehacker – Productivity tips Sploid – shut down in 2006 but revived and merged into Gizmodo International sites Gizmodo en Español – Hispanic Australia (owned by Allure Media) Gizmodo Australia – Gadgets and technology Kotaku Australia – Games and gaming industry coverage Lifehacker Australia – Tips, tricks, tutorials, hacks, downloads and guides Sold or defunct prior to Univision sale Gawker.com – New York City media, politics and gossip; Shut down August 22, 2016 Screenhead – shut down in 2006 Idolator – music, sold to Buzz Media in 2008 Wonkette – sold to its managing editor Ken Layne in 2008 Gridskipper – sold to Curbed in 2008 Consumerist – consumer affairs, sold to Consumers Union in 2008 Valleywag – Silicon Valley news and gossip, shut down in 2008 Defamer – shut down in 2015 Fleshbot – sex and sex industry coverage, sold in 2012 to Fleshbot's editor Lux Alptraum io9 – science, science fiction, and futurism; merged into Gizmodo in 2015 Cink – Hungarian blog, defunct in 2015 See also Kinja Weblogs, Inc. References External links Tom Zeller, Jr. New York Times.com: "A Blog Revolution? Get a Grip" (May 8, 2005) (registration required) Vanessa Grigoriadis, New York magazine: "Everybody Sucks: Gawker and the rage of the creative underclass" (October 22, 2007) Blog networks Defunct mass media companies of the United States Defunct companies based in New York City Mass media companies based in New York City Entertainment companies established in 2003 Mass media companies established in 2003 Mass media companies disestablished in 2016 2003 establishments in New York City 2016 disestablishments in New York (state) Companies that filed for Chapter 11 bankruptcy in 2016 Fusion Media Group Former Univision Communications subsidiaries Creative Commons-licensed authors
25864167
https://en.wikipedia.org/wiki/List%20of%20crowdsourcing%20projects
List of crowdsourcing projects
Below is a list of projects that rely on crowdsourcing. See also open innovation. A Adaptive Vehicle Make is a project overseen by DARPA to crowdsource the design and manufacture of a new armored vehicle. Air Quality Eggs by WickedDevices are open-source hardware Internet of Things pollution monitors that facilitate citizen crowdsourcing of air quality readings Amara is a website that enables crowdsourced translations of videos from a variety of popular video hosting websites. The subtitles created are used to make online video content accessible to a wider audience, including the deaf and hard of hearing, and those who cannot understand the language of the source. In 2005, Amazon.com launched the Amazon Mechanical Turk, a platform on which crowdsourcing tasks called "HITs" (Human Intelligence Tasks") can be created and publicized and people can execute the tasks and be paid for doing so. Dubbed "Artificial Artificial Intelligence", it was named after The Turk, an 18th-century chess-playing "machine". The first crowdsourced documentary film is the non-profit "The American Revolution". which went into production in 2005, and which examines the role media played in the cultural, social and political changes from 1968 to 1974 through the story of underground, free-form radio station WBCN-FM in Boston. When the project began, by seeking archival contributions from the public, the term "crowdsource" was not in use, and so the film was referred to as the "first open source documentary film". The film is being produced by Lichtenstein Creative Media and the non-profit Filmmakers Collaborative. Article One Partners, founded in 2008, is a community of technology experts who execute crowdsourced prior art search by researching and contributing information related to patents. By submitting research to the online platform, the community members compete for cash rewards, ranging from $5,000 to $50,000. B Berkeley Open System for Skill Aggregation (BOSSA), by analogy with the distributed computing project Berkeley Open Infrastructure for Network Computing (BOINC) Any software project with an open Beta test. Beyond Words was a crowdsourcing project created at the Library of Congress in 2017 using the open source codebase Scribe, created by Zooniverse and New York Public Library. It asked volunteers to identify cartoons and photographs in the Chronicling America historic newspaper collections. The purpose was partly to improve research based on these collections. BlueServo was a free website, which crowdsourced surveillance of the Texas–Mexico border through live camera streaming over the Internet. This evolved from an initiative taken by the State of Texas, which announced it would install 200 mobile cameras along the Texas–Mexico border, that would enable anyone with an internet connection to watch the border and report sightings of alleged illegal immigrants to border patrol agents. It was later shut down due to lack of funding. Britain in a Day is a Ridley Scott film, a successor project to Life in a Day, and part of the BBC's Cultural Olympiad, in which people in Britain filmed themselves on 12 November 2011, and submitted video clips online for inclusion in the film. By the People is a transcription and tagging crowdsourcing project from the Library of Congress. It launched on 24 October 2018. Volunteers can participate anonymously or by making an account. Signed in volunteers can edit other people's transcriptions. Materials are broken into "Campaigns" such as "Letters to Lincoln". "Rosa Parks: In her own words". and "Anna E. Dickinson Papers". Transcriptions are published on the Library of Congress main website, and are available for bulk download once a Campaign is complete. C California Digital Newspaper Collection In August 2011, the California Digital Newspaper Collection implemented crowdsourced OCR text correction of its digitized historical newspapers; some published as early as 1846 (California statehood 1850). CDNC is a project of the Center for Bibliographical Studies and Research (CBSR) at the University of California, Riverside. The CDNC is supported in part by the U.S. Institute of Museum and Library Services under the provisions of the Library Services and Technology Act, administered in California by the State Librarian. Historic Cambridge Newspaper Collection. In March 2011, the Cambridge Public Library in Cambridge, Massachusetts, launched a digital collection of historic newspapers that implements crowdsourced OCR text correction. The freely accessible and keyword searchable database contains newspapers dating back to 1846 when Cambridge was established as a city. The Historic Cambridge Newspaper Collection is a project of the Cambridge Room, the Cambridge Public Library's Archives and Special Collections, and is supported by funding from the Community Preservation Act. Chicago History Museum on 14 October 2013, announced a project asking the public to furnish ideas for a future exhibition and reducing the most-often-submitted ideas to one assignment through a series of public votes. According to the American Alliance of Museums, this is the first crowdsourcing project allowing the public to give an exhibition assignment to an American museum. Citizen Archivist is a crowdsourcing transcription project at the National Archives of the United States. Volunteers can transcribe and tag any digitized content in the National Archives' online holdings. Volunteer coordinators curate "Missions" to help volunteers choose materials that interest them. CitySourced is an enterprise civic engagement platform. CitySourced provides a mobile app for citizens to identify and report non-emergency civic issues, such as public works, quality of life, and environmental issues. The service is part of the e-Government or gov 2.0 movement, which aims to connect government and citizens through the use of technology. Cisco Systems Inc. held an I-Prize contest in which teams using collaborative technologies created innovative business plans. The winners in 2008 was a three-person team, Anna Gossen from Munich, her husband Niels Gossen, and her brother Sergey Bessonnitsyn, that created a business plan demonstrating how IP technology could be used to increase energy efficiency. More than 2,500 people from 104 countries entered the competition. The winning team won US$250,000. Clickworkers – experimental NASA site Crowdin is a localization management platform for mobile apps, web, desktop software and related assets. Reddit, Khan Academy, MinecraftMinecraft and other used the platform to crowdsource localization. CrowdFlower was founded in 2007 to manage internet crowdsourcing. It is currently the largest provider of crowdsourcing solutions for enterprise with over 450 million tasks completed and 2 million contributors. CrowdMed is a healthcare crowdsourcing platform based in San Francisco, California. Crowdspring is a marketplaces for crowdsourced creative services. D The 2007 DARPA Urban Challenge focused on autonomous vehicles, requiring participating teams to create an autonomous vehicle that was able to successfully navigate traffic as well as complex maneuvers including merging, passing, and parking. To successfully complete the challenge, participating vehicles needed to complete a 60-mile course in less than six hours. Dell IdeaStorm is a website launched by Dell on 16 February 2007 to allow Dell "to gauge which ideas are most important and most relevant to" the public. The Democratic National Committee launched FlipperTV in November 2007 and McCainpedia in May 2008 to crowdsource video gathered by Democratic trackers and research compiled by DNC staff in the hands of the public to do with as they chose. DesignCrowd, a crowdsourcing marketplace for graphic design and creative services, launched in February 2008 and helped run a contest for global footwear company HI-TEC. HI-TEC "estimated that using DesignCrowd.com [and crowdsourcing] for the project saved HI-TEC up to half the costs of going down the usual design route". DesignCrowd purchased Brandstack and formed BrandCrowd On 20 December 2011. LEGO Design byME was a service connected with the construction toy Lego. Launched in 2005 under the name Lego Factory, the service allowed people to design their own Lego models using a computer program, then upload them to the Lego website, design their own box design, and order them for actual delivery. The brand also covers a small selection of products that have been designed by Lego fans, and which were available to purchase as a set. The search for aviator Steve Fossett, whose plane went missing in Nevada in 2007, in which up to 50,000 people examined high-resolution satellite imagery from DigitalGlobe that was made available via Amazon Mechanical Turk. The search was ultimately unsuccessful. Fosset's remains were eventually located by more traditional means. DigitalGlobe satellite imagery had previously been posted to Amazon Mechanical Turk after the disappearance of computer scientist Jim Gray at sea in January 2007, an effort that had attracted much media attention, but not provided any new clues. Distributed Proofreaders (commonly abbreviated as DP or PGDP) is a Web-based project founded in 2000 by Charles Franks that supports the development of e-texts for Project Gutenberg by allowing many people to work together in proofreading drafts of e-texts for errors. As of October 2011, over 21,000 e-texts have been produced by DP. There are also offshoots (sister sites) such as DP-Europe and DP-Canada. The Doe Network is a 100% volunteer organization devoted to assisting investigating agencies in bringing closure to national and international cold cases concerning Missing & Unidentified Persons. Drift bottle experiments are citizen science experiments in which a surveying organization throws into large bodies of water, bottles containing messages requesting finders of the bottles to return the messages to the organization with a statement of the time and place at which the bottles were found, allowing the organization to determine patterns of water circulation in the bodies of water. Duolingo: With Duolingo users learned a language for free while helping to translate the web via their "Immersion" feature. The Immersion feature was retired in early 2017. E Emporis, a provider of building data, has run the Emporis Community (a website where members can submit building information) since May 2000. Today, more than 1,000 members contribute building data throughout the world. The ESP Game by Luis von Ahn (later acquired by Google and renamed Google Image Labeler) started in 2003 and gets people to label images as a side-effect of playing a game. The image labels can be used to improve image search on the Web. This game led to the concept of Games with a purpose. EteRNA, a game in which players attempt to design RNA sequences that fold into a given configuration. The widely varied solutions from players, often non-biologists, are evaluated to improve computer models predicting RNA folding. Some designs are actually synthesized to evaluate the actual folding dynamics and directly compare with the computer models. Europeana 1914–1918 is a collaboration led by Europeana with support from The Great War Archive team at the University of Oxford. A website is used by the project to encourage the public from all European Union states to contribute information about World War I, especially their family's stories and digitised photographs of their artefacts. In addition to the website, since 2011 First World War stories have been collected in person from family history roadshow events held in Germany, Luxembourg, Ireland, Slovenia, Denmark, with further events planned in Cyprus, Belgium, Italy and more. The items collected by the project are released on the Internet for use under a Creative Commons licence. EyeWire, a game by Sebastian Seung in which players help an algorithm to segment retinal cells in 3D images of the retina. The aim is to map a mouse retina by extracting individual neurons and their connections to each other. F Facebook has used crowdsourcing since 2008 to create different language versions of its site. The company claims this method offers the advantage of providing site versions that are more compatible with local cultures. FamilySearch Indexing is a volunteer project which aims to create searchable digital indexes for scanned images of historical documents. The documents are drawn primarily from a collection of 2.4 million microfilms made of historical documents from 110 countries and principalities. Volunteers install free software on their home computers, download images from the site, type the data they read from the image into the software, and submit their work back to the site. The data is eventually made publicly and freely available at familysearch.org (the world's largest nonprofit genealogical organization) for use in genealogical research. Over one billion historical records have been transcribed to date. Folding@Home is a distributed computing project for disease research that simulates protein folding, computational drug design, and other types of molecular dynamics. The project uses the idle processing resources of thousands of personal computers owned by volunteers who have installed the software on their systems. Its primary purpose is to determine the mechanisms of protein folding. This is of academic interest with implications for medical research into Alzheimer's disease, Huntington's disease, and many forms of cancer, among other diseases. Foldit invites the general public to play protein folding games to discover folding strategies. Citing Foldit, MSNBC's Alan Boyle reported that "video-game players have solved a molecular puzzle that stumped scientists for years," indicating that they "figure(d) out the detailed molecular structure of a protein-cutting enzyme from an AIDS-like virus found in rhesus monkeys." Freelancer.com started out in Sweden in 2004 as GetAFreelancer.com, and is now owned by Sydney, Australia-based Ignition Networks. M Barrie, the CEO, claims the company is the largest outsourcing site in the world, receiving more global traffic than competitor elance. The site has 1.5 million users in 234 countries and the average job size is under $200 and it projects a US$50 million in project turnover in the next 12 months. The site takes a 10 percent cut on work allocated. G Galaxy Zoo is a citizen science project that lets members of the public classify a million galaxies from the Sloan Digital Sky Survey. The project has led to numerous scientific papers and citizen scientist-led discoveries such as Hanny's Voorwerp. "The Gateway to Astronaut Photography of Earth – Image Detective" is an interactive citizen science hunt for the Earth location of images taken from space by astronauts since the 1960s. Reviewing 1.8 million photos, individuals submit what they believe to be the location of a given photo, and thus accumulate "points" and "badges" on part of the NASA website. General Electric organized a "Multi-Million Dollar Challenge" to find new, breakthrough ideas for creating cleaner, more efficient, and economically viable grid technologies. The challenge also aimed to accelerate the adoption of smart grid technologies. Genius, according to their website, is "the world's biggest collection of song lyrics and crowdsourced musical knowledge." It was previously referred to as Rap Exegesis and Rap Genius to decode complex rap lyrics. Now their mission is to "annotate the world". Google Image Labeler was a sort of game where users were asked to label pictures to improve images search results. Gooseberry Patch, has been using crowdsourcing to create their community-style cookbooks since 1992. Friends, buyers, fans, sales people are all encouraged to submit a recipe. Each contributors' recipe that is selected is recognized in the book and receives a free copy. The Great War Archive was a 2008 project, led by the University of Oxford. It requested members of the general public to digitize any artifacts they held relating to the First World War and upload them to a purpose-built website. The project successfully released over 6500 items and stories online, which can be freely downloaded and used for education and research. The project was funded by the Joint Information Systems Committee. In 2011, the team at the University of Oxford received further funding from Europeana to run a similar crowdsourcing initiative in Germany. From 2012, Europeana extended the project to become a project called "Europeana 1914–1918", a collaboration led by Europeana, with support from the team at the University of Oxford. There is a website where the project encourages the public from all European Union states to contribute information about World War I, especially their family's stories and digitized photographs of their artifacts. The Guardian'''s investigation into the MP Expense Scandal in the United Kingdom. The newspaper created a system to allow the public to search methodically through 700,000 expense-claim documents. Over 20,000 people participated in finding erroneous and remarkable expense claims by members of parliament. H The Historical Marker Database is an online database that documents locations of numerous historical markers in the United States as well as other countries. The Vancouver Police Department has put up a website entitled Hockey Riot 2011, informing people about the VPD's investigations into the 2011 Stanley Cup Riot. It also asks people to contribute any pictures or video that they may have taken during the riot, with the goal of identifying people who may have participated in the rioting. The site also reminds people to not use social media to take justice into their own hands, instead leaving it to the police. As of 1 July 2011, 101 arrests have been made. I IBM collected over 37,000 ideas for potential areas for innovation from brainstorming sessions with its customers, employees and their family members in 2006. iNaturalist is a citizen science website which allows users to contribute observations of organisms with images, start data-collecting projects, and crowdsources taxonomic identification of observations. The Indian rupee sign was developed in 2010, by using crowdsourcing to select its design through an open competition among Indian residents. The Infinity: The Quest for Earth project is a space MMOG that accepts contributions of concept art, 3D models, textures, sound effects, musical compositions and programming of standalone prototypes which could help development of the game. By the end of 2009 having contributions of more than 150 modeled ships, buildings and space stations,"Fleet renders – 2009" Forum thread at Infinity Forums. 25 July 2009 about 500 musical compositions from which 20% are considered for inclusion in the game. InfoArmy is a crowdsourcing platform for business data. Users research online for competitive intelligence information on public and private companies to create iPad and web reports. Current researchers come from a variety of backgrounds and from six continents. InnoCentive, started in 2001, crowdsources research and development for biomedical and pharmaceutical companies, among other companies in other industries. InnoCentive provides connection and relationship management services between "Seekers" and "Solvers". Seekers are the companies searching for solutions to critical challenges. Solvers are the 185,000 registered members of the InnoCentive crowd who volunteer their solutions to the Seekers. Anyone with interest and Internet access can become an InnoCentive Solver. Solvers whose solutions are selected by the Seekers are compensated for their ideas by InnoCentive, which acts as broker of the process. InnoCentive recently partnered with the Rockefeller Foundation to target solutions from InnoCentive's Solver crowd for orphan diseases and other philanthropic social initiatives. Innovation Exchange is an open innovation vendor which emphasizes community diversity; it sources solutions to business problems from both experts and novices. Companies sponsor challenges which are responded to by individuals, people working in ad hoc teams, or by small and mid-size businesses. In contrast to sites focused primarily on innovation in the physical sciences, Innovation Exchange fosters product, service, process, and business model innovation.. J Jade Magnet is Asia's largest creative crowdsourcing platform for design solutions like logos, brochures, websites, flyers, animations with a focus on SMEs. It is a Technology platform supporting clients to extract multiple options for creative solutions before making a selection. Additionally as a value add, clients can make use of Delivery Assurance service to manage requirements K Kaggle is a platform for data prediction competitions. Kaggle facilitates better predictions by providing a platform for machine learning, data prediction and bioinformatics competitions. The platform allows organizations to have their data scrutinized by the world's best statisticians. Katawa Shoujo is an open source visual novel created by Four Leaf Studios, a volunteer development team assembled from 4chan and other internet communities. The Katrina PeopleFinder Project used crowdsourcing to collect data for lost persons. Over 4,000 people donated their time after Hurricane Katrina. It included 90,000 entries. Khan Academy is a non-profit organization founded by educational entrepreneur Salman Khan which has as its mission to provide a world-class education to anyone for free. It is relying on volunteers to subtitle into the widely spoken languages of the world Khan Academy's substantial collection of educational videos on subjects ranging from math to art history. L At LibriVox.org, volunteers record chapters of books in the public domain, and then release the audio files back onto the net for free. All the audio is donated back into the public domain. Life in a Day is Kevin Macdonald's 95-minute documentary film comprising an arranged series of video clips selected from 80,000 clips (4500 hours) submitted to the YouTube video sharing website, the clips showing respective occurrences from around the world on a single day.Watercutter, Angela, "Life in a Day Distills 4,500 Hours of Intimate Video Into Urgent Documentary" (WebCite archive), Wired magazine, 29 July 2011. Yumi Goto of TIME LightBox remarked that "the most striking aspect of this documentary is that it's the first crowdsourced, user-generated content to hit the big screen." This was followed ten years later in 2021 with a new version, Life in a Day 2020, this time featuring over 300,000 submissions. The Living New Deal is a research project and online public archive documenting the scope and impact of the New Deal on Americans' lives and landscape. The Living New Deal relies on a network of Research Associates and other volunteers, including historians, teachers, students, artists, history buffs, librarians, journalists, and photographers to document New Deal sites throughout the U.S. Anyone can sign up to volunteer. L'Oreal used viewer-created advertising messages of Current TV to pool new and fresh advertising ideas. M Mapillary is a service for sharing geotagged photos developed by Mapillary AB with the aim to represent not only streets but the whole world. All photos and data are shared using a Creative Commons license, most of the internal software is available under the MIT license and the full API is public. MateCat is a web-based computer-assisted translation (CAT) tool released as open source software under the Lesser General Public License (LGPL) from the Free Software Foundation. McMaster Postcard Project is a platform created by the William Ready Division of Archives and Research Collections at the McMaster University Library for crowdsourcing information about an archival collection of historical postcards, providing information on the card's country, province and city of origin, date, and other pertinent information in a "notes" section. Cards were previously categorized only by province, or by country for international cards. Microtask is a company that has developed a software platform for global distribution of short-duration tasks to online workers. The system supports automated quality assurance and provides service-level agreements for task quality and turnaround times. The Milky Way Project is a project that aims to identify bubbles in the Milky Way with users analyzing infrared images from the Spitzer Space Telescope. Mindpixel was an online artificial intelligence project to build a knowledgebase of true/false statements, and ran from 2000 to 2005. Mob4Hire is a mobile testing and market research community. They list over 1,100 developers in 86 countries and more than 45,000 testers on 350 carriers in 150 countries. The company won a Meffy award from the Mobile Entertainment Forum for "Most Innovative Business Model". Moovit is a transit app and platform which makes use of the crowd in two ways: first by letting community editors add and edit transit data (in locales where official data is not openly available), and second by letting app users report bus tardiness, crowdedness, etc. to other riders down the line in a similar fashion to Waze. Moral Machine is an online platform that generates moral dilemmas and collects information on the decisions that people think an autonomous vehicle should make between two death outcomes. Mozilla Common Voice is a project to help make voice recognition open to everyone. Volunteers can add and verify voice recordings. The database with recordings is available as opensource. N Netflix Prize was an open competition for the best collaborative filtering algorithm that predicts user ratings for films, based on previous ratings. The competition was held by Netflix, an online DVD-rental service, and was completed in September 2009. The grand prize of $1,000,000 was reserved for the entry which bettered Netflix's own algorithm for predicting ratings by 10%. Netflix provided a training data set of over 100 million ratings that more than 480,000 users gave to nearly 18,000 movies, which is one of the largest real-life data sets available for research. The related forum maintained by Netflix has seen lively discussions and contributed a lot to the success of this competition. A very relevant fact to the power of crowdsourcing is that among the top teams are not only academic researchers, but laymen with no prior exposure to collaborative filtering (virtually learning the problem space from scratch). NotchUp is a company founded in 2008 offering crowdsourced job recruiting to find those who may passively be interested in new opportunities. Numbeo – worldwide studies based on reported consumer prices, perceived crime rates, quality of healthcare and other statistics. As of January 2020, 5,863,289 prices in 9,300 cities entered by 500,170 contributors. O Open Food Facts gathers information and data on food products from around the world. Old Weather is a web-based effort to transcribe weather observations made by Royal Navy ships around the time of World War I. These transcriptions will contribute to climate model projections and improve a database of weather extremes and will be of use to historians in tracking past ship movements and the stories of the people on board. OpenSeaMap is a free nautical chart covering seas, lakes, inland waterways and rivers for the needs of sailors, divers, fishermen and canoeists. The data is collected by crowdsourcing. In a new project OpenSeaMap collect shallow water depths worldwide for making bathimetric charts. OpenSignal is a project to independently map cell phone carrier coverage and performance. All data is collected from a smartphone application that has been downloaded over 3.5m times worldwide. OpenStreetMap is a free editable map of the world, which has over 100,000 signed up contributors in mid-2009. Creation and maintenance of geospatial data is a labor-intensive task which is expensive using traditional approaches, and crowdsourcing is also being used by commercial companies in this area including Google and TomTom. Oxfam Novib (Netherlands) mid-2008 launched a crowdsourcing initiative named Doeners.net, meant for people to support the organization's campaigning activities. Online volunteering service (United Nations) is a free service that connects grassroots organizations, international NGOs, local governments, educational institutions and United Nations agencies with thousands of individuals ready to volunteer via the Internet to help address development challenges. The service was launched in 2000 and it quickly attracted thousands of people ready to volunteer online. In 2014 alone, UN Online Volunteers undertook 16,134 assignments. P Path launched a project to crowdsource translations for its mobile platform in April 2012. Pepsi launched a marketing campaign in early 2007 which allowed consumers to design the look of a Pepsi can. The winners would receive a $10,000 prize, and their artwork would be featured on 500 million Pepsi cans around the United States. PhraseApp is a translation management platform that can be used to translate digital content, software, games and apps. It offers an in-context-editor and professional human translation. The Phylo video game invites players to give in to their addictive gaming impulses while contributing to the greater good by trying to decode the code for genetic diseases. Planet Hunters is a citizen science project where users can try to find extrasolar planets identifying patterns in the brightness data of stars retrieved by the Kepler Space Mission. Prova (Swedish for "to try") launched December 2008 as a crowdsource marketplace that connects businesses with professional ad designers to create print designs, audio ads, video content, and digital designs. Ad designers from all over the world compete for ad creation projects listed on the site.http://prova.com/advertising Q Quantum Moves is a game developed under the ScienceAtHome umbrella project, designed by Center for Community Driven Research (CODER) at Aarhus University, which aims to merge theoretical and experimental quantum research with online community efforts to explore the potential for online citizen science in this otherwise highly specialized field. Queen Silvia Nursing Award is a crowdsourcing campaign to find and develop national elderly and dementia care services. The campaign runs annually since 2013 in Sweden and in 2014 also in Finland. It is hosted by Swedish Care International and uses the Innopinion gamified crowdsourcing platform. R reCAPTCHA uses CAPTCHA to help digitize the text of books while protecting websites from bots attempting to access restricted areas. Humans are presented images of the book and asked to provide the corresponding text. Twenty years of The New York Times have already been digitized. RootMetrics (a.k.a. Root Wireless) uses a mobile client application on various kinds of smartphones to collect data about carrier signal quality and data speeds, then transmits that data to its servers. Consumers can view the crowdsourced data online in the form of color-coded maps that aid purchasing decisions by showing unbiased data from different carriers side by side. Royal British Columbia Museum transcription project allows volunteers to transcribe documents in their collections to assist those papers in becoming more accessible. S SciStarter is a citizen science focused site that supports and promotes hundreds of projects that require crowdsourced help. Micro-tasks for volunteer supporters include things such as climatic observations, astronomical observations, image / video / audio capture, image tagging, audio description, data point assessment, etc. Secret London is composed mostly of Londoners who use the site to share suggestions and photos of London. Originally started as a Facebook Group in 2010 in response to a competition to win an internship at Saatchi & Saatchi, Secret London gained 150,000 members within two weeks. This early popularity prompted its founder, Tiffany Philippou, to appeal to the community to help build the group a website, which was launched 10 days later."Tiffany turns her Facebook challenge into instant success". Sunday Times. Retrieved 19 February 2010."How secretlondon switched a Facebook Group to a start-up". Washington Post. Retrieved 20-Feb-10. SeeClickFix is a web tool that allows citizens to report non-emergency neighborhood issues, which are communicated to local government, as a form of community activism. It has an associated free mobile phone application. Similar to FixMyStreet. SESH (social entrepreneurship for sexual health) is a project that uses crowdsourcing to improve public health messaging, tools, and policies. The group has been recognized by the WHO-TDR as one of the leaders in social innovation for health. Individual crowdsourcing projects have created videos promoting HIV testing, videos promoting condom use, images promoting sexual health, and related topics. SETILive is an online project of Zooniverse. Its goal is to use the human brain's ability to recognize patterns to find extraterrestrial intelligences (ETIs). Show us a better way is a British crowdsourcing initiative that ask people a way to improve the communication of public data. The winning idea has been awarded with a £20,000 fund prize. Sightsmap is a sightseeing popularity heatmap overlaid on Google Maps, based on crowdsourcing: the number of Panoramio photos at each place in the world. Smartling is an enterprise translation management platform that can be used to crowdsource translations for digital content. IMVU, Cloudflare, and Path used the platform to crowdsource website translations. Smartsheet is an online software service and consultancy that enables businesses to track and manage work through online sharing and crowdsourcing methods. The company's Smartsourcing service enables people to anonymously submit and manage all phases of crowdsourced work processing. Amazon's Mechanical Turk is one of the work exchange platforms with which Smartsheet is integrated. Smithsonian transcription center is a crowdsourcing transcription project that invites volunteers to transcribe a wide variety of content in the Smithsonian Institution collections, including from the National Museum of African American History and Culture, Department of Paleobiology, and the Folklife Archives. Materials available for transcription include handwritten documents, audio and video materials. Snapwire is a platform that connects photographers with brands, publishers, small businesses, and creatives around the world looking for specific images that they cannot find through traditional stock photo services. Photo buyers post a request for authentic photography, set their own price and photographers compete for the posted amount, earn points, level up and receive up to 70% on their photo sales. Buyers get unique images that match their vision, and the winning photographers get paid. Stardust@Home is an ongoing citizen science project, begun in 2006, using internet volunteer "clickworkers" to find interstellar dust samples by inspecting 3D images from the Stardust spacecraft. Student of Fortune is an online service that allows students to submit homework problems for tutors to answer through a tutorial service for a fee. Started by a high school dropout. SunShot Catalyst, run by the US Department of Energy, is a crowdsourced open innovation program based on a series of prize challenges with the goal of rapid creation and development of products and solutions for the U.S. solar marketplace. T TopCoder is a crowdsourcing company with a global community of designers, developers, data scientists, and competitive programmers who compete to develop the best solutions for Topcoder customers. Organizations like IBM, Honeywell, and NASA work with Topcoder to accelerate innovation, increase bandwidth, and tap into hard-to-find expertise. On 8 May 2016, Topcoder Announced that its Topcoder Community has grown to more than 1,000,000 registered members. Tomnod crowdsourced the identification of objects and places in satellite images using online map interfaces that engaged many people to each view and tag a small section of a large area on the planet. Projects included searching for the tomb of Genghis Khan, mapping earthquake damage after the 2011 Christchurch earthquake, Malaysia Flight 370, counting refugee camps in Somalia, and the search of the Tunante II. Tomnod has since been retired and is no longer active. Transcribe Bentham is a crowdsourced manuscript transcription project launched in 2010. It is run by University College London's Bentham Project, in association with other UCL partners and the University of London Computer Centre. The project makes available, via a specially-designed transcription interface, digital images of UCL's Bentham Papers collection – the unpublished writings of the philosopher Jeremy Bentham, which run to some 60,000 manuscript folios – which volunteers are encouraged to transcribe. The transcripts are intended to contribute to the Bentham Project's production of the new edition of The Collected Works of Jeremy Bentham, and are uploaded to UCL's digital Bentham Papers repository, widening access to the collection. Media coverage has included a feature article in The New York Times, and a broadcast on Deutsche Welle radio. The project was shortlisted for the 2011 Digital Heritage Award, and received an Award of Distinction in the Digital Communities category of the 2011 Prix Ars Electronica. The open-source code for the project's transcription tool is available for reuse and customisation. U Unilever used the crowdsourcing platform IdeaBounty to find creative ideas for its next TV campaign for their snack food brand Peperami. Ushahidi (Swahili for "testimony" or "witness") is a website created in the aftermath of Kenya's disputed 2007 presidential election (see 2007–2008 Kenyan crisis) that collected eyewitness reports of violence sent in by email and text-message and placed them on a Google map. It is also the name of the open source software developed for that site, which has since been improved, released freely, and used for a number of similar projects. W Waze is a free turn-by-turn GPS application for mobile phones that uses crowdsourcing to provide routing and real-time traffic updates. "We Are The World 25 for Haiti (YouTube Edition)" is a massively collaborative charity song and music video produced by Canadian singer-songwriter Lisa Lavie and posted to the YouTube video sharing website to raise money for victims of the 12 January, 2010 Haiti earthquake. The video was the creation of a collaboration of 57 unsigned or independent YouTube musicians geographically distributed around the world. The Tokyo Times referred to J Rice's subsequently produced "We Pray for You" video, involving largely the same participants as were in Lavie's video, as an example of a trend to use crowdsourcing for charitable purposes. Wikipedia is often cited as a successful example of crowdsourcing, despite objections by co-founder Jimmy Wales to the term. Worth1000 is a community focused on creative contests, occasionally with financial incentives. Original contests invited members to submit manipulated images (typically using Photoshop) for specific themes, often of a comic nature. Now they have new contests regularly for photo effects (aka manipulated images), photography without effects, illustrations, writing and multimedia. While most contests are run by the website, anyone can apply to post a contest, and people seeking professional creative work like logo design are encouraged to add financial incentives to their requests for less playful creativity. X X-Prize is an innovation incentive prize using crowdsourcing mechanisms to tackle grand challenges'' that are considered failing as free markets. These represent pressing needs the humanity seeks solutions for, that previously have not been served by real entrepreneurial action. Z Zooniverse is an online citizen science platform that uses the active participation of human volunteers to complete projects requiring more subtle reasoning or perception than electronic computer networks. It began as a single astronomy project called Galaxy Zoo, which launched in 2007 and invited volunteers to classify images of galaxies from the Sloane Digital Sky Survey, in order to better understand galaxy morphology. Zooniverse grew from this effort and has diversified into projects in the humanities and other science domains. Humanities projects include Anti-Slavery Manuscripts with Boston Public Library, and Shakespeare's World with the Folger Shakespeare Library and Oxford English Dictionary. Ecology projects include Snapshot Serengeti, Wildcam Gorongosa and Penguin Watch. Anyone can build their own project on the free Project Builder. Zooniverse, its researchers and volunteers frequently publish research papers using the data created by volunteers. See also Comparison of crowdfunding services List of citizen science projects Crowdmapping#Examples References Crowdsourcing projects
55440889
https://en.wikipedia.org/wiki/Pixel%202
Pixel 2
The Pixel 2 and Pixel 2 XL are a pair of Android smartphones designed, developed, and marketed by Google as part of the Google Pixel product line. They collectively serve as the successors to the Pixel and Pixel XL. They were officially announced on October 4, 2017 at the Made by Google event and released in the United States on October 19. On October 9, 2018, they were succeeded by the Pixel 3 and Pixel 3 XL. History In early March 2017, Google's Rick Osterloh confirmed that they would bring a "next-gen" Pixel smartphone later that year. He stated it would "stay premium" and that there would be no "cheap Pixel". Google originally intended to use HTC to manufacture both their 2017 flagships, but later shifted to LG to manufacture the bigger Pixel 2 XL. The unreleased device that was supposed to be the Pixel 2 XL under the codename "Muskie", was later re-developed by HTC into the HTC U11+. The Google Pixel 2 and Pixel 2 XL were carried in the United States by Verizon and Project Fi. On October 4, 2018, Verizon Wireless stopped selling the Pixel 2. Specifications Design The back of the Pixel 2 and Pixel 2 XL are made from aluminum with a thin "premium coating" of plastic and has a top section made from glass to provide wireless transmissivity. Unlike the original Pixel XL, which was simply an enlarged version of the Pixel design with no other changes, the Pixel 2 XL's external design differs from its smaller sibling, employing a taller 2:1 P-OLED display (marketed as 18:9) instead of the Pixel 2's 16:9 AMOLED. Hardware The Pixel 2 and Pixel 2 XL are both powered by the Qualcomm Snapdragon 835, coupled with 4 GB LPDDR4X RAM. They both come in storage options of 64 or 128GB. The Pixel 2 has a 16:9 1080p () AMOLED display panel with a pixel density of 441ppi, while the Pixel 2 XL comes with a 2:1 1440p () P-OLED display panel with a pixel density of 538ppi. Both phones have a 12.2megapixel rear camera capable of recording 4K video at 30FPS, 1080p video at 120FPS, and 720p video at 240FPS. The camera also contains phase-detection autofocus, laser autofocus, and HDR+ processing. The Pixel 2 and Pixel 2 XL also include the Pixel Visual Core (PVC) image processor for faster and lower power image processing, though it was not enabled until Android 8.1 was released in January 2018. The PVC was custom design by Google's consumer hardware team with collaboration from Intel. The Pixels do not have support for 4K video at 60FPS, as the processor is not powerful enough. The Pixel 2 includes optical image stabilization which the Pixel lacked. Google uses Fused Video Stabilization which reduces issues with camera shake, motion blur, rolling shutter distortion, and focus breathing as found in other image stabilization methods. The Pixel 2 and Pixel 2 XL support USB Power Delivery quick charging, have a fingerprint sensor on the rear, IP67 dust and water resistance and are Daydream-ready. The Pixel 2s has a nano-SIM and an eSIM, Android Q Beta 2 enabled dual SIM support however Android Q Beta 3 disabled it. Software The phones ship with stock Android 8.0 "Oreo" on launch. Google has promised three years of software and security updates, making it closer to the average four years of support that Apple provides for its iPhones. The Sony Xperia XZ1 was the first phone to ship with Android 8.0 ("Oreo"). The new Pixels also include a feature called "Active Edge". With this, the Google Assistant can be launched by squeezing the phone's sides, similar to the HTC U11's "Edge Sense" feature. This phone was also released with the new Google Lens app, which is designed to bring up relevant information using visual analysis by the camera. The "Now Playing" feature allows the device to automatically detect music through its microphone, and identify the song on the lock screen. On December 5, 2017, Android 8.1 Oreo was released for the Pixel 2 and Pixel 2 XL. On October 4, 2017, the Pixel 2 and Pixel 2 XL were granted an extended warranty period which guarantees Android version updates for them until October 2020, 3 years from when they were first available on the Google Store, and free unlimited storage for all photos and videos taken on the phone in original quality through the end of 2020, with unlimited high-quality storage continuing afterwards. Android 9.0 "Pie" was made available upon its launch on August 6, 2018. Following the release of the Pixel 3, some of its features were backported to the Pixel 2, including updates to the camera software, augmented reality stickers, and a history log for "Now Playing", among others. Android 10 was made available upon its launch on September 3, 2019. Similarly, Android 11 was made available to download upon its launch on September 8, 2020. In October 2020, the Pixel 2 and Pixel 2 XL reached their planned end-of-life date. Their final security update was released in December 2020. Cellular networks Reception The Pixel 2 camera initially received a score of 98 (currently updated to 99) from DxOMark, making it the highest performing mobile device camera at the end of 2017, and was overtaken in March 2018 by Samsung's Galaxy S9+. The Pixel 2 and the Pixel 2 XL received mixed reviews. The phone was praised for the camera quality and water resistance, but was criticized for the removal of the headphone jack, particularly after Google mocked Apple for doing the same with its iPhone 7 phone at the launch of the first generation Pixel phone just 12 months prior. Google was also criticized for the price of the USB-C to 3.5mm headphone adapter it sells, which costs US$20 while Apple's Lightning to 3.5mm adapter costs US$9, as well as for not including headphones with the phone. However, news outlets noted that because USB-C is a standard interface, unlike Lightning, there are a variety of third-party adapters that retail for less than Google's official one. Google later dropped the price of the adapter to US$9, keeping the price in-line with Apple's offering. YouTuber JerryRigEverything, who performs durability tests on various smartphones, criticized Google for their design choice with the antenna lines on the sides of the handset. When he bent the Pixel 2, it cracked at the antenna line near the middle of the phone, voiding its water resistance and warranty, while most other phones from competitors pass his bend test. This does not apply to the Pixel 2 XL. The design of the smaller Pixel 2 was regarded as plain, and its big chunky bezels were not well received, considering that earlier 2017 phones like the Samsung Galaxy S8 and LG G6 had moved to nearly bezel-less screens. The Pixel 2 XL screen became infamous for quality control issues, a flaw shared with the LG V30 which also has the same manufacturer and P-OLED screen type. The Pixel 2 XL has a blue tint visible on the screen when the phone is viewed at an angle. Many were dissatisfied and it was speculated that Google had installed the polarizer incorrectly. However, when Google addressed the tint, they stated that it was a design choice to have the blue tint to go along with the cooler color temperature used by the screen (it is calibrated to a D67 white point, which is 6700K). The Pixel 2 was updated with the Pixel 3's Night Sight feature which dramatically improves low light performance with no flash or tripod. Using Night Sight the Pixel 2 takes superior low light photos than newer 2018 flagships such as the iPhone XS, Huawei Mate 20 Pro and Samsung Galaxy Note 9. Issues The Pixel 2 XL's display suffers from a "black smear" problem, which occurs when a group of black pixels transition to colored ones, and tend to linger for a while, before changing to their expected state. The Pixel 2 XL has an issue where the screen may flash randomly. It occurs when the phone is locked or unlocked. The issue was fixed in the June 2018 system software update, but returned with the July 2018 update. Hundreds of Pixel 2 and Pixel 2 XL owners have complained about buzzing and clicking sounds coming from the phone. Google investigated the issue and recommended turning off NFC to temporarily fix the problem until a software update comes out. The November 2017 security patch fixed the clicking sound, but Google has not fixed the buzzing and rattling sounds heard during calls on some devices. The Pixel 2 XL has a volume flaw that reduces the sound level of audio clips sent through messaging apps such as Google Allo, Instagram, and Telegram among others. Google is aware of the problem and is looking into it. Some of the USB-C to 3.5mm adapters for the headphones do not work. In some cases, rebooting the phone can fix the issue temporarily. Google is offering replacements for the faulty adapters. Multiple users reported that the microphone can randomly stop working. A possible solution is to blow into the microphone. Some units suffer from Bluetooth connectivity issues. Fixed issues Vlad Savov of The Verge has complained of under-saturated and distorted OLED displays, and there have also been reports of screen burn-in on some Pixel 2 XL units. Google conducted an investigation and released a software update in November that added a new mode for more saturated colors and faded out the navigation bar after a period of inactivity. Google also extended device warranty to two years for both devices worldwide. There was an issue where Google Assistant does not work with some Bluetooth headphones. This was an issue with the Google app and was patched in version 7.17. The Pixel 2 XL had an issue where heavy noise reduction filters distorted audio recordings. This was fixed in the November 2017 security patch. Sales In the United States, Verizon and Project Fi are the exclusive carriers for the Pixel 2 and Pixel 2 XL. They are available direct-to-consumer for use on any wireless network through Google's online store, or from Best Buy and Target. According to IDC Senior Research Director Francisco Jeronimo, Google shipped 3.9 million units in 2017, twice as much as Pixels sold in 2016. On March 21, 2018, Google did a temporary offer where, if a consumer financed a Pixel for 2 years, they would receive a $200 cashback. This aimed to compete with the Samsung Galaxy S9 and iPhone X. The promotion ended March 31, 2018. On July 9, 2018, Google reduced the price of the Pixel 2 XL by US$100. On October 12, 2018, The Pixel 2 XL was discounted by $300 for Verizon Wireless customers. On April 1, 2019, Google stopped selling both the Pixel 2 and Pixel 2XL on the Google Play Store. References External links Google hardware Android (operating system) devices Mobile phones introduced in 2017 Discontinued smartphones HTC mobile phones LG Electronics mobile phones Google Pixel Mobile phones with 4K video recording
31722495
https://en.wikipedia.org/wiki/Woo%E2%80%93Lam
Woo–Lam
In cryptography, Woo–Lam refers to various computer network authentication protocols designed by Simon S. Lam and Thomas Woo. The protocols enable two communicating parties to authenticate each other's identity and to exchange session keys, and involve the use of a trusted key distribution center (KDC) to negotiate between the parties. Both symmetric-key and public-key variants have been described. However, the protocols suffer from various security flaws, and in part have been described as being inefficient compared to alternative authentication protocols. Public-key protocol Notation The following notation is used to describe the algorithm: - network nodes. - public key of node . - private key of . - nonce chosen by . - unique identifier of . - public-key encryption using key . - digital signature using key . - random session key chosen by the KDC. - concatenation. It is assumed that all parties know the KDC's public key. Message exchange The original version of the protocol had the identifier omitted from lines 5 and 6, which did not account for the fact that is unique only among nonces generated by A and not by other parties. The protocol was revised after the authors themselves spotted a flaw in the algorithm. See also Kerberos Needham–Schroeder protocol Otway–Rees protocol References Computer network security Authentication methods
41969381
https://en.wikipedia.org/wiki/Enable%20Software
Enable Software
Enable Software, Inc. was a privately held software development company located in Ballston Lake, New York. Enable was founded in 1984 by Ron Quake and Bob Hamilton. The company's flagship product, called Enable was an integrated office suite for IBM PC compatibles. The suite included a word processor, a 3D spreadsheet, a relational database, and integrated communications. In 1992 Enable Office added electronic mail, and calendaring software. At that time the company estimated it had more than one million users of its products. Enable 4.5 was released in April, 1992, with support for OS/2 2.0. In 1993 it introduced JetForm for Enable, "a mail-enabled forms designer for Microsoft Windows." Another Product of Enable, PowerLine, was a multi-protocol terminal client for Windows similar to Hyperterminal released in 1993. Enable ceased operations in 1997. At that time there were an estimated 190,000 active users of Enable products. References External links Review of Enable O/A 4.0 Defunct software companies of the United States Software companies based in New York (state) Software companies established in 1984 1984 establishments in New York (state) Defunct companies based in New York (state)
3416350
https://en.wikipedia.org/wiki/Core-based%20trees
Core-based trees
Core-based trees (CBT) is a proposal for making IP Multicast scalable by constructing a tree of routers. It was first proposed in a paper by Ballardie, Francis, and Crowcroft. What differentiates it from other schemes for multicasting is that the routing tree comprises multiple "cores" (also known as "centres"). The locations of the core routers are statically configured. Other routers are added by growing "branches" of a tree, comprising a chain of routers, from the core routers out towards the routers directly adjacent to the multicast group members. References RFC 2189 Network architecture
7867727
https://en.wikipedia.org/wiki/Mapopolis
Mapopolis
Mapopolis has been the creator of PDA/smartphone GPS navigation software Mapopolis Navigator. Mapopolis used data from Navteq. Starting in 1999, Mapopolis first released software for the Palm OS and later added software for Pocket PC handhelds and Windows smartphones. Mapopolis created the first real-time traffic service (Mapopolis ClearRoute), which provided real-time route updates based on traffic conditions. As of April 1, 2007, Mapopolis has discontinued sales of its consumer software. Map downloads remained available for at least one year past that date for registered users who purchased the product and still did not use up their full 1-year allowance. Mapopolis Navigator files use a proprietary format and make it impossible for users to export their custom POIs. Sources Satellite navigation software Personal digital assistant software Navigation system companies
945330
https://en.wikipedia.org/wiki/Donald%20McKay
Donald McKay
Donald McKay (September 4, 1810 – September 20, 1880) was a Canadian-born American designer and builder of sailing ships, famed for his record-setting clippers. Early life He was born in Jordan Falls, Shelburne County, on Nova Scotia's South Shore. He was the oldest son and one of eighteen children of Hugh McKay, a fisherman and a farmer, and Ann McPherson McKay. Both of his parents were of Scottish descent. He was named after his grandfather, Captain Donald McKay, a British officer, who after the Revolutionary war moved to Nova Scotia from the Scottish Highlands. Early years as a shipbuilder In 1826 McKay moved to New York, working for shipbuilders Brown & Bell and was an apprentice of Isaac Webb from 1827 to 1831. After 1832 he did some freelance jobs for Webb and Smith & Dimon. McKay also freelanced for Brown & Bell at their Wescasset's shipyard. In 1840 at Newburyport, he was contracted to finish Delia Walker, 427 tons, for John Currier, Jr. Currier was very impressed with McKay and offered him a five year contract, which McKay refused driven by desire to own his own business. In 1841, William Currier offered McKay to become a partner of what would become Currier & McKay shipyard in Newburyport. The partnership did not last long and soon McKay found himself in McKay & Pickett, building the packet St. George. The partnership with William Pickett was "pleasant and profitable", but after the success of the Joshua Bates the shipyard became too small for McKay's ambitions and he was convinced by Enoch Train to move to East Boston and open his own business. Ships built before 1845 1840 Delia Walker, 427 tons, McKay finished her for John Currier, Jr. 1841 Mary Broughton, 323 tons, barque, built by Currier & McKay. 1842 Ashburton, 449 tons, ship, build by Currier & McKay. 1842 Rio Trader Courier, early clipper trading ship, 380 tons OM was the first ship fully designed and built by Donald McKay himself, as a partner in the firm of Currier & McKay, on a commission from Andrew Foster & Son, New York. She was built at Newburyport, Massachusetts. At the time it was rather unusual for a such advanced vessel to be built outside of New York or Baltimore. She was employed in the Rio coffee trade and made a big deal of money to her owners, but most importantly brought a much needed fame to McKay. 1843 St. George, 845 tons, pioneer packet of Red Cross Line, built by McKay & Pickett. 1844 John R. Skiddy, 930 tons, packet, built by McKay & Pickett. 1844 Joshua Bates, 620 tons, pioneer packet of Enoch Train's White Diamond Line. The White Diamond Line was one of the most important Atlantic emigrant routes from Europe to North America at the time. Built by McKay & Pickett. East Boston shipyard In 1845 McKay, as a sole owner, established his own shipyard on Border Street, East Boston, where he built some of the finest American ships for almost 25 years. One of his first large orders was building five large packet ships for Enoch Train's White Diamond line between 1845 and 1850. Between 1845 and 1850 McKay built five large packet ships for Enoch Train's White Diamond line: Washington Irving, Anglo Saxon, Anglo American, Daniel Webster, and Ocean Monarch. The Ocean Monarch was lost to fire on August 28, 1848, soon after leaving Liverpool and within sight of Wales; over 170 of the passengers and crew perished. The Washington Irving carried Patrick Kennedy, grandfather of Kennedy family patriarch Joseph P. Kennedy, Sr., to Boston in 1849. In the summer of 1851, McKay visited Liverpool and secured a contract to build four large ships for James Baines & Co.'s Australian trade: Lightning (1854), Champion of the Seas (1854), James Baines (1854), and Donald McKay (1855). Ships built after 1845 1845 Washington Irving, 751 tons, Boston-Liverpool packet ship, built for Enoch Train's White Diamond Line. Launched 15 September 1845. Sold to England in 1852. 1846 Anglo-Saxon, 894 tons, 147 ft long, built for Enoch Train, Launched 5 September 1846. 1846 New World, 1404 tons, packet ship, sold in 1882 to Austrians and renamed Rudolph Kaiser. Her painting is available at Royal Museums Greenwich. 1847 Ocean Monarch, 1301 tons OM, built for Enoch Train. 1847 A.Z., 700 tons, packet for Zerega&Co of New York. 1847 Anglo-American, 704 tons, packet ship built for Enoch Train. 1848 Jenny Lind, 533 tons, packet ship. 1848 L.Z., 897 tons, packet for Zerega&Co of New York. 1849 Plymouth Rock, 960 tons, packet ship. 1849 Helicon, extreme clipper barque, 400 tons OM 1849 Reindeer, extreme clipper trading ship, 800 tons OM, built in East Boston 1849 Parliament, 998 tons, packet ship. 1850 Moses Wheeler, extreme clipper trading ship, 900 tons OM, built for Wheeler & King, Boston. 1850 Sultana, extreme clipper barque, 400 tons OM 1850 Cornelius Grinell, 118 tons, packet ship 1850 Antarctic, 1116 tons, packet for Zerega&Co of New York 1850 Daniel Webster, 1187 tons, built for Enoch Train. 1850 Stag Hound, extreme clipper, 1534 tons OM – first large clipper ship built by Donald McKay 1851 Flying Cloud, extreme clipper, 1782 tons OM 1851 Staffordshire, extreme clipper, 1817 tons OM. She was launched at East Boston, Massachusetts, for Enoch Train & Co. She wrecked off Cape Sable, Florida in 1853. 1851 North America, extreme clipper, 1464 tons OM 1851 Flying Fish, extreme clipper, 1505 tons OM. She was launched at East Boston, Massachusetts, for Messrs. Sampson & Tappan, Boston. She wrecked on the 23rd of November 1958 off Fuzhou, China en route to New York with a cargo of tea. The wreck was sold to a Manilla merchant. After she was rebuilt at Whampoa, China she was renamed the El Bueno Suceso. 1852 Sovereign of the Seas, extreme clipper, 2421 tons OM. At the time she was fastest sailing ship ever built. She was wrecked in the Malacca Straits in 1859. 1852 Westward Ho!, extreme clipper, 1650 tons OM, burned in Callao in 1864. 1852 Bald Eagle, extreme clipper, 1704 tons OM 1853 Empress of the Seas, extreme clipper, 2200 tons OM, burned in Australia in 1881. 1853 Star of Empire, extreme clipper, 2050 tons OM, built for the Boston and Liverpool packet line of Enoch Train & Co. In 1857, laden with guano, she broke to pieces on Currituck Beach, N. C. 1853 Chariot of Fame, extreme clipper, 2050 tons OM, 220 ft. She was launched at East Boston, Massachusetts, for Enoch Train & Co. Per Richard McKay sources, sold in 1862 and came to her end in January, 1876, being abandoned or lost at sea en route from Chincha Islands to Cork. 1853 Great Republic, extreme clipper barque, 4555 tons OM – largest clipper ship ever built 1853 Romance of the Sea, extreme clipper, 1782 tons OM. She was launched at East Boston, Massachusetts, for George B. Upton and employed in the California Trade. She disappeared en route to San Francisco after having left Hong Kong 31st of December 1862. 1854 Lightning, extreme clipper, 2083 tons OM, built for Messrs, Baines & Co. She burned while loading wool at Geelong, Australia on the 31st of October 1869. 1854 Champion of the Seas, extreme clipper, 2447 tons OM, built for Messrs, Baines & Co. 1854 James Baines, extreme clipper, 2525 tons OM, built for Messrs, Baines & Co. 1854 Blanche Moore, extreme clipper, 1787 tons OM 1854 Santa Claus, medium clipper, 1256 tons OM 1854 Benin, barque, 692 tons. 1854 Commodore Perry, medium clipper, 1964 tons OM, built for Black Ball Line, burned near Bombay on 27 August 1869. 1854 Japan, medium clipper, 1964 tons OM, built for Messrs, Baines & Co. 1855 Donald McKay, extreme clipper, 2594 tons OM, 266 ft, built for Messrs, Baines & Co., last extreme clipper ship built by Donald McKay, burned and broken up in 1888. 1855 Zephyr, medium clipper, 1184 tons OM 1855 Defender, medium clipper, 1413 tons OM 1856 Henry Hill, medium clipper barque, 568 tons OM 1856 Mastiff, medium clipper, 1030 tons OM. She was launched at East Boston, Massachusetts, for George B. Upton for the California and China trade. She was lost to a fire en route for the Sandwich Islands in the South Pacific on the 15th of September 1859. The entire crew and all passengers were rescued by the British ship HMS Achilles and brought to Honolulu. 1856 Minnehaha, medium clipper, 1695 tons OM 1856 Amos Lawrence, medium clipper, 1396 tons OM 1856 Abbott Lawrence, medium clipper, 1497 tons OM 1856 Baltic, medium clipper, 1372 tons OM, 188 feet, built for Zerega&Co of New York. 1856 Adriatic, medium clipper, 1327 tons OM, built for Zerega&Co of New York. She ran aground, off Whale Cove, on Digby Neck Peninsula, Nova Scotia, Canada on the 24th December 1859. 1858 Alhambra, medium clipper, 1097 tons OM 1859 Benj. S. Wright, 107 tons. 1860 Mary B. Dyer, schooner. 1860 H. & R. Atwood, schooner. 1861–1862 General Putnam, ship. 1864–1865 Trefoil, wooden screw propeller ship, 370 tons. 1864–1865 Yucca, wooden screw propeller ship, 373 tons. 1864–1865 Nausett, iron clad monitor. 1864–1865 Ashuelot, iron side-wheel double ended ship, 1030 tons. 1866 Geo. B. Upton, wooden screw propeller ship, 604 tons. 1866 Theodore D. Wagner, wooden screw propeller ship, 607 tons. 1867 North Star, brig, 410 tons. 1867 Helen Morris, medium clipper, 1285 tons OM 1868 Sovereign of the Seas, 1502 tons 1868 R.R. Higgins, schooner, 96 tons. 1869 Glory of the Seas, medium clipper, 2102 tons OM, scrapped for her metal at Brace Point, West Seattle on the 13th of May 1923. Her figurehead is preserved at the India House, New York. 1869 Frank Atwood, schooner, 107 tons. 1874–1875 Adams, sloop of war, 615 tons. 1874–1875 Essex, sloop of war. 1875 America, originally built by William H. Brown in 1851 this famous schooner yacht, was rebuilt by McKay in 1875. Records set Lightning set multiple records 436 miles in a 24-hour period in 1854 430 miles in 24 hours while bound for Australia 63 days and 3 hours from Melbourne, Australia, to Liverpool, England Sovereign of the Seas posted the fastest speed ever by a sailing ship – 22 kts. in 1854. Champion of the Seas set the record of 465 miles in 24 hours in December 1854; this record stood until 1984. James Baines logged a speed of 21 knots (June 18, 1856) Flying Cloud made two 89-day passages New York to San Francisco Bald Eagle set the record of 78 days 22 hours for a fully laden ship from San Francisco to New York. Late life In 1869, under financial pressure from previous losses, McKay sold his shipyard and worked for some time in other shipyards. He retired to his farm near Hamilton, Massachusetts, spending the rest of his life there. He died in 1880 in relative poverty and was buried in Newburyport. Design practices McKay's designs were characterized by a long fine bow with increasing hollow and waterlines. He was perhaps influenced by the writings of John W. Griffiths, designer of the China clipper Rainbow in 1845. The long hollow bow helped to penetrate rather than ride over the wave produced by the hull at high speeds, reducing resistance as hull speed is approached. Hull speed is the natural speed of a wave the same length as the ship, in knots, , where LWL = Length of Water Line in feet. His hulls had a shorter afterbody, putting the center of buoyancy farther aft than was typical of the period, as well as a full midsection with rather flat bottom. These characteristics led to lower drag at high speed compared to other ships of similar length, as well as great stability which translated into the ability to carry sail in high winds (more power in extreme conditions). His fishing schooner design was even more radical than his clippers, being a huge flat-bottomed dinghy similar in form to 20th century planing boats. These design changes were not favorable for light wind conditions such as were expected on the China trade, but were profitable in the California and Australian trades. Legacy and honors Pan Am named one of their Boeing 747s Clipper Donald McKay in his honor. There is a monument to McKay in South Boston, near Fort Independence, overlooking the channel, that lists all his ships. There were more than thirty ships listed. His house in East Boston was designated a Boston Landmark in 1977 and is also on the National Register of Historic Places. A memorial pavilion to McKay, including a painting of his famous “Flying Cloud,” can be found at Piers Park in East Boston. McKay was inducted into the National Sailing Hall of Fame on November 9, 2019. See also List of clipper ships Bibliography of early American naval history References Further reading Judson, Clara Ingram (1943). Donald McKay: Designer of Clipper Ships Charles Scribner's Sons, New York, p. 136, Url External links Images of Donald McKay's Shipyard – Museum of Science, Boston, MA Donald MacKay Memorial, Jordan Falls, NS Model of Flying Cloud Clipper Ship, Smithsonian Figurehead from clipper ship Donald McKay, Mystic Seaport Museum List of ships built by Donald McKay 1850 McKay and the Clipper Age – .pdf case study in innovation, bostoninnovation.org Scientific American, "Donald McKay", 09 October 1880, p. 228 1810 births 1880 deaths Boat and ship designers Pre-Confederation Canadian emigrants to the United States Canadian people of Scottish descent People from Shelburne County, Nova Scotia People from East Boston, Boston 19th-century American people Pre-Confederation Nova Scotia people Canadian shipbuilders American shipbuilders American shipwrights Persons of National Historic Significance (Canada)
42521398
https://en.wikipedia.org/wiki/Meanings%20of%20minor%20planet%20names%3A%20385001%E2%80%93386000
Meanings of minor planet names: 385001–386000
385001–385100 |-bgcolor=#f2f2f2 | colspan=4 align=center | |} 385101–385200 |-bgcolor=#f2f2f2 | colspan=4 align=center | |} 385201–385300 |-id=205 | 385205 Michelvancamp || || Michel Van Camp (born 1969), Belgian physicist and head of the Seismology-Gravimetry service at the Royal Observatory of Belgium in Brussels. His research includes gravimetry, intraplate deformations and hydrological effects on gravity (Src). || |} 385301–385400 |-bgcolor=#f2f2f2 | colspan=4 align=center | |} 385401–385500 |-id=446 | 385446 Manwë || || Manwë is foremost among the deities who rule the world in the mythology created by Tolkien. || |} 385501–385600 |-id=571 | 385571 Otrera || || Otrera, the first queen of the Amazons. She was involved with Ares and was the mother of the Amazons queen Penthesilea, who led the Amazons in the Trojan war. || |} 385601–385700 |-id=695 | 385695 Clete || || Clete was an Amazon and the attendant to the Amazons queen Penthesilea, who led the Amazons in the Trojan war. Clete went looking for Penthesilea after she went missing after the Trojan War. || |} 385701–385800 |-bgcolor=#f2f2f2 | colspan=4 align=center | |} 385801–385900 |-bgcolor=#f2f2f2 | colspan=4 align=center | |} 385901–386000 |-id=980 | 385980 Emiliosegrè || || Emilio Segrè (1905–1989) was an Italian-American physicist and 1959 Nobel Prize laureate for discovery of the antiproton. He also discovered the elements technetium and astatine. || |} References 385001-386000
2770032
https://en.wikipedia.org/wiki/MayaVi
MayaVi
MayaVi is a scientific data visualizer written in Python, which uses VTK and provides a GUI via Tkinter. MayaVi was developed by Prabhu Ramachandran, is free and distributed under the BSD License. It is cross-platform and runs on any platform where both Python and VTK are available (almost any Unix, Mac OS X, or Windows). MayaVi is pronounced as a single name, "Ma-ya-vee", meaning "magical" in Sanskrit. The code of MayaVi has nothing in common with that of Autodesk Maya or the Vi text editor. The latest version of MayaVi, called Mayavi2, is a component of the Enthought suite of scientific Python programs. It differs from the original MayaVi by its strong focus on making not only an interactive program, but also a reusable component for 3D plotting in Python. Although it exposes a slightly different interface and API than the original MayaVi, it now has more features. Major features visualizes computational grids and scalar, vector, and tensor data an easy-to-use GUI can be imported as a Python module from other Python programs or can be scripted from the Python interpreter supports volume visualization of data via texture and ray cast mappers support for any VTK dataset using the VTK data format support for PLOT3D data multiple datasets can be used simultaneously provides a pipeline browser, with which objects in the VTK pipeline can be browsed and edited imports simple VRML and 3D Studio scenes custom modules and data filters can be added exporting to PostScript files, PPM/BMP/TIFF/JPEG/PNG images, Open Inventor, Geomview OOGL, VRML files, Wavefront .obj files, or RenderMan RIB file Examples Spherical harmonics from numpy import linspace, meshgrid, array, sin, cos, pi, abs from scipy.special import sph_harm from mayavi import mlab theta_1d = linspace(0, pi, 91) phi_1d = linspace(0, 2*pi, 181) theta_2d, phi_2d = meshgrid(theta_1d, phi_1d) xyz_2d = array([sin(theta_2d) * sin(phi_2d), sin(theta_2d) * cos(phi_2d), cos(theta_2d)]) l = 3 m = 0 Y_lm = sph_harm(m, l, phi_2d, theta_2d) r = abs(Y_lm.real) * xyz_2d mlab.figure(size=(700, 830)) mlab.mesh(r[0], r[1], r[2], scalars=Y_lm.real, colormap="cool") mlab.view(azimuth=0, elevation=75, distance=2.4, roll=-50) mlab.savefig("Y_%i_%i.jpg" % (l, m)) mlab.show() References External links Free data visualization software Free plotting software Free software programmed in Python Plotting software Software that uses Tk (software) Software that uses VTK
7638093
https://en.wikipedia.org/wiki/Velocity%20Micro
Velocity Micro
Velocity Micro is a privately held boutique computer manufacturer located in Richmond, Virginia (USA), specializing in custom high-performance gaming computers, professional workstations, and high-performance computer solutions. Its extended product line includes gaming PCs, notebooks, CAD workstations, digital media creation workstations, home and home office PCs, home entertainment media centers, Tesla-based supercomputers, and business solutions. All products are custom assembled by hand and supported at the company's headquarters. History Velocity Micro traces its origins to 1992 when founder Randy Copeland began designing and producing high-performance computer systems to run CAD software and other demanding applications. These computer systems were custom-built to facilitate the design process and tailored to the extreme needs of each client. Velocity Micro was officially founded in 1997 as an extension of this highly individualized, high-performance computing philosophy. In 2001, Copeland accepted the opportunity to appear in Maximum PC'''s boutique roundup article entitled "Minor League, Major Performance". The quote which appeared in that February 2002 issue – "put together with the kind of care and craftsmanship the behemoth manufacturers can't offer" – propelled Velocity Micro forward and is still used by the company today. In May 2007, Velocity Micro acquired former competing boutique builder, Overdrive PC, known for their extreme overclocking capabilities they term "HyperClocking." Since the acquisition, Velocity Micro has incorporated HyperClocking into many of its extreme gaming systems. Overdrive PC remains a separate brand under Velocity Micro ownership. In 2010, Velocity Micro entered the eReader and tablet computer markets with the release of the first Cruz products: the Cruz Reader and the Cruz Tablet (T100). These Android-based devices featured 7" full-color screens. The Cruz Reader utilized a Resistive touchscreen, whereas the Cruz Tablet made use of the more advanced and responsive capacitive touch screen. Five product generations of Cruz tablets were produced and sold in 7", 8", and 10" screen models with close to a million units in the market by 2012. As of 2013, Velocity Micro no longer supports or offers these or any other Android-based devices for sale. In 2011, Copeland was named a "Tech Icon" by the PC Magazine staff in an article celebrating 30 years of the PC for his contributions to the industry. He continues to have an active role at Velocity Micro as president and CEO. In October 2019, Velocity Micro announced a partnership with Ansys that would provide access to resources, licenses, and benchmarks allowing Velocity Micro to build custom computers that are tailored to be integrated with Ansys applications. Velocity Micro has also partnered with Nvidia, AMD, and Intel to provide retail-grade hardware in custom computer builds. Retail In August 2005, Velocity Micro began offering pre-configured, high-performance desktops in select Best Buy stores across the country, followed by BestBuy.com that September. In July 2007, Velocity Micro began offering notebooks and desktops in Circuit City Stores across the country. In November 2008, Velocity Micro announced they were also moving into the online retail outlets Amazon.com, Newegg, TigerDirect, and Staples. In January 2009, Velocity Micro announced they were moving into Fry's stores nationwide. In 2010, the Cruz Reader and Cruz Tablet went on sale at Borders Books as well as their other numerous other retail partners. As of Jan. 2016, Velocity Micro continued to sell desktops and laptops through Newegg.com and Amazon.com. Awards Velocity Micro has won over 60 industry media awards for performance and craftsmanship including 19 Editor's Choice awards from PC Magazine. CNET, Maximum PC, PC World, HardOCP, Computer Gaming World, and Computer Shopper have all awarded Velocity Micro machines high marks.Velocity Micro Vector GX Campus Edition (overclocked Core 2 Duo E6320) Desktop reviews - CNET Reviews[H] Consumer - Velocity Micro Gamers' Edge PCX In Sept. 2007, Velocity Micro won PC Magazine's "Reader's Choice for Service and Reliability" Award. In a July 2008 review from PC Magazine, the company received an Editors Choice award for its Vector Campus Edition model. In November 2008, Core i7 based systems from Velocity Micro won Editors' Choice awards from Maximum PC and CNET. More recently, in April 2013 the Velocity Micro Vector Z25 won and Editors Choice award from PC Magazine'', stating "the Z25 is a midtower PC with all the goods". PCMag.com later went on to name the Velocity Micro Vector Z25 as "Best Mainstream Desktop of 2013" after consideration of all mainstream desktop computers covered during the entire year 2013. In August 2014, Velocity Micro followed up on that award with another Editors' Choice from PCMag.com, this one for its Edge Z55 gaming PC. Said the editors, "The Velocity Micro Edge Z55 blows away the competition ...and costs $3,000 less. We wholeheartedly award the Edge Z55 our Editors' Choice for high-end gaming desktop PCs". Velocity Micro participated in the 2018 Intel Extreme Rig Challenge and won the award for Best Performance, providing a 40% higher score than the previous year's entry. References Computer hardware companies Home computer hardware companies Computer companies of the United States Manufacturing companies based in Richmond, Virginia Companies established in 1997
46573277
https://en.wikipedia.org/wiki/Jos%C3%A9%20Luis%20Encarna%C3%A7%C3%A3o
José Luis Encarnação
José Luis Moreira da Encarnação is a Portuguese computer scientist, Professor Emeritus at the Department of Computer Science of the Technische Universität Darmstadt in Germany and a senior technology and innovation advisor to governments, multinational companies, research institutions and organizations, and foundations. He is involved in the development of research agendas and innovation strategies for socio-economic development with a focus on emerging economies. He is also a member of the Topical Network Information and Communication Technology (ICT) and ICT-related activities of the German National Academy of Science and Engineering (acatech) and the German Berlin-Brandenburg Academy of Sciences and Humanities (BBAW). He is an elected member of the ACM SIGGRAPH Academy (USA). Biography Professor Encarnação was born in Portugal and has lived in the Federal Republic of Germany (now Germany) since 1959. In Portugal, he graduated from the Escola Salesiana do Estoril. He holds a Diploma (Dipl.-Ing.) and a doctorate (Dr.-Ing.) in electrical engineering from the Technische Universität Berlin (TUB), where he conducted his Ph.D. studies with a scholarship of the Gulbenkian Foundation. Academic career In 1967, Professor Encarnação started his academic career in Computer Graphics at the Technische Universität Berlin (TUB) and at the Heinrich-Hertz-Institute in Berlin. Subsequently, he held research and academic positions at the Heinrich-Hertz-Institute in Berlin (1968-1972) and at the Universität des Saarlandes (1972 – 1975). From 1975 to 2009 he was a full professor of Computer Science at the Technische Universität Darmstadt (TU Darmstadt), Germany, and the head of its Interactive Graphics Research Group (TUD–GRIS). Since October 2009 he is Professor Emeritus of the TU Darmstadt. In 1977, he and his research group introduced the Graphical Kernel System (GKS) as the first ISO standard for low-level computer graphics. Professor Encarnação is author or co-author of more than 500 publications in reviewed international journals and conferences and was the responsible advisor or co-advisor of more than 200 doctoral theses in Computer Graphics and related areas. Professional career From 1987 to 2006, Encarnação was the founding director of the Fraunhofer Institute for Computer Graphics (IGD) in Darmstadt, Germany. From 1995 to 2001 he was an elected member of the Senate of the Fraunhofer Society (FhG) in Munich and from 2002 to 2006 he was a member of the Advisory Board (Präsidium) of the Gesellschaft für Informatik (GI). From July 2001 to October 2006 he was the chairman of the Fraunhofer ICT (Information and Communication Technology) Group. From 2001 to 2007 Professor Encarnação was a member of the EU-Advisory Board (ISTAG) for the EU 6th and 7th Framework Programme for the ICT area. He was chairman of this board from 2002 to 2004 and its vice-chairman from 2005 to 2007. Since March 2017 he is the chairman of the International Assessment and Evaluation Board nominated by the Portuguese Foundation for Science and Technology (FCT) to build up the Collaborative Laboratories (Co-Labs) in Portugal. Founding and start-up achievements In 1980, Professor Encarnação was one of the founders of Eurographics, the European Association for Computer Graphics, from 1980 to 1984 its first chairman and from 1985 to 1991 the chairman of its Professional Board. Professor Encarnação founded the Fraunhofer IGD in 1986 and several Start-Ups between 1975 and 2009. In 1999 he founded the INI-GraphicsNet, an International Network of Institutions for R&D and Applications in Computer Graphics, which is today a network of legally independent but closely cooperating research entities in Germany, Italy, Panama, Portugal and Spain. Since 2010 this network is operating under the new name GraphicsMedia.net GmbH. Notable awards and recognitions Professor Encarnação is a Fellow of the Association for Computing Machinery since 1996, a Fellow of Eurographics as well as Honorary Fellow of Eurographics since 2006. In 2001 he was elected full academy member of the Berlin-Brandenburg Academy of Sciences and Humanities BBAW and in 2002 he became a full academy member of the German Academy of Science and Technology acatech. In 2017 he was awarded with an honorary membership of the German "Gesellschaft für Forschungstransfer" (GFFT). In August 2018 he was elected a member of the ACM SIGGRAPH Academy. National and Federal awards Professor Encarnação received several Order of Merit of the Federal Republic of Germany decorations: the Cross in 1983, the Officer's Cross in 1995, and the Grand Cross in 2006. The German Federal State of Hesse awarded to him the Hessian Culture Prize in 2000 The country of Portugal decorated him with the Order of Saint James of the Sword in 2001. In honor of Professor Encarnação's achievements in the area of Computer Graphics, the Eurographics Portuguese Chapter established in 2010 the annual Professor José Luís Encarnação Award for student achievement in academic publication in the area of Computer Graphics. Honorary appointments In recognition of his technical and scientific achievements he received several honorary doctorates (Dr.h.c. and Dr.E.h.) from the Universidade Técnica de Lisboa in Lisbon, Portugal, in 1991, from the Universität Rostock, Germany, in 1996, from the Universidade do Minho, Portugal, in 2002, from the Nanyang Technological University in Singapore, in 2008, and from the Technische Universität Berlin, Germany, in 2014 as well as honorary professorships from Instituto Superior Técnico in Lisbon, Portugal, in 1990, from Zhejiang University in Hangzhou, China, in 1991, and from the State University of Campinas (UNICAMP) in São Paulo, Brazil, in 2001, and an honorary senatorship from the University of Maribor, Slowenia, in 2002. Professional, academic, and cultural awards For his professional, technical and scientific achievements and his impact in science and industry he received internationally recognized professional awards, including the Karl-Heinz-Beckurts Award in 1989, the Steven A. Coons Award from ACM SIGGRAPH (USA) in 1995, the Konrad Zuse Medal by the German Computer Society (Gesellschaft für Informatik) in 1997, the Fraunhofer Medal from the Fraunhofer Society in 2001, the Technology Award of the Eduard Rhein Foundation in 2001, the Convergators Award for Lifetime Achievement by BiTKOM and FOCUS during CeBIT the Golden Honorary Medal of the Universität Rostock in 2012, the first Eurographics Gold Medal of the Eurographics Association in 2016, and an inaugural ACM SIGGRAPH Academy award in 2018, which made him an elected member of the ACM SIGGRAPH Academy. References External links Homepage at TU Darmstadt Electrical engineers Computer scientists Living people 1941 births People from Cascais Computer graphics researchers Human–computer interaction researchers Fellows of the Association for Computing Machinery Commanders Crosses of the Order of Merit of the Federal Republic of Germany Technical University of Berlin alumni Technische Universität Darmstadt faculty Saarland University faculty Portuguese emigrants to Germany
58035459
https://en.wikipedia.org/wiki/KaOS
KaOS
KaOS is a Linux distribution that is built from scratch with a very specific focus on Qt and KDE. Although KaOS is currently based on the Linux kernel, the developers are "constantly evaluating" the illumos kernel, and say that "a future switch is a wish". History The first version of KaOS was released as "KdeOS" in 2013. To prevent confusion between the distribution's name and the desktop environment KDE, the name was changed to "KaOS" in September 2013. Features KaOS is distributed via an ISO image, and exclusively supports 64-bit x86 processors. The idea behind KaOS is to create a tightly integrated rolling and transparent distribution for the modern desktop, build from scratch with a very specific focus. Focus on one DE (KDE Plasma), one toolkit (Qt), one architecture (x86_64) plus a focus on evaluating and selecting the most suitable tools and applications. Receptions Phoronix wrote in 2016, "Overall, I was quite pleased with it for being a niche distribution. KaOS was easy to install and was quickly running on a bleeding-edge KDE Plasma 5 stack. Overall, it was a fun and pleasant few hours spent with KaOS. FossMint stated in 2017, that KaOS "is a modern, open-source, beautifully designed, QT and KDE-focused Linux distro. It is a rolling release that ships with KDE Plasma as its default Desktop Environment, uses Pacman as its package manager, and has a 3-group structure repository on GitHub." and "The fact that it is a rolling release means that you will never need to worry about future updates the moment you have a version installed like in the case of Ubuntu and the like where you would need to consider whether to perform a clean installation of another “major version” or not." Hectic Geek reviewed KaOS in 2014, and wrote that the distribution was not very fast, but included all necessary applications. Jesse Smith from DistroWatch Weekly wrote a review of KaOS 2014.04. Smith said the features of KaOS worked well. Robert Rijkhoff reviewed KaOS 2017.09 for DistroWatch Weekly, and he said that "KaOS seems to be trying a little bit hard to be different". ZDNet wrote a hands-on review about KaOS 2014.06: Dedoimedo reviewed KaOS 2014.12: Jack Wallen from Linux.com stated his opinion about KaOS in 2016, and said that the distribution is beautiful. References External links KaOS Forum KaOS in OpenSourceFeed gallery Pacman-based Linux distributions Rolling Release Linux distributions x86-64 Linux distributions Linux distributions
383849
https://en.wikipedia.org/wiki/MOBIDIC
MOBIDIC
Sylvania's MOBIDIC, short for "MOBIle DIgital Computer", was a transistorized computer intended to store, sort and route information as one part of the United States Army's Fieldata concept. Fieldata aimed to automate the distribution of battlefield data in any form, ensuring the delivery of reports to the proper recipients regardless of the physical form they were sent or received. MOBIDIC was mounted in the trailer of a semi-trailer truck, while a second supplied power, allowing it to be moved about the battlefield. The Army referred to the system as the AN/MYK-1, or AN/MYK-2 for the dual-CPU version, Sylvania later offered a commercial version of the S 9400. History In early 1956 the Army Signal Corps at Fort Monmouth released a contract tender for the development of a van-mounted mobile computer as part of their Fieldata efforts. Fieldata envisioned a system where any sort of reports would be converted into text format and then sent electronically around an extended battlefield. At the recipient's end, it would be converted into an appropriate output, often on a line printer or similar device. By automating the process of routing the messages in the middle of the information flow, the Signal Corps was hoping to guarantee delivery and improve responsiveness. Fieldata can be thought of as a general purpose version of the system the US Air Force was developing in their SAGE system, which did the same task but limited to the field of information about aircraft locations and status. The heart of Fieldata would be computer systems that would receive, store, prioritize and send the messages. The machines would have to be built using transistors in order to meet the size and power requirements, so in effect, the Army was paying to develop transistorized computers. In spite of this, most established players ignored the Army's calls for the small machine. Sylvania's director of development speculated that the Army's terminology in the contract may have hidden the apparent wonderful opportunity. In the end, RCA and Sylvania entered bids, along with a number of smaller companies with unproven track records. Sylvania's bid was the lower of the "big two", and they won the contract in September 1956. The first experimental machine, retroactively known as MOBIDIC A, was delivered to Fort Monmouth in December 1959. By this time the Army had expressed increasing interest in the concept and had ordered four additional machines and associated software, including a COBOL compiler. The original contract for the experimental machine was for $1.6 million, but the new developments increased the total to between $20 and $30 million. MOBIDIC B was supplied to the Army's Tactical Operations Center and featured dual CPUs for increased reliability. MOBIDIC A/B weighed about . MOBIDIC C was sent to Fort Huachuca as a software testing system. MOBIDIC D was ordered for the Army Security Agency in Europe, and MOBIDIC 7A was shipped to the 7th Army Stock Control Center in Zweibrücken, Germany. 7A's service entry was delayed due to the failure of the Army-supplied tape drives, but Sylvania replaced these with off-the-shelf commercial units and the system went operational in January 1962, the first off-shore deployment. MOBIDIC C/D/7A weighed about . The 7A unit was extremely successful in operation, cutting the time needed to order and deliver spare parts dramatically. Although Fieldata was developed for battlefield information, MOBIDIC was just as useful for other sorts of information as well, as the 7A machine demonstrated. It was so successful that the MOBIDIC D was diverted to the Army's 3922nd Ordnance Supply Control Agency in Orléans, France (Maison Fort) to replace the existing RAMAC 305 card system. By 1962, however, the Army had lost interest in Fieldata and canceled the project. The B machine was no longer needed for Fieldata software development, and in 1965 it was purchased by the National Bureau of Standards for software development and research. The C, D, and 7A machines were later all moved to Karlsruhe, Germany, where they operated in the supply role for years. MOBIDIC's success, independent of Fieldata's failure, led to additional Army contracts for the smaller AN/APQ-32 computers, which processed artillery radar data. The basic layout of the MOBIDIC system was also used for the AN/ASD-1 computer used on the Boeing RC-135 ELINT aircraft, the PARADE and TIDEWATER projects, and its basic circuitry was used extensively in the development of the IBM 7090 for the BMEWS systems. As Sylvania had hoped, commercial interest in a small, low-cost, robust computer system seemed widespread. MOBIDIC was adapted into the Sylvania 9400 that was marketed towards factory automation systems. Two systems were ordered, one by the Office of the Assistant Chief of Staff for Intelligence in the Pentagon, and another by General Telephone in California. However, as the costs of trying to compete in the commercial computer market became clear, Sylvania decided to withdraw from the market, and General Telephone canceled their order. Both 9400's were built; General Telephone's intended delivery was used by Sylvania internally. Description MOBIDIC's design goal was the real time operation of its input/output system. A typical use for MOBIDIC would be to collate all the messages flowing through an input to different tape outputs based on a field in the data. The tapes could then be removed and the messages printed on an offline printer. For instance, a large supply depot might have numerous warehouses for different sorts of materials; MOBIDIC could route incoming requests by examining the part number and then sending that message to a particular tape. All of the output on that tape would then be printed and sent to the associated warehouse. MOBIDIC replaced many manual steps; it performed the collation lookup, sorting the data, and collecting all the printed messages for delivery. MOBIDIC was a 36-bit binary machine, a common word size for early computers. The system used 36-bit data throughout, but stored it as 40-bit values to add additional sign and parity bits, and two spares. This allowed it to store the full range from -(1 - 2−36) to +(1 - 2−36). Machines were normally equipped with two parallel banks of core memory with 4096 words each, but was expandable to seven banks maximum. It could support up to 63 tape drives, punch tape input and output, as well as a Flexowriter. One connection could also be dedicated to sending data to another MOBIDIC system. The tape drives used one of the spare bits in the 40-bit word as a STOP indicator. Most of the 52 instructions were in the one-address format, collecting into an accumulator, but a small number (load, move, etc.) were in two-address format. There were 15 arithmetic, eight transfer (memory), 17 logic, three sense and nine input-output instructions. An add required 16 microseconds, a multiply or divide 86, these slow times a side effect of its serial operation. MOBIDIC's CPU and I/O systems were housed in a 30-foot (10 meters) van. The machine required 29.76 kW of power, which was supplied from a second, smaller, van containing a generator set. Two other vans contained auxiliary EAM equipment and a repair shop. All four vans were backed up; two to a side, to a raised wooden platform with steps on one end. One of the concepts being; since this was the Cold War era than in case of enemy attack, everything could be moved instead of having to be abandoned and destroyed. The dual-CPU MOBIDIC B (only one was produced) included three additional general instructions, as well as nine new instructions for supporting subroutines. The CPUs were independent but shared a single main memory consisting of 8,192 words of core. In a sample use, one of the CPUs would be used to import data, handing off data via shared memory to the second for output. Although the machine's speed was slower overall (adds were 42 μs), throughput could be greatly improved. If one of the machines failed, the program could be restarted on the remaining CPU, running both sides of the I/O task with reduced throughput. References Notes Bibliography George Sokol, "MOBIDIC History", Sylvania, 4 September 1967 M.D. Abrams and R. Rosenthal, "On The Passing Of MOBIDIC-B", IEEE Computer, Volume 6 Issue 3 (March 1973), pg. 10–18. doi:10.1109/C-M.1973.217033 Martin Weik, , A Third Survey of Domestic Electronic Digital Computing Systems, Ballistic Research Laboratories, Report No. 1115, March 1961, pg. 0650-0657 Further reading Watts Humphrey, "MOBIDIC and Fieldata", IEEE Annals of the History of Computing, Volume 9 Issue 2 (April–June 1987), pg. 137-182 Military computers Transistorized computers
33731003
https://en.wikipedia.org/wiki/IBM%20Personal%20Computer%20XT
IBM Personal Computer XT
The IBM Personal Computer XT (model 5160, often shortened to PC/XT) is the second computer in the IBM Personal Computer line, released on March 8, 1983. Except for the addition of a built-in hard drive and extra expansion slots, it is very similar to the original IBM PC model 5150 from 1981. Name IBM did not specify an expanded form of "XT" on the machine, press releases, brochures or documentation, but some publications expanded the term as "eXtended Technology" or just "eXTended". Features The XT was regarded as an incremental improvement over the PC and a disappointment compared to the next-generation successor that some had anticipated. Compared to the original IBM PC, the XT has the following major differences: The number of expansion slots is increased from five to eight Base RAM is increased to at least 128 KB A 10 MB hard drive is included as standard equipment PC DOS 2.0 is included Otherwise the specifications are identical to the original PC. Expansion slots The number of expansion slots in the original IBM PC was a limiting factor for the product, since essential components (such as the video controller, disk controller and printer interface) each came as separate expansion cards and could quickly fill up all five available slots, requiring the user to swap cards in and out as tasks demanded. Some PC clones addressed this problem by integrating components into the motherboard to free up slots, while peripheral manufacturers produced products which integrated multiple functions into one card. The XT addressed the problem by adding three extra expansion slots for a total of eight. While the slots themselves are identical to those in the original PC, the amount of physical space in the chassis differs, so two of the new slots (located behind the hard drive) cannot accept full-length cards. In addition, the spacing of the slots is narrower than in the original PC, making it impossible to install some multi-board cards. Expansion unit The 5161 is an expansion chassis using an identical case and power supply to the XT, but instead of a system board, provides a backplane with eight card slots. It connects to the main system unit using an Extender Card in the system unit and a Receiver Card in the Expansion Unit, connected by a custom cable. The 5161 shipped with a 10 MB hard drive, and had room for a second one. The Expansion Unit can also contain extra memory, but the Extender card inserts wait states for memory in the Expansion Unit, so it may be preferable to install memory into the main system unit. The 5161 can be connected to either an XT or to the earlier 5150 (the original IBM PC). Other features PC DOS 2.0 offers a 9-sector floppy disk format, providing 180K/360K (single- vs. dual-sided) capacity per disk, compared to the 160K/320K provided by the 8-sector format of previous releases. The XT was not offered in a floppy-only model for its first two years on the market, although the standard ribbon cable with two floppy connectors was still included. At that time, in order to get a second floppy drive, the user had to purchase the 5161 expansion chassis. Like the original PC, the XT comes with IBM BASIC in ROM. The XT BIOS also displays a memory count during the POST, unlike the original PC. The XT has a desktop case similar to that of the IBM PC. It weighs and is approximately wide by deep by high. The power supply is 130 watts, an upgrade from the original PC. Those sold in the US are configured for 120 V AC only and could not be used with 240 V mains supplies. XTs with 240 V-compatible power supplies were later sold in international markets. Both were rated at 130 watts. Revisions and variants IBM made several submodels of the XT. The 3270 PC, a variant of the XT featuring 3270 terminal emulation, was released in October 1983. Submodel 068 and 078, released in 1985, offered dual-floppy configurations without a hard drive as well, and the new Enhanced Graphics Adapter and Professional Graphics Adapter became available as video card options. In 1986, the 256–640 KB motherboard models were launched, which switched to half-height drives. Submodels 268, 278 and 089 came with 101-key keyboards (essentially the IBM Model M, but in a modified variant that used the XT's keyboard protocol and lacked LEDs). Submodels 267, 277 and 088 had the original keyboard, but 3.5" floppy drives became available and 20MB Seagate ST-225 hard disks in 5.25" half-height size replaced the full-height 10 MB drives. Submodel 788 was the only XT sold with the Color Graphics Adapter as a standard feature. Submodels 568, 588, and 589 were used as basis for the XT/370; they had an additional (co-)processor board that could execute System/370 instructions. An XT-based machine with a Series/1 co-processor board existed as well, but it had its own System Unit number, the IBM 4950. IBM XT 286 In 1986 the XT 286 (model 5162) was released with a 6 MHz Intel 80286 processor. Despite being marketed as a lower-tier model than the IBM AT, this system runs many applications faster than the ATs of the time with 6 MHz 286 processors, since it has zero-wait state RAM. It shipped with 640 KB RAM standard, an AT-style 1.2 MB high-density diskette drive and a 20 MB hard disk. Despite these features, reviews rated it as a poor market value. The XT 286 uses a 157-watt power supply, which can internally switch between 115 or 230 V AC operation. Reception The XT was well received, although PC DOS 2.0 was regarded as a greater improvement than any of the hardware changes, and by the end of 1983 IBM was selling every unit they made. The Compaq Portable also came out in March 1983, and would prove a popular competitor. Sometimes called the "first PC clone" and first "legal clone", that distinction may go to another. Columbia Data Products' MPC 1600 "Multi Personal Computer", in June 1982. Other "clones" included the Seequa Chameleon, the Hyperion, Eagle Computer's Eagle 1600 that September and the Corona PC The latter two companies were sued by IBM and settled out of court, agreeing to re-implement their BIOS in a way that did not violate IBM's copyrights. The XT was discontinued in the spring of 1987. See also Amiga Sidecar PC-based IBM-compatible mainframes#Personal Computer XT/370 References Notes IBM (1983). Personal Computer Hardware Reference Library: Guide to Operations, Personal Computer XT. IBM Part Number 6936831. External links IBM 5160 information at www.minuszerodegrees.net Photo galleries: XT with 256 KB on system board XT 286 Personal Computer XT Computer-related introductions in 1983 16-bit computers
41797374
https://en.wikipedia.org/wiki/Port%20Control%20Protocol
Port Control Protocol
Port Control Protocol (PCP) is a computer networking protocol that allows hosts on IPv4 or IPv6 networks to control how the incoming IPv4 or IPv6 packets are translated and forwarded by an upstream router that performs network address translation (NAT) or packet filtering. By allowing hosts to create explicit port forwarding rules, handling of the network traffic can be easily configured to make hosts placed behind NATs or firewalls reachable from the rest of the Internet (so they can also act as network servers), which is a requirement for many applications. Additionally, explicit port forwarding rules available through PCP allow hosts to reduce the amount of generated traffic by eliminating workarounds in form of outgoing NAT keepalive messages, which are required for maintaining connections to servers and for various NAT traversal techniques such as TCP hole punching. At the same time, less generated traffic reduces the power consumption, directly improving the battery runtime for mobile devices. PCP was standardized in 2013 as a successor to the NAT Port Mapping Protocol (NAT-PMP), with which it shares similar protocol concepts and packet formats. In environments where a Universal Plug and Play Internet Gateway Device (UPnP IGD) is used in the local network, an interworking function between the UPnP IGD and PCP is required to be embedded in the IGD. The UPnP IGD-PCP IWF is specified in RFC6970. DHCP (IPv4 and IPv6) options to configure hosts with Port Control Protocol (PCP) server IP addresses are specified in RFC7291. The procedure to follow for selecting a server among a list of PCP servers is discussed in RFC7488. In environments where NAT64 is deployed, PCP allows to learn the IPv6 prefix(es) used by a PCP-controlled NAT64 device to build IPv4-converted IPv6 addresses by the NAT64 (RFC7225). Overview Many applications and network equipment deployments require their network locations to be reachable from outside their local networks, following the originally envisioned model of IP end-to-end connectivity across the Internet, so they can operate as network servers and accept connections from remote clients. An example of such equipment is an IP camera, which includes a network server that provides remote surveillance over IP networks. Usually, network equipment deployments place the devices behind routers or firewalls that perform NAT (to enable sharing of an IPv4 address, for example) or packet filtering (for improved network security and protection), ending up with breaking the end-to-end connectivity and rendering the equipment and applications inaccessible from the rest of the Internet. The problem Making the deployed equipment accessible, by extending its server role beyond the local network, requires either manual configuration of port forwarding at the network gateway (which is usually a CPE), or application-level workarounds that initiate connections from the deployed equipment to additional intermediate servers used for "merging" those "firewall punching" connections and connections from the actual clients. Both approaches have their downsides manual CPE configuration is usually either inconvenient or not possible, while using additional intermediate servers increases complexity and cost. For example, an online computer game (which acts as a client) requires communication with a game server for exchanging gameplay data. In order to make it possible for a game server to provide data to its clients, those clients must be made accessible to the server. Usually, clients initiate connections to the game server to open communication channels. However, such open connections can become idle and can subsequently be closed by network gateways, leading to the necessity of maintaining them by using a form of keepalive messages. Keepalive messages are small messages that are sent between client and server that create traffic over a communication channel and therefore prevent gateway servers from closing it. Thus, keeping a connection alive requires a constant exchange of empty messages between client and server. This increases network chatter, wastes network bandwidth and CPU cycles, and decreases the autonomy of battery-powered devices. Additionally, some network applications (for example, FTP) require dynamic opening of multiple connections, which involves application-level gateways (ALGs) and additionally increases complexity. PCP as a solution PCP allows equipment and applications to create explicit mappings between an external IP address, protocol and port, and an internal IP address, protocol and port. With such explicit mappings in place, inbound communication can reach the hosts behind a NAT or firewall, which either expands their server roles beyond boundaries of local networks, or makes use of various services simplified and less resource-consuming. Created mappings are permanent to the extent of having a known lifetime that can be extended, which is similar to the way Dynamic Host Configuration Protocol (DHCP) implements its leases. At the same time, PCP allows applications to create additional mappings dynamically as required, which reduces or eliminates the need for having ALG-enabled NAT devices and firewalls. Created explicit mappings have a known lifetime, commonly several hours, with no need for application-level keepalive messages to be exchanged between hosts and servers for the purpose of preserving the mapping. As a result, network usage and power consumption are reduced, and application-level keepalive logic no longer needs to be implemented at client and server sides. The PCP mapping response provides the application with associated externally visible parameters (IP address, protocol and port) that can then be announced to other clients in application-specific ways so incoming connections can be established. Additionally, PCP can inform applications when the external IP address is changed while a mapping is already established. Various types of NAT can be handled by PCP, providing support for NAT64, NAT66, and NAT44; inclusion of PCP into IPv4 and IPv6 firewall devices is also supported. PCP is designed to be used on both large-scale aggregation points (for example, as part of carrier-grade NATs), and inside less expensive consumer-grade devices. Both long-term (for an IP camera or a temperature sensor acting as a server, for example) and short-term mappings (while playing an online computer game, for example) are supported. PCP supports transport layer protocols that use 16-bit port numbers (for example, TCP, UDP, Stream Control Transmission Protocol (SCTP) or Datagram Congestion Control Protocol (DCCP). Protocols that do not use port numbers (for example, Resource Reservation Protocol (RSVP), Encapsulating Security Payload (ESP), ICMP or ICMPv6) are supported for IPv4 firewall, IPv6 firewall and NPTv6 (IPv6 prefix translation) functions, but cannot be supported by more than one client per external IP address in the case of NAT. The PCP specification does not define a mechanism for dealing with multi-homed networks (which have multiple network gateways or default routes). It is nonetheless possible to implement PCP in such networks using a coordination mechanism such as conntrackd. However, if the different networks each have their own external IP address(es), a given PCP mapping can only use one or the other because the protocol requires one specific external IP address to be provided to the client. If that network should then become unavailable the PCP mapping would have to be updated to use an external IP address from the other network. History PCP was standardized in 2013 as a successor to the NAT Port Mapping Protocol (NAT-PMP), sharing similar protocol concepts and packet formats with it. As one of the design differences, NAT-PMP is pretty much limited to the deployment on consumer-grade devices, while PCP is designed to also support carrier-grade equipment. Since 2005, NAT-PMP has been implemented in various Apple products. PCP relates to the Internet Gateway Device Protocol (IGDP), which was standardized in 2001 as part of the Universal Plug and Play (UPnP) specification. While the IGDP is complex and tailored toward manual configuration, PCP is designed for simplicity and automated use within software applications. The NAT-PMP specification contains a list of the problems with IGDP that prompted the creation of NAT-PMP, and subsequently, its successor PCP. Security Excluding the attackers capable of altering network packets exchanged while an explicit PCP mapping is created (packets that contain negotiation required for establishing an explicit mapping, which is exchanged between hosts and PCP-enabled NAT devices or firewalls), PCP is considered to be secure as long as created explicit mappings do not exceed the domain of implicit mappings. In other words, implicit mappings are created as a result of the way NAT devices and firewalls are handling regular outbound client connections, meaning that PCP is safe as long as no new mapping possibilities are introduced through the explicit mapping mechanism. From the security standpoint, an important PCP feature is the mapping request option. When used, this option signifies that the IP address specified additionally as part of the mapping request should be used as the internal address for the created explicit mapping, rather than following the default behavior of using source IP address of the actual mapping request packet for that purpose. Such mapping requests can end up with a PCP-enabled NAT device or firewall granting explicit mapping privileges higher than allowed by implicit mappings due to unknown rules imposed elsewhere for the specified IP address, allowing that way an attacker to steal some traffic, or to conduct a denial-of-service (DoS) attack. Additionally, explicit PCP security mechanisms are available as extensions to the PCP protocol, providing authentication and access control mechanisms by using an authenticated and integrity-protected in-band signalling channel, which relies on Extensible Authentication Protocol (EAP) to perform the authentication between devices involved in a PCP negotiation session. Such PCP-enabled NAT devices or firewalls may still accept unauthenticated mapping requests; at the same time, all previously described explicit mapping constraints still apply. Internals Internally, PCP works by exchanging control messages between hosts and PCP-enabled NAT devices or firewalls (referred to as servers), using User Datagram Protocol (UDP) as the underlying protocol. This communication consists of port mapping requests created by the hosts that result in responses once submitted to and processed by the servers. Following UDP's nature of unreliability, which means that UDP datagrams can be lost, duplicated or reordered, after submitting a request there is no guarantee for a response of any kind, thus host requests are also referred to as "hints". In addition to direct responses, servers also generate gratuitous notifications for example, unicast notifications to inform hosts of changes in the external IP address. Exchanged messages contain no means for determining either the transaction they belong to, or which stage of a "session" they represent. Such a simplified design is based on having all messages self-describing and complete, with no additional context required for each message to be successfully processed. Servers may decide to silently ignore host requests, in case they are unable to process them at the moment; in such cases, hosts need to retransmit the request. Also, hosts may safely decide to silently ignore any unwanted mapping responses. For the purpose of creating PCP requests, IP address of the server is either manually configured on the host, found as part of the host's DHCP lease, or set to the host's configured default gateway. Host request messages are sent from any source UDP port on a client to the server's UDP port 5351 that it listens to; unsolicited multicast server notifications (such as server restart announcements) are sent from the server's UDP port 5351 to the UDP port 5350 on hosts which they listen to. Maximum UDP payload length for all PCP messages is 1100 octets. Each PCP message consists of a request or response header containing an opcode that determines the associated operation, any relevant opcode-specific information (such as which ports are to be mapped), and zero or more options (such as the option described above). Result codes are returned as part of server responses; each result code has an associated lifetime, which tells the hosts when certain operations may be retried or should be repeated. For example, result lifetimes can specify how long a failure condition is expected to persist, or how long the created mapping will last. Implementations The list of implementations of both PCP and NAT-PMP protocols. Most NAT-PMP libraries have an intention to add PCP support too in near future. In C language MiniUPnPd is originally a UPnP IGD daemon but then it began to support NAT-PMP and PCP. The daemon is installed on many routers by various manufactures. libnatpmp a C client library and command line tool natpmpc. Available in many Linux repositories and ported to Windows. Used by Transmission. Part of MiniUPnPc project. Doesn't supports IPv6. BSD-3-Clause License. libpcp is a most complete implementation of both PCP and NATPMP client. Is also has a demo PCP server. It was developed by Cisco engineers. It supports IPv6 but it's not available in Linux distros. BSD-2-Clause License. stallone a NAT-PMP gateway daemon & client. Written in C. LGPL-2.1 License natpmpd a NAT-PMP daemon. Written in C. GPL-2.0 License. TCMPortMapper for Objective-C but internally uses libnatpmp. MIT License Rust Rust natpmp crate re-implementation of the C library libnatpmp in a Rust. MIT License. Go go-nat-pmp NAT-PMP client. Apache 2.0 License. go-pcp a PCP only client. MIT License. Java portmapper a Java client library for PCP, NAT-PMP and UPnP-IGD. Apache 2.0 jNAT-PMPlib a Java client library for NAT-PMP. LGPLv3 License .NET Mono.Nat and it's Open.NAT fork UPNP and NAT-PMP client. Written in C#. MIT License. Python nat-pmp NAT-PMP client library. MIT License pypcpc PCP client library and command line tool. MIT License. NodeJS freedom-port-control client for PCP, NAT-PMP, and UPnP. Dozen of NAT-PMP libraries in NPM See also DMZ (computing) a subnetwork that contains and exposes one's external-facing services to a larger and untrusted network Hole punching (networking) establishing direct connections between two networked parties residing behind firewalls or NAT-enabled routers Universal Plug and Play Internet Gateway Device Protocol References External links Port Control Protocol (PCP): Related documents (IETF) Port Control Protocol (PCP): Charter for Working Group (IETF) Internet architecture Internet protocols Network address translation Network protocols
326413
https://en.wikipedia.org/wiki/List%20of%20exceptional%20asteroids
List of exceptional asteroids
The following is a collection of lists of asteroids of the Solar System that are exceptional in some way, such as their size or orbit. For the purposes of this article, "asteroid" refers to minor planets out to the orbit of Neptune, and includes the dwarf planet 1 Ceres, the Jupiter trojans and the centaurs, but not trans-Neptunian objects (objects in the Kuiper belt, scattered disc or inner Oort cloud). For a complete list of minor planets in numerical order, see List of minor planets. Background Asteroids are given minor planet numbers, but not all minor planets are asteroids. Minor planet numbers are also given to objects of the Kuiper belt, which is similar to the asteroid belt but farther out (around 30–60 AU), whereas asteroids are mostly between 2–3 AU from the Sun and at the orbit of Jupiter 5 AU from the Sun. Also, comets are not typically included under minor planet numbers, and have their own naming conventions. Asteroids are given a unique sequential identifying number once their orbit is precisely determined. Prior to this, they are known only by their systematic name or provisional designation, such as . Physical characteristics Largest by diameter Estimating the sizes of asteroids from observations is difficult due to their irregular shapes, varying albedo, and small angular diameter. Observations by the Very Large Telescope of most large asteroids were published 2019–2021. The number of bodies grows rapidly as the size decreases. Based on IRAS data there are about 140 main-belt asteroids with a diameter greater than 120 km. For a more complete list, see List of Solar System objects by size. The inner asteroid belt (defined as the region interior to the 3:1 Kirkwood gap at 2.50 AU) has few large asteroids. Of those in the above list, only 4 Vesta, 19 Fortuna, 6 Hebe, 7 Iris and 9 Metis orbit there. (Sort table by mean distance.) Most massive Below are the sixteen most-massive measured asteroids. Ceres, at a third the estimated mass of the asteroid belt, is half again as massive as the next fifteen put together. The masses of asteroids are estimated from perturbations they induce on the orbits of other asteroids, except for asteroids that have been visited by spacecraft or have an observable moon, where a direct mass calculation is possible. Different sets of astrometric observations lead to different mass determinations; the biggest problem is accounting for the aggregate perturbations caused by all of the smaller asteroids. The proportions assume that the total mass of the asteroid belt is , or . Outside the top four, the ranking of all the asteroids is uncertain, as there is a great deal of overlap among the estimates. The largest asteroids with an accurately measured mass, because they have been studied by the probe Dawn, are 1 Ceres with a mass of , and 4 Vesta at . The third-largest asteroid with an accurately measured mass, because it has moons, is 87 Sylvia at . For a more complete list, see List of Solar System objects by size. Other large asteroids such as 423 Diotima currently only have estimated masses. Brightest from Earth Only Vesta is regularly bright enough to be seen with the naked eye. Under ideal viewing conditions with very dark skies, a keen eye might be able to also see Ceres, as well as Pallas and Iris at their rare perihelic oppositions. The following asteroids can all reach an apparent magnitude brighter than or equal to the +8.3 attained by Saturn's moon Titan at its brightest, which was discovered 145 years before the first asteroid was found owing to its closeness to the easily observed Saturn. None of the asteroids in the outer part of the asteroid belt can ever attain this brightness. Even Hygiea and Interamnia rarely reach magnitudes of above 10.0. This is due to the different distributions of spectral types within different sections of the asteroid belt: the highest-albedo asteroids are all concentrated closer to the orbit of Mars, and much lower albedo C and D types are common in the outer belt. Those asteroids with very high eccentricities will only reach their maximum magnitude rarely, when their perihelion is very close to a heliocentric conjunction with Earth, or (in the case of 99942 Apophis, , , and 367943 Duende) when the asteroid passes very close to Earth. * Apophis will only achieve that brightness on April 13, 2029. It typically has an apparent magnitude of 20–22. Slowest rotators This list contains the slowest-rotating known minor planets with a period of at least 1000 hours, or 41 days, while most bodies have rotation periods between 2 and 20 hours. Also see Potentially slow rotators for minor planets with an insufficiently accurate period (). Fastest rotators This list contains the fastest-rotating minor planets with a period of less than 100 seconds, or 0.027 hours. Bodies with a highly uncertain period, having a quality of less than 2, are highlighted in dark-grey. The fastest rotating bodies are all unnumbered near-Earth objects (NEOs) with a diameter of less than 100 meters (see table). Among the numbered minor planets with an unambiguous period solution are , a 60-meter sized stony NEO with a period of 352 seconds, as well as and , two main-belt asteroids, with a diameter of 0.86 and 2.25 kilometers and a period of 1.29 and 1.95 hours, respectively (see full list). Orbital characteristics Retrograde Minor planets with orbital inclinations greater than 90° (the greatest possible is 180°) orbit in a retrograde direction. , of the near-800,000 minor planets known, there are only 99 known retrograde minor planets (0.01% of total minor planets known). In comparison, there are over 2,000 comets with retrograde orbits. This makes retrograde minor planets the rarest group of all. High-inclination asteroids are either Mars-crossers (possibly in the process of being ejected from the Solar System) or damocloids. Some of these are temporarily captured in retrograde resonance with the gas giants. the value given when the number of observations is multiplied by the observation arc; larger values are generally better than smaller values depending on residuals. Highly inclined Trojans Earth trojans: . Mars trojans: , 5261 Eureka, , , , , , and the candidate . Jupiter trojans: the first one was discovered in 1906, 588 Achilles, and the current total is over 6,000. Record-setting close approaches to Earth Viewed in detail Spacecraft targets Surface resolved by telescope or lightcurve 1 Ceres 2 Pallas 3 Juno 4 Vesta 5 Astraea 6 Hebe 7 Iris 8 Flora 9 Metis 10 Hygiea Koronis family 12 Victoria 13 Egeria 14 Irene 15 Eunomia 16 Psyche 18 Melpomene 26 Proserpina 29 Amphitrite 35 Leukothea 37 Fides 51 Nemausa 52 Europa 65 Cybele 87 Sylvia 89 Julia 121 Hermione 130 Elektra 201 Penelope 216 Kleopatra 324 Bamberga 511 Davida 925 Alphonsina 1140 Crimea 9969 Braille (33342) 1998 WT24 66391 Moshup (136617) 1994 CC (285263) 1998 QE2 (357439) 2004 BL86 Multiple systems resolved by telescope 90 Antiope Comet-like activity 2006 VW139 P/2013 P5 Disintegration 6478 Gault P/2013 R3 Timeline Landmark asteroids Numbered minor planets that are also comets The above table lists only numbered asteroids that are also comets. Note there are several cases where a non-numbered minor planets turned out to be a comet, e.g. C/2001 OG108 (LONEOS), which was provisionally designated . Minor planets that were misnamed and renamed In earlier times, before the modern numbering and naming rules were in effect, asteroids were sometimes given numbers and names before their orbits were precisely known. And in a few cases duplicate names were given to the same object (with modern use of computers to calculate and compare orbits with old recorded positions, this type of error no longer occurs). This led to a few cases where asteroids had to be renamed. Landmark names Asteroids were originally named after female mythological figures. Over time the rules loosened. First asteroid with non-Classical and non-Latinized name: 64 Angelina (in honor of a research station) First asteroid with a non-feminine name: 139 Juewa (ambiguous) or 141 Lumen First asteroid with a non-feminized man's name: 903 Nealley Lowest-numbered unnamed asteroid (): Landmark numbers Many landmark numbers had specially chosen names for asteroids, and there was some debate about whether Pluto should have received number 10000, for example. This list includes some non-asteroids. See also Asteroid mining Asteroid Redirect Mission (proposed NASA mission) Centaur (minor planet) List of numbered Aten asteroids List of Amor asteroids List of Apollo asteroids List of asteroids named after people List of asteroids named after places List of instrument-resolved minor planets List of meteor air bursts List of minor planet moons List of Venus-crossing minor planets List of Earth-crossing minor planets List of Jupiter-crossing minor planets List of Mars-crossing minor planets List of Mercury-crossing minor planets List of Neptune-crossing minor planets List of Saturn-crossing minor planets List of Solar System objects by size List of Uranus-crossing minor planets Lists of astronomical objects Scattered disc object Small Solar System body ʻOumuamua Books Dictionary of Minor Planet Names, 5th ed.: Prepared on Behalf of Commission 20 Under the Auspices of the International Astronomical Union, Lutz D. Schmadel, References External links Lists and plots: Minor Planets PDS Asteroid Data Archive SBN Small Bodies Data Archive NASA Near Earth Object Program Major News About Minor Objects Latest News About Asteroids & Meteorites Exceptional asteroids
3342298
https://en.wikipedia.org/wiki/Perpetual%20beta
Perpetual beta
Perpetual beta is the keeping of software or a system at the beta development stage for an extended or indefinite period of time. It is often used by developers when they continue to release new features that might not be fully tested. Perpetual beta software is not recommended for mission critical machines. However, many operational systems find this to be a much more rapid and agile approach to development, staging, and deployment. Definition Perpetual beta has come to be associated with the development and release of a service in which constant updates are the foundation for the habitability or usability of a service. According to publisher and open source advocate Tim O'Reilly: "Users must be treated as co-developers, in a reflection of open source development practices (even if the software in question is unlikely to be released under an open source license.) The open source dictum, 'release early and release often', in fact has morphed into an even more radical position, 'the perpetual beta', in which the product is developed in the open, with new features slipstreamed in on a monthly, weekly, or even daily basis. It's no accident that services such as Gmail, Google Maps, Flickr, del.icio.us, and the like may be expected to bear a 'Beta' logo for years at a time." Used in the larger conversation of what defines Web 2.0, O'Reilly described the concept of perpetual beta as part of a customized Internet environment with these applications as distinguishing characteristics: Services, not packaged software, with cost-effective scalability Control over unique, hard-to-recreate data sources that get richer as more people use them Trusting users as co-developers Harnessing collective intelligence Leveraging the long tail through customer self-service Software above the level of a single device Lightweight user interfaces, development models, and business models. See also Continuous improvement References Web 2.0 neologisms Software release
8503981
https://en.wikipedia.org/wiki/Software%20modernization
Software modernization
Legacy modernization, also known as software modernization or platform modernization, refers to the conversion, rewriting or porting of a legacy system to modern computer programming languages, architectures (e.g. microservices), software libraries, protocols or hardware platforms. Legacy transformation aims to retain and extend the value of the legacy investment through migration to new platforms to benefit from the advantage of the new technologies. Strategies Making of software modernization decisions is a process within some organizational context. “Real world” decision making in business organizations often has to be made based on “bounded rationality”. Besides that, there exist multiple (and possibly conflicting) decision criteria; the certainty, completeness, and availability of useful information (as a basis for the decision) is often limited. Legacy system modernization is often a large, multi-year project. Because these legacy systems are often critical in the operations of most enterprises, deploying the modernized system all at once introduces an unacceptable level of operational risk. As a result, legacy systems are typically modernized incrementally. Initially, the system consists completely of legacy code. As each increment is completed, the percentage of legacy code decreases. Eventually, the system is completely modernized. A migration strategy must ensure that the system remains fully functional during the modernization effort. Modernization strategies There are different drivers and strategies for software modernization: Architecture Driven Modernization (ADM) is the initiative to standardize views of the existing systems in order to enable common modernization activities like code analysis and comprehension, and software transformation. Business-Focus Approach: The modernization strategy is tied to the business value added by the modernization. It implies defining the intersection of the criticality to the business of an applications with its technical quality. This approach pushed by Gartner puts the Application Portfolio Analysis (APA) as a prerequisite of modernization decisions for an application portfolio to measures software health, risks, complexity and cost providing insight into application strengths and weaknesses. Model Driven Engineering (MDE) is being investigated as an approach for reverse engineering and then forward engineering software code. Renaissance Method for iteratively evaluating legacy systems, from technical, business, and organizational perspectives. WMU (Warrants, Maintenance, Upgrade) is a model for choosing appropriate maintenance strategies based on aspired customer satisfaction level and their effects on it. Modernization risk management Software modernization is a risky, difficult, long, and highly intellectual process involving multiple stakeholders. The software modernization tasks are supported by various tools related to Model-driven architecture from the Object Management Group and processes such as ISO/IEC 14764:2006 or Service-Oriented Migration and Reuse Technique (SMART). Software modernization implies various manual and automated tasks performed by specialized knowledge workers. Tools are supporting project participants' tasks and help organize the collaboration and sequencing of the work. A general software modernization management approach taking risks (both technological and business objectives) explicitly into account consists of: Analysis the existing portfolio: measuring the technical quality and business value. Confronting the technical quality with business goals to define the right strategy: replace, no go, low priority, good candidate. Identify stakeholders: all persons involved in the software modernization: developers, testers, customers, end-users, architects, … Understand the requirements: requirements are divided in 4 categories: user, system, constraints and nonfunctional. Create the Business Case: the business case supports the decision process in considering different approaches when decision makers need it. Understand the system to be modernized: this is a critical step as software documentation is rarely up-to-date and projects are made by numerous teams, both internal or external and usually out of sight for long time. Extracting the content of the application and its architecture design help reason about the system. Understand and evaluate target technology: this allows compare and contrast technologies and capabilities against requirements and existing system. Define modernization strategy: the strategy defines the transformation process. This strategy must accommodate changes happening during the modernization process (technologies changes, additional knowledge, requirement evolution). Reconcile strategy with stakeholder needs: implied stakeholders may have varying opinions on what is important and what is the best way to proceed. It is important to have a consensus between stakeholders. Estimate resources: when previous steps are defined, costs can be evaluated. It enables the management determining whether the modernization strategy is feasible given the available resources and constraints. Modernization costs Softcalc (Sneed, 1995a) is a model and tool for estimating costs of incoming maintenance requests, developed based on COCOMO and FPA. EMEE (Early Maintenance Effort Estimation) is a new approach for quick maintenance effort estimation before starting the actual maintenance. RENAISSANCE is a method to support system evolution by first recovering a stable basis using reengineering, and subsequently continuously improving the system by a stream of incremental changes. The approach integrates successfully with different project management processes Challenges in legacy modernization Primary issues with a legacy system include very old systems with lack of documentation, lack of SMEs/ knowledge on the legacy systems and dearth of technology skills in which the legacy systems have been implemented. Typical legacy systems have been in existence for more than two decades. Migrating is fraught with challenges: Lack of visibility across large application portfolios – Large IT organizations have hundreds, if not thousands, of software systems. Technology and functional knowledge are by nature distributed, diluted, and opaque. No central point of visibility for senior management and Enterprise Architects is a top issue – it is challenging to make modernization decisions about software systems without having the necessary quantitative and qualitative data about these systems across the enterprise. Organizational change management – Users must be re-trained and equipped to use and understand the new applications and platforms effectively. Coexistence of legacy and new systems – Organizations with a large footprint of legacy systems cannot migrate at once. A phased modernization approach needs to be adopted. However, this brings its own set of challenges like providing complete business coverage with well understood and implemented overlapping functionality, data duplication; throw-away systems to bridge legacy and new systems needed during the interim phases. Poor management of structural quality (see software quality), resulting in a modernized application that carries more security, reliability performance and maintainability issues than the original system. Significant modernization costs and duration - Modernization of a complex mission-critical legacy system may need large investments and the duration of having a fully running modernized system could run into years, not to mention unforeseen uncertainties in the process. Stakeholders commitment - Main organization stakeholders must be convinced of the investment being made for modernization, since the benefits, and an immediate ROI may not be visible as compared to the modernization costs being invested. Software Composition – It is extremely rare that developers create 100% original code these days in anything built after 2010. They are often using 3rd party and open source frameworks and software components to gain efficiency, speed, and reusability. This introduces two risks: 1.) vulnerabilities within the 3rd party code, and 2.) open source licensing risk. Last but not least, there is no one-stop solution-fits all kind of option in modernization. With a multitude of commercial and bespoke options available for modernization, it’s critical for the customers, the sellers and the executors to understand the intricacies of various modernization techniques, their best applicable implementations, suitability in a particular context, and the best practices to follow before selecting the right modernization approach. Modernization options Over the years, several different options have come into being for legacy modernization – each of them met with varying success and adoption. Even now, there is a range of possibilities, as explained below, and there is no “the option” for all legacy transformation initiatives. Application Assessment: Baselining the existing application portfolio using Software intelligence to understand software health, quality, composition, complexity, and cloud readiness to start segmenting and prioritizing applications for various modernization options. Application Discovery: Applications components are strongly interlaced implying requirement for understanding the complexity and resolving the interdependencies of software component. Migration: Migration of languages (3GL or 4GL), databases (legacy to RDBMS, and one RDBMS to another), platform (from one OS to another OS), often using automated converters or Program transformation systems for high efficiency. This is a quick and cost-effective way of transforming legacy systems. Cloud Migration: Migration of legacy applications to cloud platforms often using a methodology such as Gartner’s 5 Rs methodology to segment and prioritize apps into different models (Rehost, Refactor, Revise, Rebuild, Replace). Re-engineering: A technique to rebuild legacy applications in new technology or platform, with same or enhanced functionality – usually by adopting Service Oriented Architecture (SOA). This is the most efficient and agile way of transforming legacy applications. This requires application-level Software intelligence with legacy systems that are not well known or documented. Re-hosting: Running the legacy applications, with no major changes, on a different platform. Business logic is preserved as application and data are migrated into the open environment. This option only needs the replacement of middleware, hardware, operating system, and database. This is often used as an intermediate step to eliminate legacy and expensive hardware. Most common examples include mainframe applications being rehosted on UNIX or Wintel platform. Package implementation: Replacement of legacy applications, in whole or part, with off-the-shelf software (COTS) such as ERP, CRM, SCM, Billing software etc. A legacy code is any application based on older technologies and hardware, such as mainframes, that continues to provide core services to an organization. Legacy applications are frequently large and difficult to modify, and scrapping or replacing them often means re-engineering an organization’s business processes as well. However, more and more applications that were written in so called modern languages like java are becoming legacy. Whereas 'legacy' languages such as COBOL are top on the list for what would be considered legacy, software written in newer languages can be just as monolithic, hard to modify, and thus, be candidates of modernization projects. Re-implementing applications on new platforms in this way can reduce operational costs, and the additional capabilities of new technologies can provide access to functions such as web services and integrated development environments. Once transformation is complete and functional equivalence has been reached the applications can be aligned more closely to current and future business needs through the addition of new functionality to the transformed application. The recent development of new technologies such as program transformation by software modernization enterprises have made the legacy transformation process a cost-effective and accurate way to preserve legacy investments and thereby avoid the costs and business impact of migration to entirely new software. The goal of legacy transformation is to retain the value of the legacy asset on the new platform. In practice this transformation can take several forms. For example, it might involve translation of the source code, or some level of re-use of existing code plus a Web-to-host capability to provide the customer access required by the business. If a rewrite is necessary, then the existing business rules can be extracted to form part of the statement of requirements for a rewrite. Software migration Software migration is the process of moving from the use of one operating environment to another operating environment that is, in most cases, is thought to be a better one. For example, moving from Windows NT Server to Windows 2000 Server would usually be considered a migration because it involves making sure that new features are exploited, old settings do not require changing, and taking steps to ensure that current applications continue to work in the new environment. Migration could also mean moving from Windows NT to a UNIX-based operating system (or the reverse). Migration can involve moving to new hardware, new software, or both. Migration can be small-scale, such as migrating a single system, or large-scale, involving many systems, new applications, or a redesigned network. One can migrate data from one kind of database to another kind of database. This usually requires the data into some common format that can be output from the old database and input into the new database. Since the new database may be organized differently, it may be necessary to write a program that can process the migrating files. When a software migration reaches functional equivalence, the migrated application can be aligned more closely to current and future business needs through the addition of new functionality to the transformed application. The migration of installed software from an old PC to a new PC can be done with a software migration tool. Migration is also used to refer simply to the process of moving data from one storage device to another. Articles, papers and books Creating reusable software Due to the evolution of technology today some companies or groups of people don’t know the importance of legacy systems. Some of their functions are too important to be left unused, and too expensive to reproduce again. The software industry and researchers have recently paid more attention towards component-based software development to enhance productivity and accelerate time to market. Risk-managed modernization In general, three classes of information system technology are of interest in legacy system modernization: Technologies used to construct the legacy systems, including the languages and database systems. Modern technologies, which often represent nirvana to those mired in decades-old technology and which hold (the often unfulfilled) promise of powerful, effective, easily maintained enterprise information systems. Technologies offered by the legacy system vendors – These technologies provide an upgrade path for those too timid or wise to jump head-first into the latest wave of IT offerings. Legacy system vendors offer these technologies for one simple reason: to provide an upgrade path for system modernization that does not necessitate leaving the comfort of the “mainframe womb.” Although these technologies can provide a smoother road toward a modern system, they often result in an acceptable solution that falls short of the ideal. See also System migration Data migration References Software maintenance
290752
https://en.wikipedia.org/wiki/JumpStart%20Games
JumpStart Games
JumpStart Games, Inc., formerly Knowledge Adventure, Inc., is an American edutainment video game company based in Torrance, California. Founded in 1991, it owns the Neopets virtual pet website, and is itself owned by Chinese holding company NetDragon Websoft. History Until 1994, Knowledge Adventure had created DOS games, including Knowledge Adventure The Game, Isaac Asimov's Science Adventure, Space Adventure, Mario Teachers Typing, Mario Is Missing!, San Diego Zoo Presents: The Animals!, Dinosaur Adventure, The Tale of Peter Rabbit, Mario's Time Machine, Mario's Early Years! Fun with Letters, Mario's Early Years! Fun with Numbers, Imax's Speed, Undersea Adventure, 3D Dinosaur Adventure, Isaac Asimov's Science Adventure II, Kid's Zoo: A Baby Animal Adventure, 3D Body Adventure, Space Adventure II, Aviation Adventure, America Adventure, Bug Adventure, Imax's The Discoverers, Mario's Early Years! Preschool Fun, Magic Theater, My First Encyclopedia, Zurk's Learning Safari, Zurk's Rainforest Lab, Zurk's Alaskan Trek, Mario's Fundamentals, Mario's Early Years! Kindergarten Fun, Pyramid: Challenge of the Pharaoh's Dream, Chess Mates, Bricks the Ultimate Construction Toy!, Flipper, Drawing Discoveries, Mario Teachers Typing 2, Kid Keys: The Magical Typing Tutor, Kid Pilots, Dinosaur Adventure 3-D, Lionel Trains Presents: Trans-Con!. On November 5, 1996, CUC International announced that it would acquire Knowledge Adventure and was completed on February 3, 1997, its Davidson & Associates subsidiary that CUC acquired in February 1996 will later merge with Knowledge Adventure in October 1998. On May 28, 1997, CUC International announced plans to merge with Hospitality Franchise Systems to create a single, "one-stop" entity. The merger was finalized in December that year and created Cendant. As a result of the merger, CUC Software was renamed Cendant Software. On November 20, 1998, French media company Havas (later acquired by water utility Vivendi) announced that it would acquire Cendant Software for in cash and up to contingent on the performance of Cendant Software. Subsequently, the division was renamed Havas Interactive. During that time, Knowledge Adventure released many branded games such as JumpStart, Dr. Brain, Fisher-Price, Barbie, Bear in the Big Blue House, Blaster, Teletubbies, Noddy, Jurassic Park III, Captain Kangaroo, Curious George and American Idol. In October 2004, Vivendi sold Knowledge Adventure to a group of investors interested in taking a more active management strategy, and in developing new educational software. The company has since released new products under both the JumpStart and Math Blaster brands. In October 2012, Knowledge Adventure changed its name to JumpStart Games. On March 17, 2014, JumpStart Games purchased Neopets from Viacom. On July 7, 2017, JumpStart Games was acquired by Chinese online game publisher NetDragon Websoft. Back-catalog digital re-releases On November 25, 2014, five Knowledge Adventure titles were re-released digitally as DRM-Free exclusives on ZOOM-Platform.com through a partnership between JumpStart Games and the Jordan Freeman Group. The five titles included 3D Body Adventure, 3D Dinosaur Adventure, Dinosaur Adventure (Original), Space Adventure, and Undersea Adventure. On March 6, 2015, another Knowledge Adventure title, Bug Adventure, was re-released digitally as a DRM-Free exclusive on ZOOM-Platform.com. This title was also released through the partnership between JumpStart and the Jordan Freeman Group. ZOOM-Platform.com indicated the game was released due to the "incredible reaction" they got to the first batch of Knowledge Adventure titles. References Educational software companies Software companies based in California Video game companies of the United States Video game companies established in 1991 Video game development companies Companies based in Torrance, California Former Vivendi subsidiaries
198856
https://en.wikipedia.org/wiki/Lego%20Mindstorms
Lego Mindstorms
Lego Mindstorms is a hardware and software structure which is produced by Lego for the development of programmable robots based on Lego building blocks. Each version of the system includes a computer Lego brick that controls the system, a set of modular sensors and motors, and Lego parts from the Technic line to create the mechanical systems. While originally conceptualized and launched as a tool for supporting educational constructivism, Mindstorms would retroactively become the first home robotics kit available to a wide audience and quickly developed a community of adult hobbyists and hackers following the product's launch in 1998. There have been five generations of the Mindstorms platform: the original Robotics Invention System, NXT, NXT 2.0, EV3, and Robot Inventor kit. With each platform release, the motor and sensor capabilities are expanded. The one before last system, Lego Mindstorms EV3, was released on 1 September 2013. Some robot competitions used this set, such as the FIRST Lego League (until 2021) and the World Robot Olympiad. Pre-Mindstorms Background In 1985 Seymour Papert, Mitchel Resnick and Stephen Ocko created a company called Microworlds with the intent of developing a construction kit that could be animated by computers for educational purposes. Papert had previously created the Logo programming language as a tool to "support the development of new ways of thinking and learning", and employed "Turtle" robots to physically act out the programs in the real world. As the types of programs created were limited by the shape of the Turtle, the idea came up to make a construction kit that could use Logo commands to animate a creation of the learner's own design. Similar to the "floor turtle" robots used to demonstrate Logo commands in the real world, a construction system that ran Logo commands would also demonstrate them in the real world, but allowing the child to construct their own creations benefitted the learning experience by putting them in control In considering which construction system to partner with, they wanted a "low floor high ceiling" approach, something that was easy to pick up but very powerful. To this end, they decided to use LEGO bricks due to the system and diversity of pieces, and the Logo language due to the groups familiarity with the software and its ease of use. LEGO was receptive to collaboration, particularly because its educational division had founding goals very similar to those of the Microworlds company. The collaboration very quickly moved to the newly minted MIT Media lab, where there was an open sharing of ideas. As a sponsor of the entire lab, LEGO was allowed royalty free rights to mass-produce any technology produced by Papert, Resnick and Ocko's group; and was also allowed to send an employee over to assist with research, so they sent engineer Alan Tofte (also spelled Toft) who helped with the design of the programmable brick. As another part of the MIT Media Lab was community outreach, so the bricks would be used working with children in schools for both research and educational purposes. LEGO/Logo, lego tc Logo (1985) The first experiments of combining LEGO and the Logo programming language was called LEGO/Logo and it started in 1985. Similar to the "floor turtles" used to demonstrate Logo commands in the real world, LEGO/Logo used Logo commands to animate Lego creations. It is important that children could build their own machines to program, as they would then care more about their projects and be more willing to explore the mathematical concepts involved in making them move. The LEGO/Logo system allowed children to create their own designs and experiments, offered multiple paths for learning and encouraged a sense of community. First, machines are built out LEGO. The machines are then connected to a computer and programmed in a modified version of Logo. The LEGO/Logo system introduced new types of parts for making creations such as: motors, sensors and lights. The motors and sensors are connected to an interface box which is communicates with a computer. LEGO/Logo would later be commercialized by the LEGO group LEGO tc Logo. It was observed that using the LEGO/Logo system, children developed a form of knowledge about the physical world that allowed those even without mathematics or verbal skills to solve problems effectively using the system. Logo Brick 1st Generation, "Grey Brick" (1986) While LEGO/Logo was powerful, it was restricted somewhat by the requirement to have the creations attached to a computer. The group began working on further iterations of the LEGO/ LOGO environment to produce a robot that could interact not only with the environment but with other robots programmed in the same system. The experiments with an untethered brick (called the Logo Brick or "Grey Brick") began in the fall of 1986. To speed up the design process, the Logo Brick contained the processor chip from an Apple II computer. It ran an adapted version of LEGO/Logo written for the Apple II computer. The LEGO/Logo interface box, The previous development of the group, had only two sensor ports available, which the design team observed were not always enough. To address this, they gave the Logo Brick four sensor ports. The Logo Brick was made out of a modified LEGO battery box and was about the size of a deck of cards. The Logo Brick was tested in schools. LEGO Mindstorms and RCX (1996) Development While LEGO had been interested in mass-producing the programmable brick concept for a while, they had to wait until enough people owned personal computers and the components required to produce the intelligent brick went down in price. Development of what would later be known as LEGO Mindstorms started in 1996 as the first product of the newly created home-learning division of LEGO Education (LEGO Dacta). The product's name of "Mindstorms" was intended to express the user experience of the product, it is named after Papert's book Mindstorms, as the user experience was similar to the educational constructivism concepts described in his book. The LEGO home education team used the insights that MIT researchers discovered from testing the 3rd Generation Logo Brick ("Red Brick") in schools as the basis for the development of the mass-produced programmable brick. The physical programmable brick was re-engineered from the ground up, as the experimental programmable bricks were not designed for robustness or cost-effective manufacturing. The programming language of the product was developed with help from members of the MIT Media lab. LEGO decided to use a Visual programming language for Mindstorms, inspired by the LOGOBlocks language previously used with programmable brick experiments, in order to make the product accessible to children who might be unfamiliar with programming. While the technology that Mindstorms was based on was aimed towards "all children", the chosen target demographic of LEGO Mindstorms was intentionally narrow, in order to garner positive press by outselling expectations. The decision was made to aim the product towards 10 to 14-year-old boys, partially because it was LEGO's bread-and-butter demographic, and partially based on market research (not substantiated by the findings of the MIT Media Lab) which concluded that this demographic would be most attracted to computerized toys . This choice of target demographic directly informed the color of the RCX brick (which was made yellow and black to resemble construction equipment) and the sample uses for the Mindstorms kit (such as making autonomous robots). The project's at-first low profile allowed the Mindstorms team the freedom to develop the product using operating procedures then-unorthodox to the LEGO Group. Unlike traditional LEGO sets, the Mindstorms Robotics Invention System did not have a main model, nor was the play driven by storytelling. To bridge the gap between this new play experience and pre-existing LEGO ones, the Mindstorms team created a lot of opportunites for people interested in the product to engage with each other, such as the creation of Mindstorms.com, Mindstorms Discovery Centers, and the FIRST Lego League. The creation of these experiences was done through partnerships with a relatively large amount of external groups that the Mindstorms team interacted with as equal partners, something that was uncommon for the LEGO group at the time. To ease tensions between Mindstorms and more conventional products, the project team was given autonomy from LEGO's product development process and instead reported directly to the company's senior management. Promotion of the LEGO Mindstorms Robotics Invention System began 6 months before the product was planned to launch. The product was first soft launched with the opening of the Mindstorms Discovery Center at the Museum of Science and Industry, where children could interact with the Mindstorms Robotics Invention System to complete set tasks, getting them familiar with the product. The Mindstorms product was launched concurrently with LEGO Cybermaster, another LEGO product spun off from the MIT programmable brick technology that was more in line with the traditional product philosophies of the LEGO group. Instead of being sold at toy stores, the product was sold at electronics stores like BestBuy and CompUSA, due to the relatively high cost of the set. Launch LEGO Mindstorms was Released in 1998 at a retail price of $199. The entire production run (of between 60-100 thousand units) sold out within 3 months . Despite being aimed towards children, the kit quickly found an audience with adults and hackers of all ages; Lego company surveys had determined that seventy percent of Lego Mindstorms Hobbyists were adults. Shortly following the product's launch, hobbyists began sharing reverse-engineered versions of the RCX brick's Microcode and Firmware on the internet, leading to the development of alternative programming languages for the RXC such as "Not Quite C" (NQC) and alternative operating systems for the brick like legOS. The Lego Group was surprised by the products embrace by adult hobbyists, and was not sure how to respond to the sharing of proprietary code. The Mindstorms team would determine that the embrace of the product by the hacking community proved that the product was worth developing; In order to foster this burgeoning community, an official forum was established on the Lego website and a "right to hack" clause was added to end user license agreement of the Lego Mindstorms software . Robotics Discovery Set and Droid/Darkside Developer Kit The Robotics Discovery Set was a more affordable and simpler package than the Robotics Invention Set. Instead of being based on the RCX, it had its own programmable brick called the Scout. An even simpler version of the Scout would be featured in two Star-Wars-themed Mindstorm sets as well. Scout Lego also released a blue computer called the Scout, which has 2 sensor ports, 2 motor ports (plus one extra if linked with a Micro Scout using a fiber optic cable), and a built-in light sensor, but no PC interface. It comes with the Robotics Discovery Set. The Scout can be programmed from a collection of built-in program combinations. In order to program the Scout, a user must enable "power mode" on it. The Scout can store one program. The Scout is based on a Toshiba microcontroller with 32 KB of ROM and 1 KB of RAM, where about 400 bytes are available for user programs. Due to the extremely limited amount of RAM, many predefined subroutines were provided in ROM. The Scout only supports passive external sensors, which means that only touch, temperature and other unpowered sensors can be used. The analog-to-digital converters used in the Scout only have a resolution of 8 bits, in contrast to the 10-bit converters of the RCX. There was a plan for Lego to create a booster set that allows you to program the Scout from a computer with software such as RCX code. However, due to the complexity of this project, it was abandoned. The RCX can control the Scout brick using the "Send IR Message" program block. The RCX does all of the controlling, and therefore can be programmed with the PC, while the Scout accepts commands. The Scout brick must have all of its options set to "off" during this process. Micro Scout The Micro Scout was added as an entry-level to Lego robotics. It is a very limited Pbrick with a single built-in light sensor and a single built-in motor. It has seven built-in programs and can be controlled by a Scout, Spybotics or RCX unit using VLL. Like the Scout, the Micro Scout is also based on a microcontroller from Toshiba. The unit was sold as part of the Droid Developer Kit (featuring R2-D2) and later the Darkside Developer Kit (featuring an AT-AT Imperial Walker). Robotics Invention System The main core of the first generation of Mindstorms sets were the Robotics Invention System sets. These were based around the RCX (Robotic Command eXplorers) brick and the 9 V LEGO Technic peripherals available at the time. It also includes three touch-sensors and an optical sensor, using the technology from the earlier 9 V sensors from the pre-Mindstorms sets. RCX The RCX is based on the 8-bit Renesas H8/300 microcontroller, including 32 KB of ROM for low-level IO functions, along with 32 KB of RAM to store high-level firmware and user programs. The RCX is programmed by uploading a program using a dedicated infrared interface. After the user uploads a program, the RCX can run it on its own without the need for computer access. Programs may make use of three sensor input ports and three 9 V output ports, in addition to the IR interface, enabling several RCX bricks to communicate. A built-in LCD can display the battery level, the status of the input/output ports, which program is selected or running, and other information. Version 1.0 RCX bricks feature a power adapter jack in addition to batteries. In version 2.0 (as well as later 1.0s included in the RIS 1.5), the power adapter jack was removed. Power adapter-equipped RCX bricks were popular for stationary robotics projects (such as robot arms) or for controlling Lego model trains. In the latter context, the RCX might be programmed with Digital Command Control (DCC) software to operate multiple wired trains. The IR interface on the RCX is able to communicate with Spybots, Scout Bricks, Lego Trains, and the NXT (using a third-party infrared link sensor). The RCX 1.0 IR receiver carrier frequency is 38.5 kHz, while the RCX 2.0 IR carrier frequency is 76 kHz. Both versions can transmit on either frequency. The RCX communicates with a computer using a Serial or USB IR tower. As the RCX is discontinued, support for the interface is limited on more recent operating systems than Windows XP. All RCX versions have a unique number printed on them, which could be registered on the now-defunct Lego Mindstorms RCX website. This was necessary to obtain technical support. The first RCX produced is marked "000001," and was on display at the Mindstorms 10th Anniversary event. The Lego RCX was available in new sets from 1998 (Lego Set 9719: Robotics Invention System 1.0) through 2003 (Lego Set 9786: Robo Technology Set, with USB cable). The original RCX 1.0 worked with existing Lego power supply products from the Lego Train theme, Lego Product 70931: Electric Train Speed Regulator 9V Power Adaptor for 120v 60Hz - US version (Years: 1991 thru 2004), Lego Product 70938: Electric Train Speed Regulator 9V Power Adaptor for 230v 50Hz - European version (Years: 1991 thru 1996). Both of these products converted wall power to 12VAC, through a coaxial power connector (also called a "barrel connector"), 5.5 mm outside, 2.1 mm inside. These were sometimes sold alone and sometimes available as part of other sets such as Lego Set 4563: Load N' Haul Railroad (Year: 1991) and Lego Set 10132: Motorized Hogwarts Express (Year: 2004). Lego Mindstorms NXT Lego Mindstorms NXT was a programmable robotics kit released by Lego in July 2006, replacing the first-generation LEGO Mindstorms kit. The kit consists of 577 pieces, including: 3 servo motors, 4 sensors (ultrasonic, sound, touch, and light), 7 connection cables, a USB interface cable, and the NXT Intelligent Brick. The Intelligent Brick is the "brain" of a Mindstorms machine. It lets the robot autonomously perform different operations. The kit also includes NXT-G, a graphical programming environment that enables the creation and downloading of programs to the NXT. The software also has instructions for 4 robots: Alpha-Rex (a humanoid), Tri-Bot (a car), Robo-Arm T-56 (a robotic arm), and Spike (a scorpion) Lego Mindstorms NXT 2.0 The Lego Mindstorms NXT 2.0 was launched on 5 August 2009. It contains 619 pieces (includes sensors and motors), two Touch Sensors, an Ultrasonic Sensor, and introduced a new Color Sensor. The NXT 2.0 uses Floating Point operations whereas earlier versions use Integer operation. The kit costs around US$280. Lego Mindstorms EV3 The Lego Mindstorms EV3 is the third generation Lego Mindstorms product. EV3 is a further development of the NXT. The system was released on 1 September 2013. The LEGO MINDSTORMS EV3 set includes motors (2 large servo motor and 1 medium servo motor), sensors (2 touch sensors, ultrasonic sensor, color sensor, infrared sensor, and the new gyro sensor) , the EV3 programmable brick, 550+ LEGO Technic elements and a remote control (the Infrared Beacon, which is only on Home/Retail mode). The EV3 can be controlled by smart-devices. It can boot an alternative operating system from a microSD card, which makes it possible to run ev3dev, a Debian-based operating system. Lego Education Spike Prime Spike Prime was announced in April 2019. While not being part of the Mindstorms product line, the basic set includes three motors (1 large 2 medium) and sensors for distance, force and color a controller brick based on an STM32F413 microcontroller and 520+ LEGO Technic elements. Lego Mindstorms Robot Inventor Lego Mindstorms Robot Inventor was announced in June 2020 and released later in autumn. It has four medium motors from Spike Prime, two sensors (distance sensor and color/light sensor) also from Spike Prime, a Spike Prime hub with a six-axis gyroscope, an accelerometer, and support for controllers and phone control. It also has 902+ LEGO Technic elements. Programming languages Use in education Mindstorms kits are also sold and used as an educational tool, originally through a partnership between Lego and the MIT Media Laboratory. The educational version of the products is called Mindstorms for Schools or Mindstorms Education, and later versions come with the ROBOLAB GUI-based programming software, developed at Tufts University using the National Instruments LabVIEW as an engine. See also FIRST Lego League WRO (World Robot Olympiad) Robofest FIRST Tech Challenge RoboCup Junior WeDo 2.0 Big Trak iRobot Create Robotis Bioloid The Robotic Workshop Robotics suite C-STEM Studio Botball References Further reading Bagnall, Brian. Maximum LEGO NXT: Building Robots with Java Brains. Variant Press. 2007. . Bagnall, Brian. Core LEGO Mindstorms. Prentice-Hall PTR. 2002. . Baum, Dave. Definitive Guide to LEGO MINDSTORMS, 2nd ed. Apress. 2002. . Erwin, Benjamin. Creative Projects with LEGO Mindstorms (book and CD-ROM). Addison-Wesley. 2001. . Ferrari et al. Building Robots with LEGO Mindstorms: The Ultimate Tool for Mindstorms Maniacs. Syngress. 2001. . Gindling, J., A. Ioannidou, J. Loh, O. Lokkebo, and A. Repenning., "LEGOsheets: A Rule-Based Programming, Simulation and Manipulation Environment for the LEGO Programmable Brick", Proceeding of Visual Languages, Darmstadt, Germany, IEEE Computer Society Press, 1995, pp. 172–179. Breña Moral, Juan Antonio. Develop LeJOS programs Step by Step. External links Official LEGO Mindstorms 1998 in robotics Educational toys Electronic toys Embedded systems Products introduced in 1998 Robot kits
506063
https://en.wikipedia.org/wiki/Centrum%20Wiskunde%20%26%20Informatica
Centrum Wiskunde & Informatica
The (abbr. CWI; English: "National Research Institute for Mathematics and Computer Science") is a research centre in the field of mathematics and theoretical computer science. It is part of the Netherlands Organisation for Scientific Research (NWO) and is located at the Amsterdam Science Park. This institute is famous as the creation site of the programming language Python. It was a founding member of the European Research Consortium for Informatics and Mathematics (ERCIM). Early history The institute was founded in 1946 by Johannes van der Corput, David van Dantzig, Jurjen Koksma, Hendrik Anthony Kramers, Marcel Minnaert and Jan Arnoldus Schouten. It was originally called Mathematical Centre (in Dutch: Mathematisch Centrum). One early mission was to develop mathematical prediction models to assist large Dutch engineering projects, such as the Delta Works. During this early period, the Mathematics Institute also helped with designing the wings of the Fokker F27 Friendship airplane, voted in 2006 as the most beautiful Dutch design of the 20th century. The computer science component developed soon after. Adriaan van Wijngaarden, considered the founder of computer science (or informatica) in the Netherlands, was the director of the institute for almost 20 years. Edsger Dijkstra did most of his early influential work on algorithms and formal methods at CWI. The first Dutch computers, the Electrologica X1 and Electrologica X8, were both designed at the centre, and Electrologica was created as a spinoff to manufacture the machines. In 1983, the name of the institute was changed to Centrum Wiskunde & Informatica (CWI) to reflect a governmental push for emphasizing computer science research in the Netherlands. Recent research The institute is known for its work in fields such as operations research, software engineering, information processing, and mathematical applications in life sciences and logistics. More recent examples of research results from CWI include the development of scheduling algorithms for the Dutch railway system (the Nederlandse Spoorwegen, one of the busiest rail networks in the world) and the development of the Python programming language by Guido van Rossum. Python has played an important role in the development of the Google search platform from the beginning, and it continues to do so as the system grows and evolves. Many information retrieval techniques used by packages such as SPSS were initially developed by Data Distilleries, a CWI spinoff. Work at the institute was recognized by national or international research awards, such as the Lanchester Prize (awarded yearly by INFORMS), the Gödel Prize (awarded by ACM SIGACT) or the Spinoza Prize. Most of its senior researchers hold part-time professorships at other Dutch universities, with the institute producing over 170 full professors during the course of its history. Several CWI researchers have been recognized as members of the Royal Netherlands Academy of Arts and Sciences, the Academia Europaea, or as knights in the Order of the Netherlands Lion. In February 2017, CWI in association with Google announced a successful collision attack on SHA 1 encryption algorithm. European Internet CWI was an early user of the Internet in Europe, in the form of a TCP/IP connection to NSFNET. Piet Beertema at CWI established one of the first two connections outside the United States to the NSFNET (shortly after France's INRIA) for EUnet on 17 November 1988. The first Dutch country code top-level domain issued was cwi.nl. The Amsterdam Internet Exchange (one of the largest Internet Exchanges in the world, in terms of both members and throughput traffic) is located at the neighbouring SARA (an early CWI spin-off) and NIKHEF institutes. The World Wide Web Consortium (W3C) office for the Benelux countries is located at CWI. Spin-off companies CWI has demonstrated a continuing effort to put the work of its researchers at the disposal of society, mainly by collaborating with commercial companies and creating spin-off businesses. In 2000 CWI established "CWI Incubator BV", a dedicated company with the aim to generate high tech spin-off companies. Some of the CWI spinoffs include: 1956: Electrologica, a pioneering Dutch computer manufacturer. 1971: SARA, founded as a center for data processing activities for Vrije Universiteit Amsterdam, Universiteit van Amsterdam, and the CWI. 1990: DigiCash, an electronic money corporation founded by David Chaum. 1994: NLnet, an Internet Service Provider. 1994: General Design / Satama Amsterdam, a design company, acquired by LBi (then Lost Boys international). 1995: Data Distilleries, developer of analytical database software aimed at information retrieval, eventually becoming part of SPSS and acquired by IBM. 1996: Stichting Internet Domeinregistratie Nederland (SIDN), the .nl top-level domain registrar. 2000: Software Improvement Group (SIG), a software improvement and legacy code analysis company. 2008: MonetDB, a high-tech database technology company, developer of the MonetDB column-store. 2008: Vectorwise, an analytical database technology company, founded in cooperation with the Ingres Corporation (now Actian) and eventually acquired by it. 2010: Spinque, a company providing search technology for information retrieval specialists. 2013: MonetDB Solutions, a database services company. 2016: Seita, a technology company providing demand response services for the energy sector. Software and languages ABC programming language Algol 60 Algol 68 Alma-0, a multi-paradigm computer programming language ASF+SDF Meta Environment, programming language specification and prototyping system, IDE generator Cascading Style Sheets MonetDB NetHack Python programming language RascalMPL, general purpose meta programming language RDFa SMIL van Wijngaarden grammar XForms XHTML XML Events Notable people Adrian Baddeley Theo Bemelmans Piet Beertema Jan Bergstra Gerrit Blaauw Peter Boncz Hugo Brandt Corstius Stefan Brands Andries Brouwer Harry Buhrman Dick Bulterman David Chaum Ronald Cramer Theodorus Dekker Edsger Dijkstra Constance van Eeden Peter van Emde Boas Richard D. Gill Jan Friso Groote Dick Grune Michiel Hazewinkel Jan Hemelrijk Martin L. Kersten Willem Klein Jurjen Ferdinand Koksma Kees Koster Monique Laurent Gerrit Lekkerkerker Arjen Lenstra Jan Karel Lenstra Gijsbert de Leve Barry Mailloux Massimo Marchiori Lambert Meertens Rob Mokken Albert Nijenhuis Steven Pemberton Herman te Riele Guido van Rossum Alexander Schrijver Jan H. van Schuppen Marc Stevens John Tromp John V. Tucker Paul Vitányi Hans van Vliet Marc Voorhoeve Adriaan van Wijngaarden Ronald de Wolf Peter Wynn References External links Amsterdam-Oost Computer science institutes in the Netherlands Edsger W. Dijkstra Mathematical institutes Members of the European Research Consortium for Informatics and Mathematics Organisations based in Amsterdam 1946 establishments in the Netherlands Research institutes in the Netherlands Science and technology in the Netherlands
21685506
https://en.wikipedia.org/wiki/The%20Linux%20Schools%20Project
The Linux Schools Project
The Linux Schools Project (formerly Karoshi, which can be translated literally as "death from overwork" in Japanese) is an operating system designed for schools. It is a Linux distribution based on Ubuntu (operating system). The project maintains two custom distributions, with one designed for use on servers and the other for use with the server version on client machines. The server distribution is the official Karoshi, while the client is known as Karoshi Client. TLSP uses prepackaged GUI scripts in order to simplify the install and configuration process for inexperienced users. History TLSP was originally developed using Red Hat, early in the 2000s with the aim of making Linux adoption easier for schools in the UK. Linux, at the time, was considered difficult to use in educational environments where computing expertise mainly came from teachers who were not dedicated IT staff. With version 5.1.x, TLSP moved to the PCLinuxOS platform - but has since adopted Ubuntu in its place. The current production version of TLSP is 12.1. Features TLSP is downloadable from their homepage. The installation steps require an initial install of Ubuntu, which the Live CD prompts to initiate. Following the machine reboot after installation of Ubuntu, the install of the TLSP system is initiated automatically. Educational TLSP is primarily aimed at educational environments, but is also suitable for use in a Small to Medium Enterprise (SME) business environment. The included systems are suitable for use as file and print, email, web and e-learning servers. By leveraging these technologies, it is possible to administer a complete network using the integrated web tools and by using some form of remote desktop technology. Server Distribution Primary Domain Controller Capability The TLSP system is a scalable single or multi server system, comprising many features. Chief among these are the ability to act as a Primary Domain Controller in a Windows network. TLSP uses built in Samba and LDAP servers to store user, group and computer information, and emulates a Microsoft Windows NT 4.0 server system using these technologies, providing computer and user authentication, along with file and print services on the local network. TLSP creates a standard Windows domain for the local network, and names it linuxgrid. KiXtart TLSP uses KiXtart scripts to set up Windows XP clients on the domain, providing mandatory profiles to most users on the system. Roaming profiles can be used, but are not recommended, due to the heavy network overhead involved. Using mandatory profiles and folder redirection to mapped file shares on the server, allows every user to store his own files in his "My Documents" folder. Servers TLSP includes the Moodle e-learning package, and several website content management systems, including Joomla! and Website Baker. eGroupWare and SquirrelMail are built into the system, allowing for full calendar and email facilities. These can be installed on a standalone machine in the DMZ section, thus providing increased security on systems that are directly exposed to the internet. WPKG Particularly interesting is the inclusion of WPKG, which enables the remote installation of software on Windows clients. By using a machine profile stored on the server, it is possible to install software packages, hotfixes, and security updates in the background. It is also very helpful in terms of creating machine profiles, allowing a 'blank' Windows XP machine to be updated automatically to a particular WPKG profile, once the machine is added to the domain. This type of technology can be compared to the group policy mechanism in Windows Server 2003, particularly from a machine administration perspective. It is by no means a replacement for group policy, but is a step in the right direction. Client Distribution The first version of Karoshi Client was based on PCLinuxOS. Further upgrades to the system as a whole led to the client using a modified version of Ubuntu 10.04 LTS with a GUI similar to the Microsoft Windows interface. The interface was designed to be fast, as to run well on older hardware. In June 2012 work was started on Karoshi Client version 2, which would have an interface closer to Gnome 2 than Windows. Development of the client release was given to Robin McCorkell - a student of Dover Grammar School for Boys. On 21 July 2012 Karoshi Client 2 was uploaded to Sourceforge.net. Technical Karoshi Client contains many applications which were deemed necessary for school work. Media production software (including music production, image manipulation, and video editing software) are included, along with programming tools and visualization software. Many IDEs are installed by default, mainly set for use with Java, but also supporting C/C++ programming or other languages. The C++ compiler and standard libraries are installed by default, along with the Boost libraries, ncurses and Mesa libraries for OpenGL programming. The Java Development Kit is installed, and integrated with the installed IDEs. Xfce is used as the desktop manager, with a customized theme and panel layout. The developer ported the Clearlooks GTK2 theme to GTK3 so that Gnome 3 applications like gEdit would display correctly. The panel layout is similar to the Gnome 2 environment. Compositing effects have been enabled by default for the environment. The interface settings are locked down in the Xfce configuration files due to the need for suitability in a school environment, where children may try and play with the settings. The KDE greeter for LightDM is used for the log in screen, due to problems with KDM and Ubiquity. This version of Karoshi Client is more integrated with the server distribution than the previous client releases, with most of the custom configuration files pulled down from a primary domain controller on boot up. A server patch that added in the correct files for the client was released on 23 July 2012. Limitations It used to be difficult to integrate TLSP into an existing Windows network, without changing the address space to the standard one that is used by the TLSP system. This was only a limitation in early versions and no longer applies. Future Plans Kerberos support is planned for the Karoshi server and client system, providing single sign on to all services provided by the Karoshi distribution. This will be unfeasible until Samba 4 is released due to the complexities surrounding integration of user resolution and file access across multiple operating systems, such as those that do not support the Active Directory protocols. Some integration has occurred already with a working client system that authenticates using Kerberos, then authenticates successfully with Moodle, Samba and Squid using Kerberos credentials. References External links WPKG Homepage Debian-based distributions Linux distributions
15656278
https://en.wikipedia.org/wiki/HOB%20GmbH%20%26%20Co%20KG
HOB GmbH & Co KG
HOB GmbH & Co. KG is part of the Brandstätter Group, which also owns Playmobil and other companies. HOB GmbH & Co. KG was founded in 1964 as an electronics concern, by Horst Brandstätter. The name HOB is an acronym formed from his name. HOB began developing software and terminals for IBM Mainframe computers in 1981. In 1983, HOB brought the world's first multi-session terminal for Mainframes onto the market, the HOB 78E terminal. Among the company’s major customers were MAN, Munich, the OFD Koblenz (Oberfinanzdirektion = Superior Finance Directorate: the central instance for all financial matters of the German state Rhineland-Westphalia), the automakers BMW and Audi, and the German mail-order giant Quelle. Up until 2001, the company produced hard- and software mainly for IBM Mainframes. With the advent of PC’s, terminal hardware sales dropped and, in 2001, HOB discontinued production of the multi-session terminals. Since then, HOB produces software providing remote connectivity for a range of computer operating systems. HOB also provides network infrastructure consultation services. In October 2018, HOB GmbH & Co. KG went into bankruptcy and self-administration, shortly after closing down its overseas development branch in Malta. Company milestones 1964 - HOB Electronic GmbH & Co. KG is founded. 1981 - Maintaining its focus on hardware, the company also begins to develop software and terminals for IBM 1983 - At CeBIT, HOB launches a simultaneous, multi-session terminal for 3270 access. 1990 - Expanding its focus on the software market, the company develops its first Windows-based 3270 emulation. 1996 - HOB begins to develop Java-based connectivity technology. 2000 - HOB Inc. is founded, expanding HOB GmbH & Co KG into the US marketplace with connectivity applications for Windows Terminal Server, IBM Mainframes, AS/400 and UNIX. Areas of operation HOB is an internationally active company that delivers software to a variety of customers in the financial, aerospace, health care, education, government and other sectors. HOB software is available in a variety of languages. Customer care (support, maintenance, update service) is provided both directly from HOB’s central office in Cadolzburg, Germany and by the HOB distributors throughout the world. In its headquarters in Cadolzburg and other branch offices in Germany, HOB employs about 100 people. In addition to this, HOB also has branch offices in France, Austria, the U.S.A. and Malta, as well as a network of distributors and resellers all over the world. HOB applications are deployed in more than 3,000 enterprises from the banking, insurance, public administration, governmental agencies, hospitals and industrial sectors. Organizational structure and ownership HOB Inc. and HOB GmbH & Co. KG are both affiliated with HOB electronic GmbH as general partners. These entities belong to the Brandstätter Group. All stock is privately held. Outside capital is not involved. References Computer hardware companies of Germany Computer companies established in 1964 1964 establishments in West Germany
5764036
https://en.wikipedia.org/wiki/The%20Unix%20System
The Unix System
The Unix System () is a book by Stephen R. Bourne. Published in 1982, it was the first widely available general introduction to the Unix operating system. It included some historical material on Unix, as well as material on using the system, editing, the software tools concept, C programming using the Unix API, data management with the shell and awk, and typesetting with troff. 1982 non-fiction books Addison-Wesley books Computer books Unix books
18268330
https://en.wikipedia.org/wiki/American%20Medical%20Informatics%20Association
American Medical Informatics Association
The American Medical Informatics Association (AMIA), is an American non-profit organization dedicated to the development and application of biomedical and health informatics in the support of patient care, teaching, research, and health care administration. History AMIA is the official United States representative to the International Medical Informatics Association. It has grown to more than 5,000 members from 42 countries worldwide. Together, these members represent all basic, applied, and clinical interests in health care information technology. It publishes the Journal of the American Medical Informatics Association. AMIA is a professional scientific association that was formed by the merger of three organizations in 1988: the American Association for Medical Systems and Informatics (AAMSI); the American College of Medical Informatics (ACMI); and the Symposium on Computer Applications in Medical Care (SCAMC). Founding AMIA was founded in 1989 by the merger of three organizations: American Association for Medical Systems and Informatics American College of Medical Informatics Symposium on Computer Applications in Medical Care Leadership As a professional society, AMIA leadership includes member-leaders who are elected annually from the membership. Until 2004, AMIA was led by an elected President and Chair of the Board of Directors, who worked closely with the organization's professional executive director. In 2004, AMIA created the role of a full-time, staff President and CEO, and changed the elected leadership role to Chair of the Board. The first President and CEO of AMIA was Don E. Detmer. He was succeeded in July 2009 by Edward H. Shortliffe. In March 2012, he was succeeded by Kevin Fickenscher. In 2013, he was succeeded by Douglas B. Fridsma, who served until December, 2019. Beginning in 2020, AMIA updated the leadership structure, changing the staff "President and CEO" role to "CEO," and the elected "Chair of the Board" back to the previously title of "President and Chair of the Board." The current President and Chair of the Board is Dr. Patricia Dykes. The current CEO, Tanya Tolpegin, was appointed in 2021. Membership AMIA membership is open to individuals, institutions, and corporations. Members include physicians, nurses, dentists, pharmacists, clinicians, health information technology professionals, computer and information scientists, biomedical engineers, consultants and industry representatives, medical librarians, academic researchers and educators, and advanced students pursuing a career in clinical informatics or health information technology. Meetings and education AMIA annually holds the following meetings: AMIA Annual Symposium The AMIA Joint Summit on Translational Science comprising: AMIA Summit on Translational Bioinformatics AMIA Summit on Clinical Research Informatics AMIA Clinical Informatics Working and special interest groups The association includes a number of special interest and working groups on a variety of issues important to its membership. See also eHealth References External links Health informatics and eHealth associations Information technology organizations based in North America Nursing informatics Organizations established in 1989
34154
https://en.wikipedia.org/wiki/PARC%20%28company%29
PARC (company)
PARC (Palo Alto Research Center; formerly Xerox PARC) is a research and development company in Palo Alto, California. Founded in 1969 by Jacob E. "Jack" Goldman, Xerox Corporation's chief scientist, the company was originally a division of Xerox, tasked with creating computer technology-related products and hardware systems. Xerox PARC has been at the heart of numerous revolutionary computer developments as laser printing, Ethernet, the modern personal computer, graphical user interface (GUI) and desktop paradigm, object-oriented programming, ubiquitous computing, electronic paper, amorphous silicon (a-Si) applications, the computer mouse, and advancing very-large-scale integration (VLSI) for semiconductors. Jack's “Advanced Scientific & Systems Laboratory” aimed to develop future technologies; it was not intended to reproduce the already existing Xerox's research laboratory in Rochester, New York, which focused on refining and expanding the company's copier business. Instead, Xerox PARC was a site for pioneering work in advanced physics, materials science, and computer science applications. Xerox formed Palo Alto Research Center Incorporated as a wholly owned subsidiary in 2002. History In 1969, Jack Goldman, Xerox's Chief Scientist, spoke to George Pake, a physicist specializing in nuclear magnetic resonance and provost of Washington University in St. Louis, about starting a second research center for the company. On July 1, 1970, the Xerox Palo Alto Research Center opened. While the 3,000-mile buffer between it and Xerox headquarters in Rochester, New York afforded scientists at the new lab great freedom to undertake their work, the distance also served as an impediment in persuading management of the promise of some of their greatest achievements. PARC's West Coast location proved to be advantageous in the mid-1970s, when the lab was able to hire many employees of the nearby SRI Augmentation Research Center (ARC) as that facility's funding began falling, from the Defense Advanced Research Projects Agency (DARPA), National Aeronautics and Space Administration (NASA) and U.S. Air Force (USAF). Being situated on Stanford Research Park land leased from Stanford University encouraged Stanford graduate students to be involved in PARC research projects and PARC scientists to collaborate with academic seminars and projects. Much of PARC's early success in the computer field was under the leadership of its Computer Science Laboratory manager Bob Taylor, who guided the lab as associate manager from 1970 to 1977 and as manager from 1977 to 1983. Today After three decades as a division of Xerox, PARC was transformed in 2002 into an independent, wholly owned subsidiary company dedicated to developing and maturing advances in science and business concepts. PARC's research areas encompass a range of disciplines in hardware, software, social sciences, and design. Areas include ubiquitous sensing, electrochemical energy systems, material deposition systems, polymeric and composite materials, semiconductor materials, printing for manufacturing, optical sensors, optical and mechanical microsystems, printed and hybrid electronics, large-area thin-film electronics, optoelectronic devices, user experience design, systems security, system prognosis and health management, modeling and simulation of cyber-physical systems, interactive machine learning, human-machine collaboration, geometric and spatial reasoning, data science, conversational agents, and computer vision and image synthesis. Accomplishments Xerox PARC has been the inventor and incubator of many elements of modern computing in the contemporary office work place: Laser printers Computer-generated bitmap graphics The graphical user interface, featuring skeuomorphic windows and icons, operated with a mouse The WYSIWYG text editor Interpress, a resolution-independent graphical page-description language and the precursor to PostScript Ethernet as a local-area computer network Fully formed object-oriented programming (OOP) (with class-based inheritance, the most popular OOP model to this day) in the Smalltalk programming language and integrated development environment Prototype-based programming (the second most popular inheritance model in OOP ) in the Self programming language Model–view–controller software architecture AspectJ an aspect-oriented programming (AOP) extension for the Java programming language The Alto Most of these developments were included in the Alto, which added the now familiar Stanford Research Institute (SRI) developed mouse, unifying into a single model most aspects of now-standard personal computer use. The integration of Ethernet prompted the development of the PARC Universal Packet architecture, much like today's Internet. The GUI Xerox has been heavily criticized (particularly by business historians) for failing to properly commercialize and profitably exploit PARC's innovations. A favorite example is the graphical user interface (GUI), initially developed at PARC for the Alto and then commercialized as the Xerox Star by the Xerox Systems Development Department. Although very significant in terms of its influence on future system design, it is deemed a failure because it only sold approximately 25,000 units. A small group from PARC led by David Liddle and Charles Irby formed Metaphor Computer Systems. They extended the Star desktop concept into an animated graphic and communicating office-automation model and sold the company to IBM. Bill Gates, the founder of Microsoft later stated that the Xerox graphical interface influenced both Microsoft and Apple, and Steve Jobs of Apple said that “Xerox could have owned the entire computer industry, could have been the IBM of the nineties, could have been the Microsoft of the nineties." Distinguished researchers Among PARC's distinguished researchers were three Turing Award winners: Butler W. Lampson (1992), Alan Kay (2003), and Charles P. Thacker (2009). The Association for Computing Machinery (ACM) Software System Award recognized the Alto system in 1984, Smalltalk in 1987, InterLisp in 1992, and the remote procedure call in 1994. Lampson, Kay, Bob Taylor, and Charles P. Thacker received the National Academy of Engineering's prestigious Charles Stark Draper Prize in 2004 for their work on the Alto. Legacy PARC's developments in information technology served for a long time as standards for much of the computing industry. Many advances were not equalled or surpassed for two decades, enormous timespans in the fast-paced high-tech world. While there is some truth that Xerox management failed to see the potential of many of PARC's inventions, this was mostly a problem with its computing research, a relatively small part of PARC's operations. A number of GUI engineers left to join Apple Computer. Technologies pioneered by its materials scientists such as liquid-crystal display (LCD), optical disc innovations, and laser printing were actively and successfully introduced by Xerox to the business and consumer markets. Work at PARC since the early 1980s includes advances in ubiquitous computing, aspect-oriented programming, and IPv6. See also GlobalView List of people associated with PARC Xerox Daybreak (a.k.a. Xerox Windows 6085) References Further reading Michael A. Hiltzik, Dealers of Lightning: Xerox PARC and the Dawn of the Computer Age (HarperCollins, New York, 1999) Douglas K. Smith, Robert C. Alexander, Fumbling the Future: How Xerox Invented, Then Ignored, the First Personal Computer (William Morrow and Company, New York, 1988) M. Mitchell Waldrop, The Dream Machine: J.C.R. Licklider and the Revolution That Made Computing Personal (Viking Penguin, New York, 2001) Howard Rheingold, Tools for Thought (MIT Press, 2000) Todd R. Weiss, "Xerox PARC turns 40: Making four decades of tech innovation" Computerworld, 2010 External links PARC official website Xerox Star Historical Documents MacKiDo article Oral history interview with Terry Allen Winograd Charles Babbage Institute, University of Minnesota, Minneapolis Oral history interview with Paul A. Strassmann Charles Babbage Institute, University of Minnesota, Minneapolis Oral history interview with William Crowther Charles Babbage Institute, University of Minnesota, Minneapolis PARC History of human–computer interaction Software companies based in the San Francisco Bay Area Technology transfer Research organizations in the United States Companies based in Palo Alto, California Technology companies established in 1970 Software companies of the United States Science and technology in the San Francisco Bay Area Research and development in the United States Corporate spin-offs 1970 establishments in California
158837
https://en.wikipedia.org/wiki/Cd%20%28command%29
Cd (command)
The command, also known as (change directory), is a command-line shell command used to change the current working directory in various operating systems. It can be used in shell scripts and batch files. Implementations The command has been implemented in operating systems such as Unix, DOS, IBM OS/2, MetaComCo TRIPOS, AmigaOS (where if a bare path is given, cd is implied), Microsoft Windows, ReactOS, and Linux. On MS-DOS, it is available in versions 2 and later. DR DOS 6.0 also includes an implementation of the and commands. The command is also available in the open source MS-DOS emulator DOSBox and in the EFI shell. It is named in HP MPE/iX. The command is analogous to the Stratus OpenVOS command. is frequently included built directly into a command-line interpreter. This is the case in most of the Unix shells (Bourne shell, tcsh, bash, etc.), cmd.exe on Microsoft Windows NT/2000+ and Windows PowerShell on Windows 7+ and COMMAND.COM on DOS/ Microsoft Windows 3.x-9x/ME. The system call that effects the command in most operating systems is that is defined by POSIX. Command line shells on Windows usually use the Windows API to change the current working directory, whereas on Unix systems calls the POSIX C function. This means that when the command is executed, no new process is created to migrate to the other directory as is the case with other commands such as ls. Instead, the shell itself executes this command. This is because, when a new process is created, child process inherits the directory in which the parent process was created. If the command inherits the parent process' directory, then the objective of the command cd will never be achieved. Windows PowerShell, Microsoft's object-oriented command line shell and scripting language, executes the command (cmdlet) within the shell's process. However, since PowerShell is based on the .NET Framework and has a different architecture than previous shells, all of PowerShell's cmdlets like , etc. run in the shell's process. Of course, this is not true for legacy commands which still run in a separate process. Usage A directory is a logical section of a file system used to hold files. Directories may also contain other directories. The command can be used to change into a subdirectory, move back into the parent directory, move all the way back to the root directory or move to any given directory. Consider the following subsection of a Unix filesystem, which shows a user's home directory (represented as ) with a file, , and three subdirectories. If the user's current working directory is the home directory (), then entering the command ls followed by might produce the following transcript: user@wikipedia:~$ ls workreports games encyclopedia text.txt user@wikipedia:~$ cd games user@wikipedia:~/games$ The user is now in the "games" directory. A similar session in DOS (though the concept of a "home directory" may not apply, depending on the specific version) would look like this: C:\> dir workreports <DIR> Wed Oct 9th 9:01 games <DIR> Tue Oct 8th 14:32 encyclopedia <DIR> Mon Oct 1st 10:05 text txt 1903 Thu Oct10th 12:43 C:\> cd games C:\games> DOS maintains separate working directories for each lettered drive, and also has the concept of a current working drive. The command can be used to change the working directory of the working drive or another lettered drive. Typing the drive letter as a command on its own changes the working drive, e.g. ; alternatively, with the switch may be used to change the working drive and that drive's working directory in one step. Modern versions of Windows simulate this behaviour for backwards compatibility under CMD.EXE. Note that executing from the command line with no arguments has different effects in different operating systems. For example, if is executed without arguments in DOS, OS/2, or Windows, the current working directory is displayed (equivalent to Unix pwd). If is executed without arguments in Unix, the user is returned to the home directory. Executing the command within a script or batch file also has different effects in different operating systems. In DOS, the caller's current directory can be directly altered by the batch file's use of this command. In Unix, the caller's current directory is not altered by the script's invocation of the command. This is because in Unix, the script is usually executed within a subshell. Options Unix, Unix-like by itself or will always put the user in their home directory. will leave the user in the same directory they are currently in (i.e. the current directory won't change). This can be useful if the user's shell's internal code can't deal with the directory they are in being recreated; running will place their shell in the recreated directory. cd ~username will put the user in the username's home directory. (without a ) will put the user in a subdirectory; for example, if they are in , typing will put them in , while puts them in . will move the user up one directory. So, if they are , moves them to , while moves them to (i.e. up two levels). The user can use this indirection to access subdirectories too. So, from , they can use to go to . will switch the user to the previous directory. For example, if they are in , and go to , they can type to go back to . The user can use this to toggle back and forth between two directories. DOS, OS/2, Windows, ReactOS no attributes print the full path of the current directory. Print the final directory stack, just like dirs. Entries are wrapped before they reach the edge of the screen. entries are printed one per line, preceded by their stack positions. (DOS and Windows only) returns to the root dir. Consequently, command always takes the user to the named subdirectory on the root directory, regardless of where they are located when the command is issued. Interpreters other than an operating systems shell In the File Transfer Protocol, the respective command is spelled in the control stream, but is available as in most client command-line programs. Some clients also have the for changing the working directory locally. The numerical computing environments MATLAB and GNU Octave include a cd function with similar functionality. The command also pertains to command-line interpreters of various other application software. See also Directory structure pushd and popd chroot List of command-line interpreters References Further reading External links Windows XP > Command-line reference A-Z > Chdir (Cd) from Microsoft TechNet Internal DOS commands File system directories Inferno (operating system) commands MSX-DOS commands OS/2 commands ReactOS commands Windows administration Standard Unix programs Unix SUS2008 utilities
40929538
https://en.wikipedia.org/wiki/Ping%20Zhang%20%28information%20scientist%29
Ping Zhang (information scientist)
Ping Zhang is an American scholar in information systems and human–computer interaction. She is notable for her work on establishing the human–computer interaction community inside the information systems field , bridging various camps of human–computer interaction research , and exploring intellectual characteristics of the information field. She co-authored with Dov Te’eni and Jane Carey the first HCI textbook for non-Computer Science students. Ping Zhang is the co-founding EIC of AIS Transactions on Human–Computer Interaction. She was a senior editor of Journal of the Association for Information Systems, where she is also the author of the inaugural article. In 2015, Ping Zhang was named as a fellow of the American Council on Education (ACE) for the 2015–2016 academic year. Ping Zhang received her PhD in Information Systems from the McCombs School of Business at the University of Texas at Austin, and M.Sc. and B.Sc. in Computer Science from Peking University, Beijing, China. Selected works Zhang, Ping (2013), The Affective Response Model: A Theoretical Framework of Affective Concepts and Their Relationships in the ICT Context, Management Information Systems Quarterly (MISQ), Vol. 37, Issue 1, 247–274. Wang Chingning & Ping Zhang (2012), The Evolution of Social Commerce: An Examination from the People, Business, Technology, and Information Perspective, Communications of the AIS (CAIS). Vol. 31, Article 5, 105–127. Zhang, Ping & Heshan Sun (2009), The complexity of different types of attitude in initial and continued ICT use, Journal of American Society for Information Science and Technology (JASIST). Zhang, Ping (2008), Motivational affordances: Fundamental reasons for ICT design and use, Communications of the ACM, 51(11). Zhang, Ping and Na Li (2005), The Importance of Affective Quality, Communications of the ACM, Vol. 48, No. 9, September, pp. 105–108. Zhang, Ping & Dennis Galletta (eds.), Human–Computer Interaction and Management Information Systems – Foundations, Series of Advances in Management Information Systems (AMIS), Armonk, NY: M.E. Sharpe, 2006. Galletta, Dennis and Ping Zhang (eds.), Human–Computer Interaction and Management Information Systems – Applications, Series of Advances in Management Information Systems (AMIS), Armonk, NY: M.E. Sharpe, 2006. References External links Ping Zhang website at Syracuse University AIS Transactions on Human–Computer Interaction journal website JAIS journal website Chinese emigrants to the United States Information systems researchers Human–computer interaction researchers Living people Year of birth missing (living people) American women computer scientists American computer scientists Peking University alumni University of Texas at Austin alumni Syracuse University faculty Scientists from Inner Mongolia People from Hohhot American women academics 21st-century American women scientists
40345057
https://en.wikipedia.org/wiki/OLinuXino
OLinuXino
OLinuXino is an open hardware single-board computer capable of running Android or Linux designed by OLIMEX Ltd in Bulgaria. The project's goal was to design DIY friendly industrial-grade Linux board which everyone can reproduce at home. It leverages widely-available hand-solderable components which are reasonable to purchase in low quantities, housed in TQFP packages. The project's CAD files are hosted on GitHub, allowing everyone to study and customize them according to their needs. Initially OLinuXino was designed with EAGLE. In March 2016 the first boards designed with KiCad became available as OLIMEX Ltd announced plans on switching development to Open Source CAD tools. iMX233 A13 The Chinese company Allwinner released in April 2012 Cortex-A8 SoC in TQFP package, this was spotted immediately by OLinuXino developers and they start working on OLinuXino board based on A13 Three OLinuXino boards with A13 processor were released: A10S In November 2012, Allwinner released a new A10S processor with HDMI and Ethernet and dual-core A20 processor. The A13 has no native Ethernet capability, so the A10S processor was chosen for new OLinuXino boards. A20 A64 Operating systems Officially supported: Debian Android Third party: Armbian Arch Linux ARM See also List of open-source hardware projects References External links OLinuXino web site SUNXI: OlimexA64-OLinuXino OLinuXino looks to take on the Raspberry Pi Meet the iMX233 OLinuXino Nano Element14 Olimex A10S/A20-OLinuXino boards quite BBB-like Linux Sunxi Community Open Source Hardware Hackaday article - OLinuXino booting Android CNX-Software OLinuXino unboxing and review Slashdot Fully open OLinuXino computer PC Magazine - PC like the Raspberry Pi but faster and fully open Dangerous Prototypes - OLinuXino single board computer Single-board computers
51237011
https://en.wikipedia.org/wiki/ROSA%20Linux
ROSA Linux
ROSA Linux is a Linux operating system distribution, developed by the Russian company 'LLC NTC IT ROSA'. It is available in three different editions: ROSA Desktop Fresh, ROSA Enterprise Desktop, and ROSA Enterprise Linux Server, with the latter two aiming at commercial users. Its desktop computer editions come bundled with closed-source software such as Adobe Flash Player, multimedia codecs, and Steam. ROSA Desktop Fresh R11.1, the latest desktop release as of 23 April 2020, is available with four different desktop environments: KDE Plasma 4, KDE Plasma 5, Xfce, and LXQt. It also contains open source software developed in-house by ROSA, such as ROSA Image Writer or ROSA Media Player. ROSA Linux has been certified by the Ministry of Defence of Russia. ROSA originated as a fork of now defunct French distribution Mandriva Linux and has since then been developed independently. The ROSA company was founded in early 2010 and released the first version of its operating system in December 2010. It initially targeted enterprise users only, but in late 2012, ROSA started its end-user oriented distribution, Desktop Fresh. Before its bankruptcy, Mandriva developed its last releases jointly with ROSA. Mandriva 2011 was also based on ROSA. Also MagOS Linux, is based on ROSA. Although its main popularity is in the Russian language market, ROSA Desktop also received favorable reviews by several non-Russian online publications. German technology website Golem.de praised ROSA for its stability and hardware support, while LinuxInsider.com called ROSA "a real Powerhouse". Version history Reception LinuxBSDos.com reviewed ROSA Desktop Fresh R2 with GNOME. He wrote: LinuxBSDos.com also reviewed same version with KDE, and have the review about earlier version — Fresh and Marathon 2012. In October 2012, Dedoimedo wrote review about ROSA Marathon 2012: Dedoimedo also wrote review about ROSA Desktop Fresh R7. Jesse Smith reviewed ROSA Desktop Fresh 2012 R1 for DistroWatch Weekly: Smith also reviewed Fresh R9 version. References External links ROSA Linux Wiki ROSA Linux Bugzilla ROSA Linux Forum ROSA Linux in DistroWatch ROSA Linux in the OpenSourceFeed Gallery Computer-related introductions in 2010 KDE Mandriva Linux RPM-based Linux distributions x86-64 Linux distributions Linux distributions Russian-language Linux distributions
66097941
https://en.wikipedia.org/wiki/2020%20United%20States%20federal%20government%20data%20breach
2020 United States federal government data breach
In 2020, a major cyberattack suspected to have been committed by a group backed by the Russian government penetrated thousands of organizations globally including multiple parts of the United States federal government, leading to a series of data breaches. The cyberattack and data breach were reported to be among the worst cyber-espionage incidents ever suffered by the U.S., due to the sensitivity and high profile of the targets and the long duration (eight to nine months) in which the hackers had access. Within days of its discovery, at least 200 organizations around the world had been reported to be affected by the attack, and some of these may also have suffered data breaches. Affected organizations worldwide included NATO, the U.K. government, the European Parliament, Microsoft and others. The attack, which had gone undetected for months, was first publicly reported on December 13, 2020, and was initially only known to have affected the U.S. Treasury Department and the National Telecommunications and Information Administration (NTIA), part of the U.S. Department of Commerce. In the following days, more departments and private organizations reported breaches. The cyberattack that led to the breaches began no later than March 2020. The attackers exploited software or credentials from at least three U.S. firms: Microsoft, SolarWinds, and VMware. A supply chain attack on Microsoft cloud services provided one way for the attackers to breach their victims, depending upon whether the victims had bought those services through a reseller. A supply chain attack on SolarWinds's Orion software, widely used in government and industry, provided another avenue, if the victim used that software. Flaws in Microsoft and VMware products allowed the attackers to access emails and other documents, and to perform federated authentication across victim resources via single sign-on infrastructure. In addition to the theft of data, the attack caused costly inconvenience to tens of thousands of SolarWinds customers, who had to check whether they had been breached, and had to take systems offline and begin months-long decontamination procedures as a precaution. U.S. Senator Richard J. Durbin described the cyberattack as tantamount to a declaration of war. President Donald Trump was silent for days after the attack, before suggesting that China, not Russia, might have been responsible for it, and that "everything is well under control". Background The global data breach occurred over the course of at least 8 or 9 months during the final year of the presidency of Donald Trump. Throughout this time, the White House lacked a cybersecurity coordinator, Trump having eliminated the post itself in 2018. When the breach was discovered, the U.S. also lacked a Senate-confirmed Director of the Cybersecurity and Infrastructure Security Agency (CISA), the nation's top cybersecurity official, responsible for coordinating incident response. The incumbent, Chris Krebs, had been fired on November 18, 2020. Also at that time, the Department of Homeland Security (DHS), which manages CISA, lacked a Senate-confirmed Secretary, Deputy Secretary, General Counsel, Undersecretary for Intelligence and Analysis, and Undersecretary for Management; and Trump had recently forced out the deputy director of CISA. Numerous federal cybersecurity recommendations made by the Government Accountability Office and others had not been implemented. SolarWinds, a Texas-based provider of network monitoring software to the U.S. federal government, had shown several security shortcomings prior to the attack. SolarWinds did not employ a chief information security officer or senior director of cybersecurity. Cybercriminals had been selling access to SolarWinds's infrastructure since at least as early as 2017. SolarWinds had been advising customers to disable antivirus tools before installing SolarWinds software. In November 2019, a security researcher had warned SolarWinds that their FTP server was not secure, warning that "any hacker could upload malicious [files]" that would then be distributed to SolarWinds customers. Furthermore, SolarWinds's Microsoft Office 365 account had been compromised, with the attackers able to access emails and possibly other documents. On December 7, 2020, a few days before trojaned SolarWinds software was publicly confirmed to have been used to attack other organizations, longstanding SolarWinds CEO Kevin Thompson retired. That same day, two private equity firms with ties to SolarWinds's board sold substantial amounts of stock in SolarWinds. The firms denied insider trading. Methodology Multiple attack vectors were used in the course of breaching the various victims of the incident. Microsoft exploits The attackers exploited flaws in Microsoft products, services, and software distribution infrastructure. At least one reseller of Microsoft cloud services was compromised by the attackers, constituting a supply chain attack that allowed the attackers to access Microsoft cloud services used by the reseller's customers. Alongside this, "Zerologon", a vulnerability in the Microsoft authentication protocol NetLogon, allowed attackers to access all valid usernames and passwords in each Microsoft network that they breached. This allowed them to access additional credentials necessary to assume the privileges of any legitimate user of the network, which in turn allowed them to compromise Microsoft Office 365 email accounts. Additionally, a flaw in Microsoft's Outlook Web App may have allowed attackers to bypass multi-factor authentication. Attackers were found to have broken into Microsoft Office 365 in a way that allowed them to monitor NTIA and Treasury staff emails for several months. This attack apparently used counterfeit identity tokens of some kind, allowing the attackers to trick Microsoft's authentication systems. The presence of single sign-on infrastructure increased the viability of the attack. SolarWinds exploit Here, too, the attackers used a supply chain attack. The attackers accessed the build system belonging to the software company SolarWinds, possibly via SolarWinds's Microsoft Office 365 account, which had also been compromised at some point. The attackers established a foothold in SolarWinds's software publishing infrastructure no later than September 2019. In the build system, the attackers surreptitiously modified software updates provided by SolarWinds to users of its network monitoring software Orion. The first known modification, in October 2019, was merely a proof of concept. Once the proof had been established, the attackers spent December 2019 to February 2020 setting up a command-and-control infrastructure. In March 2020, the attackers began to plant remote access tool malware into Orion updates, thereby trojaning them. These users included U.S. government customers in the executive branch, the military, and the intelligence services (see Impact section, below). If a user installed the update, this would execute the malware payload, which would stay dormant for 12–14 days before attempting to communicate with one or more of several command-and-control servers. The communications were designed to mimic legitimate SolarWinds traffic. If able to contact one of those servers, this would alert the attackers of a successful malware deployment and offer the attackers a back door that the attackers could choose to utilise if they wished to exploit the system further. The malware started to contact command-and-control servers in April 2020, initially from North America and Europe and subsequently from other continents too. The attackers appear to have utilized only a small fraction of the successful malware deployments: ones located within computer networks belonging to high-value targets. Once inside the target networks, the attackers pivoted, installing exploitation tools such as Cobalt strike components, and seeking additional access. Because Orion was connected to customers' Office 365 accounts as a trusted 3rd-party application, the attackers were able to access emails and other confidential documents. This access apparently helped them to hunt for certificates that would let them sign SAML tokens, allowing them to masquerade as legitimate users to additional on-premises services and to cloud services like Microsoft Azure Active Directory. Once these additional footholds had been obtained, disabling the compromised Orion software would no longer be sufficient to sever the attackers' access to the target network. Having accessed data of interest, they encrypted and exfiltrated it. The attackers hosted their command-and-control servers on commercial cloud services from Amazon, Microsoft, GoDaddy and others. By using command-and-control IP addresses based in the U.S., and because much of the malware involved was new, the attackers were able to evade detection by Einstein, a national cybersecurity system operated by the Department of Homeland Security (DHS). FBI investigators recently found that a separate flaw in software made by SolarWinds Corp was used by hackers tied to another foreign government to help break into U.S. government computers. VMware exploits Vulnerabilities in VMware Access and VMware Identity Manager, allowing existing network intruders to pivot and gain persistence, were utilized in 2020 by Russian state-sponsored attackers. As of December 18, 2020, while it was definitively known that the SUNBURST trojan would have provided suitable access to exploit the VMware bugs, it was not yet definitively known whether attackers had in fact chained those two exploits in the wild. Discovery Microsoft exploits During 2019 and 2020, cybersecurity firm Volexity discovered an attacker making suspicious usage of Microsoft products within the network of a think tank whose identity has not publicly been revealed. The attacker exploited a vulnerability in the organization's Microsoft Exchange Control Panel, and used a novel method to bypass multi-factor authentication. Later, in June and July 2020, Volexity observed the attacker utilising the SolarWinds Orion trojan; i.e. the attacker used Microsoft vulnerabilities (initially) and SolarWinds supply chain attacks (later on) to achieve their goals. Volexity said it was not able to identify the attacker. Also in 2020, Microsoft detected attackers using Microsoft Azure infrastructure in an attempt to access emails belonging to CrowdStrike. That attack failed because - for security reasons - CrowdStrike does not use Office 365 for email. Separately, in or shortly before October 2020, Microsoft Threat Intelligence Center reported that an apparently state-sponsored attacker had been observed exploiting zerologon, a vulnerability in Microsoft's NetLogon protocol. This was reported to CISA, who issued an alert on October 22, 2020, specifically warning state, local, territorial and tribal governments to search for indicators of compromise, and instructing them to rebuild their networks from scratch if compromised. Using VirusTotal, The Intercept discovered continued indicators of compromise in December 2020, suggesting that the attacker might still be active in the network of the city government of Austin, Texas. SolarWinds exploit On December 8, 2020, the cybersecurity firm FireEye announced that red team tools had been stolen from it by what it believed to be a state-sponsored attacker. FireEye was believed to be a target of the SVR, Russia's Foreign Intelligence Service. FireEye says that it discovered the SolarWinds supply chain attack in the course of investigating FireEye's own breach and tool theft. After discovering that attack, FireEye reported it to the U.S. National Security Agency (NSA), a federal agency responsible for helping to defend the U.S. from cyberattacks. The NSA is not known to have been aware of the attack before being notified by FireEye. The NSA uses SolarWinds software itself. Some days later, on December 13, when breaches at the Treasury and Department of Commerce were publicly confirmed to exist, sources said that the FireEye breach was related. On December 15, FireEye confirmed that the vector used to attack the Treasury and other government departments was the same one that had been used to attack FireEye: a trojaned software update for SolarWinds Orion. The security community shifted its attention to Orion. The infected versions were found to be 2019.4 through 2020.2.1 HF1, released between March 2020 and June 2020. FireEye named the malware SUNBURST. Microsoft called it Solorigate. The tool that the attackers used to insert SUNBURST into Orion updates was later isolated by cybersecurity firm CrowdStrike, who called it SUNSPOT. Subsequent analysis of the SolarWinds compromise using DNS data and reverse engineering of Orion binaries, by DomainTools and ReversingLabs respectively, revealed additional details about the attacker's timeline. July 2021 analysis published by the Google Threat Analysis Group found that a "likely Russian government-backed actor" exploited a zero-day vulnerability in fully-updated iPhones to steal authentication credentials by sending messages to government officials on LinkedIn. VMware exploits Some time before December 3, 2020, the NSA discovered and notified VMware of vulnerabilities in VMware Access and VMware Identity Manager. VMware released patches on December 3, 2020. On December 7, 2020, the NSA published an advisory warning customers to apply the patches because the vulnerabilities were being actively exploited by Russian state-sponsored attackers. Responsibility Conclusions by investigators SolarWinds said it believed the malware insertion into Orion was performed by a foreign nation. Russian-sponsored hackers were suspected to be responsible. U.S. officials stated that the specific groups responsible were probably the SVR or Cozy Bear (also known as APT29). FireEye gave the suspects the placeholder name "UNC2452"; incident response firm Volexity called them "Dark Halo". On December 23, 2020, the CEO of FireEye said Russia was the most likely culprit and the attacks were "very consistent" with the SVR. One security researcher offers the likely operational date, February 27, 2020, with a significant change of aspect on October 30, 2020. In January 2021, cybersecurity firm Kaspersky said SUNBURST resembles the malware Kazuar, which is believed to have been created by Turla, a group known from 2008 that Estonian intelligence previously linked to the Russian federal security service, FSB. Statements by U.S. government officials On October 22, 2020, CISA and the FBI identified the Microsoft zerologon attacker as Berserk Bear, a state-sponsored group believed to be part of Russia's FSB. On December 18, U.S. Secretary of State Mike Pompeo said Russia was "pretty clearly" responsible for the cyber attack. On December 19, U.S. president Donald Trump publicly addressed the attacks for the first time, downplaying its severity and suggesting without evidence that China, rather than Russia, might be responsible. The same day, Republican senator Marco Rubio, acting chair of the Senate Intelligence Committee, said it was "increasingly clear that Russian intelligence conducted the gravest cyber intrusion in our history." On December 20, Democratic senator Mark Warner, briefed on the incident by intelligence officials, said "all indications point to Russia." On December 21, 2020, former Attorney General William Barr said that he agreed with Pompeo's assessment of the origin of the cyberhack and that it "certainly appears to be the Russians," contradicting Trump. On January 5, 2021, CISA, the FBI, the NSA, and the Office of the Director of National Intelligence, all confirmed that they believe Russia was the most likely culprit. On June 10, 2021, FBI Director Christopher Wray attributed the attack to Russia's SVR specifically. Denial of involvement The Russian government said that it was not involved. The Chinese foreign ministry said in a statement, "China resolutely opposes and combats any form of cyberattacks and cyber theft." Impact SolarWinds said that of its 300,000 customers, 33,000 use Orion. Of these, around 18,000 government and private users downloaded compromised versions. Discovery of the breaches at the U.S. Treasury and Commerce Departments immediately raised concerns that the attackers would attempt to breach other departments, or had already done so. Further investigation proved these concerns to be well-founded. Within days, additional federal departments were found to have been breached. Reuters quoted an anonymous U.S. government source as saying: “This is a much bigger story than one single agency. This is a huge cyber espionage campaign targeting the U.S. government and its interests.” Compromised versions were known to have been downloaded by the Centers for Disease Control and Prevention, the Justice Department, and some utility companies. Other prominent U.S. organisations known to use SolarWinds products, though not necessarily Orion, were the Los Alamos National Laboratory, Boeing, and most Fortune 500 companies. Outside the U.S., reported SolarWinds clients included parts of the British government, including the Home Office, National Health Service, and signals intelligence agencies; the North Atlantic Treaty Organization (NATO); the European Parliament; and likely AstraZeneca. FireEye said that additional government, consulting, technology, telecom and extractive entities in North America, Europe, Asia and the Middle East may also have been affected. Through a manipulation of software keys, the hackers were able to access the email systems used by the Treasury Department's highest-ranking officials. This system, although unclassified, is highly sensitive because of the Treasury Department's role in making decisions that move the market, as well as decisions on economic sanctions and interactions with the Federal Reserve. Simply downloading a compromised version of Orion was not necessarily sufficient to result in a data breach; further investigation was required in each case to establish whether a breach resulted. These investigations were complicated by: the fact that the attackers had in some cases removed evidence; the need to maintain separate secure networks as organizations' main networks were assumed to be compromised; and the fact that Orion was itself a network monitoring tool, without which users had less visibility of their networks. As of mid-December 2020, those investigations were ongoing. As of mid-December 2020, U.S. officials were still investigating what was stolen in the cases where breaches had occurred, and trying to determine how it could be used. Commentators said that the information stolen in the attack would increase the perpetrator's influence for years to come. Possible future uses could include attacks on hard targets like the CIA and NSA, or using blackmail to recruit spies. Cyberconflict professor Thomas Rid said the stolen data would have myriad uses. He added that the amount of data taken was likely to be many times greater than during Moonlight Maze, and if printed would form a stack far taller than the Washington Monument. Even where data was not exfiltrated, the impact was significant. The Cybersecurity and Infrastructure Security Agency (CISA) advised that affected devices be rebuilt from trusted sources, and that all credentials exposed to SolarWinds software should be considered compromised and should therefore be reset. Anti-malware companies additionally advised searching log files for specific indicators of compromise. However, it appeared that the attackers had deleted or altered records, and may have modified network or system settings in ways that could require manual review. Former Homeland Security Advisor Thomas P. Bossert warned that it could take years to evict the attackers from US networks, leaving them able to continue to monitor, destroy or tamper with data in the meantime. Harvard's Bruce Schneier, and NYU's Pano Yannakogeorgos, founding dean of the Air Force Cyber College, said that affected networks may need to be replaced completely. The Justice Department disclosed in July 2021 that 27 of its federal prosecutors' offices around the country had been affected, including 80% of Microsoft email accounts breached in four New York offices. Two of the offices, in Manhattan and Brooklyn, handle many prominent investigations of white-collar crime, as well as of people close to former president Trump. List of confirmed connected data breaches U.S. federal government U.S. state and local governments Private sector Investigations and responses Technology companies and business On December 8, 2020, before other organizations were known to have been breached, FireEye published countermeasures against the red team tools that had been stolen from FireEye. On December 15, 2020, Microsoft announced that SUNBURST, which only affects Windows platforms, had been added to Microsoft's malware database and would, from December 16 onwards, be detected and quarantined by Microsoft Defender. GoDaddy handed ownership to Microsoft of a command-and-control domain used in the attack, allowing Microsoft to activate a killswitch in the SUNBURST malware, and to discover which SolarWinds customers were infected. On December 14, 2020, the CEOs of several American utility companies convened to discuss the risks posed to the power grid by the attacks. On December 22, 2020, the North American Electric Reliability Corporation asked electricity companies to report their level of exposure to Solarwinds software. SolarWinds unpublished its featured customer list after the hack, although as of December 15, cybersecurity firm GreyNoise Intelligence said SolarWinds had not removed the infected software updates from its distribution server. Around January 5, 2021, SolarWinds investors filed a class action lawsuit against the company in relation to its security failures and subsequent fall in share price. Soon after, SolarWinds hired a new cybersecurity firm co-founded by Krebs. The Linux Foundation pointed out that if Orion had been open source, users would have been able to audit it, including via reproducible builds, making it much more likely that the malware payload would have been spotted. U.S. government On December 18, 2020, U.S. Secretary of State Mike Pompeo said that some details of the event would likely be classified so as not to become public. Security agencies On December 12, 2020, a National Security Council (NSC) meeting was held at the White House to discuss the breach of federal organizations. On December 13, 2020, CISA issued an emergency directive asking federal agencies to disable the SolarWinds software, to reduce the risk of additional intrusions, even though doing so would reduce those agencies' ability to monitor their computer networks. The Russian government said that it was not involved in the attacks. On December 14, 2020, the Department of Commerce confirmed that it had asked the CISA and the FBI to investigate. The NSC activated Presidential Policy Directive 41, an Obama-era emergency plan, and convened its Cyber Response Group. The U.S. Cyber Command threatened swift retaliation against the attackers, pending the outcome of investigations. The DOE helped to compensate for a staffing shortfall at CISA by allocating resources to help the Federal Energy Regulatory Commission (FERC) recover from the cyberattack. The FBI, CISA, and the Office of the Director of National Intelligence (ODNI) formed a Cyber Unified Coordination Group (UCG) to coordinate their efforts. On December 24, 2020, CISA said state and local government networks, in addition to federal ones, and other organizations, had been impacted by the attack, but did not provide further details. Congress The Senate Armed Services Committee's cybersecurity subcommittee was briefed by Defense Department officials. The House Committee on Homeland Security and House Committee on Oversight and Reform announced an investigation. Marco Rubio, acting chair of the Senate Intelligence Committee, said the U.S. must retaliate, but only once the perpetrator is certain. The committee's vice-chairman, Mark Warner, criticized President Trump for failing to acknowledge or react to the hack. Senator Ron Wyden called for mandatory security reviews of software used by federal agencies. On December 22, 2020, after U.S. Treasury Secretary Steven Mnuchin told reporters that he was "completely on top of this", the Senate Finance Committee was briefed by Microsoft that dozens of Treasury email accounts had been breached, and the attackers had accessed systems of the Treasury's Departmental Offices division, home to top Treasury officials. Senator Wyden said that the briefing showed that the Treasury "still does not know all of the actions taken by hackers, or precisely what information was stolen". On December 23, 2020, Senator Bob Menendez asked the State Department to end its silence about the extent of its breach, and Senator Richard Blumenthal asked the same of the Veterans Administration. The judiciary The Administrative Office of the United States Courts initiated an audit, with DHS, of the U.S. Judiciary's Case Management/Electronic Case Files (CM/ECF) system. It stopped accepting highly sensitive court documents to the CM/ECF, requiring those instead to be accepted only in paper form or on airgapped devices. President Trump President Donald Trump made no comment on the hack for days after it was reported, leading Senator Mitt Romney to decry his "silence and inaction". On December 19, Trump publicly addressed the attacks for the first time; he downplayed the hack, contended that the media had overblown the severity of the incident, said that "everything is well under control"; and proposed, without evidence, that China, rather than Russia, might be responsible for the attack. Trump then pivoted to insisting that he had won the 2020 presidential election. He speculated, without evidence, that the attack might also have involved a "hit" on voting machines, part of a long-running campaign by Trump to falsely assert that he won the 2020 election. Trump's claim was rebutted by former CISA director Chris Krebs, who pointed out that Trump's claim was not possible. Adam Schiff, chair of the House Intelligence Committee, described Trump's statements as dishonest, calling the comment a "scandalous betrayal of our national security" that "sounds like it could have been written in the Kremlin." Former Homeland Security Advisor Thomas P. Bossert said, "President Trump is on the verge of leaving behind a federal government, and perhaps a large number of major industries, compromised by the Russian government," and noted that congressional action, including via the National Defense Authorization Act would be required to mitigate the damage caused by the attacks. President Biden Then president-elect Joe Biden said he would identify and penalize the attackers. Biden's incoming chief of staff, Ron Klain, said the Biden administration's response to the hack would extend beyond sanctions. On December 22, 2020, Biden reported that his transition team was still being denied access to some briefings about the attack by Trump administration officials. In January 2021, Biden named appointees for two relevant White House positions: Elizabeth Sherwood-Randall as homeland security adviser, and Anne Neuberger as deputy national security adviser for cyber and emerging technology. In March 2021, the Biden administration expressed growing concerns over the hack, and White House Press Secretary Jen Psaki called it “an active threat”. Meanwhile The New York Times reported that the US government was planning economic sanctions as well as "a series of clandestine actions across Russian networks" in retaliation. On April 15, 2021, the United States expelled 10 Russian diplomats and issued sanctions against 6 Russian companies that support its cyber operations, as well as 32 individuals and entities for their role in the hack and in Russian interference in the 2020 United States elections. Rest of the world NATO said that it was "currently assessing the situation, with a view to identifying and mitigating any potential risks to our networks." On December 18, the United Kingdom National Cyber Security Centre said that it was still establishing the attacks' impact on the UK. The UK and Irish cybersecurity agencies published alerts targeting SolarWinds customers. On December 23, 2020, the UK Information Commissioner's Office – a national privacy authority – told UK organizations to check immediately whether they were impacted. On December 24, 2020, the Canadian Centre for Cyber Security asked SolarWinds Orion users in Canada to check for system compromises. Cyber espionage or cyberattack? The attack prompted a debate on whether the hack should be treated as cyber espionage, or as a cyberattack constituting an act of war. Most current and former U.S. officials considered the 2020 Russian hack to be a "stunning and distressing feat of espionage" but not a cyberattack because the Russians did not appear to destroy or manipulate data or cause physical damage (for example, to the electrical grid). Erica Borghard of the Atlantic Council and Columbia's Saltzman Institute and Jacquelyn Schneider of the Hoover Institution and Naval War College argued that the breach was an act of espionage that could be responded to with "arrests, diplomacy, or counterintelligence" and had not yet been shown to be a cyberattack, a classification that would legally allow the U.S. to respond with force. Law professor Jack Goldsmith wrote that the hack was a damaging act of cyber-espionage but "does not violate international law or norms" and wrote that "because of its own practices, the U.S. government has traditionally accepted the legitimacy of foreign governmental electronic spying in U.S. government networks." Law professor Michael Schmitt concurred, citing the Tallinn Manual. By contrast, Microsoft president Brad Smith termed the hack a cyberattack, stating that it was "not 'espionage as usual,' even in the digital age" because it was "not just an attack on specific targets, but on the trust and reliability of the world's critical infrastructure." U.S. Senator Richard J. Durbin (D-IL) described the attack as tantamount to a declaration of war. Debate on possible U.S. responses Writing for Wired, Borghard and Schneider opined that the U.S. "should continue to build and rely on strategic deterrence to convince states not to weaponize the cyber intelligence they collect". They also stated that because deterrence may not effectively discourage cyber-espionage attempts by threat actors, the U.S. should also focus on making cyber-espionage less successful through methods such as enhanced cyber-defenses, better information-sharing, and "defending forward" (reducing Russian and Chinese offensive cyber-capabilities). Writing for The Dispatch, Goldsmith wrote that the failure of defense and deterrence strategies against cyber-intrusion should prompt consideration of a "mutual restraint" strategy, "whereby the United States agrees to curb certain activities in foreign networks in exchange for forbearance by our adversaries in our networks." Cybersecurity author Bruce Schneier advocated against retaliation or increases in offensive capabilities, proposing instead the adoption of a defense-dominant strategy and ratification of the Paris Call for Trust and Security in Cyberspace or the Global Commission on the Stability of Cyberspace. In the New York Times, Paul Kolbe, former CIA agent and director of the Intelligence Project at Harvard's Belfer Center for Science and International Affairs, echoed Schneier's call for improvements in the U.S.'s cyberdefenses and international agreements. He also noted that the US is engaged in similar operations against other countries in what he described as an ambient cyber-conflict. See also Cyberwarfare in the United States Cyberwarfare by Russia EternalBlue Global surveillance disclosures (2013–present) List of data breaches Moonlight Maze Office of Personnel Management data breach Security dilemma The Shadow Brokers 2008 cyberattack on United States 2021 Microsoft Exchange Server data breach References External links SolarWinds Security Advisory FireEye Research Report GuidePoint Security Analysis Russian SVR Targets U.S. and Allied Networks (pdf file) A 'Worst Nightmare' Cyberattack: The Untold Story Of The SolarWinds Hack by Dina Temple-Raston, Friday, April 16, 2021 (NPR text only version) 2020 in the United States Cyberattacks Data breaches in the United States 2020 in computing Hacking in the 2020s
38073880
https://en.wikipedia.org/wiki/The%20Wollongong%20Group
The Wollongong Group
The Wollongong Group (TWG) was one of the first companies to sell commercial software products based on the Unix operating system. It was founded to market a port of Unix Version 6 developed by researchers at the University of Wollongong, Australia (thus the name "Wollongong Group"). The company was active in Palo Alto, California from 1980 to 1995. It later achieved name recognition as a pioneer in developing and selling commercial versions of the TCP/IP protocols. The Wollongong Group had annual sales of $40 million and employed 165 people when it was acquired by former competitor Attachmate in 1995. Commercializing TCP/IP and the Internet Virtually all Wollongong's products were initially based on versions of software that had been developed at Universities and released into the public domain. Wollongong products included: Eunice - A UNIX emulator for the VAX VMS operating system (based on software written by David Kashtan at SRI) individual TCP/IP packages for: the VAX VMS operating system (based on Berkeley TCP/IP) AT&T UNIX Version 7. (also based on Berkeley TCP/IP) the IBM PC (Based on MIT PC-IP by John Romkey) Individual licensing arrangements were made with brand-name vendors such as Philips for the P9000 Unix offerings, and Cray Research. These products, which they advertised in publications such as Computerworld and Hardcopy, were among the first commercially supported systems of their type, allowing people other than software developers access to the Internet. The PC product in particular made it possible for a non-technical user to access the Internet with equipment costing less than $3000 - about one tenth the cost of any other available systems at the time. Original Internet work PATHway was the name they used for a specialized TCP/IP product. By the mid 1980s many Wollongong employees were active in developing new Internet Technologies. Wollongong Employees produced the first Internet tunneling specification (RFC1088) and the first SNMP MIB (RFC1066). Notable Wollongong technical staff that worked on these projects include David H. Crocker (Email), Dr. Marshall Rose (SNMP), Karl Auerbach (Netbios, SNMP), Narayan Mohanram (TCP/IP on UNIX), Jerry Scott (TCP/IP on VMS), Leo McLaughlin III and John Bartas (TCP/IP on IBM PC). Internet Technology companies founded by ex-Wollongong employees include Epilogue Technologies, Taos Mountain Software, Interniche Technologies and iPass Inc. References Internet technology companies of the United States Unix history Unix software
65732625
https://en.wikipedia.org/wiki/Owl%20Scientific%20Computing
Owl Scientific Computing
Owl Scientific Computing is a software system for scientific and engineering computing developed in the Department of Computer Science and Technology, University of Cambridge. The System Research Group (SRG) in the department recognises Owl as one of the representative systems developed in SRG in the 2010s. The source code is licensed under the MIT License and can be accessed from the GitHub repository. The library is mostly designed and developed in the functional programming language OCaml. As a unique functional programming language, OCaml offers runtime efficiency, flexible module system, static type checking, intelligent garbage collector, and powerful type inference. Owl inherits these features directly from OCaml. With Owl, users can write succinct type-safe numerical applications in a concise functional language without sacrificing performance. It speeds up the development life-cycle, and reduces the cost from prototype to production use. The system serves as the de facto tool for computation intensive tasks in OCaml. History Owl was developed when Dr. Liang Wang was working as a Post-Doc in the OCaml Labs. Owl originated from a research project which studied the design of synchronous parallel machines for large-scale distributed computing in July 2016. Back then the libraries for numerical computing in OCaml ecosystem were very limited and the tooling was fragmented at that time. In order to test various analytical applications, many numerical functions has to be implemented, from very low level algebra and random number generators to the high level stuff like algorithmic differentiation and deep neural networks. These code snippets started accumulating. These functions were later taken out and wrapped into a standalone library named Owl. Owl's architecture undertook at least a dozen of iterations in the beginning, and some of the architectural changes are quite drastic. After one-year intensive development, Owl was capable of doing many complicated numerical tasks. e.g. image classification. Dr. Liang Wang held a tutorial at the CUFP 2017 to demonstrate data science in OCaml. In 2018, Prof. Richard Mortier gave a talk about Owl in the Alan Turing Institute. To further promotes OCaml and functional programming in data science, Owl provides abundant learning materials in the form a details manual. Design and Features Owl has implemented many advanced numerical functions atop of its implementation of n-dimensional arrays. Compared to other numerical libraries, Owl is unique in many perspectives, e.g. algorithmic differentiation and distributed computing have been included as integral components in the core system to maximise developers' productivity. The figure below gives a bird view of Owl's system architecture. The subsystem on the left part is Owl's Numerical system. The modules contained in this subsystem fall into three categories. The first is core modules contains basic data structures, i.e., N-dimensional array (Ndarray) in both dense and sparse forms. The Ndarray module supports various number types: float32, float64, complex32, complex64, int16, int32, etc. Also, the core module provide foreign function interfaces to other low level numerical libraries, such as CBLAS and LAPACK. These libraries are fully interfaced to the Linear Algebra module. The second category is the classic analytics modules. This part contains basic mathematical and statistical functions, linear algebra, regression, optimisation, plotting, etc. Advanced math and statistics functions such as statistical hypothesis testing and Markov chain Monte Carlo are also included. As a core functionality, Owl provides the algorithmic differentiation (or automatic differentiation) and dynamic computation graph modules. The highest level in the Owl architecture includes modules more advanced numerical applications such as neural network, natural language processing, data processing etc. The Zoo system is used for efficient scripting and code sharing. The modules in the second category, especially the algorithmic differentiation, make the code at this level quite concise. The subsystem on the right is called Actor Subsystem which extends Owl's capability to parallel and distributed computing. The core idea is to transform a user application from sequential execution mode into parallel mode (using various computation engines) with minimal efforts. The method is to compose two subsystems together with functors to generate the parallel version of the module defined in the numerical subsystem. Besides what have been mentioned in this figure, there are several other features in Owl. For example, the JavaScript and unikernel backends, integration with other frameworks such as TensorFlow and PyTorch, utilising GPU and other accelerator frameworks via symbolic graph, etc. Research The Owl project is research oriented, and supports research of numerical computing in multiple related topics. Some of its research topics are listed below. Synchronous parallel distributed machine learning design. Owl is the first to propose using sampling to synchronise nodes in iterative algorithms. The work published on arxiv comes with solid mathematical proof. This idea proves to be advanced and was later proposed in top Machine Learning conferences. One of the factors that contribute to the small code base of Owl is that it builds advanced analytical functions around the algorithmic differentiation. This idea was also proves to be popular and develops into the paradigm of Differentiable programming. It is now being used in popular numerical packages such as JuliaDiff. Using the computation graph offers another dimension optimization to the computation in Owl. Besides, the computation graph also bridges Owl application and hardware accelerators such as GPU and TPU. Later, the computation graph becomes a de facto intermediate representation. Standards such as the Open Neural Network Exchange and Neural Network Exchange Format are now widely supported by various deep learning frameworks such as TensorFlow and PyTorch. The idea of service-level composition and serving was investigated in the Zoo subsystem of Owl. The prototype demonstrates the streamlining various stages in the code development including composition, test, distribution, validation, and deployment. It is very similar to the later MLOps concepts. Recently this topic attracts attention in top system conferences such as OSDI. As result of research following part of these directions, Owl produces several publications. In 2018, a paper titled Data Analytics Service Composition and Deployment on Edge Devices is accepted at the ACM SIGCOMM 2018 Workshop on Big Data Analytics and Machine Learning for Data Communication Networks. Two talks are also accepted at the OCaml Workshop of the International Conference on Functional Programming 2019, on the topics of numerical ordinary differential equation solving, and executing Owl computation on GPUs. An internship in the OCaml Labs investigates the topic of image segmentation and related memory optimisation in Owl. In 2022, the book <<OCaml Scientific Computing>> was published by Springer. See also Array programming List of numerical-analysis software References Free mathematics software Numerical analysis software for Linux Numerical programming languages Array programming languages Free science software Numerical analysis software for MacOS Software using the MIT license
981057
https://en.wikipedia.org/wiki/MINIX%20file%20system
MINIX file system
The Minix file system is the native file system of the Minix operating system. It was written from scratch by Andrew S. Tanenbaum in the 1980s and aimed to replicate the structure of the Unix File System while omitting complex features, and was intended to be a teaching aid. It largely fell out of favour among Linux users by 1994 due to the popularity of other filesystems - most notably ext2 - and its lack of features, including limited partition sizes and filename length limits. History MINIX was written from scratch by Andrew S. Tanenbaum in the 1980s, as a Unix-like operating system whose source code could be used freely in education. The MINIX file system was designed for use with MINIX; it copies the basic structure of the Unix File System but avoids any complex features in the interest of keeping the source code clean, clear and simple, to meet the overall goal of MINIX to be a useful teaching aid. When Linus Torvalds first started writing his Linux operating system kernel (1991), he was working on a machine running MINIX, and adopted its file system layout. This soon proved problematic, since MINIX restricted filename lengths to 14 characters (30 in later versions), it limited partitions to 64 megabytes, and the file system was designed for teaching purposes, not performance. The Extended file system (ext; April 1992) was developed to replace MINIX's, but it was only with the second version of this, ext2, that Linux obtained a commercial-grade file system. As of 1994, the MINIX file system was "scarcely in use" among Linux users. Design and implementation A MINIX file system has six components: The Boot Block which is always stored in the first block. It contains the boot loader that loads and runs an operating system at system startup. The second block is the Superblock which stores data about the file system, that allows the operating system to locate and understand other file system structures. For example, the number of inodes and zones, the size of the two bitmaps and the starting block of the data area. The inode bitmap is a simple map of the inodes that tracks which ones are in use and which ones are free by representing them as either a one (in use) or a zero (free). The zone bitmap works in the same way as the inode bitmap, except it tracks the zones. The inodes area. Each file or directory is represented as an inode, which records metadata including type (file, directory, block, char, pipe), IDs for user and group, three timestamps that record the date and time of last access, last modification and last status change. An inode also contains a list of addresses that point to the zones in the data area where the file or directory data is actually stored. The data area is the largest component of the file system, using the majority of the space. It is where the actual file and directory data are stored. See also List of file systems MINIX 3 Minix-vmd References External links File, file system, and memory size limits in Minix Minix Filesystem Tool Introduction to the minix file system 1987 software Unix file system technology Disk file systems File systems supported by the Linux kernel MINIX
2843564
https://en.wikipedia.org/wiki/Timeline%20of%20cryptography
Timeline of cryptography
Below is a timeline of notable events related to cryptography. B.C. 36th century The Sumerians develop cuneiform writing and the Egyptians develop hieroglyphic writing. 16th century The Phoenicians develop an alphabet 600-500 Hebrew scholars make use of simple monoalphabetic substitution ciphers (such as the Atbash cipher) c. 400 Spartan use of scytale (alleged) c. 400 Herodotus reports use of steganography in reports to Greece from Persia (tattoo on shaved head) 100-1 A.D.- Notable Roman ciphers such as the Caesar cipher. 1–1799 A.D. 801–873 A.D. Cryptanalysis and frequency analysis leading to techniques for breaking monoalphabetic substitution ciphers are developed in A Manuscript on Deciphering Cryptographic Messages by the Muslim mathematician, Al-Kindi (Alkindus), who may have been inspired by textual analysis of the Qur'an. He also covers methods of encipherments, cryptanalysis of certain encipherments, and statistical analysis of letters and letter combinations in Arabic. 1355-1418 Ahmad al-Qalqashandi writes Subh al-a 'sha, a 14-volume encyclopedia including a section on cryptology, attributed to Ibn al-Durayhim (1312–1361). The list of ciphers in this work include both substitution and transposition, and for the first time, a cipher with multiple substitutions for each plaintext letter. It also included an exposition on and worked example of cryptanalysis, including the use of tables of letter frequencies and sets of letters which cannot occur together in one word. 1450 The Chinese develop wooden block movable type printing. 1450–1520 The Voynich manuscript, an example of a possibly encoded illustrated book, is written. 1466 Leon Battista Alberti invents polyalphabetic cipher, also first known mechanical cipher machine 1518 Johannes Trithemius' book on cryptology 1553 Bellaso invents Vigenère cipher 1585 Vigenère's book on ciphers 1586 Cryptanalysis used by spymaster Sir Francis Walsingham to implicate Mary, Queen of Scots, in the Babington Plot to murder Elizabeth I of England. Queen Mary was eventually executed. 1641 Wilkins' Mercury (English book on cryptology) 1793 Claude Chappe establishes the first long-distance semaphore telegraph line 1795 Thomas Jefferson invents the Jefferson disk cipher, reinvented over 100 years later by Etienne Bazeries 1800–1899 1809–14 George Scovell's work on Napoleonic ciphers during the Peninsular War 1831 Joseph Henry proposes and builds an electric telegraph 1835 Samuel Morse develops the Morse code 1854 Charles Wheatstone invents Playfair cipher c. 1854 Babbage's method for breaking polyalphabetic ciphers (pub 1863 by Kasiski) 1855 For the English side in Crimean War, Charles Babbage broke Vigenère's autokey cipher (the 'unbreakable cipher' of the time) as well as the much weaker cipher that is called Vigenère cipher today. Due to secrecy it was also discovered and attributed somewhat later to the Prussian Friedrich Kasiski. 1883 Auguste Kerckhoffs' La Cryptographie militare published, containing his celebrated laws of cryptography 1885 Beale ciphers published 1894 The Dreyfus Affair in France involves the use of cryptography, and its misuse, in regard to false documents. 1900–1949 1916-1922 William Friedman and Elizebeth Smith Friedman apply statistics to cryptanalysis (coincidence counting, etc.), write Riverbank Publications 1917 Gilbert Vernam develops first practical implementation of a teletype cipher, now known as a stream cipher and, later, with Joseph Mauborgne the one-time pad 1917 Zimmermann telegram intercepted and decrypted, advancing U.S. entry into World War I 1919 Weimar Germany Foreign Office adopts (a manual) one-time pad for some traffic 1919 Edward Hebern invents/patents first rotor machine design—Damm, Scherbius and Koch follow with patents the same year 1921 Washington Naval Conference U.S. negotiating team aided by decryption of Japanese diplomatic telegrams c. 1924 MI8 (Herbert Yardley, et al.) provide breaks of assorted traffic in support of US position at Washington Naval Conference c. 1932 first break of German Army Enigma by Marian Rejewski in Poland 1929 United States Secretary of State Henry L. Stimson shuts down State Department cryptanalysis "Black Chamber", saying "Gentlemen do not read each other's mail." 1931 The American Black Chamber by Herbert O. Yardley is published, revealing much about American cryptography 1940 Break of Japan's PURPLE machine cipher by SIS team December 7, 1941 attack on Pearl Harbor; U.S. Navy base at Pearl Harbor in Oahu is surprised by Japanese attack, despite U.S. breaking of Japanese codes. U.S. enters World War II. June 1942 Battle of Midway where U.S. partial break into Dec 41 edition of JN-25 leads to turning-point victory over Japan April 1943 Admiral Yamamoto, architect of Pearl Harbor attack, is assassinated by U.S. forces who know his itinerary from decoded messages April 1943 Max Newman, Wynn-Williams, and their team (including Alan Turing) at the secret Government Code and Cypher School ('Station X'), Bletchley Park, Bletchley, England, complete the "Heath Robinson". This is a specialized machine for cipher-breaking, not a general-purpose calculator or computer. December 1943 The Colossus computer was built, by Thomas Flowers at The Post Office Research Laboratories in London, to crack the German Lorenz cipher (SZ42). Colossus was used at Bletchley Park during World War II as a successor to April's 'Robinson's. Although 10 were eventually built, unfortunately they were destroyed immediately after they had finished their work it was so advanced that there was to be no possibility of its design falling into the wrong hands. 1944 Patent application filed on SIGABA code machine used by U.S. in World War II. Kept secret, it finally issues in 2001 1946 The Venona project's first break into Soviet espionage traffic from the early 1940s 1948 Claude Shannon writes a paper that establishes the mathematical basis of information theory. 1949 Shannon's Communication Theory of Secrecy Systems published in Bell Labs Technical Journal 1950–1999 1951 U.S. National Security Agency founded. KL-7 rotor machine introduced sometime thereafter. 1957 First production order for KW-26 electronic encryption system. August 1964 Gulf of Tonkin Incident leads U.S. into Vietnam War, possibly due to misinterpretation of signals intelligence by NSA. 1967 David Kahn's The Codebreakers is published. 1968 John Anthony Walker walks into the Soviet Union's embassy in Washington and sells information on KL-7 cipher machine. The Walker spy ring operates until 1985. 1969 The first hosts of ARPANET, Internet's ancestor, are connected. 1970 Using quantum states to encode information is first proposed: Stephen Wiesner invents conjugate coding and applies it to design “money physically impossible to counterfeit” (still technologically unfeasible today). 1974? Horst Feistel develops Feistel network block cipher design. 1976 The Data Encryption Standard published as an official Federal Information Processing Standard (FIPS) for the United States. 1976 Diffie and Hellman publish New Directions in Cryptography. 1977 RSA public key encryption invented. 1978 Robert McEliece invents the McEliece cryptosystem, the first asymmetric encryption algorithm to use randomization in the encryption process. 1981 Richard Feynman proposed quantum computers. The main application he had in mind was the simulation of quantum systems, but he also mentioned the possibility of solving other problems. 1984 Based on Stephen Wiesner's idea from the 1970s, Charles Bennett and Gilles Brassard design the first quantum cryptography protocol, BB84. 1985 Walker spy ring uncovered. Remaining KL-7's withdrawn from service. 1986 After an increasing number of break-ins to government and corporate computers, United States Congress passes the Computer Fraud and Abuse Act, which makes it a crime to break into computer systems. The law, however, does not cover juveniles. 1988 African National Congress uses computer-based one-time pads to build a network inside South Africa. 1989 Tim Berners-Lee and Robert Cailliau built the prototype system which became the World Wide Web at CERN. 1989 Quantum cryptography experimentally demonstrated in a proof-of-the-principle experiment by Charles Bennett et al. 1991 Phil Zimmermann releases the public key encryption program PGP along with its source code, which quickly appears on the Internet. 1994 Bruce Schneier's Applied Cryptography is published. 1994 Secure Sockets Layer (SSL) encryption protocol released by Netscape. 1994 Peter Shor devises an algorithm which lets quantum computers determine the factorization of large integers quickly. This is the first interesting problem for which quantum computers promise a significant speed-up, and it therefore generates a lot of interest in quantum computers. 1994 DNA computing proof of concept on toy travelling salesman problem; a method for input/output still to be determined. 1994 Russian crackers siphon $10 million from Citibank and transfer the money to bank accounts around the world. Vladimir Levin, the 30-year-old ringleader, uses his work laptop after hours to transfer the funds to accounts in Finland and Israel. Levin stands trial in the United States and is sentenced to three years in prison. Authorities recover all but $400,000 of the stolen money. 1994 Formerly proprietary, but un-patented, RC4 cipher algorithm is published on the Internet. 1994 First RSA Factoring Challenge from 1977 is decrypted as The Magic Words are Squeamish Ossifrage. 1995 NSA publishes the SHA1 hash algorithm as part of its Digital Signature Standard. July 1997 OpenPGP specification (RFC 2440) released 1997 Ciphersaber, an encryption system based on RC4 that is simple enough to be reconstructed from memory, is published on Usenet. October 1998 Digital Millennium Copyright Act (DMCA) becomes law in U.S., criminalizing production and dissemination of technology that can circumvent technical measures taken to protect copyright. October 1999 DeCSS, a computer program capable of decrypting content on a DVD, is published on the Internet. 2000 and beyond January 14, 2000 U.S. Government announce restrictions on export of cryptography are relaxed (although not removed). This allows many US companies to stop the long running process of having to create US and international copies of their software. March 2000 President of the United States Bill Clinton says he doesn't use e-mail to communicate with his daughter, Chelsea Clinton, at college because he doesn't think the medium is secure. September 6, 2000 RSA Security Inc. released their RSA algorithm into the public domain, a few days in advance of their expiring. Following the relaxation of the U.S. government export restrictions, this removed one of the last barriers to the worldwide distribution of much software based on cryptographic systems 2000 UK Regulation of Investigatory Powers Act requires anyone to supply their cryptographic key to a duly authorized person on request 2001 Belgian Rijndael algorithm selected as the U.S. Advanced Encryption Standard (AES) after a five-year public search process by National Institute of Standards and Technology (NIST) 2001 Scott Fluhrer, Itsik Mantin and Adi Shamir publish an attack on WiFi's Wired Equivalent Privacy security layer September 11, 2001 U.S. response to terrorist attacks hampered by lack of secure communications November 2001 Microsoft and its allies vow to end "full disclosure" of security vulnerabilities by replacing it with "responsible" disclosure guidelines 2002 NESSIE project releases final report / selections August 2002, PGP Corporation formed, purchasing assets from NAI. 2003 CRYPTREC project releases 2003 report / recommendations 2004 The hash MD5 is shown to be vulnerable to practical collision attack 2004 The first commercial quantum cryptography system becomes available from id Quantique. 2005 Potential for attacks on SHA1 demonstrated 2005 Agents from the U.S. FBI demonstrate their ability to crack WEP using publicly available tools May 1, 2007 Users swamp Digg.com with copies of a 128-bit key to the AACS system used to protect HD DVD and Blu-ray video discs. The user revolt was a response to Digg's decision, subsequently reversed, to remove the keys, per demands from the motion picture industry that cited the U.S. DMCA anti-circumvention provisions. November 2, 2007 NIST hash function competition announced. 2009 Bitcoin network was launched. 2010 The master key for High-bandwidth Digital Content Protection (HDCP) and the private signing key for the Sony PlayStation 3 game console are recovered and published using separate cryptoanalytic attacks. PGP Corp. is acquired by Symantec. 2012 NIST selects the Keccak algorithm as the winner of its SHA-3 hash function competition. 2013 Edward Snowden discloses a vast trove of classified documents from NSA. See Global surveillance disclosures (2013–present) 2013 Dual_EC_DRBG is discovered to have a NSA backdoor. 2013 NSA publishes Simon and Speck lightweight block ciphers. 2014 The Password Hashing Competition accepts 24 entries. 2015 Year by which NIST suggests that 80-bit keys be phased out. See also History of cryptography References External links Timeline of Cipher Machines Cryptography History of cryptography Cryptography lists and comparisons
18588994
https://en.wikipedia.org/wiki/Usenet
Usenet
Usenet () is a worldwide distributed discussion system available on computers. It was developed from the general-purpose Unix-to-Unix Copy (UUCP) dial-up network architecture. Tom Truscott and Jim Ellis conceived the idea in 1979, and it was established in 1980. Users read and post messages (called articles or posts, and collectively termed news) to one or more categories, known as newsgroups. Usenet resembles a bulletin board system (BBS) in many respects and is the precursor to Internet forums that became widely used. Discussions are threaded, as with web forums and BBSs, though posts are stored on the server sequentially. A major difference between a BBS or web forum and Usenet is the absence of a central server and dedicated administrator. Usenet is distributed among a large, constantly changing conglomeration of news servers that store and forward messages to one another via "news feeds". Individual users may read messages from and post messages to a local server, which may be operated by anyone. Usenet is culturally and historically significant in the networked world, having given rise to, or popularized, many widely recognized concepts and terms such as "FAQ", "flame", sockpuppet, and "spam". In the early 1990s, shortly before access to the Internet became commonly affordable, Usenet connections via Fidonet's dial-up BBS networks made long-distance or worldwide discussions and other communication widespread, not needing a server, just (local) telephone service. The name Usenet comes from the term "users network". The first Usenet group was NET.general, which quickly became net.general. The first commercial spam on Usenet was from immigration attorneys Canter and Siegel advertising green card services. Introduction Usenet was conceived in 1979 and publicly established in 1980, at the University of North Carolina at Chapel Hill and Duke University, over a decade before the World Wide Web went online (and thus before the general public received access to the Internet), making it one of the oldest computer network communications systems still in widespread use. It was originally built on the "poor man's ARPANET", employing UUCP as its transport protocol to offer mail and file transfers, as well as announcements through the newly developed news software such as A News. The name "Usenet" emphasizes its creators' hope that the USENIX organization would take an active role in its operation. The articles that users post to Usenet are organized into topical categories known as newsgroups, which are themselves logically organized into hierarchies of subjects. For instance, sci.math and sci.physics are within the sci.* hierarchy. Or, talk.origins and talk.atheism are in the talk.* hierarchy. When a user subscribes to a newsgroup, the news client software keeps track of which articles that user has read. In most newsgroups, the majority of the articles are responses to some other article. The set of articles that can be traced to one single non-reply article is called a thread. Most modern newsreaders display the articles arranged into threads and subthreads. For example, in the wine-making newsgroup rec.crafts.winemaking, someone might start a thread called; "What's the best yeast?" and that thread or conversation might grow into dozens of replies long, by perhaps six or eight different authors. Over several days, that conversation about different wine yeasts might branch into several sub-threads in a tree-like form. When a user posts an article, it is initially only available on that user's news server. Each news server talks to one or more other servers (its "newsfeeds") and exchanges articles with them. In this fashion, the article is copied from server to server and should eventually reach every server in the network. The later peer-to-peer networks operate on a similar principle, but for Usenet it is normally the sender, rather than the receiver, who initiates transfers. Usenet was designed under conditions when networks were much slower and not always available. Many sites on the original Usenet network would connect only once or twice a day to batch-transfer messages in and out. This is largely because the POTS network was typically used for transfers, and phone charges were lower at night. The format and transmission of Usenet articles is similar to that of Internet e-mail messages. The difference between the two is that Usenet articles can be read by any user whose news server carries the group to which the message was posted, as opposed to email messages, which have one or more specific recipients. Today, Usenet has diminished in importance with respect to Internet forums, blogs, mailing lists and social media. Usenet differs from such media in several ways: Usenet requires no personal registration with the group concerned; information need not be stored on a remote server; archives are always available; and reading the messages does not require a mail or web client, but a news client. However, it is now possible to read and participate in Usenet newsgroups to a large degree using ordinary web browsers since most newsgroups are now copied to several web sites. The groups in are still widely used for data transfer. ISPs, news servers, and newsfeeds Many Internet service providers, and many other Internet sites, operate news servers for their users to access. ISPs that do not operate their own servers directly will often offer their users an account from another provider that specifically operates newsfeeds. In early news implementations, the server and newsreader were a single program suite, running on the same system. Today, one uses separate newsreader client software, a program that resembles an email client but accesses Usenet servers instead. Not all ISPs run news servers. A news server is one of the most difficult Internet services to administer because of the large amount of data involved, small customer base (compared to mainstream Internet service), and a disproportionately high volume of customer support incidents (frequently complaining of missing news articles). Some ISPs outsource news operations to specialist sites, which will usually appear to a user as though the ISP itself runs the server. Many of these sites carry a restricted newsfeed, with a limited number of newsgroups. Commonly omitted from such a newsfeed are foreign-language newsgroups and the hierarchy which largely carries software, music, videos and images, and accounts for over 99 percent of article data. There are also Usenet providers that offer a full unrestricted service to users whose ISPs do not carry news, or that carry a restricted feed. Newsreaders Newsgroups are typically accessed with newsreaders: applications that allow users to read and reply to postings in newsgroups. These applications act as clients to one or more news servers. Historically, Usenet was associated with the Unix operating system developed at AT&T, but newsreaders are now available for all major operating systems. Modern mail clients or "communication suites" commonly also have an integrated newsreader. Often, however, these integrated clients are of low quality, compared to standalone newsreaders, and incorrectly implement Usenet protocols, standards and conventions. Many of these integrated clients, for example the one in Microsoft's Outlook Express, are disliked by purists because of their misbehavior. With the rise of the World Wide Web (WWW), web front-ends (web2news) have become more common. Web front ends have lowered the technical entry barrier requirements to that of one application and no Usenet NNTP server account. There are numerous websites now offering web based gateways to Usenet groups, although some people have begun filtering messages made by some of the web interfaces for one reason or another. Google Groups is one such web based front end and some web browsers can access Google Groups via news: protocol links directly. Moderated and unmoderated newsgroups A minority of newsgroups are moderated, meaning that messages submitted by readers are not distributed directly to Usenet, but instead are emailed to the moderators of the newsgroup for approval. The moderator is to receive submitted articles, review them, and inject approved articles so that they can be properly propagated worldwide. Articles approved by a moderator must bear the Approved: header line. Moderators ensure that the messages that readers see in the newsgroup conform to the charter of the newsgroup, though they are not required to follow any such rules or guidelines. Typically, moderators are appointed in the proposal for the newsgroup, and changes of moderators follow a succession plan. Historically, a mod.* hierarchy existed before Usenet reorganization. Now, moderated newsgroups may appear in any hierarchy, typically with .moderated added to the group name. Usenet newsgroups in the Big-8 hierarchy are created by proposals called a Request for Discussion, or RFD. The RFD is required to have the following information: newsgroup name, checkgroups file entry, and moderated or unmoderated status. If the group is to be moderated, then at least one moderator with a valid email address must be provided. Other information which is beneficial but not required includes: a charter, a rationale, and a moderation policy if the group is to be moderated. Discussion of the new newsgroup proposal follows, and is finished with the members of the Big-8 Management Board making the decision, by vote, to either approve or disapprove the new newsgroup. Unmoderated newsgroups form the majority of Usenet newsgroups, and messages submitted by readers for unmoderated newsgroups are immediately propagated for everyone to see. Minimal editorial content filtering vs propagation speed form one crux of the Usenet community. One little cited defense of propagation is canceling a propagated message, but few Usenet users use this command and some news readers do not offer cancellation commands, in part because article storage expires in relatively short order anyway. Almost all unmoderated Usenet groups tend to accumulate large volumes of spam. Technical details Usenet is a set of protocols for generating, storing and retrieving news "articles" (which resemble Internet mail messages) and for exchanging them among a readership which is potentially widely distributed. These protocols most commonly use a flooding algorithm which propagates copies throughout a network of participating servers. Whenever a message reaches a server, that server forwards the message to all its network neighbors that haven't yet seen the article. Only one copy of a message is stored per server, and each server makes it available on demand to the (typically local) readers able to access that server. The collection of Usenet servers has thus a certain peer-to-peer character in that they share resources by exchanging them, the granularity of exchange however is on a different scale than a modern peer-to-peer system and this characteristic excludes the actual users of the system who connect to the news servers with a typical client-server application, much like an email reader. RFC 850 was the first formal specification of the messages exchanged by Usenet servers. It was superseded by RFC 1036 and subsequently by RFC 5536 and RFC 5537. In cases where unsuitable content has been posted, Usenet has support for automated removal of a posting from the whole network by creating a cancel message, although due to a lack of authentication and resultant abuse, this capability is frequently disabled. Copyright holders may still request the manual deletion of infringing material using the provisions of World Intellectual Property Organization treaty implementations, such as the United States Online Copyright Infringement Liability Limitation Act, but this would require giving notice to each individual news server administrator. On the Internet, Usenet is transported via the Network News Transfer Protocol (NNTP) on TCP Port 119 for standard, unprotected connections and on TCP port 563 for SSL encrypted connections. Organization The major set of worldwide newsgroups is contained within nine hierarchies, eight of which are operated under consensual guidelines that govern their administration and naming. The current Big Eight are: comp.* – computer-related discussions (comp.software, comp.sys.amiga) humanities.* – fine arts, literature, and philosophy (humanities.classics, humanities.design.misc) misc.* – miscellaneous topics (misc.education, misc.forsale, misc.kids) news.* – discussions and announcements about news (meaning Usenet, not current events) (news.groups, news.admin) rec.* – recreation and entertainment (rec.music, rec.arts.movies) sci.* – science related discussions (sci.psychology, sci.research) soc.* – social discussions (soc.college.org, soc.culture.) talk.* – talk about various controversial topics (talk.religion, talk.politics, talk.origins) See also the Great Renaming. The alt.* hierarchy is not subject to the procedures controlling groups in the Big Eight, and it is as a result less organized. Groups in the alt.* hierarchy tend to be more specialized or specific—for example, there might be a newsgroup under the Big Eight which contains discussions about children's books, but a group in the alt hierarchy may be dedicated to one specific author of children's books. Binaries are posted in , making it the largest of all the hierarchies. Many other hierarchies of newsgroups are distributed alongside these. Regional and language-specific hierarchies such as .*, .* and ne.* serve specific countries and regions such as Japan, Malta and New England. Companies and projects administer their own hierarchies to discuss their products and offer community technical support, such as the historical .* hierarchy from the Free Software Foundation. Microsoft closed its newsserver in June 2010, providing support for its products over forums now. Some users prefer to use the term "Usenet" to refer only to the Big Eight hierarchies; others include alt.* as well. The more general term "netnews" incorporates the entire medium, including private organizational news systems. Informal sub-hierarchy conventions also exist. *.answers are typically moderated cross-post groups for FAQs. An FAQ would be posted within one group and a cross post to the *.answers group at the head of the hierarchy seen by some as a refining of information in that news group. Some subgroups are recursive—to the point of some silliness in alt.*. Binary content Usenet was originally created to distribute text content encoded in the 7-bit ASCII character set. With the help of programs that encode 8-bit values into ASCII, it became practical to distribute binary files as content. Binary posts, due to their size and often-dubious copyright status, were in time restricted to specific newsgroups, making it easier for administrators to allow or disallow the traffic. The oldest widely used encoding method for binary content is uuencode, from the Unix UUCP package. In the late 1980s, Usenet articles were often limited to 60,000 characters, and larger hard limits exist today. Files are therefore commonly split into sections that require reassembly by the reader. With the header extensions and the Base64 and Quoted-Printable MIME encodings, there was a new generation of binary transport. In practice, MIME has seen increased adoption in text messages, but it is avoided for most binary attachments. Some operating systems with metadata attached to files use specialized encoding formats. For Mac OS, both BinHex and special MIME types are used. Other lesser known encoding systems that may have been used at one time were BTOA, XX encoding, BOO, and USR encoding. In an attempt to reduce file transfer times, an informal file encoding known as yEnc was introduced in 2001. It achieves about a 30% reduction in data transferred by assuming that most 8-bit characters can safely be transferred across the network without first encoding into the 7-bit ASCII space. The most common method of uploading large binary posts to Usenet is to convert the files into RAR archives and create Parchive files for them. Parity files are used to recreate missing data when not every part of the files reaches a server. Binary retention time Each news server allocates a certain amount of storage space for content in each newsgroup. When this storage has been filled, each time a new post arrives, old posts are deleted to make room for the new content. If the network bandwidth available to a server is high but the storage allocation is small, it is possible for a huge flood of incoming content to overflow the allocation and push out everything that was in the group before it. The average length of time that posts are able to stay on the server before being deleted is commonly called the retention time. Binary newsgroups are only able to function reliably if there is sufficient storage allocated to handle the amount of articles being added. Without sufficient retention time, a reader will be unable to download all parts of the binary before it is flushed out of the group's storage allocation. This was at one time how posting undesired content was countered; the newsgroup would be flooded with random garbage data posts, of sufficient quantity to push out all the content to be suppressed. This has been compensated by service providers allocating enough storage to retain everything posted each day, including spam floods, without deleting anything. Modern Usenet news servers have enough capacity to archive years of binary content even when flooded with new data at the maximum daily speed available. In part because of such long retention times, as well as growing Internet upload speeds, Usenet is also used by individual users to store backup data. While commercial providers offer easier to use online backup services, storing data on Usenet is free of charge (although access to Usenet itself may not be). The method requires the uploader to cede control over the distribution of the data; the files are automatically disseminated to all Usenet providers exchanging data for the news group it is posted to. In general the user must manually select, prepare and upload the data. The data is typically encrypted because it is available to anyone to download the backup files. After the files are uploaded, having multiple copies spread to different geographical regions around the world on different news servers decreases the chances of data loss. Major Usenet service providers have a retention time of more than 12 years. This results in more than 60 petabytes (60000 terabytes) of storage (see image). When using Usenet for data storage, providers that offer longer retention time are preferred to ensure the data will survive for longer periods of time compared to services with lower retention time. Legal issues While binary newsgroups can be used to distribute completely legal user-created works, open-source software, and public domain material, some binary groups are used to illegally distribute commercial software, copyrighted media, and pornographic material. ISP-operated Usenet servers frequently block access to all groups to both reduce network traffic and to avoid related legal issues. Commercial Usenet service providers claim to operate as a telecommunications service, and assert that they are not responsible for the user-posted binary content transferred via their equipment. In the United States, Usenet providers can qualify for protection under the DMCA Safe Harbor regulations, provided that they establish a mechanism to comply with and respond to takedown notices from copyright holders. Removal of copyrighted content from the entire Usenet network is a nearly impossible task, due to the rapid propagation between servers and the retention done by each server. Petitioning a Usenet provider for removal only removes it from that one server's retention cache, but not any others. It is possible for a special post cancellation message to be distributed to remove it from all servers, but many providers ignore cancel messages by standard policy, because they can be easily falsified and submitted by anyone. For a takedown petition to be most effective across the whole network, it would have to be issued to the origin server to which the content has been posted, before it has been propagated to other servers. Removal of the content at this early stage would prevent further propagation, but with modern high speed links, content can be propagated as fast as it arrives, allowing no time for content review and takedown issuance by copyright holders. Establishing the identity of the person posting illegal content is equally difficult due to the trust-based design of the network. Like SMTP email, servers generally assume the header and origin information in a post is true and accurate. However, as in SMTP email, Usenet post headers are easily falsified so as to obscure the true identity and location of the message source. In this manner, Usenet is significantly different from modern P2P services; most P2P users distributing content are typically immediately identifiable to all other users by their network address, but the origin information for a Usenet posting can be completely obscured and unobtainable once it has propagated past the original server. Also unlike modern P2P services, the identity of the downloaders is hidden from view. On P2P services a downloader is identifiable to all others by their network address. On Usenet, the downloader connects directly to a server, and only the server knows the address of who is connecting to it. Some Usenet providers do keep usage logs, but not all make this logged information casually available to outside parties such as the Recording Industry Association of America. The existence of anonymising gateways to USENET also complicates the tracing of a postings true origin. History UUCP/Usenet Logical Map — June 1, 1981 / mods by S. McGeady November 19, 1981 (ucbvax) +=+===================================+==+ | | | | | | wivax | | | | | | | | | microsoft| uiucdcs | | | | genradbo | | | | | | (Tektronix) | | | | | | | purdue | | | decvax+===+=+====+=+=+ | | | | | | | | | | | pur-phy | | tekmdp | | | | | | | | | | | +@@@@@@cca | | | | | | | | | | | | | +=pur-ee=+=+=====+===+ | | | csin | | | | | | | | +==o===+===================+==+========+=======+====teklabs=+ | | | | | | | pdp phs grumpy wolfvax | | | | | | | | | | | cincy unc=+===+======+========+ | | | | bio | | | | | (Misc) | | (Misc) | | | | sii reed | dukgeri duke34 utzoo | | | | | | | | | | | | +====+=+=+==+====++======+==++===duke=+===+=======+==+=========+ | | | | | | | | | | | u1100s | bmd70 ucf-cs ucf | andiron | | | | | | | | | | | | | red | | | | | pyuxh | | | | zeppo | | | | | psupdp---psuvax | | | | | | | | | | | alice | whuxlb | utah-cs | | houxf | allegra | | | | | | | | | | | | | | | | | | | +--chico---+ | +===+=mhtsa====research | /=+=======harpo=+==+ | | | | | | | | / | | | | hocsr | | +=+=============+=/ cbosg---+ | | | ucbopt | | | | | esquire | | : | | | cbosgd | | | : | | | | | | ucbcory | | eagle==+=====+=====+=====+=====+ | | | : | | | | | | | | | +-uwvax--+ | : | | | mhuxa mhuxh mhuxj mhuxm mhuxv | | | : | | | | | | : | | | +----------------------------o--+ | : | | | | | | ucbcad | | | ihpss mh135a | | : | | | | | | | : \--o--o------ihnss----vax135----cornell | | : | | | | | +=+==ucbvax==========+===+==+=+======+=======+=+========+=========+ (UCB) : | | | | (Silicon Valley) ucbarpa cmevax | | menlo70--hao : | | | | ucbonyx | | | sri-unix | ucsfcgl | | | | Legend: | | sytek====+========+ ------- | | | | - | / \ + = Uucp sdcsvax=+=======+=+======+ intelqa zehntel = "Bus" | | | o jumps sdcarl phonlab sdcattb : Berknet @ Arpanet UUCP/Usenet Logical Map, original by Steven McGeady. Copyright© 1981, 1996 Bruce Jones, Henry Spencer, David Wiseman. Copied with permission from The Usenet Oldnews Archive: Compilation. Newsgroup experiments first occurred in 1979. Tom Truscott and Jim Ellis of Duke University came up with the idea as a replacement for a local announcement program, and established a link with nearby University of North Carolina using Bourne shell scripts written by Steve Bellovin. The public release of news was in the form of conventional compiled software, written by Steve Daniel and Truscott. In 1980, Usenet was connected to ARPANET through which had connections to both Usenet and ARPANET. Mark Horton, the graduate student who set up the connection, began "feeding mailing lists from the ARPANET into Usenet" with the "fa" ("From ARPANET") identifier. Usenet gained 50 member sites in its first year, including Reed College, University of Oklahoma, and Bell Labs, and the number of people using the network increased dramatically; however, it was still a while longer before Usenet users could contribute to ARPANET. Network UUCP networks spread quickly due to the lower costs involved, and the ability to use existing leased lines, X.25 links or even ARPANET connections. By 1983, thousands of people participated from more than 500 hosts, mostly universities and Bell Labs sites but also a growing number of Unix-related companies; the number of hosts nearly doubled to 940 in 1984. More than 100 newsgroups existed, more than 20 devoted to Unix and other computer-related topics, and at least a third to recreation. As the mesh of UUCP hosts rapidly expanded, it became desirable to distinguish the Usenet subset from the overall network. A vote was taken at the 1982 USENIX conference to choose a new name. The name Usenet was retained, but it was established that it only applied to news. The name UUCPNET became the common name for the overall network. In addition to UUCP, early Usenet traffic was also exchanged with Fidonet and other dial-up BBS networks. By the mid-1990s there were almost 40,000 FidoNet systems in operation, and it was possible to communicate with millions of users around the world, with only local telephone service. Widespread use of Usenet by the BBS community was facilitated by the introduction of UUCP feeds made possible by MS-DOS implementations of UUCP, such as UFGATE (UUCP to FidoNet Gateway), FSUUCP and UUPC. In 1986, RFC 977 provided the Network News Transfer Protocol (NNTP) specification for distribution of Usenet articles over TCP/IP as a more flexible alternative to informal Internet transfers of UUCP traffic. Since the Internet boom of the 1990s, almost all Usenet distribution is over NNTP. Software Early versions of Usenet used Duke's A News software, designed for one or two articles a day. Matt Glickman and Horton at Berkeley produced an improved version called B News that could handle the rising traffic (about 50 articles a day as of late 1983). With a message format that offered compatibility with Internet mail and improved performance, it became the dominant server software. C News, developed by Geoff Collyer and Henry Spencer at the University of Toronto, was comparable to B News in features but offered considerably faster processing. In the early 1990s, InterNetNews by Rich Salz was developed to take advantage of the continuous message flow made possible by NNTP versus the batched store-and-forward design of UUCP. Since that time INN development has continued, and other news server software has also been developed. Public venue Usenet was the first Internet community and the place for many of the most important public developments in the pre-commercial Internet. It was the place where Tim Berners-Lee announced the launch of the World Wide Web, where Linus Torvalds announced the Linux project, and where Marc Andreessen announced the creation of the Mosaic browser and the introduction of the image tag, which revolutionized the World Wide Web by turning it into a graphical medium. Internet jargon and history Many jargon terms now in common use on the Internet originated or were popularized on Usenet. Likewise, many conflicts which later spread to the rest of the Internet, such as the ongoing difficulties over spamming, began on Usenet. Decline Sascha Segan of PC Magazine said in 2008 that "Usenet has been dying for years". Segan said that some people pointed to the Eternal September in 1993 as the beginning of Usenet's decline, when AOL began offering Usenet access. He argues that when users began putting large (non-text) files on Usenet by the late 1990s, Usenet disk space and traffic increased correspondingly. Internet service providers questioned why they needed to host space for binary articles. AOL discontinued Usenet access in 2005. In May 2010, Duke University, whose implementation had started Usenet more than 30 years earlier, decommissioned its Usenet server, citing low usage and rising costs. On February 4, 2011, the Usenet news service link at the University of North Carolina at Chapel Hill (news.unc.edu) was retired after 32 years. In response, John Biggs of TechCrunch said "As long as there are folks who think a command line is better than a mouse, the original text-only social network will live on". While there are still some active text newsgroups on Usenet, the system is now primarily used to share large files between users, and the underlying technology of Usenet remains unchanged. Usenet traffic changes Over time, the amount of Usenet traffic has steadily increased. the number of all text posts made in all Big-8 newsgroups averaged 1,800 new messages every hour, with an average of 25,000 messages per day. However, these averages are minuscule in comparison to the traffic in the binary groups. Much of this traffic increase reflects not an increase in discrete users or newsgroup discussions, but instead the combination of massive automated spamming and an increase in the use of newsgroups in which large files are often posted publicly. A small sampling of the change (measured in feed size per day) follows: In 2008, Verizon Communications, Time Warner Cable and Sprint Nextel signed an agreement with Attorney General of New York Andrew Cuomo to shut down access to sources of child pornography. Time Warner Cable stopped offering access to Usenet. Verizon reduced its access to the "Big 8" hierarchies. Sprint stopped access to the alt.* hierarchies. AT&T stopped access to the hierarchies. Cuomo never specifically named Usenet in his anti-child pornography campaign. David DeJean of PC World said that some worry that the ISPs used Cuomo's campaign as an excuse to end portions of Usenet access, as it is costly for the Internet service providers and not in high demand by customers. In 2008 AOL, which no longer offered Usenet access, and the four providers that responded to the Cuomo campaign were the five largest Internet service providers in the United States; they had more than 50% of the U.S. ISP market share. On June 8, 2009, AT&T announced that it would no longer provide access to the Usenet service as of July 15, 2009. AOL announced that it would discontinue its integrated Usenet service in early 2005, citing the growing popularity of weblogs, chat forums and on-line conferencing. The AOL community had a tremendous role in popularizing Usenet some 11 years earlier. In August 2009, Verizon announced that it would discontinue access to Usenet on September 30, 2009. JANET announced it would discontinue Usenet service, effective July 31, 2010, citing Google Groups as an alternative. Microsoft announced that it would discontinue support for its public newsgroups (msnews.microsoft.com) from June 1, 2010, offering web forums as an alternative. Primary reasons cited for the discontinuance of Usenet service by general ISPs include the decline in volume of actual readers due to competition from blogs, along with cost and liability concerns of increasing proportion of traffic devoted to file-sharing and spam on unused or discontinued groups. Some ISPs did not include pressure from Cuomo's campaign against child pornography as one of their reasons for dropping Usenet feeds as part of their services. ISPs Cox and Atlantic Communications resisted the 2008 trend but both did eventually drop their respective Usenet feeds in 2010. Archives Public archives of Usenet articles have existed since the early days of Usenet, such as the system created by Kenneth Almquist in late 1982. Distributed archiving of Usenet posts was suggested in November 1982 by Scott Orshan, who proposed that "Every site should keep all the articles it posted, forever." Also in November of that year, Rick Adams responded to a post asking "Has anyone archived netnews, or does anyone plan to?" by stating that he was, "afraid to admit it, but I started archiving most 'useful' newsgroups as of September 18." In June 1982, Gregory G. Woodbury proposed an "automatic access to archives" system that consisted of "automatic answering of fixed-format messages to a special mail recipient on specified machines." In 1985, two news archiving systems and one RFC were posted to the Internet. The first system, called keepnews, by Mark M. Swenson of the University of Arizona, was described as "a program that attempts to provide a sane way of extracting and keeping information that comes over Usenet." The main advantage of this system was to allow users to mark articles as worthwhile to retain. The second system, YA News Archiver by Chuq Von Rospach, was similar to keepnews, but was "designed to work with much larger archives where the wonderful quadratic search time feature of the Unix ... becomes a real problem." Von Rospach in early 1985 posted a detailed RFC for "archiving and accessing usenet articles with keyword lookup." This RFC described a program that could "generate and maintain an archive of Usenet articles and allow looking up articles based on the article-id, subject lines, or keywords pulled out of the article itself." Also included was C code for the internal data structure of the system. The desire to have a fulltext search index of archived news articles is not new either, one such request having been made in April 1991 by Alex Martelli who sought to "build some sort of keyword index for [the news archive]." In early May, Mr. Martelli posted a summary of his responses to Usenet, noting that the "most popular suggestion award must definitely go to 'lq-text' package, by Liam Quin, recently posted in alt.sources." The Alt Sex Stories Text Repository (ASSTR) site archives and indexes erotic and pornographic stories posted to the Usenet group alt.sex.stories. The archiving of Usenet has led to fears of loss of privacy. An archive simplifies ways to profile people. This has partly been countered with the introduction of the X-No-Archive: Yes header, which is itself controversial. Archives by Google Groups and DejaNews Web-based archiving of Usenet posts began in 1995 at Deja News with a very large, searchable database. In 2001, this database was acquired by Google. Google Groups hosts an archive of Usenet posts dating back to May 1981. The earliest posts, which date from May 1981 to June 1991, were donated to Google by the University of Western Ontario with the help of David Wiseman and others, and were originally archived by Henry Spencer at the University of Toronto's Zoology department. The archives for late 1991 through early 1995 were provided by Kent Landfield from the NetNews CD series and Jürgen Christoffel from GMD. The archive of posts from March 1995 onward was started by the company DejaNews (later Deja), which was purchased by Google in February 2001. Google began archiving Usenet posts for itself starting in the second week of August 2000. Google has been criticized by Vice and Wired contributors as well as former employees for its stewardship of the archive and for breaking its search functionality. See also Usenet II PLATO Notes Usenet Celebrity Usenet newsreaders Newsreader (Usenet) Comparison of Usenet newsreaders List of Usenet newsreaders Usenet/newsgroup service providers Astraweb Easynews Giganews Supernews Usenet history Legion of Net.Heroes Scientology and the Internet Serdar Argic Usenet administrators Usenet as a whole has no administrators; each server administrator is free to do whatever pleases him or her as long as the end users and peer servers tolerate and accept it. Nevertheless, there are a few famous administrators: Chris Lewis Gene (Spaf) Spafford Henry Spencer Kai Puolamäki Mary Ann Horton References Further reading External links IETF working group USEFOR (USEnet article FORmat), tools.ietf.org A-News Archive: Early Usenet news articles: 1981 to 1982., quux.org UTZoo Archive: 2,000,000 articles from early 1980s to July 1991 Social Accounting Reporting Tool Living Internet A comprehensive history of the Internet, including Usenet. livinginternet.com Usenet Glossary A comprehensive list of Usenet terminology Computer-mediated communication Computer networks History of the Internet Internet Protocol based network software Internet protocols Internet Standards Internet culture Online chat Pre–World Wide Web online services Wikipedia articles with ASCII art Computer-related introductions in 1980 1980 establishments in North Carolina
28387289
https://en.wikipedia.org/wiki/The%20Girl%20with%20the%20Dragon%20Tattoo%20%282011%20film%29
The Girl with the Dragon Tattoo (2011 film)
The Girl with the Dragon Tattoo is a 2011 neo-noir psychological thriller film based on the 2005 novel by Swedish writer Stieg Larsson. It was directed by David Fincher with a screenplay by Steven Zaillian. Starring Daniel Craig as journalist Mikael Blomkvist and Rooney Mara as Lisbeth Salander, it tells the story of Blomkvist's investigation to find out what happened to a girl from a wealthy family who disappeared 40 years prior. He recruits the help of Salander, a computer hacker. Sony Pictures Entertainment began development on the film, a co-production of the United States, United Kingdom, Sweden and Germany, in 2009. It took the company a few months to obtain the rights to the novel, while also recruiting Zaillian and Fincher. The casting process for the lead roles was exhaustive and intense; Craig faced scheduling conflicts, and a number of actresses were sought for the role of Lisbeth Salander. The script took over six months to write, which included three months of analyzing the novel. The film premiered at Odeon Leicester Square in London on December 12, 2011. A critical and commercial success, the film grossed $232.6 million on a $90 million budget and received highly positive reviews from critics, who praised Craig and Mara's performances as well as the film's somber tone. The film was chosen by National Board of Review as one of the top ten films of 2011 and was a candidate for numerous awards, winning, among others, the Academy Award for Best Film Editing, while Mara's performance earned her an Academy Award nomination for Best Actress. Plot In Stockholm, disgraced journalist Mikael Blomkvist is recovering from the legal and professional fallout of a libel suit brought against him by businessman Hans-Erik Wennerström, straining Blomkvist's relationship with his business partner and married lover, Erika Berger. Lisbeth Salander, a young, brilliant but asocial investigator and hacker, compiles an extensive background check on Blomkvist for the wealthy Henrik Vanger, who offers Blomkvist evidence against Wennerström in exchange for an unusual task: investigate the 40-year-old disappearance and presumed murder of Henrik's grandniece, 16-year-old Harriet. Every year, Vanger has received a framed pressed flower, the same type Harriet always gave him on his birthday before she disappeared, leading him to believe that Harriet's killer is taunting him. Blomkvist agrees, and moves into a cottage on the Vanger family estate on Hedestad Island. Salander's state-appointed guardian, Holger Palmgren, suffers a stroke and is replaced by Nils Bjurman, a sadist who controls Salander's finances and extorts sexual favors by threatening to have her institutionalized. Unaware she is secretly recording one of their meetings, Bjurman chains Salander to his bed and brutally rapes her. At their next meeting, Salander tasers Bjurman, binds him, anally rapes him with a metallic dildo, and tattoos "I'm a rapist pig" across his chest. Using the secret recording she made, she blackmails him into securing her financial independence and having no further contact with her. Blomkvist explores the island and interviews various Vanger family members, learning some were Nazi sympathizers during World War II. He uncovers a list of names and numbers that his visiting daughter, Pernilla, notices are Bible verse references. Blomkvist discovers that Salander had researched him illegally but, rather than report it, he recruits her as his research assistant. She uncovers a connection between the list and numerous young women brutally murdered from 1947 to 1967, indicating a serial killer; she also notes that many of the victims have Jewish names, theorizing that the murders could have been motivated by antisemitism. One morning, Blomkvist finds the mutilated corpse of his adopted cat on the doorstep. Another night, while walking outdoors, a bullet grazes his forehead; after Salander tends his wounds, they have sex. Blomkvist begins to suspect Martin, Harriet's brother and operational head of the Vanger empire. Salander also uncovers evidence that Harriet's late father, Gottfried, and later Martin, committed the murders. Blomkvist breaks into Martin's house to obtain more proof, but Martin arrives and catches him. He forces Blomkvist into his specially prepared basement, knocking him unconscious and placing him in restraints. Martin brags about killing and raping women for decades, as did his father, but makes it clear he does not know what happened to Harriet. As Martin is about to kill Blomkvist, Salander arrives, attacking Martin and forcing him to flee in his SUV. She pursues him on her motorcycle until he runs off the road and hits a propane tank, blowing up the car and killing him. Salander nurses Blomkvist back to health and tells him that, as a child, she was institutionalized after attempting to burn her father alive. They deduce that Harriet is alive and in hiding; traveling to London, they confront Harriet's cousin, Anita, only to discover she is Harriet. Harriet reveals that Gottfried sexually abused her when she was 14 for an entire year before she was able to defend herself, accidentally killing him in the process. Martin continued the abuse after Gottfried's death. Her cousin, Anita, smuggled her off the island and let Harriet assume her identity in London, though Anita and her husband were later killed in a car accident. Finally free of her brother, Harriet returns to Sweden and tearfully reunites with Henrik. As promised, Henrik gives Blomkvist information against Wennerström, but it proves to be outdated and useless. Salander reveals that she had hacked Wennerström's accounts, and gives Blomkvist evidence of Wennerström's crimes, which Blomkvist publishes in a scathing editorial, ruining him and bringing Blomkvist to national prominence. Salander, in disguise, travels to Switzerland and removes two billion euro from Wennerström's secret accounts. Wennerström is later murdered in an apparent gangland shooting. On her way to give Blomkvist a Christmas present, Salander sees him with Erika. She discards the gift and rides away on her motorbike. Cast Daniel Craig as Mikael Blomkvist: A co-owner for Swedish magazine Millennium, Blomkvist is devoted to exposing the corruptions and malfeasance of government, attracting infamy for his tendency to "go too far". Craig competed with George Clooney, Johnny Depp, Viggo Mortensen, and Brad Pitt as candidates for the role. Initial concerns over schedule conflicts with the production of Cowboys & Aliens (2011) and Skyfall (2012) prompted Craig to postpone the casting process. Given the uncertainty surrounding Skyfall following Metro-Goldwyn-Mayer's bankruptcy, Sony Pictures Entertainment and DreamWorks worked out a schedule and Craig agreed to take the part. The British actor was required to gain weight and adopted a neutral accent to befit Stockholm's worldly cultural fabric. Having read the book amid its "initial craze", Craig commented, "It's one of those books you just don't put down" [...] There's just this immediate feeling that bad things are going to happen and I think that's part of why they've been so readable for people." Rooney Mara as Lisbeth Salander: Salander is a computer hacker who has survived severe emotional and sexual abuse. The character was a "vulnerable victim-turned-vigilante with the "take-no-prisoners" attitude of Lara Croft and the "cool, unsentimental intellect" of Spock. Fincher felt that Salander's eccentric persona was enthralling, and stated, "there's a kind of wish fulfillment to her in the way that she takes care of things, the way she will only put up with so much, but there are other sides to her as well." Casting was complicated by the raft of prominent candidates such as Emily Browning, Eva Green, Anne Hathaway, Scarlett Johansson, Keira Knightley, Jennifer Lawrence, Carey Mulligan, Elliot Page, Natalie Portman, Léa Seydoux, Vanessa Hudgens, Sophie Lowe, Sarah Snook, Kristen Stewart, Olivia Thirlby, Mia Wasikowska, Emma Watson, Evan Rachel Wood and Yolandi Visser; Lowe, Mara, Seydoux, and Snook were the final four candidates. Despite the hype, some eventually withdrew from consideration due to the time commitment and low pay. Mara had worked with Fincher in his 2010 film The Social Network. Fincher, while fond of the actress' youthful appearance, found it difficult at first to mold her to match Salander's antisocial demeanor, which was a vast contrast from her earlier role as the personable Erica. Mara went through multiple changes in her appearance to become Salander. Her hair was dyed black and cut into various jagged points, giving the appearance that she cut it herself. In addition to her transgressive appearance, which was described as a "mash-up of brazen Seventies punk and spooky Eighties goth with a dash of S&M temptress" by Lynn Hirschberg of W, Mara participated in a formal screening and was filmed by Fincher on a subway in Los Angeles in an effort to persuade the executives of Sony Pictures that she was a credible choice. Christopher Plummer as Henrik Vanger: Vanger is a wealthy businessman who launches an extensive investigation into his family's affairs. Despite calling the Vanger family "dysfunctional", Plummer said of the character: "I love the character of the old man, and I sympathize with him. He's really the nicest old guy in the whole book. Everybody is a bit suspect, and still are at the end. Old Vanger has a nice straight line, and he gets his wish." Plummer wanted to imbue the character with irony, an element he found to be absent from the novel's Henrik. "I think that the old man would have it," he opined, "because he's a very sophisticated old guy [...] used to a great deal of power. So in dealing with people, he would be very good [...] he would be quite jokey, and know how to seduce them." Julian Sands portrays a young Henrik Vanger. Stellan Skarsgård as Martin Vanger: Martin is the current CEO of Vanger Industries. Skarsgård was allured by the character's dual nature, and was fascinated that he got to portray him in "two totally different ways". Regarding Martin's "very complex" and "complicated" personality, the Swedish actor said, "He can be extremely charming, but he also can seem to be a completely different person at different points in the film." While consulting with Fincher, the director wanted Skarsgård to play Martin without reference to the book. Steven Berkoff as Dirch Frode, Head Legal Counsel for Vanger Industries Robin Wright as Erika Berger: Blomkvist's business partner and editor-in-chief of Millennium magazine. She's also Blomkvist's occasional lover. Yorick van Wageningen as Nils Bjurman: As Salander's legal guardian, he uses his position to sexually abuse and eventually rape her. Salander turns the tables on him, torturing him and branding him across the torso with the words I AM A RAPIST PIG. Fincher wanted the character to be worse than a typical antagonist, although he did not want to emulate the stereotypical "mustache-twirling pervert". The director considered Van Wageningen to be the embodiment of a versatile actor—one who was a "full-fledged human being" and a "brilliant" actor. "He was able to bring his performance from a logical place in Bjurman's mind and find the seething morass of darkness inside," Fincher stated. Bjurman's multifaceted psyche was the main reason Van Wageningen wanted to play the role. The Dutch actor said, "This character goes through a lot and I wasn't quite sure I wanted to go through all that. I started out half way between the elation of getting to work with David Fincher and the dread of this character, but I was able to use both of those things. We both thought the most interesting route would be for Bjurman to seem half affable. The challenge was not in finding the freak violence in the guy but finding the humanity of him." Joely Richardson as Harriet Vanger: Henrik's long-lost grandniece who went into hiding posing as her cousin Anita. In performing her "tricky" character, Richardson recalled that Fincher wanted her to embrace a "darker, edgier" persona, without sugarcoating, and not "resolved or healed". "Even if you were starting to move towards the direction of resolved or healed, he still wanted it edgy and dark. There are no straightforward emotions in the world of this film." Moa Garpendal portrays a young Harriet Vanger. Goran Višnjić as Dragan Armansky, head of Milton Security, Salander's employer Donald Sumpter as Detective Morell. David Dencik portrays a young Morell. Ulf Friberg as Hans-Erik Wennerström, CEO of the Wennerström Group Geraldine James as Cecilia Vanger Embeth Davidtz as Annika Giannini, Mikael's sister and a lawyer Josefin Asplund as Pernilla Blomkvist, Mikael's daughter Per Myrberg as Harald Vanger. Gustaf Hammarsten portrays a young Harald. Tony Way as Plague, Salander's hacker friend Fredrik Dolk as Bertil Camnermarker, Counsel for the Wennerström Group Alan Dale as Detective Isaksson Leo Bill as Trinity, another of Salander's hacker friends Élodie Yung as Miriam Wu, Salander's occasional lover Joel Kinnaman as Christer Malm Production Conception and writing The success of Stieg Larsson's novel created Hollywood interest in adapting the book, as became apparent in 2009 when Sony's Michael Lynton and Amy Pascal pursued the idea of developing an "American" version unrelated to the Swedish film adaptation released that year. By December, two major developments occurred for the project: Steven Zaillian, who had recently completed the script for Moneyball (2011), became the screenwriter, while producer Scott Rudin finalized a partnership allocating full copyrights to Sony. Zaillian, who was unfamiliar with the novel, got a copy from Rudin. The screenwriter recalled, "They sent it to me and said, 'We want to do this. We will think of it as one thing for now. It's possible that it can be two and three, but let's concentrate on this one.'" After reading the book, the screenwriter did no research on the subject. Fincher, who was requested with partner Cean Chaffin by Sony executives to read the novel, was astounded by the series' size and success. As they began to read, the duo noticed that it had a tendency to take "readers on a lot of side trips"—"from detailed explanations of surveillance techniques to angry attacks on corrupt Swedish industrialists," professed The Hollywood Reporter Gregg Kilday. Fincher recalled of the encounter: "The ballistic, ripping-yarn thriller aspect of it is kind of a red herring in a weird way. It is the thing that throws Salander and Blomkvist together, but it is their relationship you keep coming back to. I was just wondering what 350 pages Zaillian would get rid of." Because Zaillian was already cultivating the screenplay, the director avoided interfering. After a conversation, Fincher was comfortable "they were headed in the same direction". The writing process consumed approximately six months, including three months creating notes and analyzing the novel. Zaillian noted that, as time progressed, the writing accelerated. "As soon as you start making decisions," he explained, "you start cutting off all of the other possibilities of things that could happen. So with every decision that you make you are removing a whole bunch of other possibilities of where that story can go or what that character can do." Given the book's sizable length, Zaillian deleted elements to match Fincher's desired running time. Even so, Zaillan took significant departures from the book. To Zaillian, there was always a "low-grade" anxiety, "but I was never doing anything specifically to please or displease," he continued. "I was simply trying to tell the story the best way I could, and push that out of my mind. I didn't change anything just for the sake of changing it. There's a lot right about the book, but that part, I thought we could do it a different way, and it could be a nice surprise for the people that have read it." Zaillian discussed many of the themes in Larsson's Millennium series with Fincher, taking the pair deeper into the novel's darker subjects, such as the psychological dissimilarities between rapists and murderers. Fincher was familiar with the concept, from projects such as Seven (1995) and Zodiac (2007). Zaillian commented, "A rapist, or at least our rapist, is about exercising his power over somebody. A serial killer is about destruction; they get off on destroying something. It's not about having power over something, it's about eliminating it. What thrills them is slightly different." The duo wanted to expose the novels' pivotal themes, particularly misogyny. "We were committed to the tack that this is a movie about violence against women about specific kinds of degradation, and you can't shy away from that. But at the same time you have to walk a razor thin line so that the audience can viscerally feel the need for revenge but also see the power of the ideas being expressed." Instead of the typical three-act structure, they reluctantly chose a five-act structure, which Fincher pointed out is "very similar to a lot of TV cop dramas." Filming Fincher and Zaillian's central objective was to maintain the novel's setting. To portray Larsson's vision of Sweden, and the interaction of light on its landscape, Fincher cooperated with an artistic team that included cinematographer Jeff Cronenweth and production designer Donald Graham Burt. The film was wholly shot using Red Digital Cinema Camera Company's RED MX digital camera, chosen to help evoke Larsson's tone. The idea, according to Cronenweth, was to employ unorthodox light sources and maintain a realistic perspective. "So there may be shadows, there may be flaws, but it's reality. You allow silhouettes and darkness, but at the same time we also wanted shots to counter that, so it would not all be one continuous dramatic image." Sweden's climate was a crucial element in enhancing the mood. Cronenweth commented, "It's always an element in the background and it was very important that you feel it as an audience member. The winter becomes like a silent character in the film giving everything a low, cool-colored light that is super soft and non-direct." To get acquainted with Swedish culture, Burt set out on a month-long expedition across the country. He said of the process, "It takes time to start really taking in the nuances of a culture, to start seeing the themes that recur in the architecture, the landscape, the layouts of the cities and the habits of the people. I felt I had to really integrate myself into this world to develop a true sense of place for the film. It was not just about understanding the physicality of the locations, but the metaphysics of them, and how the way people live comes out through design." Principal photography began in Stockholm, Sweden in September 2010. Production mostly took place at multiple locations in the city's central business district, including at the Stockholm Court House. One challenge was realizing the Vanger estate. They picked an eighteenth-century French architecture mansion Hofsta located approximately southwest of Stockholm. Filmmakers wanted to use a typical "manor from Småland" that was solemn, formal, and "very Old Money". "The Swedish are very good at the modern and the minimal but they also have these wonderful country homes that can be juxtaposed against the modern city—yet both speak to money." Principal photography relocated in October to Uppsala. On Queen Street, the facade of the area was renovated to mimic the Hotel Alder, after an old photograph of a building obtained by Fincher. From December onward, production moved to Zurich, Switzerland, where locations were established at Dolder Grand Hotel and the Zurich Airport. Because of the "beautiful" environment of the city, Fincher found it difficult to film in the area. Principal photography concluded in Oslo, Norway, where production took place at Oslo Airport, Gardermoen. Recorded for over fifteen hours, twelve extras were sought for background roles. Filming also took place in the United Kingdom and the United States. In one sequence the character Martin Vanger plays the song "Orinoco Flow" by Enya before beginning his torture of Mikael Blomkvist. David Fincher, the director, said that he believed that Martin "doesn't like to kill, he doesn't like to hear the screams, without hearing his favorite music" so therefore the character should play a song during the scene. Daniel Craig, the actor who played Blomkvist, selected "Orinoco Flow" on his iPod as a candidate song. Fincher said "And we all almost pissed ourselves, we were laughing so hard. No, actually, it's worse than that. He said, ‘Orinoco Flow!’ Everybody looked at each other, like, what is he talking about? And he said, ‘You know, "Sail away, sail away..."’ And I thought, this guy is going to make Blomkvist as metro as we need." Title sequence Tim Miller, creative director for the title sequence, wanted to develop an abstract narrative that reflected the pivotal moments in the novel, as well as the character development of Lisbeth Salander. It was arduous for Miller to conceptualize the sequence abstractly, given that Salander's occupation was a distinctive part of her personality. His initial ideas were modeled after a keyboard. "We were going to treat the keyboard like this giant city with massive fingers pressing down on the keys," Miller explained, "Then we transitioned to the liquid going through the giant obelisks of the keys." Among Miller's many vignettes was "The Hacker Inside", which revealed the character's inner disposition and melted them away. The futuristic qualities in the original designs provided for a much more cyberpunk appearance than the final product. In creating the "cyber" look for Salander, Miller said, "Every time I would show David a design he would say, 'More Tandy!' It's the shitty little computers from Radio Shack, the Tandy computers. They probably had vacuum tubes in them, really old technology. And David would go 'More Tandy', until we ended up with something that looked like we glued a bunch of computer parts found at a junkyard together." Fincher wanted the vignette to be a "personal nightmare" for Salander, replaying her darkest moments. "Early on, we knew it was supposed to feel like a nightmare," Miller professed, who commented that early on in the process, Fincher wanted to use an artwork as a template for the sequence. After browsing through various paintings to no avail, Fincher chose a painting that depicted the artist, covered in black paint, standing in the middle of a gallery. Many of Miller's sketches contained a liquid-like component, and were rewritten to produce the "gooey" element that was so desired. "David said let's just put liquid in all of them and it will be this primordial dream ooze that's a part of every vignette," Miller recalled. "It ties everything together other than the black on black." The title sequence includes abundant references to the novel, and exposes several political themes. Salander's tattoos, such as her phoenix and dragon tattoos, were incorporated. The multiple flower representations signified the biological life cycle, as well as Henrik, who received a pressed flower each year on his birthday. "One had flowers coming out of this black ooze," said Fincher, "it blossoms, and then it dies. And then a different flower, as that one is dying is rising from the middle of it. It was supposed to represent this cycle of the killer sending flowers." Ultimately, the vignette becomes very conceptual because Miller and his team took "a whole thought, and cut it up into multiple different shots that are mixed in with other shots". In one instance, Blomkvist is strangled by strips of newspaper, a metaphor for the establishment squelching his exposes. In the "Hot Hands" vignette, a pair of rough, distorted hands that embrace Salander's face and melt it represent all that's bad in men. The hands that embrace Blomkvist's face and shatter it, represent wealth and power. Themes of domestic violence become apparent as a woman's face shatters after a merciless beating; this also ties in the brutal beating of Salander's mother by her father, an event revealed in the sequel, The Girl Who Played with Fire (2006). A cover of Led Zeppelin's "Immigrant Song" (1970) plays throughout the title sequence. The rendition was produced by soundtrack composers Atticus Ross and Nine Inch Nails member Trent Reznor, and features vocals from Yeah Yeah Yeahs lead singer Karen O. Fincher suggested the song, but Reznor agreed only at his request. Led Zeppelin licensed the song only for use in the film's trailer and title sequence. Fincher stated that he sees title sequences as an opportunity to set the stage for the film, or to get an audience to let go of its preconceptions. Software packages that were primarily used are 3ds Max (for modeling, lighting, rendering), Softimage (for rigging and animation), Digital Fusion (for compositing), Real Flow (for fluid dynamics), Sony Vegas (for editorial), ZBrush and Mudbox (for organic modeling), and VRAY (for rendering). Soundtrack Fincher recruited Reznor and Ross to produce the score; aside from their successful collaboration on The Social Network, the duo had worked together on albums from Nine Inch Nails' later discography. They dedicated much of the year to work on the film, as they felt it would appeal to a broad audience. Akin to his efforts in The Social Network, Reznor experiments with acoustics and blends them with elements of electronic music, resulting in a forbidding atmosphere. "We wanted to create the sound of coldness—emotionally and also physically," he asserted, "We wanted to take lots of acoustic instruments [...] and transplant them into a very inorganic setting, and dress the set around them with electronics." Even before viewing the script, Reznor and Ross opted to use a redolent approach to creating the film's score. After discussing with Fincher the varying soundscapes and emotions, the duo spent six weeks composing. "We composed music we felt might belong," stated the Nine Inch Nails lead vocalist, "and then we'd run it by Fincher, to see where his head's at and he responded positively. He was filming at this time last year and assembling rough edits of scenes to see what it feels like, and he was inserting our music at that point, rather than using temp music, which is how it usually takes place, apparently." Finding a structure for the soundtrack was arguably the most strenuous task. "We weren't working on a finished thing, so everything keeps moving around, scenes are changing in length, and even the order of things are shuffled around, and that can get pretty frustrating when you get precious about your work. It was a lesson we learned pretty quickly of, 'Everything is in flux, and approach it as such. Hopefully it’ll work out in the end.'" Release Pre-release A screening for The Girl with the Dragon Tattoo took place on November 28, 2011, as part of a critics-only event hosted by the New York Film Critics Circle. Commentators at the event predicted that while the film would become a contender for several accolades, it would likely not become a forerunner in the pursuit for Academy Award nominations. A promotional campaign commenced thereafter, including a Lisbeth Salander-inspired collection, designed by Trish Summerville for H&M. The worldwide premiere was at the Odeon Leicester Square in London on December 12, 2011, followed by the American opening at the Ziegfeld Theatre in New York City on December 14 and Stockholm the next day. Sony's target demographics were men and women over the age of 25 and 17–34. The film went into general release in North America on December 21, at 2,700 theaters, expanding to 2,974 theaters on its second day. The United Kingdom release was on December 26, Russia on January 1, 2012, and Japan on February 13. India and Vietnam releases were abandoned due to censorship concerns. A press statement from the Central Board of Film Certification stated: "Sony Pictures will not be releasing The Girl with the Dragon Tattoo in India. The censor board has judged the film unsuitable for public viewing in its unaltered form and, while we are committed to maintaining and protecting the vision of the director, we will, as always, respect the guidelines set by the board." In contrast, the National Film Board of Vietnam insisted that the film's withdrawal had no relation to rigid censorship guidelines, as it had not been reviewed by the committee. Home media Sony Pictures Home Entertainment released the film in a DVD and Blu-ray disc combo pack in the United States on March 20, 2012. Bonus features include a commentary from Fincher, featurettes on Blomkvist, Salander, the sets and locations, etc. The disc artwork for the DVD version of the film resembles a Sony brand DVD-R, a reference to the hacker Lisbeth Salander. This caused a bit of confusion in the marketplace with consumers thinking they had obtained a bootleg copy. The release sold 644,000 copies in its first week, in third place behind The Muppets and Hop. The following week, the film sold an additional 144,000 copies generating $2.59 million in gross revenue. , 1,478,230 units had been sold, grossing $22,195,069. Reception Box office Fincher's film grossed $232.6 million during its theatrical run. The film's American release grossed $1.6 million from its Tuesday night screenings, a figure that increased to $3.5 million by the end of its first day of general release. It maintained momentum into its opening weekend, accumulating $13 million for a total of $21 million in domestic revenue. The film's debut figures fell below media expectations. Aided by positive word of mouth, its commercial performance remained steady into the second week, posting $19 million from 2,914 theaters. The third week saw box office drop 24% to $11.3 million, totaling $76.8 million. The number of theaters slightly increased to 2,950. By the fifth week, the number of theaters shrank to 1,907, and grosses to $3.7 million, though it remained within the national top ten. The film completed its North American theatrical run on March 22, 2012, earning over $102.5 million. The international debut was in six Scandinavian markets on December 19–25, 2011, securing $1.6 million from 480 venues. In Sweden the film opened in 194 theaters to strong results, accounting for more than half of international revenue at the time ($950,000). The first full week in the United Kingdom collected $6.7 million from 920 theaters. By the weekend of January 6–8, 2012, the film grossed $12.2 million for a total of $29 million; this included its expansion into Hong Kong, where it topped the box office, earning $470,000 from thirty-six establishments. The film similarly led the field in South Africa. It accumulated $6.6 million from an estimated 600 theaters over a seven-day period in Russia, placing fifth. The expansion continued into the following week, opening in nine markets. The week of January 13–15 saw the film yield $16.1 million from 3,910 locations in over forty-three territories, thus propelling the international gross to $49.3 million. It debuted at second place in Austria and Germany, where in the latter, it pulled $2.9 million from 525 locations. Similar results were achieved in Australia, where it reached 252 theaters. The film's momentum continued throughout the month, and by January 22, it had hit ten additional markets, including France and Mexico, from which it drew $3.25 million from 540 venues and $1.25 million from 540 theaters, respectively. In its second week in France it descended to number three, with a total gross of $5.8 million. The next major international release came in Japan on February 13, where it opened in first place with $3.68 million (¥288 million) in 431 theaters. By the weekend of February 17–19, the film had scooped up $119.5 million from international markets. The total international gross for The Girl with the Dragon Tattoo was $130.1 million. MGM, one of the studios involved in the production, posted a "modest loss" and declared that they had expected the film to gross at least 10% more. Critical response The Girl with the Dragon Tattoo received positive reviews from critics, with particular note to the cast, tone, score and cinematography. Review aggregation website Rotten Tomatoes reported an approval rating of 86% based on 258 reviews, with an average rating of 7.60/10. The site's critics consensus states, "Brutal yet captivating, The Girl with the Dragon Tattoo is the result of David Fincher working at his lurid best with total role commitment from star Rooney Mara." Metacritic assigned the film a weighted average score of 71 out of 100, based on 41 critics, indicating "generally favorable reviews". Audiences polled by CinemaScore have the film an average grade of "A" on an A+ to F scale. David Denby of The New Yorker asserted that the austere, but captivating installment presented a "glancing, chilled view" of a world where succinct moments of loyalty coexisted with constant trials of betrayal. To USA Today columnist Claudia Puig, Fincher captures the "menace and grim despair in the frosty Scandinavian landscape" by carefully approaching its most gruesome features. Puig noted a surfeit of "stylistic flourishes" and "intriguing" changes in the narrative, compared to the original film. In his three-and-a-half star review, Chris Knight of the National Post argued that it epitomized a so-called "paradoxical position" that was both "immensely enjoyable and completely unnecessary". Rene Rodriguez of The Miami Herald said that the "fabulously sinister entertainment" surpassed the original film "in every way". The film took two and a half stars from Rolling Stone commentator Peter Travers, who concluded: "Fincher's Girl is gloriously rendered but too impersonal to leave a mark." A. O. Scott, writing for The New York Times, admired the moments of "brilliantly orchestrated" anxiety and confusion, but felt that The Girl with the Dragon Tattoo was vulnerable to the "lumbering proceduralism" that he saw in its literary counterpart, as evident with the "long stretches of drab, hackneyed exposition that flatten the atmosphere". The Wall Street Journal Joe Morgenstern praised Cronenweth's cinematography, which he thought provided for glossy alterations in the film's darkness; "Stockholm glitters in nighttime exteriors, and its subway shines in a spectacular spasm of action involving a backpack." Rex Reed of The New York Observer professed that despite its occasional incomprehensibility, the movie was "technically superb" and "superbly acted". In contrast, Kyle Smith of New York Post censured the film, calling it "rubbish" and further commenting that it "demonstrates merely that masses will thrill to an unaffecting, badly written, psychologically shallow and deeply unlikely pulp story so long as you allow them to feel sanctified by the occasional meaningless reference to feminism or Nazis." The performances were a frequent topic in the critiques. Mara's performance, in particular, was admired by commentators. A revelation in the eyes of Entertainment Weekly Owen Gleiberman, he proclaimed that her character was more important than "her ability to solve a crime". Her "hypnotic" portrayal was noted by Justin Chang of Variety, as well as Salon critic Andrew O'Hehir, who wrote, "Rooney Mara is a revelation as Lisbeth Salander, the damaged, aggressive computer geek and feminist revenge angel, playing the character as far more feral and vulnerable than Noomi Rapace’s borderline-stereotype sexpot Goth girl." Scott Tobias of The A.V. Club enjoyed the chemistry between Mara and Craig, as did David Germain of the Associated Press; "Mara and Craig make an indomitable screen pair, he nominally leading their intense search into decades-old serial killings, she surging ahead, plowing through obstacles with flashes of phenomenal intellect and eruptions of physical fury." Although Puig found Mara inferior to Rapace in playing Salander, with regard to Craig's performance, he said that the actor shone. This was supported by Morgenstern, who avouched that Craig "nonetheless finds welcome humor in Mikael's impassive affect". Roger Ebert of the Chicago Sun-Times said the film was given a more assured quality than the original because of Fincher's direction and the lead performances, although he believed this did not always work to the film's advantage, preferring the original version's "less confident surface" where "emotions were closer to the surface." Accolades In addition to numerous awards, The Girl with the Dragon Tattoo was included on several year-end lists by film commentators and publications. It was named the best film of 2011 by MTV and James Berardinelli of ReelViews. The former wrote, "The director follows up the excellent Social Network with another tour de force, injecting the murder mystery that introduces us to outcast hacker Lisbeth Salander [...] and embattled journalist [...] with style, intensity and relentless suspense. Mara is a revelation, and the film's daunting 160-minute runtime breezes by thanks to one heart-racing scene after the next. Dark and tough to watch at times, but a triumph all around." The film came second in indieWire list of "Drew Taylor's Favorite Films Of 2011", while reaching the top ten of seven other publications, including the St. Louis Post-Dispatch, San Francisco Chronicle, and the New Orleans Times-Picayune. The Girl with the Dragon Tattoo was declared one of the best films of the year by the American Film Institute, as well as the National Board of Review of Motion Pictures. Sequel In December 2011, Fincher stated that the creative team involved planned to film the sequels The Girl Who Played with Fire and The Girl Who Kicked the Hornets' Nest, "back to back." There was an announced release date of 2013 for a film version of The Girl Who Played with Fire, although by August 2012 it was delayed due to changes being done to the script, being written by Steven Zaillian. By July 2013, Andrew Kevin Walker was hired to re-write the script. The following year, Fincher stated that a script for The Girl Who Played with Fire had been written and that it was "extremely different from the book," and that despite the long delay, he was confident that the film would be made given that the studio "already has spent millions of dollars on the rights and the script". Mara was less optimistic about the production of the sequels, though she stated that she was still contractually signed on to reprise the lead role. By November 2015, it was announced that Sony was considering rebooting the franchise, before settling on continuing the film series with an adaptation of The Girl in the Spider's Web. The story is based on a 2015 novel by David Lagercrantz that was a continuation of the original Millennium trilogy after series creator Stieg Larsson died in 2004. Looking for a new lead in the series, Alicia Vikander was considered by the studio. The following year, Fede Álvarez was announced by Sony as director, as well as co-screenwriter with Steven Knight and Jay Basu. The Girl in the Spider's Web was notably the first adaptation of an installment in the book series to be produced into an English-language film upon its initial release. By March 2017, Álvarez announced that the film would have an entirely new cast, as he wanted the entire film to be his interpretation of the story. In September of the same year, Claire Foy was cast as Lisbeth Salander, replacing Mara. The film was released in the U.S. on November 9, 2018. References External links Dragon Tattoo Stories (film series) 2011 films 2011 crime thriller films 2011 LGBT-related films 2011 psychological thriller films 2011 thriller drama films 2010s mystery drama films 2010s mystery thriller films 2010s serial killer films American crime thriller films American detective films American films American LGBT-related films American mystery drama films American mystery thriller films American psychological thriller films American serial killer films American thriller drama films Columbia Pictures films 2010s English-language films Female bisexuality in film Films about journalists Films about missing people Films about violence against women Films based on crime novels Films based on Swedish novels Films directed by David Fincher Films produced by Scott Rudin Films scored by Atticus Ross Films scored by Trent Reznor Films set in London Films set in Stockholm Films set in Switzerland Films set on fictional islands Films shot in London Films shot in Los Angeles Films shot in Norway Films shot in Stockholm Films shot in Zürich Films whose editor won the Best Film Editing Academy Award Films with screenplays by Steven Zaillian Incest in film Lesbian-related films LGBT-related thriller drama films Metro-Goldwyn-Mayer films Patricide in fiction Rape and revenge films British crime thriller films British films British LGBT-related films British mystery drama films British mystery thriller films British thriller drama films Swedish crime thriller films Swedish films Swedish LGBT-related films Swedish mystery drama films Swedish mystery thriller films Swedish thriller drama films German crime thriller films German films German LGBT-related films German mystery drama films German mystery thriller films German thriller drama films English-language German films English-language Swedish films
51385716
https://en.wikipedia.org/wiki/Descartes%20Systems%20Group
Descartes Systems Group
The Descartes Systems Group Inc. (commonly referred to as Descartes) is a Canadian multinational technology company specializing in logistics software, supply chain management software, and cloud-based services for logistics businesses. Descartes is perhaps best known for its abrupt and unexpected turnaround in the mid-2000s after coming close to bankruptcy in the wake of the dot-com bubble collapse. It is also known as one of the earliest logistics technology companies to adopt an on-demand business model and sell its software as a service (SaaS) via the Internet. The company operates the Global Logistics Network, an extensive electronic messaging system used by freight companies, manufacturers, distributors, retailers, customs brokers, government agencies, and other interested parties to exchange logistics and customs information. Headquartered in Waterloo, Ontario, Canada, Descartes is a publicly traded company with shares listed on the NASDAQ Stock Market (NASDAQ: DSGX) and Toronto Stock Exchange (TSX: DSG). It has offices in the Americas, Europe, the Middle East, Africa, and the Asia Pacific region. History Descartes was founded in 1981. In 1998, the company made an initial public offering on the Toronto Stock Exchange, where its common shares trade under the stock symbol DSG. Descartes was first listed on the NASDAQ Stock Market in 1999, with common shares trading under the symbol DSGX. Descartes’ share price peaked during the dot-com bubble and then fell precipitously in the subsequent crash. In 2001, Descartes switched its business model from selling full-featured enterprise software licenses to providing on-demand software on a subscription basis, becoming one of the first SaaS providers in the logistics sector. After years of losses, Descartes came close to bankruptcy in 2004, prompting it to aggressively restructure. The company cut 35% of its workforce and initiated a sweeping transformation of its corporate culture under CEO Arthur Mesher, who was appointed in 2005. The company returned to profitability in 2005, with one analyst describing this as "one of the most dramatic turnarounds in Canadian corporate history." In December, 2013, Descartes was added to the S&P/TSX Composite Index, an index of the stock (equity) prices of the largest companies on the Toronto Stock Exchange. By January, 2015, Descartes had posted 41 straight profitable quarters and was supplying logistics software and services to more than 10,000 logistics-centric businesses, such as ground transportation companies, airlines, ocean carriers, freight forwarders, manufacturers, distributors, and retailers. Its customers included American Airlines, Delta Air Lines, Air Canada, British Airways, Maersk Group, Hapag-Lloyd, Con-way, Kuehne + Nagel, DHL, The Home Depot, Sears Brands, Hallmark Cards, Hasbro, Volvo, Ferrellgas, Del Monte, and The Coca-Cola Company. In 2018, Gartner ranked Descartes 6th in its list of the Top 20 Supply Chain Management Software Suppliers, based on revenue of $221 million. In July 2020, Descartes Systems Group has confirmed that dnata is expanding Descartes Core Bluetooth Low Energy (BLE) readers through its regional freight operations to enable foreign mail, shipment, and freight monitoring. Groupe Descartes Services T.DSG stock up $0.68 to $71.47. Acquisitions Acquisitions have played a key role in Descartes’ growth. By acquiring niche technology companies, Descartes has expanded its line of logistics software and services, enlarged its customer base, and extended its business geographically. Acquisitions going back to 2006 are listed below. Global Logistics Network Descartes’ cloud-based logistics messaging system, the Global Logistics Network (GLN), connects more than 13,000 customers in over 160 countries, making it one of the world’s largest logistics networks. Each year, the GLN carries more than 4.5 billion messages and manages more than 30 million shipping routes. Companies use the network to oversee shipping orders, file customs paperwork, comply with security regulations, share information across international supply chains, and automate logistics processes. In 2015, Descartes signed a deal with German business software giant SAP SE that allows users of SAP’s transportation management software to access the GLN. Conference Descartes holds an annual logistics technology conference for its users and partners. The eleventh Descartes "Evolution" conference was held in 2016. References External links Companies listed on the Toronto Stock Exchange 1981 establishments in Ontario Software companies established in 1981 Companies based in Waterloo, Ontario Software companies of Canada Companies listed on the Nasdaq Multinational companies headquartered in Canada Business software companies Supply chain software companies 1998 initial public offerings Canadian companies established in 1981
29450903
https://en.wikipedia.org/wiki/University%20of%20Library%20Studies%20and%20Information%20Technologies
University of Library Studies and Information Technologies
The University of Library Studies and Information Technologies has got university status by a resolution of the National Assembly of Bulgaria of 29 September 2010. It is the successor consequently of the Specialised Higher School of Library Studies and Information Technologies (est. 2 Sept 2004), the College of Library Studies and Information Technologies (CLSIT), the College of Library Studies (CLS), the Institute of Library Studies (ILS) and the State Institute of Library Studies (est. 1950). SULSIT is member of the Balkan Universities Network. Structure As of 2012, the structure of SULSIT is as follows: The Faculty of Library Studies and Cultural Heritage Library Sciences Dept. Library Management Dept. Book and Society Dept. Cultural and Historical Heritage Dept. The Faculty of Information Sciences Information Systems and Technologies Dept. Communications and Security Dept. Computer Sciences Dept. Department of Comprehensive Studies Foreign Language Preparation Centre Institutes, Research Centres and Labs Institute for Scientific Research and Doctoral Programs (PhD School) Research Institute in Organisation, Management and Protection of Cultural and Historical Heritage Centre for Continuing Education Centre for Distance Education Centre for Information Security and Protection Centre for Career Orientation and Student Development Scientific and Research Laboratory for Cybernetic Security John Atanasoff Computer Lab Oracle Lab Publishing House „Za Bukvite – O Pismeneh” Facilities: Microsoft Developers Network Academic Alliance Library and Information Centre (with a reading-room for 150 people) Education and demonstration museum collection „Spirit and Leadership” Chapel of St Nicholas the Miracle-Maker Sports Complex (indoor and outdoor facilities for tennis, mini-football, gym) Programmes The curricula for the Bachelor and master's degrees are in line with the requirements of the Higher Education Act and the European Credit Transfer System (ECTS). As of academic year 2014/2015, SULSIT offers the following Bachelor Programs: Faculty of Library Studies and Cultural Heritage: Library Studies and Bibliography (LSB) Library and Information Management (LIM) Press Communications (PC) Archives and Document Studies (ADS) Information Funds of the Cultural and Historical Heritage (IFCHH) Information Resources of Tourism (IRT) Communications and Informing (CI) Public policies and practices (PPP) Faculty of Information Sciences: Information Technologies (IT) Information Brokering (IB) Information Security (IS) Computer Sciences (CS) Information Technologies in Law Administration (ITLA) National Security and Cultural and Historical Heritage (NSCHH) National Security (NS) SULSIT trains students in the following M.A Degree Programs: Faculty of Library Studies and Cultural Heritage: Library-Informing and Cultural Management Publishing Business and Electronic Resources Media Information and Advertising Cultural and Historical Heritage in the Modern Information Environment Protection of Cultural and Historical Heritage in the Republic of Bulgaria Culture Tourism Electronic content: innovations and politics Management of documents and archives Museum and Art Management Applied Bulgaristics Science and Technology Research Business and Administration Informations Technologies and Communications Strategic Communications and Informing Faculty of Information Sciences: Information Technologies IT in the Media Industry (in co-operation with the Moscow State University of Printing Arts) Technical Entrepreneurship and Information Technologies Innovation Electronic Business and Elevtronic Management Software Engineering Information Technologies and Finance Engineering National security: State, Spirituality and Leadership National Security Information Security See also List of universities in Bulgaria References External links State University of Library Studies and Information Technologies Moscow State University of Printing Arts Universities in Sofia Technical universities and colleges in Bulgaria Educational institutions established in 1950 1950 establishments in Bulgaria
170316
https://en.wikipedia.org/wiki/CP/M-86
CP/M-86
CP/M-86 was a version of the CP/M operating system that Digital Research (DR) made for the Intel 8086 and Intel 8088. The system commands are the same as in CP/M-80. Executable files used the relocatable .CMD file format. Digital Research also produced a multi-user multitasking operating system compatible with CP/M-86, MP/M-86, which later evolved into Concurrent CP/M-86. When an emulator was added to provide PC DOS compatibility, the system was renamed Concurrent DOS, which later became Multiuser DOS, of which REAL/32 is the latest incarnation. The FlexOS, DOS Plus, and DR DOS families of operating systems started as derivations of Concurrent DOS as well. History Digital Research's CP/M-86 was originally announced to be released in November 1979, but was delayed repeatedly. When IBM contacted other companies to obtain components for the IBM PC, the as-yet unreleased CP/M-86 was its first choice for an operating system because CP/M had the most applications at the time. Negotiations between Digital Research and IBM quickly deteriorated over IBM's non-disclosure agreement and its insistence on a one-time fee rather than DRI's usual royalty licensing plan. After discussions with Microsoft, IBM decided to use 86-DOS (QDOS), a CP/M-like operating system that Microsoft bought from Seattle Computer Products renaming it MS-DOS. Microsoft adapted it for PC, and licensed it to IBM. It was sold by IBM under the name of PC DOS. After learning about the deal, Digital Research founder Gary Kildall threatened to sue IBM for infringing DRI's intellectual property, and IBM agreed to offer CP/M-86 as an alternative operating system on the PC to settle the claim. Most of the BIOS drivers for CP/M-86 for the IBM PC were written by Andy Johnson-Laird. The IBM PC was announced on 12 August 1981, and the first machines began shipping in October the same year, ahead of schedule. CP/M-86 was one of three operating systems available from IBM, with PC DOS and UCSD p-System. Digital Research's adaptation of CP/M-86 for the IBM PC was released six months after PC DOS in spring 1982, and porting applications from CP/M-80 to either operating system was about equally difficult. In November 1981, Digital Research also released a version for the proprietary IBM Displaywriter. On some dual-processor 8-bit/16-bit computers special versions of CP/M-86 could natively run CP/M-86 and CP/M-80 applications. A version for the DEC Rainbow was named CP/M-86/80, whereas the version for the was named CP/M 8-16 (see also: MP/M 8-16). The version of CP/M-86 for the 8085/8088-based Zenith Z-100 supported running programs for both processors as well. When PC clones came about, Microsoft licensed MS-DOS to other companies as well. Experts found that the two operating systems were technically comparable, with CP/M-86 having better memory management but DOS being faster. BYTE speculated that Microsoft reserving multitasking for Xenix "appears to leave a big opening" for Concurrent CP/M-86. On the IBM PC, however, at per copy for IBM's version, CP/M-86 sold poorly compared to the PC DOS; one survey found that 96.3% of IBM PCs were ordered with DOS, compared to 3.4% with CP/M-86 or Concurrent CP/M-86. In mid-1982 Lifeboat Associates, perhaps the largest CP/M software vendor, announced its support for DOS over CP/M-86 on the IBM PC. BYTE warned that IBM, Microsoft, and Lifeboat's support for DOS "poses a serious threat to" CP/M-86, and Jerry Pournelle stated in the magazine that "it is clear that Digital Research made some terrible mistakes in the marketing". By early 1983 DRI began selling CP/M-86 1.1 to end users for . Advertisements called CP/M-86 a "terrific value", with "instant access to the largest collection of applications software in existence … hundreds of proven, professional software programs for every business and education need"; it also included Graphics System Extension (GSX), formerly . In May 1983 the company announced that it would offer DOS versions of all of its languages and utilities. It stated that "obviously, PC DOS has made great market penetration on the IBM PC; we have to admit that", but claimed that "the fact that CP/M-86 has not done as well as DRI had hoped has nothing to do with our decision". By early 1984 DRI gave free copies of Concurrent CP/M-86 to those who purchased two CP/M-86 applications as a limited time offer, and advertisements stated that the applications were booters, which did not require loading CP/M-86 first. In January 1984, DRI also announced Kanji CP/M-86, a Japanese version of CP/M-86, for nine Japanese companies including Mitsubishi Electric Corporation, Sanyo Electric Co. Ltd., Sord Computer Corp. In December 1984 Fujitsu announced a number of FM-16-based machines using Kanji CP/M-86. CP/M-86 and DOS had very similar functionality, but were not compatible because the system calls for the same functions and program file formats were different, so two versions of the same software had to be produced and marketed to run under both operating systems. The command interface again had similar functionality but different syntax; where CP/M-86 (and CP/M) copied file SOURCE to TARGET with the command PIP TARGET=SOURCE, DOS used COPY SOURCE TARGET. Initially MS-DOS and CP/M-86 also ran on computers not necessarily hardware-compatible with the IBM PC such as the Apricot and Sirius, the intention being that software would be independent of hardware by making standardised operating system calls to a version of the operating system custom tailored to the particular hardware. However, writers of software which required fast performance accessed the IBM PC hardware directly instead of going through the operating system, resulting in PC-specific software which performed better than other MS-DOS and CP/M-86 versions; for example, games would display fast by writing to video memory directly instead of suffering the delay of making a call to the operating system, which would then write to a hardware-dependent memory location. Non-PC-compatible computers were soon replaced by models with hardware which behaved identically to the PC's. A consequence of the universal adoption of detailed PC architecture was that no more than 640 kilobytes of memory were supported; early machines running MS-DOS and CP/M-86 did not suffer from this restriction, and some could make use of nearly one megabyte of RAM. Reception PC Magazine wrote that CP/M-86 "in several ways seems better fitted to the PC" than DOS; however, for those who did not plan to program in assembly language, because it cost six times more "CP/M seems a less compelling purchase". It stated that CP/M-86 was strong in areas where DOS was weak, and vice versa, and that the level of application support for each operating system would be most important, although CP/M-86's lack of a run-time version for applications was a weakness. Versions A given version of CP/M-86 has two version numbers. One applies to the whole system and is usually displayed at startup; the other applies to the BDOS kernel. Versions known to exist include: All known Personal CP/M-86 versions contain references to CP/M-86 Plus, suggesting that they are derived from the CP/M-86 Plus codebase. A number of 16-bit CP/M-86 derivatives existed in the former East-bloc under the names SCP1700 (), CP/K, and K8918-OS. They were produced by the East-German VEB Robotron Dresden and Berlin. Legacy Caldera permitted the redistribution and modification of all original Digital Research files, including source code, related to the CP/M family through Tim Olmstead's "The Unofficial CP/M Web site" since 1997. After Olmstead's death on 12 September 2001, the free distribution license was refreshed and expanded by Lineo, who had meanwhile become the owner of those Digital Research assets, on 19 October 2001. See also History of computing hardware (1960s-present) SpeedStart CP/M-86 DOS Plus Notes References Further reading External links The Unofficial CP/M Website, which has a licence from the copyright holder to distribute original Digital Research software. The comp.os.cpm FAQ Intel iPDS-100 Using CP/M-Video CP/M variants IBM PC compatibles Microcomputer software Digital Research operating systems Discontinued operating systems Floppy disk-based operating systems Free software operating systems X86 operating systems 1981 software
2642453
https://en.wikipedia.org/wiki/NetWare%20Core%20Protocol
NetWare Core Protocol
The NetWare Core Protocol (NCP) is a network protocol used in some products from Novell, Inc. It is usually associated with the client-server operating system Novell NetWare which originally supported primarily MS-DOS client stations, but later support for other platforms such as Microsoft Windows, the classic Mac OS, Linux, Windows NT, Mac OS X, and various flavors of Unix was added. The NCP is used to access file, print, directory, clock synchronization, messaging, remote command execution and other network service functions. It originally took advantage of an easy network configuration and a little memory footprint of the IPX/SPX protocol stack. Since mid-1990s the TCP/IP implementation is available. Novell eDirectory uses NCP for synchronizing data changes between the servers in a directory service tree. Technical information The original IPX/SPX implementation was provided only for Novell NetWare platform and now is obsolete. The TCP/IP implementation uses TCP/UDP port 524 and relies on SLP for name resolution. For NCP operation in IPX/SPX networks the bare IPX protocol was used with Packet Type field set to 17. On the workstation (client station) side the IPX socket number of 0x4003 was used, on the server side the socket number of 0x0451. The NCP PDU has the following structure: The NCP Type field determines the type of operation: Individual requests are identified by the Sequence Number (modulo 256). The Connection Number identifies an individual client station connection on the server. Novell Netware servers of version up to 2.x supported up to 255 connections and the Connection Number occupied only 1 octet. Later it was extended to 2 octets. Task number has value 3 in requests and 1 in replies. The Data field starts with NCP Function number octet which distinguishes individual services. The contents and the length of the rest of the Data field depends on the NCP Function. Client-side implementations Novell Client for Windows Vista from Novell. Novell Client for Windows 2000/XP/2003 from Novell. Novell Client for Windows 95/98 from Novell. Novell Client for Linux from Novell. NetWare Clients for DOS from Novell - no longer supported. NetWare Client for Mac OS X from Prosoft Engineering. ncpfs, an open-source NCP client implementation for Linux. Client Service for NetWare from Microsoft. External links NCP specification without description of underlying Netware RPC framework Wireshark (an open source protocol analyzer) Documentation - Fields of the NCP packet Making Mac OS X play nicely with Novell Network file systems Network protocols Novell NetWare Presentation layer protocols
14124773
https://en.wikipedia.org/wiki/High%20Performance%20Computing%20Modernization%20Program
High Performance Computing Modernization Program
The United States Department of Defense High Performance Computing Modernization Program (HPCMP) was initiated in 1992 in response to Congressional direction to modernize the Department of Defense (DoD) laboratories’ high performance computing capabilities. The HPCMP provides supercomputers, a national research network, and computational science experts that together enable the Defense laboratories and test centers to conduct research, development, test and technology evaluation activities. The program was administered by the Office of the Director, Defense Research and Engineering (now called the Assistant Secretary of Defense for Research and Engineering) through FY2011, at which point it was transferred to the office of the United States Assistant Secretary of the Army for Acquisition, Logistics, and Technology, where it is managed by the Deputy Assistant Secretary for Research and Technology. The program comprises three primary elements: DoD Supercomputing Resource centers, which provide large scale supercomputers and operations staff; DREN, a nationwide high speed, low latency, R&D network connecting the centers and major user communities; and a collection of efforts in software applications to develop, modernize, and maintain software to address DoD's science and engineering challenges. The current director of the program is David Horner, who was appointed to the position in January 2015. DoD Supercomputing Resource Centers The HPCMP funds and oversees the operation of five supercomputing centers, called DoD Supercomputing Resource Centers, or DSRCs. The centers are operated by the Engineer Research and Development Center in Vicksburg, MS (erdc.hpc.mil), the Army Research Laboratory in Aberdeen, MD (arl.hpc.mil), the Naval Meteorology and Oceanography Command in Stennis Space Center, MS (navydsrc.hpc.mil), and the Air Force Research Laboratory in Dayton, OH (afrl.hpc.mil), and Maui High Performance Computing Center in Maui, HI (mhpcc.hpc.mil). The Arctic Region Supercomputing Center (ARSC) in Fairbanks, AK was a sixth DSRC until funding for it was discontinued in 2011. Each center hosts large-scale supercomputers, high-speed networks, multi-petabyte archival mass storage systems, and computational experts. The centers are managed by the HPCMP Assistant Director for Centers, who also funds program-wide activities in user support (the HPC Help Desk) and scientific visualization (the Data Analysis and Assessment Center, or DAAC). Defense Research and Engineering Network The Defense Research and Engineering Network (DREN) — a high-speed national computer network for computational research, engineering, and testing — is a significant program within the HPCMP. The DREN is the United States Department of Defense’s research and engineering computer network. The DREN is a high-speed, high-capacity, low-latency nationwide computer network for computational scientific research, engineering, and testing in support of the DoD's Science and Technology and Test and Evaluation communities. The DREN connects scientists and engineers at the HPCMP's geographically dispersed high performance computing (HPC) user sites — including the five DoD Supercomputing Resource Centers — and more than 150 user sites at other government laboratories, test centers, universities, and industrial locations throughout the United States (including Hawaii and Alaska). The DREN wide area networking (WAN) capability is provided under a commercial contract, currently awarded to CenturyLink. It has been awarded in the past to AT&T, MCI/WorldCom, and Verizon. The DREN WAN service provider has built DREN as a virtual private network based on its commercial infrastructure. Capabilities provided by DREN III include digital data transfer services at speeds from 50 Mbit/s through 100 Gbit/s. DREN III is also fully IPv6 enabled, with legacy support for IPv4. In 2003, DREN was designated the Department of Defense's first IPv6 network by the Assistant Secretary of Defense for Networks & Information Integration. Other research networks of interest: CANARIE DANTE Energy Sciences Network High Performance Wireless Research and Education Network Internet2 Network NASA Research and Engineering Network Software Applications In addition to supercomputers and the national wide area research network, the HPCMP funds software applications that help the DoD achieve its research objectives by ensuring that important DoD applications run effectively on the large-scale supercomputers it deploys. HPC Software Applications Institutes HPC Software Applications Institutes, or HSAIs, are cross-disciplinary projects funded by the HPCMP but executed in the DoD labs. The HSAI program both develops tools to solve important computational problems facing the DoD and build organic HPC expertise within the department. Productivity Enhancement, Technology Transfer, and Training The Productivity Enhancement, Technology Transfer, and Training, or PETTT, project provides training, computational tools and libraries, as well as application development and algorithm enhancement specifically designed to improve the productivity of the DoD’s HPC user community. PETTT projects are generally both shorter in duration and smaller in scale that HSAIs. The PETTT program provides the HPCMP and DoD users with access to expertise and new technologies emerging from over 100 university and industry partners. Computational Research and Engineering Acquisition Tools and Environments The Computational Research and Engineering Acquisition Tools and Environments, or CREATE, project is designed to bring the benefits of modern scientific computing to the development and acquisition of next-generation Defense weapons systems, and modernization and enhancement of currently deployed weapons systems. CREATE is developing large-scale, multiphysics software to support the simulation of aircraft (CREATE-AV), ship survivability and maneuverability (CREATE-Ships), radio frequency antenna design (CREATE-RF), and military ground vehicles (CREATE-GV). The program is supported by a fifth, cross-cutting effort to develop a common suite of tools for geometry and mesh generation (CREATE-MG). Current and Former Program Directors The following table summarizes the chronology of program directors. The HPCMP in the DoD Budget Since FY2012, the HPCMP's base funding has been provided on two lines in the Army budget, which is itself a portion of the Department of Defense Budget submitted each year by the President and approved by Congress. PE 0603461A provides for RDT&E funds that operate the centers and DREN, and funds R&D efforts in support of program goals. Line item number B66501 (line 103, BA 02, BSA 92) provides procurement funds for the annual purchase of new supercomputing hardware (both supercomputers and related systems). Prior to FY2012, the HPCMP's RDT&E funding was provided on PE 0603755D8Z, while procurement was funded on PE 0902198D8Z (P011). The following table summarizes requested and committee-approved funding amounts for the RDT&E portion of the program for the most recent federal fiscal years (procurement funding, which is supplied on a different line in the federal budget, is not included in this table). The temporary change in Program Element number for FY2004 reflects a planned transition of the program from management by the Office of the Secretary of Defense to the Air Force; this transition did not ultimately occur. References External links DREN IPv6 Information High Performance Computing Modernization Program United States Department of Defense information technology Wide area networks Supercomputer sites
661675
https://en.wikipedia.org/wiki/S/KEY
S/KEY
S/KEY is a one-time password system developed for authentication to Unix-like operating systems, especially from dumb terminals or untrusted public computers on which one does not want to type a long-term password. A user's real password is combined in an offline device with a short set of characters and a decrementing counter to form a single-use password. Because each password is only used once, they are useless to password sniffers. Because the short set of characters does not change until the counter reaches zero, it is possible to prepare a list of single-use passwords, in order, that can be carried by the user. Alternatively, the user can present the password, characters, and desired counter value to a local calculator to generate the appropriate one-time password that can then be transmitted over the network in the clear. The latter form is more common and practically amounts to challenge–response authentication. S/KEY is supported in Linux (via pluggable authentication modules), OpenBSD, NetBSD, and FreeBSD, and a generic open-source implementation can be used to enable its use on other systems. OpenSSH also implement S/KEY since version OpenSSH 1.2.2 released on December 1, 1999. One common implementation is called OPIE. S/KEY is a trademark of Telcordia Technologies, formerly known as Bell Communications Research (Bellcore). S/KEY is also sometimes referred to as Lamport's scheme, after its author, Leslie Lamport. It was developed by Neil Haller, Phil Karn and John Walden at Bellcore in the late 1980s. With the expiration of the basic patents on public-key cryptography and the widespread use of laptop computers running SSH and other cryptographic protocols that can secure an entire session, not just the password, S/KEY is falling into disuse. Schemes that implement two-factor authentication, by comparison, are growing in use. Password generation The server is the computer that will perform the authentication. This step begins with a secret key W. This secret can either be provided by the user, or can be generated by a computer. Either way, if this secret is disclosed, then the security of S/KEY is compromised. A cryptographic hash function H is applied n times to W, thereby producing a hash chain of n one-time passwords. The passwords are the results of the application of the cryptographic hash function: H(W), H(H(W)), ..., Hn(W). The initial secret W is discarded. The user is provided with the n passwords, printed out in reverse order: Hn(W), Hn−1(W), ..., H(H(W)), H(W). The passwords H(W), H(H(W)), ..., Hn−1(W) are discarded from the server. Only the password Hn(W), the one at the top of the user's list, is stored on the server. Authentication After password generation, the user has a sheet of paper with n passwords on it. If n is very large, either storing all n passwords or calculate the given password from H(W) become inefficient. There are methods to efficiently calculate the passwords in the required order, using only hash calculations per step and storing passwords. More ideally, though perhaps less commonly in practice, the user may carry a small, portable, secure, non-networked computing device capable of regenerating any needed password given the secret passphrase, the salt, and the number of iterations of the hash required, the latter two of which are conveniently provided by the server requesting authentication for login. In any case, the first password will be the same password that the server has stored. This first password will not be used for authentication (the user should scratch this password on the sheet of paper), the second one will be used instead: The user provides the server with the second password on the list and scratches that password. The server attempts to compute H(), where is the password supplied. If H() produces the first password (the one the server has stored), then the authentication is successful. The server will then store as the current reference. For subsequent authentications, the user will provide i. (The last password on the printed list, n, is the first password generated by the server, H(W), where W is the initial secret). The server will compute H(i) and will compare the result to i−1, which is stored as reference on the server. Security The security of S/KEY relies on the difficulty of reversing cryptographic hash functions. Assume an attacker manages to get hold of a password that was used for a successful authentication. Supposing this is i, this password is already useless for subsequent authentications, because each password can only be used once. It would be interesting for the attacker to find out i−1, because this password is the one that will be used for the next authentication. However, this would require inverting the hash function that produced i−1 using i (H(i−1) = i), which is extremely difficult to do with current cryptographic hash functions. Nevertheless, S/KEY is vulnerable to a man in the middle attack if used by itself. It is also vulnerable to certain race conditions, such as where an attacker's software sniffs the network to learn the first N − 1 characters in the password (where N equals the password length), establishes its own TCP session to the server, and in rapid succession tries all valid characters in the N-th position until one succeeds. These types of vulnerabilities can be avoided by using ssh, SSL, SPKM, or other encrypted transport layer. Since each iteration of S/KEY doesn't include the salt or count, it is feasible to find collisions directly without breaking the initial password. This has a complexity of 264, which can be pre-calculated with the same amount of space. The space complexity can be optimized by storing chains of values, although collisions might reduce the coverage of this method, especially for long chains. Someone with access to an S/KEY database can break all of them in parallel with a complexity of 264. While they wouldn't get the original password, they would be able to find valid credentials for each user. In this regard, it is similar to storing unsalted 64-bit hashes of strong, unique passwords. The S/KEY protocol can loop. If such a loop were created in the S/KEY chain, an attacker could use user's key without finding the original value, and possibly without tipping off the valid user. The pathological case of this would be an OTP that hashes to itself. Usability Internally, S/KEY uses 64-bit numbers. For human usability purposes, each number is mapped to six short words, of one to four characters each, from a publicly accessible 2048-word dictionary. For example, one 64-bit number maps to "ROY HURT SKI FAIL GRIM KNEE". See also OTPW OPIE Authentication System PGP biometric word list uses two lists of 256 words, each word representing 8 bits. References External links The S/KEY One-Time Password System (RFC 1760) A One-Time Password System (RFC 2289) jsotp: JavaScript OTP & S/Key Calculator Introduction to the system Cryptographic software Password authentication
5829782
https://en.wikipedia.org/wiki/Accordance
Accordance
Accordance is a Bible study program for Apple Macintosh and iPhone, and now Windows and Android, developed by OakTree Software, Inc. Although originally written exclusively for the Mac OS (and then iOS), Accordance was then released in a Windows-native version, although it was available prior to this by using the Basilisk II emulator. Since 2018 there has also been an Accordance app for Android. The program is used for both private and academic study. OakTree Software OakTree Software, Inc. is based in Altamonte Springs, Florida, USA. The company has focused on areas of special interest, particularly the study of Biblical texts. Program history Roy Brown, OakTree Software's president and application developer, created one of the first Bible programs available for the Macintosh, known as ThePerfectWord, in 1988. ThePerfectWord was later bought by another company and renamed MacBible. By the early 1990s there were a number of good general Bible study programs for the Mac. However, Brown saw the need for a new program which would make it easy to engage in more sophisticated kinds of Bible study, enabling scholars and pastors to do in-depth analysis of the original Greek and Hebrew texts of the Bible. Accordance 1.0 was released in February 1994, and welcomed for its power and ease of use. The translators of the Holman Christian Standard Bible, completed in 2004, used it for word studies, comparisons and instant searches. A version of Accordance 5.0 rewritten to run natively under Mac OS X was released at the start of 2002. The company has continued to add and improve features, such as the native Quartz rendering system. Version 8 was released in May 2008 and introduced a universal binary for Intel-based Macs. Version 9 was released in September 2010, version 10 in 2012, version 11 in 2014 and version 12 in 2016. Version 12 also saw a complete refreshing of the packages available, where, apart from individual modules, users must purchase collections from a specialized stream (Hebrew, Greek, English or Graphics). Version 13 was released in November 2019, new features including new look, built-in training, and import from PDF. Accordance for iOS was released on December 30, 2010 as a free app for iPhone, iPod Touch and iPad. After running through an emulator for many years, 2013 saw Accordance released as Windows-native software, with upgrades and updates generally running in parallel with the Mac software. After a beta in 2017, Accordance for Android was released in early 2018. Modules available The program is centered on the Biblical text, but has many additional texts. There are many optional modules, detailed study tools for the original Hebrew and Greek, commentaries and reference dictionaries, with a unique cross-reference system to interconnect the Biblical text with entire libraries of ancient extra-biblical material . Some modules available include: Mishnah, Targums, Pseudepigrapha, Josephus and early Christian writings such as the Didache Biblia Hebraica Stuttgartensia with the Groves-Wheeler Westminster Hebrew Morphology Louw & Nida Semantic Domain Lexicon GNT Robinson Byzantine Dead Sea Scrolls Bible Atlas and Bible Lands Photo Guide Comprehensive Cross Reference interactive module for Dead Sea Scrolls, Josephus, Philo, Nag Hammadi Library, Pseudepigrapha, Old Testament Apocrypha, New Testament Apocrypha, Plato, Pythagoras, Dhammapada, Egyptian Book of the Dead, Tacitus, Talmud, New and Old Testaments, Apostolic and Early Church Fathers Although the product has many modules, purchasing is simplified by packaging these into "Collections" (which have replaced all of the earlier "bundles" and "libraries"). References External links Accordance Home Accordance Blog Accordance Forums Accordance Exchange Electronic Bibles
19401142
https://en.wikipedia.org/wiki/Kot%20Addu%2C%20Pakistan
Kot Addu, Pakistan
Kot Addu () is a city and tehsil in the Muzaffargarh District of the southern part of the Punjab province of Pakistan. This city is subdivided into 30 Union Councils and has a population of over 104 thousand, making it the 67th largest city in Pakistan. It is located just east of the Indus River, about from Karachi, from Islamabad, 100 km from Multan, 80 km from D.G.Khan, from Muzaffargarh, 60 km from Layyah, and from Taunsa Barrage. Kot Addu City attracts a large number of tourists every year, due to the Indus river and public gardens among other things. The city is served by Kot Addu Junction railway station.The Zip code of Kot Addu is 34050.It is the 69th largest city of Pakistan according to the 2017 census. Demography Total population 8,08,438 persons Location on Google Map N 30° 28' 34 E 70° 57' 52 Total Area Area under cultivation 4,24,521 The city contains a total of 31 Union councils, which are listed below. Haider Ghazi Bait Qaimwala Bharri Hog Budh Chak No. 547/TDA Chak No. 565/TDA Chak No. 632/TDA Chowk Sarwar Shaheed D.D. Pannah Dogar Kalasra Ghazi Ghatt Hinjrai Ihsanpur Kot Adu No. 2 Kot Adu No. 1 Kot Adu No. 3 Manhan Sharif Mehmood Kot Mirpur Bhagal Pattal Monda Pattal Kot Adu Patti Ghulam Ali Sanawan Shadi Khan Monda Sheikh Umer Thatha Gurmani U.C. 22 Gujrat Wahandur Pirhar cha larr wala NoorShah Basti Sirai Location The city of Kot Addu is located in the southern area of Punjab province, almost at the exact center of Pakistan. The area around the city is a flat plain and is ideal for agriculture. There are two main canals (Muzaffar, and T.P. link) and eight sub-canals that cross Kot Addu, providing water from the Indus River. The geographical coordinates of the city, according to Google Maps, are: N 30° 28' 34" E 70° 57' 52". Geography and climate Kot Addu is located almost exactly at the geographical center of Pakistan. The closest major city is Multan. The area around the city is a flat alluvial plain and is ideal for agriculture, with many citrus and mango farms. There are also canals that cut across the Muzaffargarh District, providing water to farms. During the monsoon season, the land close to the Taunsa Barrage is usually flooded. Kot Addu has an arid climate with very hot summers and mild winters. The city has experienced some of the most extreme weather in Pakistan. The highest recorded temperature was approximately 51 °C (129 °F), and the lowest recorded temperature was approximately −1 °C (30 °F). The average rainfall is roughly 127 millimeters (5.0 inches). Dust storms are a common occurrence within the city. Education Like other major cities in Punjab, Kot Addu features a rich educational landscape. In the last few years, the city has observed a surge in the number of educational institutions. Colleges include private commerce and science colleges, schools, academies like Concordia Colleges (A Project of Beaconhouse), Punjab Inter Science Academy, and a cadet college. Government degrees are provided for both men and women. Several colleges are affiliated with the Bahuddin Zakrya University, B.Z.U. Multan Pakistan, and Punjab University Lahore Pakistan. The names of some private institutions are, Kot Addu School of Economics and Management Sciences (Mr. Fiaz Hussain Malik, director), Punjab Higher Secondary School, Bismillah. Inter Science Academy, Punjab Group of Colleges, My School System, Dar-E-Arqam School, Oxford Grammar School, Professor Academy etc. A Government Technical College is also under construction. School and Colleges Ahmad Schools (PTV) Limited Allied School Al-Rehman College of Technology Kot Addu. Bismillah Grammar School Kot Addu. BrightWay School System C.C.U. The Creator children university City Public School Zia Colony, Kot Addu City School Concordia Colleges (a Beaconhouse project) Dar-e-Arqam School Kot Addu Campus. Elementary Teachers Training College Kot Addu Ghalib Science Secondary School Government College of Commerce Kot Addu Government College of Technology Lal Meer Kot ADDU Government Post Graduate College Kot Addu Govt. boys High School Govt. High School #01 Govt. P/s School Basti sirai Hira Girls Science Academy Kot Addu. Ideal Public School IIUI (School) Kot Adu campus Learners House school (Kot Addu) Little Angel's Garden Mairi Darsgah Secondary School for Boys and Girls Maryam Public Secondary School Moon of Heaven Academy www.moonofheaven.com Moon of Heaven Commerce College (Regd) www.moonofheaven.com My School System Kot Addu New Ibn-e-Sena School Kot Addu New Punjab Higher Secondary School Kot Addu Punjab Group of Colleges Kot Addu Sunny Public School Superior Group of Colleges Kot Addu Vocational Training Institute Kot Addu. The Scholars Inn School Taunsa Barrage Taunsa Barrage is a barrage on the River Indus. It is situated southeast of Taunsa Sharif and from Kot Addu. This barrage controls water flow in the River Indus for irrigation and flood control purposes. This barrage serves 2.351 million acres (951,400 hectares) besides diverting flows from Indus River to the Chenab River through Taunsa Panjnad (TP) Link Canal. The barrage also serves as an arterial road bridge, a railway bridge, and crossing for gas and oil pipelines, telephone line and extra high voltage (EHV) transmission lines. In 2011, the rehabilitation of the Taunsa Barrage was blamed for devastation of the Muzaffargarh district during the 2010 Pakistan floods. Critics blamed the rehabilitation of the barrage, alleging that it failed to raise its height and strengthen protective embankments, used dysfunctional computer control system of the hoist gates and ignored hill-torrent management. Industry Kot Addu is a prominent commercial and industrial city in the Punjab province. It is connected by road and rail with Lahore, Karachi, Multan, Rawalpindi, Islamabad, Quetta, and Faisalabad; and also by air from Multan Airport to all Pakistani airports. Main Industries Pak Arab Oil Refinery (PARCO) Power Stations Kot Adu Power Company The Kot Addu Power Company Limited (KAPCO) was incorporated in 1996, located in Kot Addu, District Muzaffargarh, Punjab, Pakistan. Kot Addu Power Plant was built by the Pakistan Water and Power Development Authority (WAPDA). In April 2000, the Company was incorporated as a public limited company. Kot Addu Power Plant produces 1,600 MW of electricity. On 18 April 2005 the Company was formally listed on all three Stock Exchanges of Pakistan.[21] Lal-Peer Thermal Power Station Sugar Mills Sheikhoo Sugar Mill Fatima Sugar Mill Gellani flour Mill LTD Shoib Qasim Floor Mill Besides these, cotton factories, foundries, cotton, woolen, and silk textile mills, flour, and oil mills are also located in this region. This area is famous for its handicrafts (Kundra work), and cottage industries. Information technology industry With the constantly changing environment in the world and as information technology has become increasingly important, Kot Addu has also adopted this change with a great positive response. The first professional information technology center was introduced in 2002, named I.TECH, which then also opened a branch in Dubai in 2006, U.A.E. It is a prominent institution in that region which mainly focuses on web designing and development. There are many other institutions contributing towards education in Information Technology. It is clear from the interest of the people that this region needs some Great Government I.T. institution. In November 2009 the Government of Pakistan also opened a Great Technical College named LAL-MEER technical college in the Union of Kot Addu. Health centers There is one government civil hospital and several private hospitals in the city, beside much small government and private hospitals in the union councils. Agriculture Kot Addu is an important agricultural area. The total area of Tehsil Kot Addu is , of which 4,24,521 acres are under cultivation. Main crops of the area include corn, cotton, rice, sugarcane, tobacco, wheat, and vegetables. Bajra, moong, mash, masoor, and oil seeds (such as mustard and sunflower seeds) are also grown in the district. Mangoes, citrus, guavas, and pomegranates are Kot Addu's most important fruit crops. Minor fruit crops include dates, jaman, pears, falsa and bananas are also grown. One other local fruit is called a bare (Berry) . It is one of the main fruits grown in this region. Due to flooding, the crops are now rare, especially cotton and wheat products affected by this flooding. So, now a majority of the agricultural lands are covered by sugarcane. People either sell this sugarcane to sugar mills or produce Jaggery from sugarcane. KG Gurmani Agriculture Farm (Thatta Gurmani Bate Esan Wala Kot Addu). Notable people Inayat Hussain Bhatti - film industry Mushtaq Ahmed Gurmani - former Governor ofWest Pakistan Pathanay Khan - poet, pride of performance Ghulam Mustafa Khar - former Governor and Chief Minister of Punjab Hina Rabbani Khar - former Foreign Minister of Pakistan Sultan Mehmood - former Minister Mian Shabbir Ali Qureshi - Federal Minister Milkha Singh - Indian athlete also known as the "Flying Sikh" See also Kot Addu Tehsil Dera Ghazi Khan Division References Populated places in Muzaffargarh District
1887719
https://en.wikipedia.org/wiki/HP%20Time-Shared%20BASIC
HP Time-Shared BASIC
HP Time-Shared BASIC (HP TSB) is a BASIC programming language interpreter for Hewlett-Packard's HP 2000 line of minicomputer-based time-sharing computer systems. TSB is historically notable as the platform that released the first public versions of the game Star Trek. The system implements a dialect of BASIC as well as a rudimentary user account and program library that allows multiple people to use the system at once. The systems were a major force in the early-to-mid 1970s and generated a large number of programs. HP maintained a database of contributed-programs and customers could order them on punched tape for a nominal fee. Most BASICs of the 1970s trace their history to the original Dartmouth BASIC of the 1960s, but early versions of Dartmouth did not handle string variables and vendors added their own solutions. This led to two general styles; DEC introduced the MID/LEFT/RIGHT functions, while TSB used a system more akin to Fortran and other languages with array slicing. As microcomputers began to enter the market in the mid-1970s, many new BASICs appeared that based their parsers on DEC's or HP's syntax. Altair BASIC, the original version of what became Microsoft BASIC, was patterned on DEC's BASIC-PLUS. Others, including Apple's Integer BASIC, Atari BASIC and North Star BASIC were patterned on the HP style. This made conversions between these platforms somewhat difficult if string handling was encountered. Nomenclature The software was also known by its versioned name, tied to the hardware version on which it ran, such as HP 2000C Time-Shared BASIC and the operating system came in different varieties — 2000A, 2000B, 2000C, High-Speed 2000C, 2000E, and 2000F. HP also referred to the language as "Access BASIC" in some publications. This matched the naming of the machines on which it ran, known as the "2000/Access" in some publications. This terminology appears to have been used only briefly when the platform was first launched. Platform details Except for the 2000A and 2000E systems, the system is implemented using a dual-processor architecture. One fully configured HP 2100-series processor is used for the execution of most of the system code and all of the user code, while a second, smaller HP 2100-series processor is used to handle the RS-232 serial lines through which the time-sharing users connected. Depending on the hardware configuration, the system supports up to 16 or up to 32 simultaneous remote users. The usual terminal for a TSB system was a Teletype Model 33 ASR and connected directly to the I/O processor or through a modem or acoustic coupler. Account names are a combination of one alphabetic character, followed by three decimal digits, e.g., B001. Privileged accounts started with the letter "A" and had some additional command and program storage capabilities. The superuser account is A000. This scheme allows up to 26,000 user accounts. During execution, user programs are swapped to a fixed head drive — physically a disk, but operating like a magnetic drum. When not executing, user programs are stored on moving-head cartridge- or pack-loaded disk storage. Privileged users can also store programs on the much-faster drum. The hard drive was backed up to magnetic tape. Program and file names consist of a mix of up to six alphabetic characters (A-Z) and numbers (0-9). Programs are stored in a tokenized format, using the SAVE command. They can also be stored in a semi-compiled format, using the CSAVE command, which allows them to start quicker. Since the system was closely tied to the use of commonly available teleprinters, line endings in files consisted of the carriage return character (ASCII CR, 0D hexadecimal), followed by the linefeed character (ASCII LF, 0A hexadecimal). Syntax The language is a fairly standard implementation of BASIC, providing an integrated editing and runtime environment. Statements are analyzed for correct syntax as they are entered and then stored in tokenized form. Each BASIC statement has to be on a uniquely numbered line, e.g. 10 PRINT "HELLO WORLD" Line numbers are mandatory, and statements are automatically placed in ascending numeric sequence. TSB lines can contain one statement, chaining multiple statements with the colon as in MS BASIC is not supported. Multiple variable assignments are allowed, e.g., 20 LET A=B=C=42. As in most versions of BASIC, use of the word "LET" was optional. In the earliest version (2000A), the language supported the following features. Later versions added many more features. Unconditional program flow-control via GOTO statements, and subroutines via the GOSUB and RETURN statements Conditional flow-control via IF/THEN statement Calculated flow-control via the GOTO/OF and GOSUB/OF statements Variable-based block loop FOR and NEXT statements In-code data storage via DATA, READ, and RESTORE statements Input from and output to the user or a disc file via INPUT, READ #, PRINT, PRINT #, and IF END # statements Numeric variables of the form "A" or "An" (where A is a single letter and n is a single, optional digit) stored as 32-bit floating-point numbers String variables of the form "A$" (where A is a single letter), storing from 0 to 72 characters One- or two-dimensional matrix (array) variables of the form "A[x]" or "A[x,y]" Matrix operations via statements (MAT READ, MAT INPUT, MAT PRINT, MAT=) and operations (+, -, *, ZER, CON, IDN, INV, TRN) Boolean operators (AND, OR, NOT) and relational operators (<, <=, =, #, <>, >=, and >) Built-in mathematical functions including trigonometric (SIN, COS, TAN, ATN), logarithms (LOG, EXP), square root (SQR), random number generator (RND), others (ABS, INT, SGN, MIN, MAX), and user-defined functions Punched tape operations using Teletype Model 33 electromechanical teleprinter remote terminals String handling Strings in TSB are treated as an array of characters, rather than a single multi-character object. By default, they are allocated one character in memory, and if a string of longer length is needed, they have to be mentioned before use. For instance, DIM A$[10] will set up a string that can hold a maximum of 10 characters. The maximum length of a string in TSB is 72 characters. Substrings within strings are accessed using a "slicing" notation: A$(L,R) or A$[L,R], where the substring begins with the leftmost character specified by the index L and continues to the rightmost character specified by the index R, or the A$[L] form where the substring starts at the leftmost character specified by the index L and continues to the end of the string. TSB accepts () or [] interchangeably. Array and substring indices start with 1. This is in sharp contrast to BASICs following the DEC pattern that use functions such as LEFT$(), MID$(), and RIGHT$() to access substrings, although ANSI BASIC continues to use a similar substring syntax to that introduced by Hewlett-Packard. HP's notation can also be used on the destination side of a LET or INPUT statement to modify part of an existing string value, for example or , which cannot be done with early implementations of LEFT/MID/RIGHT. The main advantage to this style of string access is that it eliminates the need for complex memory management that is otherwise required when string lengths change. MS BASIC had a lengthy library to handle the compression of memory by removing dead space in the string heap when the system ran out of memory. It was also notoriously slow, and was modified several times over its lifetime in order to improve performance or fix bugs. The downside to the TSB style is that the string always takes up the full amount of DIMed space even if the string inside is empty, and simple tasks like concatenation can potentially overflow the string unless it was set to a large size to begin with. Later versions of Dartmouth BASIC did include string variables, based on the same pattern found in BASIC-PLUS and MS BASIC. However, this version did not use the LEFT/MID/RIGHT functions for manipulating strings, but instead used the CHANGE command which converted the string to and from equivalent ASCII values. HP included identical functionality, changing only the name to CONVERT. Additionally, one could use the single-quote to convert a numeric constant to an ASCII character, allowing one to build up a string in parts; A$='23 '64 '49 "DEF" produced the string "ABCDEF", without the need for the CHR$() function. MAT commands Later versions of Dartmouth BASIC included a suite of MAT commands that allowed operations on entire arrays (matrices) with a single statement. These were also available in later versions of TSB. In their simplest form, the MAT is used like an alternate form of LET, applying an expression to all the elements in an array. For instance: 100 DIM A(20),B(20) ... 200 MAT A=A+B Will add the value of every value in B to every entry in A, in the same fashion as: 100 DIM A(20),B(20) ... 200 FOR I=1 TO 20 210 A[I]=A[I]+B[I] 220 NEXT I As well as making the code shorter and more obvious, these commands also have the advantage of being highly optimized, easily outperforming the use of FOR/NEXT. Additional functions and statements modify PRINT and INPUT, invert arrays, and build identity matrixes and such in a single statement. Other differences TSB also includes a number of more minor differences with other dialects. Among the most important are: # is an optional form of the not-equal comparison, identical to <> computed-goto using the ON...GOTO/GOSUB syntax is not supported. Instead, the GOTO expression OF 1,2,3... performs the same function by picking a line number from the list based on its ordinal position. For instance, GOTO 1 OF 10,20,30 will always go to line 10, whereas GOSUB A OF 100,200,300 will branch to different lines if the value of A is 1, 2 or 3. Boolean and relational operators can be used in any mathematical expression, returning 0 for false or 1 for true, which was unusual for BASIC languages of that time, but popular in languages like C. For instance, IF C+D THEN 1600 will branch to line 1600 if either C or D are greater than zero, because the expression C+D will evaluate to 'true' in the IF. If C and D are both zero, the IF will evaluate it to 'false' and the branch will not be taken. TSB includes ENTER, a variation on the standard INPUT statement that continues after a time limit is reached. ENTER has three inputs, a time limit in seconds, a return variable containing the actual time elapsed (or a status code), and then finally the user input. For instance, ENTER 15,T,A$[1,1] will wait 15 seconds for the user to type in a single character. T will contain the actual time they took, -256 if the timer expired, or -257 or -258 to indicate problems with the terminal. When printing string constants (literals), semicolons are not needed within the line. For instance, PRINT "THE NUMBER IS"A", TRY A LARGER VALUE." does not require semicolons between the string constants and the variable A. Some other BASICs, including MS, also supported this syntax. Others, like Atari or Integer, did not. Commas in PRINT use tab stops very 15 characters, leaving 12 at the end of the line to total 72. The LIN function operates as a vertical counterpart to TAB. LIN(3) will insert three carriage returns, potentially on the existing line if a trailing semicolon or comma was active, while the special-case LIN(-1) will always advance to the next line. Integer BASIC had a similar feature, called VTAB. See also Rocky Mountain BASIC, another but very different dialect of BASIC created at Hewlett-Packard Notes References Citations Bibliography , Part No. 22687-90001 , Part No. 22687-90009 External links www.bitsavers.org — Archived HP documentation (scanned into PDF) HP 2000 compatible Basic Interpreter HP Computer Museum BTI Computer Systems History HP software Time-sharing operating systems BASIC programming language BASIC interpreters BASIC programming language family
37580119
https://en.wikipedia.org/wiki/Juan%20Pav%C3%B3n
Juan Pavón
Juan Pavón (born 19 November 1962) is a Spanish computer scientist, full professor of the Complutense University of Madrid (UCM). He is a pioneer researcher in the field of Software Agents, co-creator of the FIPA MESSAGE and INGENIAS methodologies, and founder and director of the research group GRASIA: GRoup of Agent-based, Social and Interdisciplinary Applications at UCM. He is known for his work in the field of Artificial Intelligence, specifically in agent-oriented software engineering. Biography Education Pavón belongs to the first Spanish generation to get official studies in Computer Science, during the 1980s. He studied Computer Science at the Technical University of Madrid, graduating in 1985. In 1988, he obtained his PhD in this area with the thesis: "Synthesis of communication protocols from service specifications". While doing this thesis, he worked as Assistant Professor at the same university. After his PhD, he joined Alcatel-Lucent R&D team, where he worked for 10 years. At the end of 1997 he got an Associate Professor position at UCM, and in 2006, he achieved the Habilitation à diriger des recherches qualification in Computer Science at the Université Pierre et Marie Curie (Paris VI) with the thesis "INGENIAS : Développement Dirigé par Modèles des Systèmes Multi-Agents" (in French). Career He joined the Alcatel R&D department as a systems engineer. There, he worked on the development of software component-based architectures for distributed systems, and its applications for multimedia services over broadband networks and new generation mobile phones. During the 10 years he worked for the company, he spent periods in Alcatel centers outside Spain, such as France (Lannion and Vélizy) and Belgium (Namur and Antwerp). In this period, he worked several years in the labs of Bellcore in Red Bank, New Jersey (USA), working in the TINA-C Core Team, and helping to produce architectural models for telecommunication services. As a result, he published several popular works. He then returned to the academic world, as an Associate Professor at the Computer Science School of the Complutense University of Madrid (1997). By then, he researched Multi-Agent Systems within the Eurescom project P815 "Communications Management Process Integration Using Software Agents" (1999) working with Telefónica R+D. His work in several projects combines software engineering practices and MAS. In the Eurescom P907 "Methodology for Engineering Systems of Software Agents", he co-created the MESSAGE methodology, currently part of the main FIPA software agent methodologies. In 2000 he established the research group GRASIA for the research in Software Agents and Artificial Intelligence in the Complutense University of Madrid. He also held several management positions in the university, serving as Vice Dean for four years (1998–2002). Nowadays, he is full professor at the Universidad Complutense Madrid. Work Software Agents Research Pavón work has focused on Software Agents and Agent-based simulation. His work on Software Agents started at the end of the 1990s, after they emerge as a new paradigm. His first project in the area was funded by Telefonica R+D, "Communications Management Process Integration Using Software Agents" (1999). Later he attracted funding for research projects from the 5th European Framework Programme (PSI3, DEMOS) and from Euroscom (P815, P907), all considered early applications of agents to software engineering processes. In the Eurescom P907 "Methodology for Engineering Systems of Software Agents", he co-created the MESSAGE methodology, currently part of the main FIPA software agent methodologies. INGENIAS His main contribution combining agent concepts in software engineering is the INGENIAS methodology and toolkit. INGENIAS (Engineering for Software Agents) is a complete framework for the analysis, design and implementation of multi-agent systems (MAS). As a result of the research in these years, Jorge J. Gómez-Sanz published in 2002 his PhD thesis "A Methodology for the Development of Multi-Agent Systems" (in Spanish) advised by Francisco Garijo and Juan Pavón. This work constitutes the first version of the INGENIAS methodology and its meta-models. INGENIAS adopts since its inception a model-driven engineering (MDE) approach. Model-driven engineering (MDE) organizes developments around the specification of systems through models that are automatically transformed to generate other artefacts, e.g., code, tests, or documentation. INGENIAS follows these principles specifying the MAS meta-models that define its modeling language and allow generating automatically its development tools distributed as the INGENIAS Development Kit (IDK). This approach supports research in different areas characterized by the use of modeling languages and requiring flexibility to adapt these to new requirements. Thus, it has been also used in Agent-based simulation. INGENIAS development process has been one of the few processes of agent-oriented methodologies in having their development process formally specified with SPEM, a language of the Object Management Group (OMG). Currently, there is one development process based on the Unified Process and another based on Scrum. The INGENIAS modeling language and the open-source tools for its application made it a popular methodology in the agent literature. It has been included in relevant surveys and comparisons in the field. The open-source INGENIAS associated tools are also successful in the agent community, as assessed by their number of downloads. Agent-based social simulation The level of maturity reached by the INGENIAS framework and related tools, mainly INGENME, in its application to Software Agents, allowed the GRASIA group to consider its application to other domains. Agent-based simulation has been one with an immediate application. As in Software Agents, agent-based simulation (ABS) relies on the concept of agent, in this case as the basic block to build computational simulations. The conceptual similarities between the concept of agent in both disciplines, and the suitability of models to work with simulations, made of agent-based simulation a sensible extension for INGENIAS work. This line of research started in GRASIA with a direct application of the works in INGENIAS with software agents to ABS for the PhD thesis of Candelaria Sansores "A Methodology for the Study of Artificial Societies" (2007), advised by Pavonl. This work pointed out the suitability of applying the MDE approach of INGENIAS to this field, but also the problems of researchers without a background on Software Engineering to use INGENIAS. These first attempts were oriented to simulations intended to verify behavioral principles described by well-known laws, but not to explore simulations based on the use of big amounts of raw data. This data-driven agent-based social simulation was the subject of the interdisciplinary PhD thesis of Samer Hassan "Towards a Data Driven Approach for Agent Based Modelling: Simulating Spanish Postmodernization", also advised by Juan Pavón. The use of INGENIAS for ABS is based on the adoption of MDE to build simulations. This approach was developed and validated by GRASIA in different domains, e.g., urbanism and group work. Other research and activity He has achieved multiple agreements with relevant research groups: ICAR-CNR, SenSysCal.it, INSISOC, SMAC. and industry such as Telefonica R+D or Boeing Research and Technology Europe. His topics of interest cover several disciplines, including simulation of complex systems, agent-oriented software engineering, artificial intelligent applications, Responsible Research and Innovation, Ethics in AI, Smart Cities, Legal Tech and inclusion tech. Scientific recognitions Pavón joined the European research networks on Software Agents and Agent-based simulation, AgentLink and AgentCities, and has contributed to different international projects funded by the European Commission: VITAL, MOMOCS, PSI3, DEMOS, AGENTCITIES.NET, P2Pvalue. He has worked on more than 30 research projects, leading 20 of them, with public and private funding. He accounts more than 260 scientific publications, and an H-index of 31. He is a research consultant for dozens of committees and part of the editorial board of several journals in the field of computer science. He is member of the Spanish Association for Artificial Intelligence., the European Social Simulation Association, the FIPA board, the European Association for the Study of Science and Technology, the IEEE Computer Society, and scientific project evaluator of the European Commission. In 2006 he received an honorary PhD from the Université Pierre et Marie Curie (París 6). His research group GRASIA has achieved some recognition through several awards: The AAMAS 2008 Best Academic Software Demo for INGENIAS. Second prize in the contest Robotrader organized by the Technical University of Madrid 2012. Selection by the IEEE Special Technical Community on Social Networks as featured article in October 2012. Best paper award in the 19th edition of the Semantic Search Workshop 2010 of the World Wide Web Conference Selected publications Pavón is author of several books and more than 260 scientific articles on Software Agents, Artificial Intelligence and Software Engineering. A selection of highly cited works are: References External links Juan Pavón personal page GRASIA research group 1962 births Living people Spanish computer scientists Multi-agent systems Formal methods people Artificial intelligence researchers Researchers in distributed computing Computer science writers Alcatel-Lucent Agent-based software Modeling and simulation Complutense University of Madrid faculty Polytechnic University of Madrid alumni Pierre and Marie Curie University alumni
14957171
https://en.wikipedia.org/wiki/TableCurve%202D
TableCurve 2D
TableCurve 2D is a linear and non-linear Curve fitting software package for engineers and scientists that automates the curve fitting process and in a single processing step instantly fits and ranks 3,600+ built-in frequently encountered equations enabling users to easily find the ideal model to their 2D data within seconds. Once the user has selected the best fit equation, they can output high-quality function and test programming codes or generate comprehensive reports and publication quality graphs. TableCurve 2D was originally developed by Ron Brown of AISN Software. The first version of TableCurve 2D was released in 1989. The first version was a DOS product. The first Windows based product was introduced in the last quarter of 1992. It was distributed by Jandel Scientific Software in the late 1980s but by January 2004, Systat Software acquired the exclusive worldwide rights from SPSS, Inc. to distribute SigmaPlot and other Sigma Series products. Systat Software is now based in San Jose, California. TableCurve 2D saves time by taking the endless trial and error out of curve fitting and that can help solve complex science and engineering problems faster. Related links SYSTAT PeakFit TableCurve 3D External links Systat Webpage TableCurve 2D Support Webpage Plotting software Regression and curve fitting software
77926
https://en.wikipedia.org/wiki/DECnet
DECnet
DECnet is a suite of network protocols created by Digital Equipment Corporation. Originally released in 1975 in order to connect two PDP-11 minicomputers, it evolved into one of the first peer-to-peer network architectures, thus transforming DEC into a networking powerhouse in the 1980s. Initially built with three layers, it later (1982) evolved into a seven-layer OSI-compliant networking protocol. DECnet was built right into the DEC flagship operating system OpenVMS since its inception. Later Digital ported it to Ultrix, as well as Apple Macintosh and IBM PC running variants of DOS and Microsoft Windows under the name DEC Pathworks, allowing these systems to connect to DECnet networks of VAX machines as terminal nodes. While the DECnet protocols were designed entirely by Digital Equipment Corporation, DECnet Phase II (and later) were open standards with published specifications, and several implementations were developed outside DEC, including ones for FreeBSD and Linux. DECnet code in the Linux kernel was marked as orphaned on February 18, 2010. Evolution DECnet refers to a specific set of hardware and software networking products which implement the DIGITAL Network Architecture (DNA). The DIGITAL Network Architecture has a set of documents which define the network architecture in general, state the specifications for each layer of the architecture, and describe the protocols which operate within each layer. Although network protocol analyzer tools tend to categorize all protocols from DIGITAL as "DECnet", strictly speaking, non-routed DIGITAL protocols such as LAT, SCS, AMDS, LAST/LAD are not DECnet protocols and are not part of the DIGITAL Network Architecture. To trace the evolution of DECnet is to trace the development of DNA. The beginnings of DNA were in the early 1970s. DIGITAL published its first DNA specification at about the same time that IBM announced its Systems Network Architecture (SNA). Since that time, development of DNA has evolved through the following phases: 1970-1980 Phase I (1974) Support limited to two PDP-11s running the RSX-11 operating system only, with communication over point-to-point (DDCMP) links between nodes. Phase II (1975) Support for networks of up to 32 nodes with multiple, different implementations which could inter-operate with each other. Implementations expanded to include RSTS, TOPS-10 and TOPS-20 with communications between processors still limited to point-to-point links only. Introduction of downline loading (MOP), and file transfer using File Access Listener (FAL), remote file access using Data Access Protocol (DAP), task-to-task programming interfaces and network management features. Phase III (1980). Support for networks of up to 255 nodes over point-to point and multi-drop links. Introduction of adaptive routing capability, record access, a network management architecture, and gateways to other types of networks including IBM's SNA and CCITT Recommendation X.25. 1981-1986 Phase IV and Phase IV+ (1982). Phase IV was released initially to RSX-11 and VMS systems, later TOPS-20, TOPS-10, ULTRIX, VAXELN, and RSTS/E gained support. Support for networks of up to 64,449 nodes (63 areas of 1023 nodes), datalink capabilities expanded beyond DDCMP to include Ethernet local area network support as the datalink of choice, expanded adaptive routing capability to include hierarchical routing (areas, level 1 and level 2 routers), VMScluster support (cluster alias) and host services (CTERM). CTERM allowed a user on one computer to log into another computer remotely, performing the same function that Telnet does in the TCP/IP protocol stack. Digital also released a product called the PATHWORKS client, and more commonly known as the PATHWORKS 32 client, that implemented much of DECnet Phase IV for DOS, and 16 and 32 bit Microsoft Windows platforms (all the way through to Windows Server 2003). Phase IV implemented an 8 layer architecture similar to the OSI (7 layer) model especially at the lower levels. Since the OSI standards were not yet fully developed at the time, many of the Phase IV protocols remained proprietary. The Ethernet implementation was unusual in that the software changed the physical address of the Ethernet interface on the network to AA-00-04-00-xx-yy where xx-yy reflected the DECnet network address of the host. This allowed ARP-less LAN operation because the LAN address could be deduced from the DECnet address. This precluded connecting two NICs from the same DECnet node onto the same LAN segment, however. The initial implementations released were for VAX/VMS and RSX-11, later this expanded to virtually every operating system DIGITAL ever shipped with the notable exception of RT-11. DECnet stacks are found on Linux, SunOS and other platforms, and Cisco and other network vendors offer products that can cooperate with and operate within DECnet networks. Full DECnet Phase IV specifications are available. At the same time that DECnet Phase IV was released, the company also released a proprietary protocol called LAT for serial terminal access via Terminal servers. LAT shared the OSI physical and datalink layers with DECnet and LAT terminal servers used MOP for the server image download and related bootstrap processing. Enhancements made to DECnet Phase IV eventually became known as DECnet Phase IV+, although systems running this protocol remained completely interoperable with DECnet Phase IV systems. 1987 & beyond Phase V and Phase V+ (1987). Support for very large (architecturally unlimited) networks, a new network management model, local or distributed name service, improved performance over Phase IV. Move from a proprietary network to an Open Systems Interconnection (OSI) by integration of ISO standards to provide multi-vendor connectivity and compatibility with DNA Phase IV, the last two features resulted in a hybrid network architecture (DNA and OSI) with separate "towers" sharing an integrated transport layer. Transparent transport level links to TCP/IP were added via the IETF RFC 1006 (OSI over IP) and RFC 1859 (NSP over IP) standards (see diagram). It was later renamed DECnet/OSI to emphasize its OSI interconnectibility, and subsequently DECnet-Plus as TCP/IP protocols were incorporated. Notable installations DEC Easynet DEC's internal corporate network was a DECnet network called Easynet, which had evolved from DEC's Engineering Net (E-NET). It included over 2,000 nodes as of 1984, 15,000 nodes (in 39 countries) as of 1987, and 54,000 nodes as of 1990. The DECnet Internet DECnet was used at various scientific research centers which linked their networks to form an international network called the DECnet Internet. This included the U.S. Space Physics Analysis Network (US-SPAN), the European Space Physics Analysis Network (E-SPAN), and other research and education networks. The network consisted of over 17,000 nodes as of 1989. Routing between networks with different address spaces involved the use of either "poor man's routing" (PMR) or address translation gateways. In December 1988, VAX/VMS hosts on the DECnet Internet were attacked by the Father Christmas worm. CCNET CCNET (Computer Center Network) was a DECnet network that connected the campuses of various universities in the eastern regions of the United States during the 1980s. A key benefit was the sharing of systems software developed by the operations staff at the various sites, all of which were using a variety of DEC computers. As of March 1983, it included Columbia University, Carnegie Mellon University, and Case Western Reserve University. By May 1986, New York University, Stevens Institute of Technology, Vassar College and Oberlin College had been added. Several other universities joined later. Hobbyist DECnet networks Hobbyist DECnet networks have been in use during the 21st century. These include: HECnet Italian Retro DECnet See also Protocol Wars References General references Carl Malamud, Analyzing DECnet/OSI Phase V. Van Hostrand Reinhold, 1991. . James Martin, Joe Leben, DECnet Phase V: An OSI Implementation. Digital Press, 1992. . DECnet-Plus manuals for OpenVMS are available at http://www.hp.com/go/openvms/doc/ DECnet Phase IV OpenVMS manuals for DECnet Phase IV; these Phase IV manuals are archived on OpenVMS Freeware V5.0 distribution, at http://www.hp.com/go/openvms/freeware and other sites. DECnet Phase IV architecture manuals (including DDCMP, MOP, NICE, NSP, DAP, CTERM, routing); at https://web.archive.org/web/20140221225835/http://h71000.www7.hp.com/wizard/decnet/ (the originals are mirrored at DECnet for Linux). Cisco documentation of DECnet, at http://docwiki.cisco.com/wiki/DECnet Network protocols Digital Equipment Corporation OpenVMS
66487319
https://en.wikipedia.org/wiki/Home%20video%20game%20console%20generations
Home video game console generations
In the video game industry, the market for home video game consoles has frequently been segmented into generations, grouping consoles that are considered to have shared in a competitive marketspace. Since the first home consoles in 1972, there have been nine defined home console generations. A new console generation typically has occurred approximately every five years, in keeping pace with Moore's law for technology, though more recent generations have had extended periods due to the use of console revisions rather than completely new designs. Not all home consoles are defined as part of these generations; only those considered to be significant competitive consoles are classed into generations, and systems such as microconsoles are often omitted from these generations. Background and origins Like most consumer electronics, home video game consoles are developed based on improving the features offered by an earlier product with advances made by newer technology. For video game consoles, these improvements typically occur every five years, following a Moore's law progression where a rough aggregate measure of processing power doubles every 18 months or increases ten-fold after five years. This cyclic market has resulted in an industry-wide adoption of the razorblade model in selling consoles at minimal profit margin while making revenue from the sale of games produced for that console, and then transitioning users to the next console model at the fifth year as the successor console enters the market. This approach incorporates planned obsolescence into the products to continue to bring consumers towards purchasing the newer models. Because of the industry dynamics, many console manufacturers release their new consoles in roughly the same time period, with their consoles typically offering similar processing power and capabilities as their competitors. This systematic market has created the nature of console generations, categorizing the primary consoles into these segmented time periods that represent consoles with similar capabilities and which shared the same competitive space. Like consoles, these generations typically start five years after its prior one, though may have long tails as popular consoles remain viable well beyond five years. The use of the generation label came after the start of the 21st century as console technology started to mature, with the terminology applied retroactively to earlier consoles. However, no exact definition and delineation of console generations was consistently developed in the industry or academic literature since that point. Some schemes have been based on direct market data (including a seminal work published in an IEEE journal in 2002), while others are based on technology shifts. Wikipedia itself has been noted for creating its own version of console generation definitions that differ from other academic sources, the definitions from Wikipedia has been adopted by other sources but without having any true rationale behind it. The discrepancies between how consoles are grouped into generations and how these generations are named have caused confusion when trying to compare shifts in the video game marketplace compared to other consumer markets. Kemerer et al. (2017) provide a comparative analysis of these different generations through systems released up to 2010 as shown below. Console generation timeline For purposes of organization, the generations described here and subsequent pages maintain the Wikipedia breakdown of generation, generally breaking consoles apart by technology features whenever possible and with other consoles released in that same period incorporated within that same generation, and starting with the Odyssey and Pong-style home consoles as the first generation, an approach that has generally been adopted and extended by video game journalism. In this approach the generation "starts" with the release of the first console considered to have those features, and considered to end with the known last discontinuation of a console in that generation. For example, the third generation is considered to end in 2003 with the formal discontinuation of the Nintendo Entertainment System that year. This can create years with overlaps between multiple generations, as shown. This approach uses the concepts of "bits", or the size of individual word length handled by the processors on the console, for the earlier console generations. Longer word lengths generally led to improved gameplay concepts, graphics, and audio capabilities than shorter ones. The use of bits to market consoles to consumers started with the TurboGrafx 16, a console that used an 8-bit central processing unit similar to the Nintendo Entertainment System (NES), but included a 16-bit graphical processing unit. NEC, the console's manufacturer, took to market the console as a "16-bit" system over the NES' "8-bit" to establish it as a superior system. Other advertisers followed suit, creating a period known as the "bit wars" that lasted through the fifth generation, where console manufactures tried to outsell each other simply on the bit-count of their system. Aside from some "128 Bit" advertising slogans at the beginning of the sixth generation, marketing with bits largely stopped after the fifth generation. Though the bit terminology was no longer used in newer generations, the use of bit-count helped to establish the idea of console generations, and the earlier generations gained alternate names based on the dominant bit-count of the major systems of that era, such as the third generation being the 8-bit era or generation. Later console generations are based on groupings of release dates rather than common hardware as base hardware configurations between consoles have greatly diverged, generally following trends in generation definition given by video game and mainstream journalism. Handheld consoles and other gaming systems and innovations are frequently grouped within the release years associated with the home console generations; for example the growth of digital distribution is associated with the seventh generation. Console generation overview The following table provides an overview of the major hardware technical specifications of the consoles of each major generations by central processor unit (CPU), graphics processor unit (GPU), memory, game media, and other features. History The development of video game consoles primarily follows the history of video gaming in the North American and Japanese markets. Few other markets saw any significant console development on their own, such as in Europe where personal computers tended to be favored alongside imports of video game consoles. The clones of video game consoles in less-developed markets like China and Russia are not considered here. First generation (1972–1980) The first generation of home consoles were generally limited to dedicated consoles with just one or two games pre-built into the console hardware, with a limited means to alter gameplay factors. In the case of the Odyssey, while it did ship with "game cards", these did not have any programmed games on them but instead acted as jumpers to alter the existing circuitry pathway, and did not extend the capabilities of the console. Unlike most other future console generations, the first generation of consoles were typically built in limited runs rather than as an ongoing product line. The first home console was the Magnavox Odyssey in September 1972 based on Baer's "Brown Box" design. Originally built from solid-state circuits, Magnavox transitioned to integrated circuit chips that were inexpensive, and developed a new line of consoles in the Odyssey series from 1975 to 1977. At the same, Atari had successfully launched Pong as an arcade game in 1972, and began work to make a home console version in late 1974, which they eventually partnered with Sears to the new home Pong console by the 1975 Christmas season. Pong offered several technological advantages over the Odyssey, including an internal sound chip and the ability to track score. Baer, who was struggling with Magnavox' management on how to market the console, gave his colleague Arnold Greenberg of Coleco a heads-up of a new low-cost chip ideal for home consoles, which led Coleco to develop the first Telstar console in 1976. With Magnavox, Atari and Coleco all vying in the console space by 1976 and further cost reductions in key processing chips from General Instruments, numerous third-party manufacturers entered the console market by 1977, most simply cloning Pong or other games and of poor quality. This led to market saturation by 1977, with several hundreds of consoles on the market, and the industry's first market crash. Atari and Coleco attempted to make dedicated consoles with wholly new games to remain competitive, including Atari's Video Pinball series and Coleco's Telstar Arcade, but by this point, the first steps of the market's transition to the second generation of consoles had begun, making these units obsolete near release. The Japanese market for gaming consoles followed a similar path at this point. Nintendo had already been a business partner with Magnavox by 1971 and helped to design the early light guns for the console. Dedicated home game consoles in Japan appeared in 1975 with Epoch Co.'s TV Tennis Electrotennis, which it had made in partnership with Magnavox as well. As in the United States, numerous clones of these dedicated consoles began to appear, most made by the large television manufacturers like Toshiba and Sharp, and these games would be called TV geemu or terebi geemu (TV game) as the designation for "video games" in Japan. Nintendo became a major player when Mitsubishi, having lost their manufacturer Systek due to bankruptcy, turned to the company to help continue to build their Color TV-Game line, which went on to sell about 1.5 million units across five different units between 1977 and 1980. Second generation (1976–1992) The second generation of home consoles was distinguished by the introduction of the game cartridge, where the game's code is stored in read-only memory (ROM) within the cartridge. When the cartridge is slotted into the console, the electrical connections allow the main console's processors to read the game's code from the ROM. While ROM cartridges had been used in other computer applications prior, the ROM game cartridge was first implemented in the Fairchild Video Entertainment System (VES) in November 1976. Additional consoles during this generation, all which used cartridge-based systems, included the Atari 2600 (known as the Atari Video Computer System (VCS) at launch), the Magnavox Odyssey 2, Mattel Electronics' Intellivision, and the ColecoVision. In addition to consoles, newer processor technology allowed games to support up to 8 colors and up to 3-channel audio effects. With the introduction of cartridge-based consoles came the need to develop a wide array of games for them. Atari was one of the forefronts in development for its Atari 2600. Atari marketed the console across multiple regions including into Japan, and retained control of all development aspects of the games. Game developments coincided with the Golden age of arcade video games that started in 1978–1979 with the releases of Space Invaders and Asteroids, and home versions of these arcade games were ideal targets. The Atari 2600 version of Space Invaders, released in 1980, was considered the killer app for home video game consoles, helping to quadruple the console's sales that year. Similarly, Coleco had beaten Atari to a key licensing deal with Nintendo to bring Donkey Kong as a pack-in game for the Colecovision, helping to drive its sales. At the same time, Atari has been acquired by Warner Communications, and internal policies led to the departure of four key programmers David Crane, Larry Kaplan, Alan Miller, and Bob Whitehead, who went and formed Activision. Activision proceeded to develop their own Atari 2600 games as well as games for other systems. Atari attempted legal action to stop this practice but ended up settling out of court, with Activision agreeing to pay royalties but otherwise able to continue game development, making Activision the first third-party game developer. Activision quickly found success with titles like Pitfall!, and were able to generate in revenue from about in startup funds within 18 months. Numerous other companies saw Activision's success and jumped into game development to try to make fast money on the rapidly expanding North America video game market. This led to a loss of publishing control and dilution of the game market by the early 1980s. Additionally, in following on the success of Space Invaders, Atari and other companies had remained eager for licensed video game possibilities. Atari had banked heavily on commercial sales of E.T. the Extra-Terrestrial in 1982, but it was rushed to market and poorly-received, and failed to make Atari's sales estimates. Along with competition from inexpensive home computers, the North American home console market crashed in 1983. For the most part, the 1983 crash signaled the end of this generation as Nintendo's introduction of the Famicom the same year brought the start of the third generation. When Nintendo brought the Famicom to North America under the name "Nintendo Entertainment System", it helped to revitalize the industry, and Atari, now owned by Jack Tramiel, pushed on sales of the previously-successful Atari 2600 under new branding to keep the company afloat for many more years while he transitioned the company more towards the personal computer market. The Atari 2600 stayed in production until 1992, marking the end of the second generation. Third generation (1983–2003) Frequently called the "8-bit generation", the third generation's consoles used 8-bit processors, which allowed up to five bits of color (25 or 32 colors), five audio channels, and more advanced graphics capability including sprites and tiles rather than block-based graphics of the second generation. Further, the third console saw the market dominance shift from the United States to Japan as a result of the 1983 crash. Both the Sega SG-1000 and the Nintendo Famicom launched near simultaneously in Japan in 1983. The Famicom, after some initial technical recalls, soon gained traction and became the best selling console in Japan by the end of 1984. By that point Nintendo wanted to bring the console to North America but recognized the faults that the video game crash had caused. It took several steps to redesign the console to make it look less like a game console and rebranded it as the "Nintendo Entertainment System" (NES) for North America to avoid the "video game" label stigma. The company also wanted to avoid the loss of publishing control that had occurred both in North America as well as in Asia after the Famicom's release, and created a lockout system that required all game cartridges to be manufactured by Nintendo to include a special chip. If this chip was not present, the console would fail to play the game. This further gave Nintendo direct control on the titles published for the system, rejecting those it felt were too mature. The NES launched in North America in 1985, and helped to revitalize the video game market there. Sega attempted to compete with the NES with its own Master System, released later in 1985 in both the US and Japan, but did not gain traction to compete. Similarly, Atari's attempts to compete with the NES via the Atari 7800 in 1987 failed to knock the NES from its dominant position. The NES remained in production until 2003, when it was discontinued along with its successor, the Super Nintendo Entertainment System. Fourth generation (1987–2004) The fourth generation of consoles, also known as the "16-bit generation", further advanced core console technology with 16-bit processors, improving the available graphics and audio capabilities of games. NEC's TurboGrafx-16 (or PC Engine as released in Japan), first released in 1987, is considered the first fourth generation console even though it still had an 8-bit CPU. The console's 16-bit graphics processor gave it capabilities comparable to the other fourth generation systems, and NEC's marketing had pushed the console being an advancement over the NES as a "16-bit" system. Both Sega and Nintendo entered the fourth generation with true 16-bit systems in the 1988 Sega Genesis (MegaDrive in Japan) and the 1990 Super Nintendo Entertainment System (SNES, Super Famicom in Japan). SNK also entered the competition with a modified version of their Neo Geo MVS arcade system into the Neo Geo, released in 1990, which attempted to bridge the gap between arcade and home console systems with the shared use of common game cartridges and memory cards. This generation was notable for the so-called "console wars" between Nintendo and Sega primarily in North America. Sega, to try to challenge Nintendo's dominant position, created the mascot character Sonic the Hedgehog, who exhibited cool personality to appeal to the Western youth in contrast to Nintendo's Mario, and bundled the Genesis with the game of the same name. The strategy succeeded with Sega becoming the dominant player in North America until the mid-1990s. During this generation, the technology costs of using optical discs in the form of CD-ROMs has dropped sufficiently to make them desirable to be used for shipping computer software, including for video games for personal computers. CD-ROMs offered more storage space than game cartridges and could allow for full-motion video and other detailed audio-video works to be used in games. Console manufacturers adapted by created hardware add-ons to their consoles that could read and play CD-ROMs, including NEC's TurboGrafx-CD add-on (as well as the integrated TurboDuo system) in 1988, and the Sega CD add-on for the Genesis in 1991, and the Neo Geo CD in 1994. Costs of these add-ons were generally high, nearing the same price as the console itself, and with the introduction of disc-based consoles in the fifth generation starting in 1993, these fell by the wayside. Nintendo had initially worked with Sony to develop a similar add-on for the SNES, the Super NES CD-ROM, but just before its introduction, business relationships between Nintendo and Sony broke down, and Sony would take its idea on to develop the fifth generation PlayStation. Additionally, Philips attempted to enter the market with a dedicated CD-ROM format, the CD-i, also released in 1990, that included other uses for the CD-ROM media beyond video games but the console never gained traction. The fourth generation had a long tail that overlapped with the fifth generation, with the SNES's discontinuation in 2003 marking the end of the generation. To keep their console competitive with the new fifth generation ones, Nintendo took to the use of coprocessors manufactured into the game cartridges to enhance the capabilities of the SNES. This included the Super FX chip, which was first used in the game Star Fox in 1993, generally considered one of the first games to use real-time polygon-based 3D rendering on consoles. Fifth generation (1993–2006) During this time home computers gained greater prominence as a way of playing video games. The video game console industry nonetheless continued to thrive alongside home computers, due to the advantages of much lower prices, easier portability, circuitry specifically dedicated towards video games, the ability to be played on a television set (which PCs of the time could not do in most cases), and intensive first party software support from manufacturers who were essentially banking their entire future on their consoles. Besides the shift to 32-bit processors, the fifth generation of consoles also saw most companies excluding Nintendo shift to dedicated optical media formats instead of game cartridges, given their lower cost of production and higher storage capacity. Initial consoles of the fifth generation attempted to capitalize on the potential power of CD-ROMs, which included the Amiga CD32, 3DO and the Atari Jaguar in 1993. However, early in the cycle, these systems were far more expensive than existing fourth-generation models and has much smaller game libraries. Further, Nintendo's use of co-processors in late SNES games further kept the SNES as one of the best selling systems over new fifth generation ones. Two of the key consoles of the fifth generation were introduced in 1995: the Sega Saturn, and the Sony PlayStation, both which challenged the SNES' ongoing dominance. While the Saturn sold well, it did have a number of technical flaws, but established Sega for a number of key game series going forward. The PlayStation, in addition to using optical media, also introduced the use of memory cards as to save the state of a game. Though memories cards had been used by Neo Geo to allow players to transfer game information between home and arcade systems, the PlayStation's approach allowed games to have much longer gameplay and narrative elements, leading to highly-successful role-playing games like Final Fantasy VII. By 1996, the PlayStation became the best-selling console over the SNES. Nintendo released their next console, the Nintendo 64 in late 1996. Unlike other fifth generation units, it still used game cartridges, as Nintendo believed the load-time advantages of cartridges over CD-ROMs was still essential, as well as their ability to continue to use lockout mechanisms to protect copyrights. The system also included support for memory cards as well, and Nintendo developed a strong library of first-party titles for the game, including Super Mario 64 and The Legend of Zelda: Ocarina of Time that helped to drive its sales. While the Nintendo 64 did not match the PlayStation's sales, it kept Nintendo a key competitor in the home console market alongside Sony and Sega. As with the transition from the fourth to fifth generation, the fifth generation has a long overlap with the sixth console generation, with the PlayStation remaining in production until 2005. Sixth generation (1998–2013) By the sixth generation, console technology began to catch up to performance of personal computers of the time, and the use of bits as their selling point fell by the wayside. The console manufactures focused on the individual strengths of their game libraries as marketing instead. The consoles of the sixth generation saw further adoption of optical media, expanding into the DVD format for even greater data storage capacity, additional internal storage solutions to function as memory cards, as well as adding support either directly or through add-ons to connect to the Internet for online gameplay. Consoles began to move towards a convergence of features of other electronic living room devices and moving away from single-feature systems. By this point, there were only three major players in the market: Sega, Sony, and Nintendo. Sega got an early lead with the Dreamcast first released in Japan in 1998. It was the first home console to include a modem to allow players to connect to the Sega network and play online games. However, Sega found several technical issues that had to be resolved before its Western launch in 1999. Though its Western release was more successful than in Japan, the console was soon outperformed by Sony's PlayStation 2 released in 2000. The PlayStation 2 was the first console to add support for DVD playback in addition to CD-ROM, as well as maintaining backward compatibility with games from the PlayStation library, which helped to draw consumers that remained on the long-tail of the PlayStation. While other consoles of the sixth generation had not anticipated this step, the PlayStation 2's introduction of backwards compatibility became a major design consideration of future generations. Along with a strong game library, the PlayStation 2 went on to sell 155 million units before it was discontinued in 2013, and , remains the best selling home console of all time. Unable to compete with Sony, Sega discontinued the Dreamcast in 2001 and left the hardware market, instead focusing on its software properties. Nintendo's entry in the sixth generation was the GameCube in 2001, its first system to use optical discs based on the miniDVD format. A special Game Boy Player attachment allowed the GameCube to use any of the GameBoy cartridges as well, and adapters were available to allow the console to connect to the Internet via broadband or modem. At this point Microsoft also entered the console market with its first Xbox system, released in 2001. Microsoft considered the PlayStation 2's success as a threat to the personal computer in the living room space, and had developed the Xbox to compete. As such, the Xbox was designed based more on Microsoft's experience from personal computers, using an operating system built out from its Microsoft Windows and DirectX features, utilizing a hard disk for save game store, built-in Ethernet functionality, and created the first console online service, Xbox Live to support multiplayer games. Seventh generation (2005–2017) Video game consoles had become an important part of the global IT infrastructure by the mid-2000s. It was estimated that video game consoles represented 25% of the world's general-purpose computational power in the year 2007. By the seventh generation, Sony, Microsoft, and Nintendo had all developed consoles designed to interface with the Internet, adding networking support for either wired and wireless connections, online services to support multiplayer games, digital storefronts for digital purchases of games, and both internal storage and support for external storage on the console for these games. These consoles also added support for digital television resolutions through HDMI interfaces, but as the generation occurred in the midst of the High-definition optical disc format war between Blu-ray and HD-DVD, a standard for high-definition playback was yet to be fixed. A further innovation came by the use of motion controllers, either built into the console or offered as an add-on afterwards. Microsoft entered the seventh generation first with the Xbox 360 in 2005. The Xbox 360 saw several hardware revisions over its lifetime which became a standard practice for Microsoft going forward; these revisions offered different features such as a larger internal hard drive or a fast processor at a higher price point. As shipped, the Xbox 360 supported DVD discs and Microsoft had opted to support the HD-DVD format with an add-on for playback of HD-DVD films. However, this format ended up as deprecated compared to Blu-ray. The Xbox 360 was backward compatible with about half of the original Xbox library. Through its lifetime, the Xbox 360 was troubled by a consistent hardware fault known as "the Red Ring of Death" (RROD), and Microsoft spent over $1 billion correcting the problem. Sony's PlayStation 3 was released in 2006. The PlayStation 3 represented a shift of the internal hardware from Sony's custom Emotion Engine to a more standard x86-based system. Initial PlayStation 3 shipped with a special Emotion Engine daughterboard that allowed for backwards compatibility of PlayStation 2 games, but later revisions of the unit removed this, leaving software-based emulation for PlayStation games available. Sony banked on the Blu-ray format, which was included from the start. With the PlayStation 3, Sony introduced the PlayStation Network for its online services and storefront. Nintendo introduced the Wii in 2006 around the same time as the PlayStation 3. Nintendo lacked the same manufacturing capabilities and relationships with major hardware supplies as Sony and Microsoft, and to compete, diverged on a feature-for-feature approach and instead developed the Wii around the novel use of motion controls in the Wii Remote. This "blue ocean strategy", releasing a product where there was no competition, was considered part of the unit's success, and which drove Microsoft and Sony to develop their own motion control accessors to compete. Nintendo provided various online services that the Wii could connect too, including the Virtual Console where players could purchase emulated games from Nintendo's past consoles as well as games for the Wii. The Wii used regular sized DVDs for its game medium but also directly supported GameCube discs. The Wii was generally considered a surprising success that many developers had initially overlooked. Based on the success of the Wii Remote controller, both Microsoft and Sony released similar motion detection controllers for their consoles. Microsoft introduced the Kinect motion controller device for the Xbox 360, which served as both a camera, microphone, and motion sensor for numerous games. Sony released the PlayStation Move, a system consisting of a camera and lit handheld controllers, which worked with its PlayStation 3. The seventh generation concluded with the discontinuation of the PlayStation 3 in 2017. Eighth generation (2012–present) Aside from the usual hardware enhancements, consoles of the eighth generation focus on further integration with other media and increased connectivity. Hardware improvements pushed for higher frame rates at up to 4k resolutions. The Wii U, introduced in 2012, was considered by Nintendo to be a successor to the Wii but geared to more serious players. The console supported backward compatibility with the Wii, including its motion controls, and introduced the Wii U GamePad, a tablet/controller hybrid that acted as a second screen. Nintendo further refined its network offerings to develop the Nintendo Network service to combine storefront and online connectivity services. The Wii U did not sell as well as Nintendo had planned, as they found people mistook the GamePad to be a tablet they could take with them away from the console, and the console struggled to draw the third-party developers as the Wii had. Both the PlayStation 4 and Xbox One came out in 2013. Both were similar improvements over the previous generation's respective consoles, providing more computational power to support up to 60 frames per seconds at 1080p resolutions for some games. Each unit also saw a similar set of revisions and repackaging to develop high- and low-end cost versions. In the case of the Xbox One, the console's initially launch had included the Kinect device but this became highly controversial in terms of potential privacy violations and lack of developer support, and by its mid-generation refresh, the Kinect had been dropped and discontinued as a game device. Later in the eighth generation, Nintendo released the Nintendo Switch in 2017. The Switch is considered the first hybrid game console. It uses a special CPU/GPU combination that can run at different clock frequencies depending on how it is used. It can be placed into a special docking unit that is hooked to a television and a permanent power supply, allowing faster clock frequencies to be used to be played at higher resolutions and frame rates, and thus more comparable to a home console. Alternatively, it can be removed and used either with the attached JoyCon controllers as a handheld unit, or can be even played as a tablet-like system via its touchscreen. In these modes, the CPU/GPU run at lower clock speeds to conserve battery power, and the graphics are not as robust as in the docked version. A larger suite of online services was added through the Nintendo Switch Online subscription, including several free NES and SNES titles, replacing the past Virtual Console system. The Switch was designed to address many of the hardware and marketing faults around the Wii U's launch, and has become one of the company's fastest-selling consoles after the Wii. Ninth generation (2020–current) Both Microsoft and Sony released successors to their home consoles in November 2020. Both console families target 4k and 8k resolution televisions at high frame rates, support for real-time ray tracing rendering, and the use of high-performance solid-state drives (SSD) as internal high-speed memory to make delivering game content much faster than from reading from optical disc or standard hard drives, which can eliminate loading times and make open world games appear seamless. Microsoft released the fourth generation of Xbox with the Xbox Series X and Series S on November 10, 2020. The Series X has a base performance target of 60 frames per second at 4k resolution to be four times as powerful as the Xbox One X. One of Microsoft's goals with both units was to assure backward compatibility with all games supported by the Xbox One, including those original Xbox and Xbox 360 titles that are backward compatible with the Xbox One, allowing the Xbox Series X and Series S to support four generations of games. Sony's PlayStation 5 was released on November 12, 2020, and also is a similar performance boost over the PlayStation 4. The PlayStation 5 uses a custom SSD solution with much higher input/output rates comparable to RAM chip speeds, significantly improving rendering and data streaming speeds. The chip architecture is comparable to the PlayStation 4, allowing backwards compatible with most of the PlayStation 4 library while select games will need chip timing tweaking to make them compatible. Sales comparison Below is a timeline of each generation with the top three home video consoles of each generation based on worldwide sales. For a complete list of home video consoles released in each generation please see the respective article of each generation. Notes References Home video game consoles Home video game consoles
4380161
https://en.wikipedia.org/wiki/Streamtuner
Streamtuner
Streamtuner is a streaming media directory browser. Through the use of a C/Python plugin system, it offers a GTK+ 2.0 interface to Internet radio directories. Streamtuner does not actually play any files, it downloads a list of online radio streams and then tells the unix player (user's option) to play the selected stream. Streamtuner offers hundreds of thousands of music resources in a more common interface. Streamtuner is free software, released under the terms of the revised BSD license. There is also a version for the Nokia 770 Internet tablet. Features of Streamtuner Browse the SHOUTcast Yellow Pages Browse the Live365 directory Browse the Xiph.org (aka icecast.org, aka Oddsock) directory Browse the basic.ch DJ mixes Manage your local music collection, with full support for ID3 and Vorbis metadata editing Listen to streams (through unix player), browse their web page, or record them using programs such as Streamripper Implement new directory handlers as tiny Python scripts or as dynamically loadable modules written in C Retain your favourite streams by bookmarking them Manually add streams to your collection Streamtuner in the press UnixReview.com: Marcel's Linux App of the Month: Streamtuner (July 2005) Tux Magazine: Streamtuner (July 2005) Orange Crate: Audiophiles' Solution For Net Radio (April 2004) External links Streamtuner homepage Unix Internet software Free audio software
8129
https://en.wikipedia.org/wiki/Digitalis
Digitalis
Digitalis ( or ) is a genus of about 20 species of herbaceous perennial plants, shrubs, and biennials, commonly called foxgloves. Digitalis is native to Europe, western Asia, and northwestern Africa. The flowers are tubular in shape, produced on a tall spike, and vary in colour with species, from purple to pink, white, and yellow. The scientific name means "finger". The genus was traditionally placed in the figwort family, Scrophulariaceae, but phylogenetic research led taxonomists to move it to the Veronicaceae in 2001. More recent phylogenetic work has placed it in the much enlarged family Plantaginaceae. The best-known species is the common foxglove, Digitalis purpurea. This biennial is often grown as an ornamental plant due to its vivid flowers which range in colour from various purple tints through pink and purely white. The flowers can also possess various marks and spottings. Other garden-worthy species include D. ferruginea, D. grandiflora, D. lutea, and D. parviflora. The term digitalis is also used for drug preparations that contain cardiac glycosides, particularly one called digoxin, extracted from various plants of this genus. Foxglove has medicinal uses but is also very toxic to humans and other animals, and consumption can even lead to death. Etymology The generic epithet Digitalis is from the Latin digitus (finger). Leonhart Fuchs first invented the name for this plant in his 1542 book De historia stirpium commentarii insignes, based upon the German vernacular name Fingerhut, which translates literally as 'finger hat', but actually means 'thimble'. The name is recorded in Old English as 'foxes glofe/glofa' or 'fox's glove'. Over time, folk myths obscured the literal origins of the name, insinuating that foxes wore the flowers on their paws to silence their movements as they stealthily hunted their prey. The woody hillsides where the foxes made their dens were often covered with the toxic flowers. Some of the more menacing names, such as "witch's glove", reference the toxicity of the plant. Henry Fox Talbot (1847) proposed 'folks' glove', where 'folk' means fairy. Similarly, R. C. A. Prior (1863) suggested an etymology of 'foxes-glew', meaning 'fairy music'. However, neither of these suggestions account for the Old English form foxes glofa. Taxonomy Species The Flora Europaea originally recognised a number of species now seen as synonyms of Digitalis purpurea, or others: D. dubia, D. leucophaea, D. micrantha and D. trojana. As of 2017, Plants of the World Online recognises the following 27 species (and a number of hybrids): Digitalis atlantica Pomel Digitalis canariensis L. Digitalis cariensis Boiss. ex Jaub. & Spach Digitalis cedretorum (Emb.) Maire Digitalis chalcantha (Svent. & O'Shan.) Albach, Bräuchler & Heubl Digitalis ciliata Trautv. Digitalis davisiana Heywood Digitalis ferruginea L. Digitalis fuscescens Waldst. & Kit. Digitalis grandiflora Mill. Digitalis ikarica (P.H.Davis) Strid Digitalis isabelliana (Webb) Linding. Digitalis laevigata Waldst. & Kit. Digitalis lamarckii Ivanina Digitalis lanata Ehrh. Digitalis lutea L. Digitalis mariana Boiss. Digitalis minor L. Digitalis nervosa Steud. & Hochst. ex Benth. Digitalis obscura L. Digitalis parviflora Jacq. Digitalis purpurea L. Digitalis sceptrum L.f. Digitalis subalpina Braun-Blanq. Digitalis thapsi L. Digitalis transiens Maire Digitalis viridiflora Lindl. Hybrids Digitalis × coutinhoi Samp. Digitalis × fulva Lindl. Digitalis × macedonica Heywood Digitalis × media Roth Digitalis × pelia Zerbst & Bocquet Digitalis × purpurascens Roth Digitalis × sibirica (Lindley) Werner had been considered a valid species since it was first described by the English botanist and gardener John Lindley in 1821, but it was considered a hybrid of D. grandiflora and D. laevigata by the German botanist in 1960. Systematics The first full monograph regarding this genus was written by Lindley in 1821. He included two sections, a section Isoplexis including two species, and the main section Digitalis with three subsections, including 2Y species, a number of which are now seen as synonyms or hybrids. In the last full monograph of the genus in 1965, Werner classified the 19 recognised species in five sections (four species from Macaronesia were separated in the genus Isoplexis at the time): In the section Digitalis, along with the type species D. purpurea, four other species (as recognised as the time) were placed: D. thapsi, D. dubia, D. heywoodii and D. mariana. The monotypic section Frutescentes contained only D. obscura. The section Grandiflorae, which was also called section Macranthae by Vernon Hilton Heywood. It included, along with the type species D. grandiflora, also D. atlantica, D. ciliata and D. davisiana. Globiflorae included five species: D. laevigata, D. nervosa, D. ferruginea, D. cariensis and D. lanata. Tubiflorae included four species: D. subalpina, D. lutea, D. viridiflora and D. parviflora. In their 2000 book about Digitalis, Luckner and Wichtl continued to uphold Werner's classification of the 19 species, but molecular studies into the phylogeny of the genus published in 2004 found that although four of Werner's sections were supported by the genetics, the section Tubiflorae was polyphyletic, and that the species D. lutea and D. viridiflora should be placed in the section Grandiflorae. This study, as well as a number of other studies published around that time, reunited the genus Isoplexis with Digitalis, increasing the number of species to 23. Peter Hadland Davis, an expert on the flora of Turkey, had used a different circumscription than Werner in his works, and recognised eight species in the country. A 2016 molecular phylogenetic study into the relationships of the Turkish species in the section Globiflorae aimed to reconcile this discrepancy, finding that the classification as proposed by Davis was largely correct: Globiflorae contained as distinct species D. cariensis, D. ferruginea, D. lamarckii, D. lanata and D. nervosa, and D. trojana was subsumed at the infraspecific rank as D. lanata subsp. trojana. This study listed 23 species: D. transiens, D. cedretorum, D. ikarica and D. fuscescens were not mentioned. D. parviflora and D. subalpina were not tested in this study, but the 2004 study found these two species situated within the section Globiflorae. Ecology Larvae of the foxglove pug, a moth, consume the flowers of the common foxglove for food. Other species of Lepidoptera eat the leaves, including the lesser yellow underwing. Uses Cardiac Digitalis is an example of a drug derived from a plant that was formerly used by herbalists; herbalists have largely abandoned its use because of its narrow therapeutic index and the difficulty of determining the amount of active drug in herbal preparations. Once the usefulness of digitalis in regulating the human pulse was understood, it was employed for a variety of purposes, including the treatment of epilepsy and other seizure disorders, which are now considered to be inappropriate treatments. A group of medicines extracted from foxglove plants are called digitalin. The use of D. purpurea extract containing cardiac glycosides for the treatment of heart conditions was first described in the English-speaking medical literature by William Withering, in 1785, which is considered the beginning of modern therapeutics. In contemporary medicine digitalis (usually digoxin) is obtained from D. lanata. It is used to increase cardiac contractility (it is a positive inotrope) and as an antiarrhythmic agent to control the heart rate, particularly in the irregular (and often fast) atrial fibrillation. Digitalis is hence often prescribed for patients in atrial fibrillation, especially if they have been diagnosed with congestive heart failure. Digoxin was approved for heart failure in 1998 under current regulations by the Food and Drug Administration on the basis of prospective, randomized study and clinical trials. It was also approved for the control of ventricular response rate for patients with atrial fibrillation. American College of Cardiology/American Heart Association guidelines recommend digoxin for symptomatic chronic heart failure for patients with reduced systolic function, preservation of systolic function, and/or rate control for atrial fibrillation with a rapid ventricular response. Heart Failure Society of America guidelines for heart failure provide similar recommendations. Despite its relatively recent approval by the Food and Drug Administration and the guideline recommendations, the therapeutic use of digoxin is declining in patients with heart failure—likely the result of several factors. The main factor is the more recent introduction of several drugs shown in randomised controlled studies to improve outcomes in heart failure. Safety concerns regarding a proposed link between digoxin therapy and increased mortality seen in observational studies may have contributed to the decline in therapeutic use of digoxin, however a systematic review of 75 studies including four million patient years of patient follow-up showed that in properly designed randomised controlled studies, mortality was no higher in patients given digoxin than in those given placebo. Variations A group of pharmacologically active compounds are extracted mostly from the leaves of the second year's growth, and in pure form are referred to by common chemical names, such as digitoxin or digoxin, or by brand names such as Crystodigin and Lanoxin, respectively. The two drugs differ in that digoxin has an additional hydroxyl group at the C-3 position on the B-ring (adjacent to the pentane). This results in digoxin having a half-life of about one day (and increasing with impaired kidney function), whereas digitoxin's is about 7 days and not affected by kidney function. Both molecules include a lactone and a triple-repeating sugar called a glycoside. Mechanism of action Digitalis works by inhibiting sodium-potassium ATPase. This results in an increased intracellular concentration of sodium ions and thus a decreased concentration gradient across the cell membrane. This increase in intracellular sodium causes the Na/Ca exchanger to reverse potential, i.e., transition from pumping sodium into the cell in exchange for pumping calcium out of the cell, to pumping sodium out of the cell in exchange for pumping calcium into the cell. This leads to an increase in cytoplasmic calcium concentration, which improves cardiac contractility. Under normal physiological conditions, the cytoplasmic calcium used in cardiac contractions originates from the sarcoplasmic reticulum, an intracellular organelle that stores calcium. Human newborns, some animals, and patients with chronic heart failure lack well developed and fully functioning sarcoplasmic reticula and must rely on the Na/Ca exchanger to provide all or a majority of the cytoplasmic calcium required for cardiac contraction. For this to occur, cytoplasmic sodium must exceed its typical concentration to favour a reversal in potential, which naturally occurs in human newborns and some animals primarily through an elevated heart rate; in patients with chronic heart failure it occurs through the administration of digitalis. As a result of increased contractility, stroke volume is increased. Ultimately, digitalis increases cardiac output (cardiac output = stroke volume x heart rate). This is the mechanism that makes this drug a popular treatment for congestive heart failure, which is characterized by low cardiac output. Digitalis also has a vagal effect on the parasympathetic nervous system, and as such is used in re-entrant cardiac arrhythmias and to slow the ventricular rate during atrial fibrillation. The dependence on the vagal effect means digitalis is not effective when a patient has a high sympathetic nervous system drive, which is the case with acutely ill persons, and also during exercise. Digoxigenin Digoxigenin (DIG) is a steroid found in the flowers and leaves of Digitalis species, and is extracted from D. lanata. Digoxigenin can be used as a molecular probe to detect mRNA in situ and label DNA, RNA, and oligonucleotides. It can easily be attached to nucleotides such as uridine by chemical modifications. DIG molecules are often linked to nucleotides; DIG-labelled uridine can then be incorporated into RNA via in vitro transcription. Once hybridisation occurs, RNA with the incorporated DIG-U can be detected with anti-DIG antibodies conjugated to alkaline phosphatase. To reveal the hybridised transcripts, a chromogen can be used which reacts with the alkaline phosphatase to produce a coloured precipitate. Toxicity Depending on the species, the digitalis plant may contain several deadly physiological and chemically related cardiac and steroidal glycosides. Thus, the digitalis plants have earned several, more sinister, names: dead man's bells and witch's gloves. Digitalis intoxication, known as digitalism, results from an overdose of digitalis and causes gastrointestinal disturbances and pain, severe headache, nausea, vomiting and diarrhoea, cardiac arrhythmias, as well as sometimes resulting in xanthopsia (jaundiced or yellow vision). The toxins can be absorbed via the skin or ingestion; early symptoms of digitalism include nausea, vomiting, diarrhoea, abdominal pain, wild hallucinations, delirium, and severe headache. Depending on the severity of the toxicosis, the victim may later suffer irregular and slow pulse, tremors, various cerebral disturbances, especially of a visual nature (unusual colour visions (see xanthopsia) with objects appearing yellowish to green, and blue halos around lights), convulsions, and deadly disturbances of the heart. Other oculotoxic effects of digitalis include generalized blurry vision, as well as the appearance of blurred outlines ('halos'). Other things mentioned are dilated pupils, drooling, weakness, collapse, seizures, and even death. Because a frequent side effect of digitalis is reduction of appetite, some individuals have used the drug as a weight-loss aid. Digitalis poisoning can cause heart block and either bradycardia (decreased heart rate) or tachycardia (increased heart rate), depending on the dose and the condition of one's heart. The electric cardioversion (to "shock" the heart) is generally not indicated in ventricular fibrillation in digitalis toxicity, as it can increase the dysrhythmia. Furthermore, the classic drug of choice for ventricular fibrillation in emergency setting, amiodarone, can worsen the dysrhythmia caused by digitalis, therefore, the second-choice drug lidocaine is more commonly used. The entire plant is toxic (including the roots and seeds). Mortality is rare, but case reports do exist. Most plant exposures occur in children younger than six years and are usually unintentional and without associated significant toxicity. More serious toxicity occurs with intentional ingestion by adolescents and adults. In some instances, people have confused foxglove with the relatively harmless comfrey (Symphytum) plant, which is sometimes brewed into a tea, with fatal consequences. Other fatal accidents involve children drinking the water in a vase containing digitalis plants. Drying does not reduce the toxicity of the plant. The plant is toxic to animals, including all classes of livestock and poultry, as well as felines and canines. Trivia According to 1981 speculation Vincent van Gogh's "Yellow Period" may have been influenced by digitalis, because it had been proposed as a therapy to control epilepsy around this time, and there are two paintings by the artist where the plant is present. Other studies immediately questioned this: there are a large number of other possible explanations for van Gogh's choice of palette, there is no evidence that van Gogh was ever given the drug or that his physician prescribed it, he was tested and had no xanthopsia, and in his many letters of the time he makes it clear that he simply liked using the colour yellow, but it has remained a popular concept. In the American television crime drama series Columbo, digitalis is frequently suspected as a drug used to murder victims. The James Bond film Casino Royale features Bond poisoned by digitalis by the terrorist financier, Le Chiffre. References External links Molecule of the Month - Digitalis eMedicine link Plantaginaceae genera Antiarrhythmic agents Medicinal plants
35976086
https://en.wikipedia.org/wiki/Hatoful%20Boyfriend
Hatoful Boyfriend
is a Japanese dōjin soft otome visual novel released in 2011 for Microsoft Windows and OS X, in which all the characters other than the protagonist are sapient birds. It was developed by manga artist Hato Moa's dōjin circle PigeoNation Inc., and is the successor of a Flash game of the same name she created for April Fools' Day in 2011. A free demo version of Hatoful made with the FamousWriter engine was released later that year, followed by a full commercial version released on 30 October 2011 at COMITIA 98, and an English version released in February 2012. An international remake by developer Mediatonic and publisher Devolver Digital, dubbed Hatoful Boyfriend HD in Japan, was released on 4 September 2014 for Microsoft Windows, OS X, and Linux and for PlayStation 4 and PlayStation Vita on 21 July 2015 in North America, and on 22 July 2015 in Europe, respectively. A port for iOS was released on 25 May 2016. Hatoful Boyfriend received generally positive reception; reviewers praised the game's replay value as well as its writing and characterization, while repetitive gameplay and the accessibility of the game's Bad Boys Love mode received a more mixed response. A sequel, Hatoful Boyfriend: Holiday Star, was released on 29 December 2011, with an English version being released on Christmas Day the following year. In addition to the main games in the series, Hatoful Boyfriend has made transitions into other media: a monthly webcomic was serialized in the anthology Manga Life WIN+, several supplementary materials and official dōjin works have been released, and four drama CDs based on the series have been made. An episodic web series began in 2014. Gameplay Hatoful Boyfriend is an interactive text-based visual novel that follows a branching plot line, with the player's decisions determining which of the game's multiple endings they receive. The title is a pun on the wasei-eigo word , and the Japanese word , as the game features pigeons and other birds as major characters. The game is set in a version of Earth populated by sapient birds, and its main story follows the player character and protagonist—the only human attending St. PigeoNation's Institute, an elite school for birds—as she finds love among her avian acquaintances. Bad Boys Love, a hidden alternate story mode, opens with the discovery of the protagonist's corpse, after which the player follows her best friend Ryouta Kawara as he investigates the circumstances of her death and unravels darker conspiracies surrounding the school. Gameplay in Hatoful Boyfriend is similar to most other visual novels for the PC, with the controls limited to the mouse and the only interactions being clicking to forward the game's narrative or to choose between multiple plot choices. The keyboard can also be used instead of the mouse, with the 'enter' key serving the same purpose as clicking. The save button can be employed at any point during the game, which also features several pages of save slots, allowing gameplay to be easily picked up from prior to a choice the player made. An arrow button in the upper right corner also allows the player to skip dialogue and interactions they have already experienced. The player assumes control of the protagonist, a teenage human girl. As the game follows a branching plot line with multiple endings, at various points during gameplay the player is allowed to make choices that determine which character's romance route the player will encounter. On weekdays, the player can also choose which classes to attend, which changes one of the protagonist's three stats depending on the activity chosen. Having certain stat values are required to obtain the good endings for each love interest and to otherwise advance along certain routes. There are thirteen (fourteen in the 2014 remake) endings in total: one ending for each of the main love interests, three extended endings for three of the love interests based on stat values, one ending for the gaiden-esque Torimi Café storyline, and one ending attained if the player fails to romance any character. When routes are completed, documents are unlocked that provide insights into the game's overarching storyline. These documents can be viewed at any time in the game's archive feature, which is accessed from the title screen. After obtaining the four specific endings required to trigger it, the player is given a new prompt to either "fulfill the promise" or live "a normal life" upon starting a new game. Choosing to live a normal life will result in a normal playthrough, while choosing to fulfill the promise locks the player into the true route or scenario Bad Boys Love, or BBL (also known as Hurtful Boyfriend), which explores the full extent of the underlying plot alluded to by the documents and various points of foreshadowing in the dating simulation portion of the game. If the player chooses to fulfill the promise, aside from several dream sequences, gameplay at first appears to continue normally until the in-game date is 2 September. The player's perspective then switches from the protagonist to the protagonist's best friend, and the events of the scenario begin regardless of any other choices made by the player up to that point. If the player obtains all other possible endings prior to starting Bad Boys Love, an extended epilogue plays after the game's credits upon completion of the scenario. In a departure from the generally lighthearted romantic routes, Bad Boys Love is presented as a murder mystery psychological thriller, and is significantly longer than any other route in Hatoful Boyfriend, making up most of the game's actual length. There are several changes to gameplay and the way text is displayed during Bad Boys Love in the original version of the game: saving is disabled except at certain points in the story, the function to skip dialogue and interactions is removed, and plot-important dialogue and narrative are highlighted with colored text; usually yellow, though text of particularly critical importance is highlighted in red. In the 2014 remake however, the option to save is available at all times, the skip function is retained, and text is no longer highlighted. In both versions, the game's interface and controls change from that of a standard visual novel to similar to that of a '90s-era turn-based role-playing game during certain segments of the narrative. Plot Setting Hatoful Boyfriend is set in an alternate version of Earth in which sapient birds have seemingly taken the place of humans in society for reasons that are hinted at, but not fully explained in the dating simulation portion of the game. In Bad Boys Love, it is revealed that Hatoful is set in a post-apocalyptic, dystopian future—in which a pandemic of a deadly, mutated strain of the H5N1 virus, or bird flu, nearly wipes out mankind in the year 2068. The release of a counter-virus, cultivated to destroy the virus' avian carriers in a desperate attempt to stop the spread of the disease, ends up backfiring as birds who resisted the counter-virus instead developed human-level intelligence. War soon breaks out between the newly uplifted birds and the remnants of humanity, resulting in birds emerging as the planet's new dominant lifeforms as humans continued to succumb to the disease. Following several terrorist attacks by a human insurgency, all remaining humans have been forced to live in the wilderness away from civilization in a form of apartheid-like segregation. The game's story takes place primarily at the fictional —a bird-only high school located in the fictional Japanese town of —long after open warfare between humans and birds has ended. Society has adjusted to the avian conquest, though with minor bird-related cultural changes—for example, while some holidays such as Christmas and Tanabata are celebrated much as they are in the present day, a major event in the game is Legumentine's Day, an amalgamation of the traditions of Valentine's Day and Setsubun. In a more grim case, the terms war dove and war hawk have been re-purposed as labels for two opposing political factions divided over the ongoing mutual hostility between birds and the human minority: the altruistic Dove Party, who advocate for cooperation and peace between the two groups, and the militant Hawk Party, whose goal is to exterminate humanity altogether. By the time Hatoful Boyfriends narrative begins, the Dove Party, the Hawk Party, and their respective schools of thought dominate much of the world's politics. Characters The primary playable character in Hatoful Boyfriend is the human protagonist, a boisterous hunter-gatherer who lives in a cave in the wilderness. Her eight potential love interests in the original version of the game, who together form the rest of the main cast, are: Ryouta Kawara, a rock dove and the protagonist's sickly but hardworking childhood friend; Sakuya Le Bel Shirogane, a fantail pigeon and snobbish French aristocrat; Sakuya's older half-brother Yuuya Sakazaki, a popular and flirtatious but strangely secretive fantail pigeon; Nageki Fujishiro, a quiet, bookish mourning dove who never seems to leave the library; San Oko, an athletic, hyperactive fantail pigeon who is obsessed with pudding; Anghel Higure, an eccentric Luzon bleeding-heart who behaves as if he were in some kind of fantasy role-playing game; Kazuaki Nanaki, a kind but narcoleptic button quail and the protagonist's homeroom teacher; Shuu Iwamine, a creepy, antisocial chukar partridge who serves as the school's doctor. In the 2014 remake, there became two more possible love interests: Azami Koshiba, a no-nonsense Java sparrow and takoyaki saleswoman Tohri Nishikikouji, a golden pheasant and Iwamine's long-forgotten work rival While most of the characters are normally represented in-game with pictures of birds, if the player toggles on the ICPSS (Intra-Cerebral Playback Synchro System) feature at the start of the game, or in Japan, each of the possible love interests is shown with a version of what they would look like as a human when first introduced. Although the ICPSS feature also lists voice credits for each of the main love interests in the original version of the game, the game itself is unvoiced; however most of the voice actors who were credited later signed on to actually voice their respective characters in the drama CDs based on the series. Story The events of Hatoful Boyfriend begin in the year 2188, when the protagonist, a teenage human girl invited to attend the prestigious bird-only St. PigeoNation's Institute, starts her second year of high school. After a hectic and surreal freshman year of attendance at St. PigeoNation's, the protagonist has grown accustomed to the confusion of being the only human in a school full of birds, and is looking forward to her sophomore year. The story of the dating simulation portion of the game follows the protagonist, and the inter-species love and hijinks—of both the mundane high school and quasi-anthropomorphic bird-specific varieties—that ensue as she draws the attention of and attempts to romance one of a number of eligible birds she comes in contact with over the course of the year. Bad Boys Love Should the player choose to fulfill the promise, the protagonist begins her sophomore year at St. PigeoNation's largely as normal, but with one exception—she begins to have recurring dreams of her younger self and Ryouta, and her parents lying dead in front of an unfamiliar house. A mysterious man approaches them, promising to grant any wish that they make. On 2 September, she decides to check on Ryouta, who had gone to the infirmary earlier that day; the next morning, she fails to show up for class. Kazuaki asks Ryouta to retrieve their class' box of print handouts, and upon retrieving it, blood is discovered leaking from a corner of the box. Ryouta opens the lid, and it is revealed that the box contains the protagonist's severed head. A siren sounds and there is an order to evacuate to the gymnasium, where Ryouta overhears other students mention that more pieces of a human corpse were found in the other print boxes. Doubting the headmaster's explanation of a natural disaster occurring, Sakuya and Ryouta resolve to figure out the identity of the protagonist's killer and leave the gymnasium, discovering a large metal dome surrounding the school. Upon returning to their classroom and finding the box empty, Yuuya explains that the protagonist's body had been gathered in the chemistry lab, where Shuu performs an autopsy concluding that the protagonist died of asphyxiation caused by illness or poison with the dismemberment occurring afterwards. Assisted by the school janitor Mister One, and pursued by a grotesque scarecrow-like being named Labor 9 who suddenly appears on the school grounds, Ryouta and Sakuya begin investigating the dome and the murder. They visit the lab and compare alibis; Shuu, who Ryouta distrusts, asks if Ryouta has forgotten anything important, to which he replies that he hasn't. Upon investigating the headmaster's office, they discover the headmaster had likewise been poisoned to death, what they saw earlier being merely pre-recorded footage; they also find a computer and a pair of documents, one titled The Human Representative and a torn, unreadable one titled Operation Hatoful. The Human Representative reveals that if the protagonist, a symbol of humanity, were to die, the campus would be sealed off and the birds inside handed over to humans as sacrifices—something confirmed when the computer is used to open a small hatch in the dome and students are shot dead as they attempt to flee—when the dome is lifted twelve hours after her death is reported. In trying to find a way to escape before the dome rises, Ryouta and Sakuya uncover records in the library mentioning a medical center that was shut down due to a fire, and that the ghostly Nageki, who Ryouta previously encountered, died in that fire. Sakuya deduces that an unused building on campus was the medical center and after investigating, they find its basement blocked off. They also encounter Anghel, who recalls the protagonist going into the infirmary the day before, contradicting Yuuya and Shuu's shared alibi. As Ryouta searches the infirmary for clues, he finds medical records for himself, the protagonist, Nageki, and Sakuya, but is knocked out immediately after. When he regains consciousness, he discovers the protagonist's bloody student ID—now with concrete evidence, Ryouta prepares to confront the doctor and Yuuya, only to find that Sakuya had left to do so alone. Ryouta returns to the infirmary as Yuuya shields Sakuya from Shuu's attempts to kill him; the doctor tells Ryouta that he will be waiting for him in the medical center's basement before escaping with Labor 9. Yuuya apologizes, affirming that while neither he nor Shuu killed the protagonist, they were the ones who dismembered her, and asks to speak to Sakuya alone. He reveals that they are full-blooded siblings, with Shuu using knowledge of Sakuya's true heritage to blackmail Yuuya into assisting him. Yuuya seemingly dies, Shuu's scalpel having been laced with the same neurotoxin that killed the headmaster, leaving Sakuya in a state of shock. Ryouta, searching for a way into the medical center basement, seeks out Nageki, a ghost, in the library to ask him about his death. Upon discovering documents revealing that Operation Hatoful was a Hawk Party project into developing biological weapons for use against humans using the school as an experimental facility—with a focus on a strain of H5N1 almost immediately lethal to humans dubbed the Charon virus—Nageki recalls that the fire was caused by his committing suicide by self-immolation after months of forced experimentation in order to destroy and remove any trace of the virus, which was isolated in his body, and that researchers often went in through the chemistry preparation room. Ryouta, Kazuaki, and Anghel make their way into the medical center's basement through the chemistry lab and encounter Labor 9, electrocuting it using a stun gun given to Ryouta earlier by Mister One. They confront Shuu, who imprisons Kazuaki and Anghel, leaving them to die of poison gas before leading Ryouta away. Meanwhile, San comforts Sakuya, and the two of them arrive to break Anghel and Kazuaki out of the prison. Alone with Shuu, Ryouta finally remembers what he had forgotten due to the traumatic nature of the events: he recognizes the doctor as a man who promised to grant his wish for peace between birds and humans after he and the protagonist witnessed a human terrorist incident at a bird orphanage in which the protagonist's parents, crisis negotiators, were killed, and that the protagonist died when she visited him in the infirmary. It is revealed that she died by Charon virus after coming in contact with Ryouta, as Shuu had induced the virus into Ryouta's body though grafts from Nageki's remains for the purpose of using him to exterminate humanity—since there can be no more fighting between two factions if one is wiped out, this would grant Ryouta's wish. Shuu then remarks that Labor 9 was powered by the protagonist's now irreversibly damaged brain. A broken Ryouta submits to Shuu's offer of becoming a living weapon of mass destruction after these revelations, and a struggle ensues during which the protagonist's spirit intervenes. Ryouta then asks Shuu why he decided to grant his wish, to which it is implied that Shuu's affection towards Ryouta's deceased father, Ryuuji, was greater than Shuu himself would like to admit, and that he was motivated by Ryuuji's dying request: to do something for his son. Shuu admits defeat and offers to lead them out of the school through a safe passage, but Kazuaki pulls out a gun and shoots him as the group prepares to escape. Shuu then recalls that Nageki's only relative—his adoptive brother—was, like Kazuaki, a quail. The terrorist incident occurred at Kazuaki and Nageki's orphanage and left them as the only survivors; witnessing Nageki's subsequent suicide drove Kazuaki insane, leading him to fake his own death, assume a new identity, and join the school's faculty to take revenge against Shuu, the head of Operation Hatoful. Ryouta, channeling Nageki, eases Kazuaki's guilt and convinces him to move on. They reunite with the other characters and exit the school along with the other students and faculty brought there by Mister One; however Ryouta, now thoroughly infected by the Charon virus, elects to stay behind in cryonic storage until a cure is found. The scenario ends with Sakuya vowing to come back for Ryouta, and Ryouta offering to recap the day's events to the protagonist's spirit, within the remnants of Labor 9, as the door to the storage facility closes. If the extended epilogue is unlocked, it is revealed that Yuuya survived being poisoned long enough to receive an antidote, and the game's closing lines imply that with Shuu's cooperation, a cure for the Charon virus has been developed. Development Hatoful Boyfriend is the first game developed by manga artist and writer Hato Moa—author of the series Vairocana and a former Dengeki Comic Grand Prix honoree—under her dōjin circle PigeoNation Inc. As she had no experience with game development prior to Hatoful, Hato initially wanted to start with a visual novel, as she believed it was an easier game type for amateur developers to make; the format also allowed visuals to easily accompany her stories, something that she, as a manga artist, was accustomed to and viewed as being necessary in her work. She first came up with the concept for Hatoful as a 2011 April Fools' Day joke: despite her lack of familiarity with the genre, she initially intended to create a parody of otome game stereotypes. Birds in particular were used as a theme due to Hato's fondness for pigeons; however, this was also partially due to Hato's prioritization of writing over illustrating, as the use of bird photographs instead of hand-drawn sprites allowed her more freedom to focus on the script. The first incarnation of the game was created over the course of half a day and posted as a browser game made with Adobe Flash; but due to strong word of mouth from social media it was taken down after immense traffic caused the web server it was hosted on to crash on two separate occasions. Following the unexpected popularity of the Flash game, development began on a longer visual novel using the FamousWriter game engine. Most of the technical aspects of development—game direction, scripting, and programming—were handled by Hato alone, with fellow artist Damurushi assisting with some minor aspects of the script and art direction. Despite this, most of the roughly seven-month period Hato spent developing Hatoful from a one-off April Fools' gag to the finished product was dedicated to the construction of the narrative. Hato's approach towards the game's writing was "to create something that seems ridiculous and crazy at first glance, but that once you look into the world, you would fall into the depth"; however, resolving the darker elements of the plot, particularly the Bad Boys Love scenario, with facts established in the quickly conceived and largely comedic pilot proved to be difficult for her, with inconsistencies in the overall timeline becoming a major concern. During the writing process, Hato admitted that she was critical of the scenarios she had written and constantly doubted whether the final product "turned out well", later remarking in a postscript written for the game's official guidebook: Most of the background images and photography used for the characters' sprites in the original games and associated media are taken from royalty-free sources or fan submissions, though in some cases pictures of birds and backgrounds used are Hato's own artwork or photography—the character San Oko is depicted by and based on her real life pet bird Okosan, and several of the sprites featured in Hatoful Boyfriend were derived from pictures she had taken of birds kept at the Kobe Animal Kingdom or the Torimi Café, also located in Kobe. All of the music tracks and sound effects used in Hatoful are also similarly taken from royalty-free sources. Naming and allusions The title of the game is a multi-layered pun; the wasei-eigo word means "heartful", however it is also phonetically identical to the Japanese pronunciation of the English word "hurtful". This is referenced in an alternate name for the Bad Boys Love route, Hurtful Boyfriend, as well as in the subtitle for the full release of the original game, Hatoful Boyfriend: Hurtful Complete Edition. Additionally, the official English transliteration of the title also incorporates the Japanese word , which means "pigeon" or "dove," and is also one of the names of the game's creator, Hato Moa. Similarly, the names of several characters are puns on the Japanese names of their respective species of bird: for example, Ryouta Kawara is a rock dove, or ; Nageki Fujishiro is a mourning dove, or ; Shuu Iwamine is a chukar partridge, or ; and Kazuaki Nanaki is a button quail, or —the character being present in his last name, . Several locations and personalities featured in the game directly correspond to real life venues and people—for example, blogger Brian Pigeon is mentioned as one of the Hatoful world's most influential writers. Likewise, some aspects of Hatofuls narrative reference real world events, media, or people: the deadly H5N1 pandemic forming the basis of the game's post-apocalyptic setting was inspired by historical outbreaks of disease, most prominently the 1918 flu pandemic; depictions of Hitchcock's Winter, the in-universe war between humans and birds, bear several similarities to Alfred Hitchcock's film The Birds; and Operation Carneades—the codename given to the human countermeasure against H5N1 that instead granted intelligence to birds in the Hatoful universe—was named after Greek philosopher Carneades and one of his thought experiments, the Plank of Carneades. Release history Hatoful Boyfriend first release in its current visual novel format was a freeware demo released as a downloadable application on 31 July 2011. The demo version contains basic routes for seven of the love interests, and also functions as a benchmark for players to assess if the full game will run on their computer before purchasing it. The first commercial variant of the game, Hatoful Boyfriend: Plus, introducing Anghel as a love interest, was released on 14 August 2011. Plus, a precursor of the full game used as a debugging site for new content and additional scenes intended for the final release, was discontinued on 28 October 2011 when it was patched with the finalized full version. The completed full game itself, Hatoful Boyfriend: Hurtful Complete Edition, was released at COMITIA 98 on 30 October 2011, and includes all content in previous versions of the game as well as the Torimi Café and Bad Boys Love scenarios. In Japan, Plus and Hurtful Complete Edition were initially available only as physical CD-ROMs; a downloadable version of the full game in Japanese was eventually released three years later on 13 April 2014, where the Hurtful Complete Edition was renamed to simply Full version. Due to limitations of the FamousWriter game engine, the demo, Plus, and Hurtful Complete Edition versions of Hatoful Boyfriend are only supported on computers running Windows XP or OS X 10.1-10.5, but are playable—though unsupported—on computers running Windows Vista, Windows 7, or OS X 10.6 with Rosetta. 2014 remake Plans to remake the original game in high-definition first began to form when Ed Fear, a writer and creative producer at game developer Mediatonic, contacted Hato Moa about the possibility of translating any projects she was involved in to English. According to the remake's creative director Jeff Tanton, the decision was made to remake the game following several e-mail conversations between Fear and Hato, with Fear's positive experience with the Japanese version of the game and the incompatibility of FamousWriter-made games with newer operating systems—which had rendered the original game effectively unplayable on newer PCs—being major factors in the decision. An international remake of Hatoful developed by Mediatonic and published by Devolver Digital made using the Unity game engine—allowing the game to be fully compatible with computers that run Windows Vista, Windows 7 or OS X 10.6, and playable for the first time on those that run Linux or OS X 10.7 or newer—was first revealed to be in development on 6 June 2014, with a formal announcement coming shortly afterwards at Electronic Entertainment Expo 2014. The remake, known as Hatoful Boyfriend HD in Japan, was originally slated for release via Steam on 21 August 2014; however release was later postponed to 4 September 2014 to allow for final adjustments to the Japanese version. The remake includes a new route for Azami, full screen capability, and redrawn backgrounds. A collector's edition of the remake titled Hatoful Boyfriend Summer of Dove Collector's Edition was released for pre-order along with the normal edition, and bundles together the remake, the original Hatoful Boyfriend: Hurtful Complete Edition, a digital version of the game's soundtrack, a new comic illustrated by Hato, exclusive wallpapers of Okosan, and a St. PigeoNation's Class of 2014 yearbook. The remake was also included in the Humble Bundle pack for Valentine's Day 2015, which exclusively featured dating sim games, along with a Hatoful Boyfriend pillowcase for the highest price point option. A port of the remake for PlayStation 4 and PlayStation Vita was released on 21 July 2015. And one to for iOS on 25 May 2016. English localization On 22 November 2011, freelance translator Nazerine released a fan translation patch of the free demo version of the game. The initial project involved Nazerine translating, writing, and revising the game text, while another person hacked the game so that the translated text displayed properly on the game screen. This English patch launched the first wave of western interest in the game, with several video game publications reporting on it due to the game's unusual concept. The success of this translation attracted the attention of Hato Moa herself, who then offered Nazerine the opportunity to translate the full game—and later its sequel, Holiday Star—for an official English release. The translation process of the full game was also a solo effort by Nazerine; however, Hato removed the need for hacking by directly supervising the translation, adjusting images in the game for English sentences. As the demo's English patch was made before Nazerine had access to the full version of the game, several lines of dialogue were translated differently to reflect context revealed in Bad Boys Love; for example, third-person pronouns from Kazuaki Nanaki's route implying that he had a female lover in the fan translation were replaced with gender-neutral ones in the official translation. Few dramatic changes were made, though several jokes were added in Nazerine's translations of the game that were not present in the original Japanese text. The official English version of the game was released for download on 15 February 2012. The 2014 remake has also been confirmed to have used Nazerine's translation. Adaptations Books and publications Several official dōjin works and supplemental materials illustrated by Hato Moa and Damurushi have been released alongside the games. An official guidebook with extra information regarding the game's setting and characters was released at 29 December 2011 at Comiket 81. The second edition of the guidebook, re-branded as a "fanbook" (), and Absolute ZERO, an anthology about the fantasy universe perceived by Anghel Higure, were both released on 11 August 2012 at Comiket 82; an English version of Absolute ZERO was later released for Amazon Kindle on 27 August 2014. , featuring an alternate universe retelling of events discussed in Holiday Star, was released in Japanese at COMIC CITY SPARK 7 on 7 October 2012, and on 23 December 2013 for Kindle in English. , an anthology featuring the Hawk Party researchers, was released in Japanese at Comiket 84 on 12 August 2013, and on 29 May 2014 for Kindle in English. , a side story featuring the Kobe Animal Kingdom and written as part of a fundraising event for the venue, was released on 1 February 2014. , a collection of haiku poems written from the perspective of the original Kazuaki Nanaki, was released on 15 August 2014 at Comiket 86. Webcomic A webcomic based on the series, written and illustrated by Hato Moa, was serialized in publisher Takeshobo's webcomic anthology Manga Life WIN+ from 8 June 2012 until the anthology's discontinuation, containing sixteen chapters. Each chapter is composed of several four-panel comic strips, followed by a short story in which the characters are depicted in their human forms. The first twelve chapters have since been collected in one tankōbon volume (), which was released on 10 August 2013. The volume also contains a feature where the series' characters answered questions sent in by fans. A subsequent dōjin anthology containing chapters thirteen to sixteen plus a bonus ten-page comic, , was released on 30 December 2013 at Comiket 85 in Japanese, and on Kindle in English. Drama CDs Four drama CDs by Frontier Works based on the series have been released. The first CD, titled had a preliminary release on 29 December 2011 at Comiket 81, and was released for general distribution on 25 January 2012. The second CD, titled Primal Feather, was released on 25 April 2012, followed by a third CD, titled Summer Vacation, on 10 August 2012 at Comiket 82, which had a general release on 12 September 2012. A fourth CD with a Legumentine's Day theme, titled , was released on 14 February 2013. Web radio An internet radio show for the series titled was broadcast from 24 December 2011 to 25 January 2012 on the Animate TV website, with the voice cast from the drama CDs reprising their respective roles. The show was hosted by Shintarō Asanuma, who played Ryouta Kawara in the drama CDs, and Hirofumi Nojima, who played Kazuaki Nanaki. Each episode consisted of four segments: , a normal talk corner, , in which various questions regarding life as birds were answered, , in which lines from the game were read, and , a question and answer corner where the voice actors answered any questions from viewers in-character. Web series A trailer for the web series was released in Japanese on 20 October 2013, with an English-language translation of the trailer being released on 23 May 2014. The first episode, titled , was released on 19 May 2014. The series is released in visual novel format on the Adobe AIR platform, and takes place in a different universe than the game series. Plush production line On 3 November 2015, Erick Scarecrow of Esc-Toy Ltd. launched an official Kickstarter campaign, together with Hato Moa and Devolver Digital, with a set goal of $25,000 to create a production line of three characters from the Hatoful Boyfriend universe, namely, Shuu, Ryouta and Okosan. During the campaign, all stretch goals were reached, the last ending at $75,000, adding seven more characters to the production line. The campaign ended on 6 December 2015, with a total of $145,015 pledged by 2,514 backers. Another official campaign was launched a year later featuring a second series of characters manufactured as limited edition plush. The $50,000 goal was met by 459 backers whom altogether pledged a total of $54,455 by the campaigns end on 6 December 2016. Reception As a dōjin soft title, Hatoful Boyfriend was created on a limited budget and had even more limited promotion; however, due to strong word of mouth on Twitter and other social media Hatoful has enjoyed a degree of commercial success, especially considering its minimal production costs—with Mado no Mori reporting that the game was a "popular title" whose physical CD-ROM copies "consistently sold out at dōjin markets and wherever it became available for purchase", and 4Gamer.net noting that the game disk was difficult to purchase due to overwhelming demand. Outside Japan, where it is only available by download, the English release of the game is dōjin soft distributor DLsite English's best-selling title with 7,000 separate purchases as of 2014. Hatoful Boyfriend has received generally favorable reception, with reviewers focusing on the surprising depth of the game's writing and storyline. In a weekly game spotlight, Kouichi Kirishima from Mado no Mori recommends the game to "not just pigeon-lovers, but anyone who enjoys visual novels", remarking that the game is "at times surprisingly serious and emotionally involved". On the other side of the Pacific, Julian Murdoch comments in a Gamers With Jobs analysis of the game that the scenarios featured in Hatoful are "elaborate and multifaceted", and that Hato Moa herself "isn’t just a storyteller, she’s actually a good storyteller". Hatoful Boyfriend was also named the best PC game of 2012 by GameCola; Paul Franzen explains the game's inclusion among higher budget and more technically sophisticated titles as being due to the strength of its storytelling and pathos, stating that "Hatoful Boyfriend isn't just a weird game about heathen human-animal relationships [...] there’s an actual, serious, emotional game here, too". In an article discussing the E3 announcement of the 2014 remake, Carly Smith for The Escapist remarks that Hatoful Boyfriend is "absolutely hilarious", but recommends that players "start the game for a laugh, but stick with it for a ride you wouldn't have expected by looking at the cover". Reviewers also praised the game's varied scenarios and replayability. Dora from Jay Is Games praised the game, saying that "with a huge amount of replay value, creativity to burn, and some of the most shocking plot lines you could ever hope to encounter, Hatoful Boyfriend is a fascinating and surprising text adventure well worth checking out", though she also observes that "the delayed payoff and the abruptness of some of the endings combined with the oddball concept may not appeal to every fan of the visual novel genre". Alexa Ray Corriea's Polygon review of the remake gave it an 8 out of 10, concluding that the "witty dialogue and absolutely bonkers scenarios are genuinely fun to discover, and the handful of different storylines make repeated playthroughs worthwhile". Some critics however expressed concerns over the presence of some repetitive aspects of gameplay—noted as being especially apparent when attempting multiple playthroughs—as well as the accessibility of the game's Bad Boys Love scenario to casual players: in his review of the remake for PC Gamer, Julian Murdoch states that "I suspect few will have the patience to ride the fast forward button and suss out the romantic proclivities of each cast member to get to the extended ending—really a second half—of the game". Much attention was drawn to Hatoful Boyfriends surreal concept in both its native Japan as well as overseas. Mentions of the game's "bird romance" spread through Japanese social media, leading several news agencies and publications to report on Hatoful and the "newness" of its premise. As translations began to make the game accessible to an English speaking audience, western media reacted similarly: Alec Meer for Rock, Paper, Shotgun commented on Hatofuls premise, citing it as being "reason enough to play it"; also for Rock, Paper, Shotgun, Craig Pearson stated that the game "could only be better if it was a secret game from Valve and BioWare". One of the main spurs to the game's popularity was a playthrough recorded by Angie Gallant on the Quarter to Three forums. In a retrospect, Jeffrey Matulef for Eurogamer remarks that Hatoful Boyfriends "outlandish premise caught on and the English speaking world demanded it not be left out of this surreal creation", while Robert Fenner of RPGFan compared it favorably to Hiroki Azuma's writings on database consumption, praising the game as a "fierce deconstruction as well as a tender celebration of dating sims". Several Japanese commentators have also noted the game's overseas success, especially following the E3 announcement of the remake by British developer Mediatonic and American publisher Devolver Digital. Legacy A sequel titled Hatoful Boyfriend: Holiday Star, was released in Japan on 29 December 2011, with an official English patch being released a year later on Christmas Day. The game is an episodic followup set around the holiday season and takes place in a separate universe from the first game, in which the events of Bad Boys Love do not occur. On 8 December 2015, it was announced that a remake would be released on 15 December 2015 for Microsoft Windows, OS X and Linux, and on 22 December 2015 for PlayStation 4 and PlayStation Vita. In 2018, Hato Moa made a blog post announcing the development of a third Hatoful game, titled Hatoful Boyfriend: MIRROR. This game would be set in an alternate universe where most of the main characters are alive, and where the events of the preceding two games did not occur. As of August, 2019, the game was still in development. See also Raptor Boyfriend References Sources External links 2010s webcomics 2011 video games 2014 video games Android (operating system) games Japanese comedy webcomics Dating sims Doujin video games Indie video games IOS games Linux games MacOS games Otome games Parody video games PlayStation Plus games PlayStation 4 games PlayStation Network games PlayStation Vita games Post-apocalyptic video games Romance video games School-themed video games Video games about amnesia Video games about birds Video games developed in Japan Video games developed in the United Kingdom Video games featuring female protagonists Visual novels Windows games Yonkoma Devolver Digital games Christmas video games Fiction set in the 2180s
28774596
https://en.wikipedia.org/wiki/TaxSlayer
TaxSlayer
TaxSlayer LLC (formerly known as TaxSlayer.com) is a privately held tax preparation and financial technology company based in Augusta, Georgia. The company offers online tax preparation technology for American consumers and tax professionals, allowing them to electronically file state and/or federal returns. TaxSlayer also offers business technology products and services for legal, bookkeeping and HR/payroll. According to the National Association of Tax Professionals (NATP), TaxSlayer Pro is one of the top-rated software packages for tax professionals in the U.S. In 2015, the IRS awarded TaxSlayer with the exclusive contracts for its VITA and TCE programs, a five-year contract that provides electronic tax preparation assistance for taxpayers who are low-income, elderly, disabled or who have limited English language proficiency in over 9,500 locations worldwide. Over 90,000 tax preparers use TaxSlayer as part of the program. In 2010, the company built its headquarters building in Evans, Georgia, a large suburb of metro Augusta. In 2017, the company purchased a building in Downtown Augusta’s Innovation Zone that will become its Innovation & Technology Campus and company headquarters in 2018. TaxSlayer plans to continue to operate from both buildings, refurbishing the Evans building as a dedicated operation unit known as the Customer Excellence Center. Between the two buildings, the company will be able to house 600 employees across the metro Augusta area. The company is also known for its sports sponsorships, such as the TaxSlayer Bowl, a major college football bowl game in Jacksonville, Florida previously known as the Gator Bowl. Other sports sponsorships have included: Dale Earnhardt Jr., NASCAR and the JR Motorsports team, as well as three PGA Tour golfers. History In the early 1960s, Aubrey Rhodes, Sr. founded Rhodes-Murphy & Co., a full-service tax preparation company that remains in operation in Georgia and South Carolina. In 1989, the company formed a subsidiary, Rhodes Computer Services, to start developing tax preparation software for others to use. Four years later, Rhodes Computer began selling taxation software known as "Taxslayer Pro" to tax preparers and accountants throughout the United States. TaxSlayer was named for the original email address of Jimmy Rhodes, son of Aubrey and the President and CEO at the time. In 1998, the firm began developing TaxSlayer.com to market its software to individuals. TaxSlayer is now one of the largest online tax preparation services and a direct competitor to Intuit's TurboTax. In 2017, the company reported record growth in tax e-files with more than 10 million state and federal returns for the year, representing a 200% increase over the past three years. Product offerings TaxSlayer produces software for several different market segments: consumers, professional tax preparers and the IRS VITA (Volunteer Income Tax Assistance) and TCE (Tax Counseling for the Elderly) program volunteers. Consumer TaxSlayer’s consumer products allow taxpayers to electronically file their taxes online each year. The product provides several different packages to provide varying levels of assistance and support customers require when filling. TaxSlayer Pro TaxSlayer Pro is designed to be licensed by members of tax preparation practices and small to mid-sized accounting firms. Government partnerships IRS VITA/TCE TaxSlayer maintains a continuing partnership with the Federal VITA and TCE programs. Both programs provide tax preparation assistance to Americans who may require assistance filing. VITA provides IRS-trained tax preparers who help those who are disabled, are low income, or limited in English proficiency, while TCE provides a similar service to the elderly. Both use TaxSlayer-provided software to aid them in their work. Freefile Alliance In addition to their paid offerings, TaxSlayer also participates in the IRS Free File Alliance, a nonprofit coalition of industry-leading tax software companies that partnered with the IRS to help millions of Americans prepare and e-file their federal tax returns for free. By participating in this program, TaxSlayer guarantees free preparation and e-filing to taxpayers who meet a set of income criteria. Sports sponsorships TaxSlayer Center The TaxSlayer Center is a 12,000-seat arena located in Moline, IL, of the Quad Cities region. The stadium is home to the Quad City Storm, a minor league professional hockey team, and the Quad City Steamwheelers of the Champions Indoor Football League. TaxSlayer purchased the naming rights to the stadium for a contract of more than $3.3 million over 10 years as part of a partnership enabling recreation and community in smaller cities, while promoting TaxSlayer’s brand in a burgeoning market. College football TaxSlayer is the title sponsor for the TaxSlayer Bowl, a college football bowl game held in Jacksonville, Florida. The game was previously known as the Gator Bowl and has been held continuously since 1946, making it the sixth oldest college bowl game. In 2014 the company struck a new six-year deal with Gator Bowl Sports to rename the bowl the TaxSlayer Bowl beginning in 2015. In keeping with its support of the military, TaxSlayer also began the Honoring Our Heroes initiative, which donates thousands of tickets to the TaxSlayer Bowl to servicemembers and their families. TaxSlayer.com has also been an associate sponsor of the Armed Forces Bowl and BBVA Compass Bowl. NASCAR TaxSlayer has been a primary sponsor of several top-tier NASCAR drivers, such as Bobby Labonte, Dale Earnhardt Jr. and Regan Smith. Professional golf TaxSlayer sponsors PGA Tour golfers Patrick Reed, Henrik Norlander and Kevin Kisner. References External links TaxSlayer IRS Free File Program Financial services companies established in 1965 Tax software of the United States Companies based in Augusta, Georgia
14901270
https://en.wikipedia.org/wiki/1437%20Diomedes
1437 Diomedes
1437 Diomedes is a large Jupiter trojan from the Greek camp, approximately in diameter. It was discovered on 3 August 1937, by astronomer Karl Reinmuth at the Heidelberg-Königstuhl State Observatory in southwest Germany. The dark D/P-type asteroid belongs to the largest Jupiter trojans and has a notably elongated shape and a longer than average rotation period of 24.49 hours. Diomedes was the first Jupiter trojan successfully observed during an occultation event of star. It was named after the hero Diomedes from Greek mythology. Orbit and classification Diomedes is a dark Jovian asteroid orbiting in the leading Greek camp at Jupiter's Lagrangian point, 60° ahead of the Gas Giant's orbit in a 1:1 resonance . It is also a non-family asteroid in the Jovian background population. Jupiter trojans are thought to have been captured into their orbits during or shortly after the early stages of the formation of the Solar System. More than 4,500 Jupiter trojans in the Greek camp have already been discovered. It orbits the Sun at a distance of 5.0–5.4 AU once every 11 years and 10 months (4,329 days; semi-major axis of 5.2 AU). Its orbit has an eccentricity of 0.04 and an inclination of 20° with respect to the ecliptic. The asteroid was first observed as at Lowell Observatory in February 1931. The body's observation arc begins at Heidelberg with its official discovery observation in August 1937. Physical characteristics In the Tholen classification, Diomedes has an ambiguous spectral type, closest to the dark D-type asteroids and somewhat similar to the primitive P-type asteroids. Its V–I color index of 0.810 is also lower than that measured for most D-type Jupiter trojans (0.95). Rotation period Several rotational lightcurves of Diomedes have been obtained from photometric observations since the 1960s. The so-far best-rated photometric observations by Robert Stephens at the Goat Mountain Astronomical Research Station and Santana Observatory in November 2008, gave a longer-than average rotation period of hours with a brightness variation of 0.34 magnitude (). Diameter and albedo In the 1970s, radiometric observations were published in the Tucson Revised Index of Asteroid Data (TRIAD) compilation gave a diameter of 173.0 kilometers with a radiometric albedo 0.021. According to the space-based surveys carried out by the Infrared Astronomical Satellite IRAS, the Japanese Akari satellite and the NEOWISE mission of NASA's Wide-field Infrared Survey Explorer, Diomedes measures between 117.786 and 172.60 kilometers in diameter and its surface has an albedo between 0.028 and 0.061. The Collaborative Asteroid Lightcurve Link adopts the results obtained by IRAS, that is, an albedo of 0.0313 and a diameter of 164.31 kilometers based on an absolute magnitude of 8.30. The Collaborative Asteroid Lightcurve Link adopts the results obtained by IRAS, that is, an albedo of 0.0313 and a diameter of 164.31 kilometers based on an absolute magnitude of 8.30. Diomedes is the third largest Jupiter trojan according to IRAS and Akari, and the 9th largest based on NEOWISE data: Occultation and shape Diomedes was the first Jupiter trojan that was successfully observed during an asteroid occultation, when it occulted the star HIP 014402A over Japan on 7 November 1997. The silhouette was elongated with a major and minor occultation axis of kilometers (poor fit). The ellipsoid dimensions of kilometers – corresponding to a mean-diameter of 132.5 kilometers, equivalent to the volume of a sphere – were estimated using follow-up photometry at Ondřejov Observatory and Mitaka Observatory that determined the body's rotational phase at the exact time of the occultation event. Naming This minor planet was named from Greek mythology after the hero Diomedes, King of Argos and known for his participation in the Trojan War, regarded as the best warriors of the Achaeans, just behind Achilles and alongside with Ajax. The official naming citation was mentioned in The Names of the Minor Planets by Paul Herget in 1955 (). References External links Asteroid Lightcurve Database (LCDB), query form (info ) Dictionary of Minor Planet Names, Google books Discovery Circumstances: Numbered Minor Planets (1)-(5000) – Minor Planet Center 001437 Discoveries by Karl Wilhelm Reinmuth Minor planets named from Greek mythology Named minor planets 001437 19370803
5121482
https://en.wikipedia.org/wiki/Ankit%20Fadia
Ankit Fadia
Ankit Fadia (born 1985) is an Indian author, speaker, television host, a security charlatan, and self-proclaimed white-hat computer hacker. His work mostly involves OS and networking tips and tricks and proxy websites. A number of his claims regarding his achievements have been disputed by others within the security industry, and he was mocked with a "Security Charlatan of the Year" award at DEF CON 20 in 2012. Attrition.org also reviewed his alleged credentials and included him on their Security Charlatans list, calling into question the veracity of his marketing statements. He has been accused of plagiarism in his work. His claims of hacking feats have since been trashed by many magazines. Early life At the age of 10, his parents gifted him a computer and he says he started taking an interest in hacking after a year of playing video games when he read a newspaper article on the subject. He soon started a website hackingtruths.box.sk where he wrote hacking tutorials, which acquired many readers and encouraged him to write a book. The book received favorable responses in India, made Fadia popular in the country, and turned his hobby into a full-time profession. However, he was also accused of plagiarism. Career He wrote more books on computer security, and spoke at several seminars across schools and colleges in India. In addition, he started providing his own computer security courses, including the "Ankit Fadia Certified Ethical Hacker" programme. In 2009, Fadia stated that he was working in New York as an Internet security expert for "prestigious companies". Fadia also endorsed the Flying Machine jeans brand of Arvind Mills. Fadia was dismissed by security and cryptography enthusiasts as a 'faker' making tall claims, who attributed his success to the [tech-illiterate] media. A security professional, who uses the handle @FakeAnkitFadia on Twitter, told The Sunday Guardian, "The first book that Fadia 'wrote' at the age of 14, The Unofficial Guide to Ethical Hacking, was a little over 32% [plagiarised] from other security publications and websites." Fadia has dismissed the critics who question his credibility as an expert, saying "If I had been fake, my growth would have stopped 10 years ago". Debunked hacking claims In 2002, Fadia claimed that at the age of 17, he had defaced the website of an Indian magazine. Subsequently, he named the magazine as the Indian edition of CHIP magazine, and stated that the editor had offered him a job when informed about the defacement. In 2012, the Forbes India executive editor Charles Assisi (who was editor of CHIP India at the time of the supposed incident), denied that such an incident ever took place after verifying with his predecessor and successor at the magazine as well. In a 2002 interview published on rediff.com, and online hacking with Mashup(Sourav) from Bongaon, Kolkata, he stated that at the age of 16, he foiled an attempt by the Kashmiri separatist hackers to deface an Indian website. He stated he gathered information about the attackers, eavesdropped on their online chat using one of their identities, and then mailed the transcript to a US spy organisation that had hired him. He did not divulge the name of the organization he worked for, citing security reasons. The Pakistani hacker group Anti-India Crew (AIC) questioned Fadia's claims: along with WFD, the AIC hacked the Indian government website epfindia.gov.in, dedicating it to Fadia, mocking his capabilities. AIC also announced that it would be defacing the website of the CBEC (www.cbec.gov.in) within the next two days, and challenged Fadia to prevent it by patching the vulnerability but Fadia couldn't. In 2003, Fadia claimed to have infiltrated a group of hackers and stated that the Pakistani intelligence agencies were paying "westerners" to deface Indian websites with anti-India or pro-Pakistan content. Fadia's own website has been hacked multiple times. In 2009, he blamed the defacement on a vulnerability in the servers of his webhost net4india. Independent security experts contested his claim, stating that the problem was a loophole in his own website's code. His website was hacked by an Indian hacker Himanshu Sharma, where he accepted the challenge from Ankit Fadia. In 2012, his website was defaced twice by hackers. In the first instance, the hackers rubbished his claims and stated that he was fooling people. Another hacker compromised it in response to a challenge that was issued by Fadia on the Tech Toyz show on CNBC-TV18. In 2012, DEF CON awarded him with the "Security Charlatan of the Year" award citing him to be a fraudster and his presentations outdated. The website attrition.org mentions him as a security charlatan and accuses him of plagiarism in his work. During September 2015, Fadia's official Facebook page posted a certificate which claimed that Fadia was appointed as the brand ambassador for Prime Minister Narendra Modi's pet project, the Digital India Initiative. The government had then announced that it would pluck out young tech entrepreneurs to be its brand ambassadors which included the ilk of Sachin Bansal and Binny Bansal of Flipkart, Snapdeal's Kunal Bahl and Micromax's Rahul Sharma. According to a report in India Today, government sources said there is "No such move to appoint a brand ambassador as reported". Television and web shows MTV What The Hack In 2008 he started a television show on MTV India called MTV What the Hack!, which he co-hosted with José Covaco. In October 2009 MTV India announced the launch of Fadia's new TV show on MTV, where Fadia gave tips on how to make use of the Internet, and answered people's questions. Internet users could email their problems to MTV India and Fadia gave them a solution on the show. Unzipped By Dell In 2012, Dell India partnered with Ankit Fadia to create a series of nearly 50 videos, each of 1 minute duration to show tips and tricks for the use of computers and mobile phones. These videos were shown on the Dell India Facebook page with an average of one video per week. People also had the opportunity to ask tech queries of Fadia on topics like photography, video making, music composing, navigation assistance, gaming, messaging and others. Geek on the Loose In 2013, Ankit Fadia started a YouTube show Geek on the Loose, in collaboration with PING networks where he shared technology-related tips, tricks and apps. The show was based on situations mentioned in his book FASTER: 100 Ways To Improve Your Digital Life. The show has got more than 750,000+ views on YouTube. Awards and recognition IT Youth Award from the Singapore Computer Society (2005) Global Ambassador for Cyber Security (National Telecom Awards 2011) Global Shaper, World Economic Forum Apart from aforementioned positive endorsements, Fadia's fabricated hacking claims garnered him some negative criticism too. He was nominated among the list of security charlatans in 2012. He was eventually awarded the same. DEF CON 20 Security Charlatan of the year. References 1985 births Living people Indian technology writers Delhi Public School alumni
867860
https://en.wikipedia.org/wiki/Netbook
Netbook
The marketing term netbook identified small and inexpensive laptops that were sold from 2007 to around 2013; these were generally low-performance. While the name has fallen out of use, machines matching their description remain an important part of the market for laptops running Microsoft Windows. Similarly, most lower-end Chromebooks run on hardware which would have been described as "Netbooks" when the term was current, and inexpensive tablets (running either Windows or Android) when used with an external keyboard are functionally equivalent to netbooks. At their inception in late 2007 as smaller-than-typical notebooks optimized for low weight and low cost—netbooks began appearing with the omission of certain features (such as an optical drive), featuring smaller screens and keyboards, and a reduction of computing power when compared to a full-sized laptop. Over the course of their evolution, netbooks have ranged in size from below 5" screen diagonal to 12". A typical weight is (). Often significantly less expensive than other laptops, by mid-2009, netbooks began to be offered by some wireless data carriers to their users "free of charge", with an extended service-contract purchase. Soon after their appearance, netbooks grew in size and features, and converged with smaller laptops and subnotebooks. By August 2009, when comparing two Dell models, one marketed as a netbook and the other as a conventional laptop, CNET called netbooks "nothing more than smaller, cheaper notebooks", noting: "the specs are so similar that the average shopper would likely be confused as to why one is better than the other", and "the only conclusion is that there really is no distinction between the devices". In an attempt to prevent cannibalizing the more lucrative laptops in their lineup, manufacturers imposed several constraints on netbooks; however this would soon push netbooks into a niche where they had few distinctive advantages over traditional laptops or tablet computers (see below). By 2011 the increasing popularity of tablet computers (particularly the iPad)—a different form factor, but with improved computing capabilities and lower production cost—had led to a decline in netbook sales. At the high end of the performance spectrum, ultrabooks, ultra-light portables with a traditional keyboard and display have been revolutionized by the 11.6-inch MacBook Air, which made fewer performance sacrifices albeit at a considerably higher production cost. Capitalizing on the success of the MacBook Air, and in response to it, Intel promoted Ultrabook as a new high-mobility standard, which some analysts have hailed as succeeding where netbooks failed. As a result of these two developments, netbooks of 2011 had kept price as their only strong point, losing in the design, ease-of-use and portability department to tablets (and tablets with removable keyboards) and to Ultrabook laptops in the features and performance field. By the end of 2012 few machines were marketed as "netbooks". Many netbook products were replaced on the market by Chromebooks, a hardware- and software-specification in the form of a netbook and a variation on the network-computer concept. HP re-entered the non-Chromebook netbook market with the Stream 11 in 2014, although the term "netbook" is seldom in use anymore. Some specialised computers have also been released more recently with form factors comparable to netbooks, such as the GPD Win and its successor, the GPD Win 2. History While Psion had unrelated netBook line of machines, the use of the broad marketing term "netbook", began in 2007 when Asus unveiled the Asus Eee PC. Originally designed for emerging markets, the device weighed about and featured a display, a keyboard approximately 85% the size of a normal keyboard, a solid-state drive and a custom version of Linux with a simplified user interface geared towards consumer use. Following the Eee PC, Everex launched its Linux-based CloudBook; Windows XP and Windows Vista models were also introduced and MSI released the Wind—others soon followed suit. The OLPC project followed the same market goals laid down by the eMate 300 eight years earlier. Known for its innovation in producing a durable, cost- and power-efficient netbook for developing countries, it is regarded as one of the major factors that led more top computer hardware manufacturers to begin creating low-cost netbooks for the consumer market. When the first Asus Eee PC sold over 300,000 units in four months, companies such as Dell and Acer took note and began producing their own inexpensive netbooks. And while the OLPC XO-1 targets a different audience than do the other manufacturers' netbooks, it appears that OLPC is now facing competition. Developing countries now have a large choice of vendors, from which they can choose which low-cost netbook they prefer. By late 2008, netbooks began to take market share away from notebooks. It was more successful than earlier "mini notebooks," most likely because of lower cost and greater compatibility with mainstream laptops. Having peaked at about 20% of the portable computer market, netbooks started to slightly lose market share (within the category) in early 2010, coinciding with the appearance and success of the iPad. Technology commentator Ross Rubin argued two and a half years later in Engadget that "Netbooks never got any respect. While Steve Jobs rebuked the netbook at the iPad's introduction, the iPad owes a bit of debt to the little laptops. The netbook demonstrated the potential of an inexpensive, portable second computing device, with a screen size of about 10 inches, intended primarily for media consumption and light productivity." Although some manufacturers directly blamed competition from the iPad, some analysts pointed out that larger, fully fledged laptops had entered the price range of netbooks at about the same time. The 11.6-inch MacBook Air, introduced in late 2010, compared favorably to many netbooks in terms of processing power but also ergonomics, at 2.3 pounds being lighter than some 10-inch netbooks, owing in part to the integration of the flash storage chips on the main logic board. It was described as a superlative netbook (or at least as what a netbook should be) by several technology commentators, even though Apple has never referred to it as such, sometimes describing it—in the words of Steve Jobs—as "the third kind of notebook." The entry level model had a MSRP of $999, costing significantly more than the average netbook, as much as three or four times more. In 2011 tablet sales overtook netbooks for the first time, and in 2012 netbook sales fell by 25 percent, year-on-year. The sustained decline since 2010 had been most pronounced in the United States and in Western Europe, while Latin America was still showing some modest growth. In December 2011, Dell announced that it was exiting the netbook market. In May 2012, Toshiba announced it was doing the same, at least in the United States. An August 2012 article by John C. Dvorak in PC Magazine claimed that the term "netbook" is "nearly gone from the lexicon already", having been superseded in the market place largely by the more powerful (and MacBook Air inspired) Ultrabook—described as "a netbook on steroids"—and to a lesser extent by tablets. In September 2012 Asus, Acer and MSI announced that they will stop manufacturing 10-inch netbooks. Simultaneously Asus announced they would stop developing all Eee PC products, instead focusing on their mixed tablet-netbook Transformer line. With the introduction of Chromebooks, major manufacturers produced the new laptops for the same segment of the market that netbooks serviced. Chromebooks, a variation on the network computer concept, in the form of a netbook, require internet connections for full functionality. Chromebooks became top selling laptops in 2014. The threat of Google Chrome OS based Chromebooks prompted Microsoft to revive and revamp netbooks with Windows 8.1 with Bing. HP re-entered the non-Chromebook netbook market with the Stream 11 in 2014.. Educational Use In Australia, the New South Wales Department of Education and Training, in partnership with Lenovo, provided Year 9 (high school) students in government high schools with Lenovo S10e netbooks in 2009, Lenovo Mini 10 netbooks in 2010, Lenovo Edge 11 netbooks in 2011 and a modified Lenovo X130e netbook in 2012, each preloaded with software including Microsoft Office and Adobe Systems' Creative Suite 4. These were provided under Prime Minister Kevin Rudd's Digital Education Revolution, or DER. The netbooks ran Windows 7 Enterprise. These netbooks were secured with Computrace Lojack for laptops that the police can use to track the device if it is lost or stolen. The NSW DET retains ownership of these netbooks until the student graduates from Year 12, when the student can keep it. The Government of Trinidad and Tobago—Prime Minister Kamla Persad Bisseser—is also providing HP laptops to form 1 Students (11-year-olds) with the same police trackable software as above. Greece provided all 13-year-old students (middle school, or gymnasium, freshmen) and their teachers with netbooks in 2009 through the "Digital Classroom Initiative". Students were given one unique coupon each, with which they redeemed the netbook of their choice, up to a €450 price ceiling, in participating shops throughout the country. These netbooks came bundled with localised versions of either Windows XP (or higher) or open source (e.g. Linux) operating systems, wired and wireless networking functionality, antivirus protection, preactivated parental controls, and an educational software package. Trademarks In 1996 Psion started applying for trademarks for a line of netBook products that was later released in 1999. International trademarks were issued (including and ) but the models failed to gain popularity and are now discontinued (except for providing accessories, maintenance and support to existing users). Similar marks were recently rejected by the USPTO citing a "likelihood of confusion" under section 2(d). Despite expert analysis that the mark is "probably generic", Psion Teklogix issued cease and desist letters on 23 December 2008. This was heavily criticised, prompting the formation of the "Save the Netbooks" grassroots campaign which worked to reverse the Google AdWords ban, cancel the trademark and encourage continued generic use of the term. While preparing a "Petition for Cancellation" of they revealed that Dell had submitted one day before on the basis of abandonment, genericness and fraud. They later revealed Psion's counter-suit against Intel, filed on 27 February 2009. It was also revealed around the same time that Intel had also sued Psion Teklogix (US & Canada) and Psion (UK) in the Federal Court on similar grounds. In addition to seeking cancellation of the trademark, Intel sought an order enjoining Psion from asserting any trademark rights in the term "netbook", a declarative judgment regarding their use of the term, attorneys' fees, costs and disbursements and "such other and further relief as the Court deems just and proper". On June 2, 2009, Psion announced that the suit had been settled out of court. Psion's statement said that the company was withdrawing all of its trademark registrations for the term "Netbook" and that Psion agreed to "waive all its rights against third parties in respect of past, current or future use" of the term. Hardware Netbooks typically have less powerful hardware than larger laptop computers and do not include an optical disc drive that contemporaneous laptops often had. Netbooks were some of the first machines to substitute a solid-state storage devices instead of a hard drive, as these were smaller, required less power, and were more shock-resistant. Unlike modern solid state drives, these early models often did not offer better performance. Almost all netbooks supported Wi-Fi and some supported Mobile broadband. Some also include ethernet and/or modems. Most netbooks used x86 processors. Most early networks used processors from the Intel Atom line, but some used competing processors from AMD, including Fusion netbook processors, or VIA Technologies, including the C7 and Nano. Some very low cost netbooks use a system-on-a-chip Vortex86 processor meant for embedded systems. A few netbook used non-x86 processors based on ARM or MIPS architectures. Operating systems Windows Microsoft announced on April 8, 2008 that, despite the impending end of retail availability for the operating system that June, it would continue to license low-cost copies of Windows XP Home Edition to OEMs through October 2010 (one year after the release of Windows 7) for what it defined as "ultra low-cost personal computers"—a definition carrying restrictions on screen size and processing power. The move served primarily to counter the use of low-cost Linux distributions on netbooks and create a new market segment for Windows devices, whilst ensuring that the devices did not cannibalize the sales of higher-end PCs running Windows Vista. In 2009, over 90% (96% claimed by Microsoft as of February 2009) of netbooks in the United States were estimated to ship with Windows XP. For Windows 7, Microsoft introduced a new stripped-down edition intended for netbooks known as "Starter", exclusively for OEMs. In comparison to Home Premium, Starter has reduced multimedia functionality, does not allow users to change their desktop wallpaper or theme, disables the "Aero Glass" theme, and does not have support for multiple monitors. For Windows 8, in a ploy to counter Chrome OS-based netbooks and low-end Android tablets, Microsoft began to offer no-cost Windows licenses to OEMs for devices with screens smaller than 9 inches in size. Additionally, Microsoft began to offer low-cost licenses for a variant of the operating system set up to use Microsoft's Bing search engine by default. Windows CE has also been used in netbooks, due to its reduced feature set. Android Google's Android software platform, designed for mobile telephone handsets, has been demonstrated on an ASUS Eee PC and its version of the Linux operating system contains policies for mobile internet devices including the original Asus Eee PC 701. ASUS has allocated engineers to develop an Android-based netbook. In May 2009 a contractor of Dell announced it is porting Adobe Flash Lite to Android for Dell netbooks. Acer announced Android netbooks to be available in Q3/2009. In July 2009, a new project, Android-x86, was created to provide an open source solution for Android on the x86 platform, especially for netbooks. Chrome OS In 2011, Google introduced Chrome OS, a Linux-based operating system designed particularly for netbook-like devices marketed as "Chromebooks". The platform is designed to leverage online services, cloud computing, and its namesake Chrome web browser as its shell—so much so that the operating system initially used a full screen web browser window as its interface, and contained limited offline functionality. Later versions of Chrome OS introduced a traditional desktop interface and a platform allowing "native" packaged software written in HTML, JavaScript, and CSS to be developed for the platform. Other Netbooks have sparked the development of several Linux variants or completely new distributions, which are optimized for small screen use and the limited processing power of the Atom or ARM processors which typically power netbooks. Examples include Ubuntu Netbook Edition, EasyPeasy, Joli OS and MeeGo. Both Joli OS and MeeGo purport to be "social oriented" or social networking operating systems rather than traditional "office work production" operating systems. Netbook users can also install other UNIX-based operating systems such as FreeBSD, NetBSD, OpenBSD, and Darwin. Since 2010, major netbook manufacturers no longer install or support Linux in the United States. The reason for this change of stance is unclear, although it coincides with the availability of a 'netbook' version of Windows XP, and a later Windows 7 Starter and a strong marketing push for the adoption of this OS in the netbook market. However, companies targeting niche markets, such as System76 and ZaReason, continue to pre-install Linux on the devices they sell. The Cloud operating system attempts to capitalize on the minimalist aspect of netbooks. The user interface is limited to a browser application only. Mac OS X has been demonstrated running on various netbooks as a result of the OSx86 project, although this is in violation of the operating system's end-user license agreement. Apple has complained to sites hosting information on how to install OS X onto non-Apple hardware (including Wired and YouTube) who have reacted and removed content in response. One article nicknamed a netbook running OS X a "Hackintosh." The Macbook Air can be considered an expensive netbook. Use A June 2009 NPD study found that 60% of netbook buyers never take their netbooks out of the house. Special "children's" editions of netbooks have been released under Disney branding; their low cost (less at risk), lack of DVD player (less to break) and smaller keyboards (closer to children's hand sizes) are viewed as significant advantages for that target market. The principal objection to netbooks in this context is the lack of good video performance for streaming online video in current netbooks and a lack of speed with even simple games. Adults browsing for text content are less dependent on video content than small children who cannot read. Netbooks are a growing trend in education for several reasons. The need to prepare children for 21st-century lifestyles, combined with hundreds of new educational tools that can be found online, and a growing emphasis on student centered learning are three of the biggest contributing factors to the rising use of netbook technology in schools. Dell was one of the first to mass-produce a ruggedised netbook for the education sector, by having a rubber outlay, touchscreen and network activity light to show the teacher the netbook is online. Netbooks offer several distinct advantages in educational settings. First, their compact size and weight make for an easy fit in student work areas. Similarly, the small size make netbooks easier to transport than heavier, larger sized traditional laptops. In addition, prices ranging from $200–$600 mean the affordability of netbooks can be a relief to school budget makers. Despite the small size and price, netbooks are fully capable of accomplishing most school-related tasks, including word processing, presentations, access to the Internet, multimedia playback, and photo management. See also References External links "The rise of the Netbook" article at CNET "The State of the Netbook" article at Ars Technica "The Netbook Effect: How Cheap Little Laptops Hit the Big Time" article at Wired "Light and Cheap, Netbooks Are Poised to Reshape PC Industry" article at New York Times "5 Tips to Boost the Laptop Speed" Appropriate technology Cloud clients Information appliances Japanese inventions
29997106
https://en.wikipedia.org/wiki/Cable%20router
Cable router
"Cable router" has two basic meanings: Single Cable Router (SCR) - a down-conversion device for the radio data link. It converts RF signal from a satellite dish or TV antenna to the user-defined IF channel. Usually, many SCRs are connected to a single coaxial cable - each converting to a separate IF channel. The entire system referred to as Single Cable Distribution. A piece of computer network equipment located between cable modem and LAN, performing functions of the network router in a modem. Cable routers are usually integrated with the modem, frequently incorporating firewall, proxy, or network gateway functions as well. See also Cable/DSL router Networking hardware
29070755
https://en.wikipedia.org/wiki/Comparison%20of%20TLS%20implementations
Comparison of TLS implementations
The Transport Layer Security (TLS) protocol provides the ability to secure communications across networks. This comparison of TLS implementations compares several of the most notable libraries. There are several TLS implementations which are free software and open source. All comparison categories use the stable version of each implementation listed in the overview section. The comparison is limited to features that directly relate to the TLS protocol. Overview Protocol support Several versions of the TLS protocol exist. SSL 2.0 is a deprecated protocol version with significant weaknesses. SSL 3.0 (1996) and TLS 1.0 (1999) are successors with two weaknesses in CBC-padding that were explained in 2001 by Serge Vaudenay. TLS 1.1 (2006) fixed only one of the problems, by switching to random initialization vectors (IV) for CBC block ciphers, whereas the more problematic use of mac-pad-encrypt instead of the secure pad-mac-encrypt was addressed with RFC 7366. A workaround for SSL 3.0 and TLS 1.0, roughly equivalent to random IVs from TLS 1.1, was widely adopted by many implementations in late 2011, so from a security perspective, all existing version of TLS 1.0, 1.1 and 1.2 provide equivalent strength in the base protocol and are suitable for 128-bit security according to NIST SP800-57 up to at least 2030. In 2014, the POODLE vulnerability of SSL 3.0 was discovered, which takes advantage of the known vulnerabilities in CBC, and an insecure fallback negotiation used in browsers. TLS 1.2 (2008) introduced a means to identify the hash used for digital signatures. While permitting the use of stronger hash functions for digital signatures in the future (rsa,sha256/sha384/sha512) over the SSL 3.0 conservative choice (rsa,sha1+md5), the TLS 1.2 protocol change inadvertently and substantially weakened the default digital signatures and provides (rsa,sha1) and even (rsa,md5). Datagram Transport Layer Security (DTLS or Datagram TLS) 1.0 is a modification of TLS 1.1 for a packet-oriented transport layer, where packet loss and packet reordering have to be tolerated. The revision DTLS 1.2 based on TLS 1.2 was published in January 2012. Note that there are known vulnerabilities in SSL 2.0 and SSL 3.0. With the exception of the predictable IVs (for which an easy workaround exists) all currently known vulnerabilities affect all version of TLS 1.0/1.1/1.2 alike. NSA Suite B Cryptography Required components for NSA Suite B Cryptography (RFC 6460) are: Advanced Encryption Standard (AES) with key sizes of 128 and 256 bits. For traffic flow, AES should be used with either the Counter Mode (CTR) for low bandwidth traffic or the Galois/Counter Mode (GCM) mode of operation for high bandwidth traffic (see Block cipher modes of operation) — symmetric encryption Elliptic Curve Digital Signature Algorithm (ECDSA) — digital signatures Elliptic Curve Diffie–Hellman (ECDH) — key agreement Secure Hash Algorithm 2 (SHA-256 and SHA-384) — message digest Per CNSSP-15, the 256-bit elliptic curve (specified in FIPS 186-2), SHA-256, and AES with 128-bit keys are sufficient for protecting classified information up to the Secret level, while the 384-bit elliptic curve (specified in FIPS 186-2), SHA-384, and AES with 256-bit keys are necessary for the protection of Top Secret information. Certifications Note that certain certifications have received serious negative criticism from people who are actually involved in them. Key exchange algorithms (certificate-only) This section lists the certificate verification functionality available in the various implementations. Key exchange algorithms (alternative key-exchanges) Certificate verification methods Encryption algorithms Notes Obsolete algorithms Notes Supported elliptic curves This section lists the supported elliptic curves by each implementation. Notes Data integrity Compression Note the CRIME security exploit takes advantage of TLS compression, so conservative implementations do not enable compression at the TLS level. HTTP compression is unrelated and unaffected by this exploit, but is exploited by the related BREACH attack. Extensions In this section the extensions each implementation supports are listed. Note that the Secure Renegotiation extension is critical for HTTPS client security . TLS clients not implementing it are vulnerable to attacks, irrespective of whether the client implements TLS renegotiation. Assisted cryptography This section lists the known ability of an implementation to take advantage of CPU instruction sets that optimize encryption, or utilize system specific devices that allow access to underlying cryptographic hardware for acceleration or for data separation. System-specific backends This section lists the ability of an implementation to take advantage of the available operating system specific backends, or even the backends provided by another implementation. Cryptographic module/token support Code dependencies Development environment API Portability concerns See also SCTP — with DTLS support DCCP — with DTLS support SRTP — with DTLS support (DTLS-SRTP) and Secure Real-Time Transport Control Protocol (SRTCP) References Cryptographic software TLS implementations
2153321
https://en.wikipedia.org/wiki/The%20Software%20Toolworks
The Software Toolworks
The Software Toolworks (commonly abbreviated as Toolworks) was an American software and video game developer based in Novato, California. The company was founded by Walt Bilofsky in 1980 out of his Sherman Oaks garage, which he converted into an office, to develop software for the Heathkit H89 microcomputer. It quickly expanded into video games, releasing Airport and MyChess in 1980; other notable games include Chessmaster 2000, Mavis Beacon Teaches Typing, and Mario Is Missing!. Toolworks merged with its distributor, Software Country, in 1986 and, after going public in 1988, acquired IntelliCreations, DS Technologies, and Mindscape. By 1994, Toolworks employed 600 people and had a revenue of . In May that year, it was acquired by Pearson plc for , which converted it to bear the Mindscape identity by November. History Early years (1979–1982) The Software Toolworks was founded by programmer Walt Bilofsky, who, after studying at Cornell University and the Massachusetts Institute of Technology (MIT), had worked for the Institute for Defense Analyses, as a programmer for RAND Corporation, and as a consultant. In 1979, he acquired and assembled a Heathkit H89 microcomputer; he found that the microcomputer lacked important software and thus began developing new software and ports of his own, including a fullscreen editor and a compiler for the C programming language entitled C/80, the latter based on Ron Cain's public-domain compiler Small-C. Bilofsky subsequently contacted the Heath Company, which made the Heathkit series of microcomputers, to have it market his software and, in response, was told that the operating system and the BASIC programming language Heathkit microcomputers came with were sufficient. He instead turned to advertise his software in BUSS, a Heathkit hobbyist newsletter, starting in 1980, quickly receiving orders for his software. Bilofsky eventually adopted the name "The Software Toolworks", using it publicly for the first time with an advertisement submitted to the magazine Byte in June 1980. He converted his garage in Sherman Oaks, California, to a two-room office, outfitting it with a disk duplicator, shelving, and a shipping area. This office was later relocated into a garden shed. By the end of the year, Toolworks had entered the video game business, having published Airport, an air traffic control game by Jim Gillogly, and MyChess, a chess game by Dave Kittinger. This continued in 1981, with Robert Wesson developing a clone of Pac-Man, the game Munchkin, and a port of Invaders for the H89, and Bilofksy adapting the artificial intelligence psychiatrist ELIZA. Other early non-game software included the spreadsheet editor Zencalc (later replaced by MyCalc), the text editor PIE, the text formatting application TEXT, and the spelling checker SPELL. One of Toolworks' major releases was a port of Adventure, a text adventure game developed by William Crowther in 1975 and later expanded by Don Woods. Gillogly made Bilofsky aware of the game and, by 1982, was able to get the game running on an H89 using Bilofsky's C/80 compiler. Although the game was in the public domain, Bilofsky decided to release an official version with the approval of Crowther and Woods. This version was expanded so that, at the end of the game, the player is admitted into a fictional "Wizards' Guild" and given a password that could be posted to Toolworks in return for a "Certificate of Wizardness", underwritten by Crowther and Woods, and signed with the Toolworks corporate seal, the only time this seal was used. The game was released in 1982 and came with a manual packaged in a Ziploc bag. Expansion (1983–1987) In 1983, Toolworks was joined by Joe Abrams, Bilofsky's cousin. That same year, the company moved into a proper, three-room office on the 11th floor of a Sherman Oaks bank building, opposite of the Sherman Oaks Galleria. This move was made possible by Toolworks' growing sales, and by this time, its products were sold through more than 50 Heathkit stores, and it had released a total of 40 products by 1984. That year, distributor Software Country and its manager, Les Crane, licensed Toolworks' versions of Adventure and ELIZA for a software compilation disk titled Golden Oldies Vol I, which was released the following year. Subsequently, Crane agreed with Abrams that Software Country would market a chess game developed by Toolworks; for this project, Toolworks brought on Mike Duffy, who had ported MyChess to IBM PC and PCjr, and the team developed Chessmaster 2000. Crane stepped up the marketing efforts for the game, paying for the cover photo. Bilofsky described this change as the "emphatic end of the Ziploc bag era". Chessmaster 2000 was released in 1986 and sold 100,000 copies within seven months. Building from this success, Toolworks and Software Country merged in October 1986, with Toolworks as the surviving entity. The merged company then bought Priority Software Packaging, a disk duplication and software packaging company, the following November. Following the merger, Crane conceived a typing application in which the user would be guided by Mavis Beacon, a fictional typing instructor who would correct the user's mistakes. The product, Mavis Beacon Teaches Typing, was developed by Bilofsky, Duffy and Norman Worthington from Bilofsky's home in six months, with Duffy often working more than 140 hours per week. The team aimed at making the application more fun to keep users engaged, thus it incorporated large quantities of text it deemed interesting, generated mistake analyses, and made it visually appealing. Renée L'Espérance, a Haitian woman whom Crane and Abrams had met at a Saks Fifth Avenue store, was contracted to represent Mavis Beacon. Due to her darker skin, several stores initially refused to sell the application when it was released in 1987. This changed when a positive review of the application published in The New York Times generated much demand, restoring all of Toolworks' usual distribution channels within two weeks. As a public company and under Pearson (1987–1994) In February 1987, Toolworks signed a distribution deal with Electronic Arts (expanded for distribution in Europe in July), which required Toolworks to port each new game to Apple II, Apple III, Apple IIGS, Macintosh, PCjr, Atari 8-bit, Atari ST, Commodore 64, Amiga and IBM PC computers, of which in colored and monochrome versions for the latter. Each team member at the company was tasked with developing one of the ports but the undertaking eventually proved a financial strain and Toolworks ran out of funds by the end of 1987. To raise new capital, the company became a public company in January 1988, through a reverse merger with Deseret-Western Venture Capital, an existing public shell corporation registered in Utah. By June 1988, Toolworks had 45 employees. Shortly thereafter, the company acquired developers IntelliCreations (of Chatsworth, California) in August 1988 and DS Technologies (of West Chicago, Illinois) in February 1989. With the acquisition of IntelliCreations, Toolworks announced that it would move its headquarters to Chatsworth. Toolworks also agreed with manufacturer Vendex to have Toolworks' games included with Vendex's machines. Life & Death, a surgery simulation game was released in 1988. In 1989, the company released Beyond the Black Hole, a stereoscopic 3-D arcade game that came with 3-D glasses. By 1989, Chessmaster games and Mavis Beacon Teaches Typing had collectively sold 750,000 through retail and licensing deals. Looking to get a hold of a development license for Nintendo platforms, which were difficult to obtain, Toolworks acquired Mindscape, an existing license holder based in Northbrook, Illinois, in March 1990. Using Mindscape's license, Toolworks released a follow-up to Mavis Beacon Teaches Typing focused on piano teaching: Miracle Piano Teaching System. The application came with a physical pressure-sensitive keyboard, which Toolworks had ordered 100,000 of. The required quantity was overestimated and many keyboards were damaged in transit, causing high financial losses for Toolworks. In April 1990, Elizabeth Barker became the president and chief operating officer (COO) of Toolworks, succeeding Crane (who remained chairman and chief executive officer) in the president role, and was succeeded herself in both roles by Robert Lloyd in November 1990. In September 1990, Toolworks moved from Chatsworth to Novato. While in talks with Japanese original equipment manufacturers (OEMs) in Japan, the chief executive officer of Philips introduced Abrams to CD-ROM drives; CD-ROM discs could store high capacities of data but drives for them were uncommon in households at the time. Developer LucasArts had completed three CD-ROM games but struggled to sell them. In 1992, Toolworks licensed the games from LucasArts and had them distributed with new PCs by the Japanese OEMs. Within one month, this led to more sales of these games than LucasArts had achieved in the two years prior. The period from 1992-1993 saw the release of several titles: Star Wars Chess, Mario's Time Machine, Mario's Early Years!, Legend, San Diego Zoo Presents: The Animals!, and the PC version of Ultimate Domain. Toolworks continued to grow further, to 600 employees by 1994, when it was generating annual revenues of . That May, the company was acquired by British media company Pearson plc for . Shortly thereafter, by November, Toolworks had assumed the Mindscape moniker for all of its operations, which is considered the end of Toolworks. Games Airport (1980) Mychess (1980) ELIZA (1981) Munchkin (1981) Adventure (1982) Golden Oldies: Volume 1 - Computer Software Classics (1985) Chessmaster 2000 (1986) Mavis Beacon Teaches Typing! (1987) The Fidelity Chessmaster 2100 (1988) Life & Death (1988) The Hunt for Red October (1988) Cribbage King / Gin King (1989) Beyond the Black Hole (1989) The Chessmaster (1989) Bruce Lee Lives (1989) Orb-3D (1990) The Games People Play: Gin ∙ Cribbage ∙ Checkers ∙ Backgammon (1990) Life & Death II: The Brain (1990) Miracle Piano Teaching System (1990) The Big Deal (1991) The Chessmaster 3000 (1991) Mario is Missing! (1992) Mario's Early Years: Fun With Letters (1993) Capitol Hill (1993) The Chessmaster 4000 Turbo (1993) Star Wars Chess (1993) Mario's Time Machine (1993) MegaRace (1993) Mario's Early Years: Fun with Numbers (1994) Mavis Beacon Teaches Typing! for Kids (1994) Mario's Early Years: Preschool Fun (1994) Ultimate Domain (1994) Evasive Action (1994) Maniac Sports (1994) Space Shuttle (1994) Al Unser Jr.'s Road to the Top (1994) NCAA Football (1994) References External links 1980 establishments in California 1994 disestablishments in California Video game companies based in California Video game companies disestablished in 1994 Video game companies established in 1980 Video game development companies
3011741
https://en.wikipedia.org/wiki/Software%20token
Software token
A software token (a.k.a. soft token) is a piece of a two-factor authentication security device that may be used to authorize the use of computer services. Software tokens are stored on a general-purpose electronic device such as a desktop computer, laptop, PDA, or mobile phone and can be duplicated. (Contrast hardware tokens, where the credentials are stored on a dedicated hardware device and therefore cannot be duplicated (absent physical invasion of the device).) Because software tokens are something one does not physically possess, they are exposed to unique threats based on duplication of the underlying cryptographic material - for example, computer viruses and software attacks. Both hardware and software tokens are vulnerable to bot-based man-in-the-middle attacks, or to simple phishing attacks in which the one-time password provided by the token is solicited, and then supplied to the genuine website in a timely manner. Software tokens do have benefits: there is no physical token to carry, they do not contain batteries that will run out, and they are cheaper than hardware tokens. Security architecture There are two primary architectures for software tokens: shared secret and public-key cryptography. For a shared secret, an administrator will typically generate a configuration file for each end-user. The file will contain a username, a personal identification number, and the secret. This configuration file is given to the user. The shared secret architecture is potentially vulnerable in a number of areas. The configuration file can be compromised if it is stolen and the token is copied. With time-based software tokens, it is possible to borrow an individual's PDA or laptop, set the clock forward, and generate codes that will be valid in the future. Any software token that uses shared secrets and stores the PIN alongside the shared secret in a software client can be stolen and subjected to offline attacks. Shared secret tokens can be difficult to distribute, since each token is essentially a different piece of software. Each user must receive a copy of the secret, which can create time constraints. Some newer software tokens rely on public-key cryptography, or asymmetric cryptography. This architecture eliminates some of the traditional weaknesses of software tokens, but does not affect their primary weakness (ability to duplicate). A PIN can be stored on a remote authentication server instead of with the token client, making a stolen software token no good unless the PIN is known as well. However, in the case of a virus infection the cryptographic material can be duplicated and then the PIN can be captured (via keylogging or similar) the next time the user authenticates. If there are attempts made to guess the PIN, it can be detected and logged on the authentication server, which can disable the token. Using asymmetric cryptography also simplifies implementation, since the token client can generate its own key pair and exchange public keys with the server. See also Authentication Electronic authentication Google Authenticator Multi-factor authentication Security token References External links Microsoft to abandon passwords, Banks to Use 2-factor Authentication by End of 2006 Cryptography Computer access control fr:Authentification forte
2597939
https://en.wikipedia.org/wiki/PicoBlaze
PicoBlaze
PicoBlaze is the designation of a series of three free soft processor cores from Xilinx for use in their FPGA and CPLD products. They are based on an 8-bit RISC architecture and can reach speeds up to 100 MIPS on the Virtex 4 FPGA's family. The processors have an 8-bit address and data port for access to a wide range of peripherals. The license of the cores allows their free use, albeit only on Xilinx devices, and they come with development tools. Third-party tools are available from Mediatronix and others. Also PacoBlaze, a behavioral and device independent implementation of the cores exists and is released under the BSD License. The PauloBlaze is an open source VHDL implementation under the Apache License. The PicoBlaze design was originally named KCPSM which stands for "Constant(K) Coded Programmable State Machine" (formerly "Ken Chapman's PSM"). Ken Chapman was the Xilinx systems designer who devised and implemented the microcontroller. Instantiation When instantiating a PicoBlaze microcontroller in VHDL, the respective KCPSM component name must be used. For example, for a PicoBlaze3 processor: component kcpsm3 is port ( address : out std_logic_vector(9 downto 0); instruction : in std_logic_vector(17 downto 0); port_id : out std_logic_vector(7 downto 0); write_strobe : out std_logic; out_port : out std_logic_vector(7 downto 0); read_strobe : out std_logic; in_port : in std_logic_vector(7 downto 0); interrupt : in std_logic; interrupt_ack : out std_logic; reset : in std_logic; clk : in std_logic ); end component; Performance All instructions execute in two clock cycles, making performance of the core instruction set deterministic. Interrupt response is not more than five clock cycles. As a resource optimization, it is possible for two PicoBlaze cores to share the same 1k x 18 instruction PROM, taking advantage of the dual-ported implementation of this block on Xilinx FPGAs. Architectural notes Xilinx documents the PicoBlaze as requiring just 96 FPGA slices. The small implementation size is achieved in part through a fairly rigid separation of the instruction sequencing side (program counter, call-return stack, implied stack pointer, and interrupt enable bit) from the execution side (ALU, register file, scratchpad RAM, Z/C status bits). The only information which flows from the compute side to the sequencing side are the zero and carry ALU status bits, when tested by the conditional JUMP and CALL instructions. It is not possible to implement computed jumps or function pointers. The only information which flows from the sequencing side to the execution side are operand fields: destination register (4 bits), ALU opcode (six bits), optional source register (4 bits), optional 8-bit immediate value/port-address, optional 6-bit scratchpad address. There is no mechanism to inspect the value of the stack pointer, the contents of the 31-entry stack, the interrupt enable bit, or the contents of program memory. The instruction sequencing side does not contain an adder, so relative branches and position independent code are not possible. All jump and call addresses are absolute. The PicoBlaze is poorly suited to programming in compiled languages such as C. In addition to the lack of support for function pointers, there are no instructions or addressing modes to expedite a stack-based calling convention. For PicoBlaze it takes two instructions to implement PUSH or POP and two instructions to implement relative addressing off a software-designated stack pointer. The PicoBlaze is better suited to a hand-optimized register-based calling convention. This does not preclude the use of a Forth-like data stack, and in fact the PicoBlaze is well suited to this approach, if the 64-byte scratchpad memory offers sufficient space. See also MicroBlaze External links Processor and derivatives: PicoBlaze on the Xilinx website PicoBlaze user manual PicoBlaze user resources Implementation of picoblaze in LabVIEW FPGA on the Xilinx Spartan 3E Starter board PacoBlaze: an open source synthesizable and behavioral Verilog clone of PicoBlaze PacoBlaze implementation description NanoBlaze: a VHDL model with generics to define various sizes PauloBlaze: an open source VHDL model fully compatible with the ISA of the kcpsm6 Tools: Open source Picoblaze assembler PicoBlaze Debugger, Software and RTL Hardware development with ModelSim MDS, Professional IDE for Linux and Windows FIDEx, an assembler IDE for Linux, MAC and Windows pBlazASM, an open source assembler and simulator for Windows pBlazIDE, an assembler IDE for Windows kpicosim, an open source assembler IDE for Linux Opbasm, Cross-platform Open Picoblaze macro assembler for kcpsm3 and kcpsm6 PicoBlaze Simulator in JavaScript References Notes Bibliography Ivanov Vl. Using a PicoBlaze Processor to Traffic Light Control. Cybernetics and Information Technologies, 15, 5, Marin Drinov, 2015, Online , , pp. 131 – 139. SJR:0.212 Soft microprocessors
43043497
https://en.wikipedia.org/wiki/Mountain%20%28video%20game%29
Mountain (video game)
Mountain is a simulation video game developed by David OReilly and published by Double Fine Productions. It was released for Microsoft Windows, OS X, Linux, and iOS in July 2014. The game is an idle game in which the only influence the player can have on the game is at the start of the game where the player is tasked to draw objects. The game is designed to be played in the background while the player uses other applications. Gameplay Mountain is described by its creator David OReilly as a "Mountain Simulator, Relax em’ up, Art Horror etc." game, featuring little interactivity from the player. Upon starting the game, the player is asked to draw responses to a series of questions, described by OReilly as "more psychologically invasive than anything Facebook wants to know about you". The game uses that input to generate a model of a mountain, floating in space, surrounded by a small sphere of atmosphere. At this point, the game lacks significant interactivity; while the player can rotate the view around the mountain and zoom in and out, they cannot affect the mountain in any way. The game is set to be run in the background as the player does other activities on their computer. Over the course of the game, the mountain slowly rotates as accelerated time progresses through day and night cycles and through seasonal changes: the player will see snow form and melt on the mountain, plants and trees grow and wither out. Randomly, the mountain may be hit by everyday objects, termed "artefacts," which then become embedded in the mountain indefinitely. The mountain periodically offers its thoughts to the player as the game progresses. After around fifty hours while the game has been running, the mountain meets its fate when it crashes into a passing giant star, ending the game, at which point the player can start the game over with a new mountain. This can be avoided by repeatedly pressing buttons on one's keyboard which forms a shield around the mountain that protects it from getting destroyed. Development David OReilly had developed a number of fictional video game sequences for the movie Her (2013). Following his involvement, OReilly had interest in creating a real video game, wanting "to explore in patterns and iterations of patterns". He considered the idea of simulating a mountain as "an iconic zen thing", and that the size of mountains dwarf that of the human experience; mountains further "defy objectification because they can't be owned or put in a museum". OReilly described Mountain as "visual silence", and that it is "about letting go of control" while one watches the simulation. To develop the game, OReilly started learning the Unity engine himself. To complete Mountain, he gained assistance from Damien Di Fede who did most of the game's coding. OReilly had revealed the game during the Horizon video game showcase held at the Museum of Contemporary Art, Los Angeles during the same week in June as the Electronic Entertainment Expo 2014. The game was published by Double Fine Productions under their "Double Fine Presents" label aimed for small indie games. The title was initially released on July 1, 2014 for Microsoft Windows, OS X, Linux, and iOS platforms; an Android version was delayed until August 19, 2014 due to the cost of getting the Unity plugin for the operating system. However, official Android support for the game is no longer available. OReilly had initially envisioned the game to run as a background application for personal computers, and thus had not spent a great deal of time optimizing the iOS version through the Unity engine. However, within a week of its release, Mountain was one of the top-selling titles on various app store charts, prompting OReilly to develop more optimization for the iOS and pending Android versions. In December 2018, Mountain 2.0, a major update to the game, was released for free on its available platforms. Taking about a year to complete, the update was a greater endeavor than the game's first version. It contains numerous additions and improvements, including more artefacts, optimized shaders, and a Slow Motion mode. OReilly largely credits the update's conception to support for the game from its fan community. Reception The game was generally praised by reviewers as a novel concept for a game, though because of its limited interactivity, many players were discontent with the title, comparing it to a screensaver rather than a game. Zack Kotzer of Vice compared the game to the Tamagotchi toys, though lacking the need to constantly attend to the toys' demands and instead letting the player decide when to see how the mountain is progressing. Others described the title as a passive Katamari game, watching how the mountain accumulates stuff over the course of the game. Some reviewers found the title pretentious; Ben Kuchera of Polygon felt the game may have been a joke by OReilly, and expressed that he did not feel the same sense of wonder that other journalists had found in the game. References External links 2014 video games iOS games Linux games MacOS games Simulation video games Video games developed in Ireland Windows games Double Fine games Indie video games
7083694
https://en.wikipedia.org/wiki/Nick%20Montfort
Nick Montfort
Nick Montfort is a poet and professor of digital media at MIT, where he directs a lab called The Trope Tank. Among his publications are seven books of computer-generated literature and six books from the MIT Press, several of which are collaborations. His work also includes digital projects, many of them in the form of short programs. He lives in New York City. Computer-Generated Books Montfort's The Truelist (Counterpath, 2017) is a computer-generated book-length poem produced by a one-page computer program. The code is included at the end of the book. Montfort has also done a complete studio recording of him reading The Truelist, available at PennSound. Among Montfort's computer-generated books is #! (pronounced "shebang"), in which he "chooses the programming languages Python, Ruby, and Perl (the last of which has a documented history as a poetic medium) to create impressions of an ideal—machines based on the rules of language." The book includes a Python version of "Taroko Gorge," which is available online in JavaScript and has been modified by many authors. Some of these "remixes" are collected in The Electronic Literature Collection: Volume 3. Montfort collaborated with six others on 2x6, a book published by Les Figues that includes six short programs and some of the short narrative poems these generate in English, Spanish, French, Polish, Japanese, and Russian. This project has also been exhibited and is available online on the Web. Montfort's Autopia, which assembles short sentences from the names of automobiles, is another project that also appears as a printed book (published by Troll Thread), a gallery installation, and a web page. These and other of his computer-generated books have been considered conceptual writing. Several of Montfort's printed computer-generated books were generated with programs he wrote during National Novel Generation Month (NaNoGenMo). These include three self-published books, Hard West Turn (2018 Edition), Megawatt, and World Clock, written during the first NaNoGenMo in 2014. Translations of two of these were published by presses in Europe: World Clock was published in Polish translation by ha!art, and Megawatt in German translation by Frohmann. Montfort is the founder and series editor of Using Electricity, a series of computer-generated books published by Counterpath. In November 2019 Montfort announced "Nano-NaNoGenMo," calling for short computer programs within that year's National Novel Generation Month. His request was that people write programs of 256 characters or less to generate novels of 50,000 words or more. He contributed several such programs himself, as did several others. Poetry Montfort's poetry, in addition to computer-generated books and projects, includes digital poems that are collaborations with others. He, Amaranth Borsuk and Jesper Juul wrote The Deletionist, a system for generating erasure poetry from any page on the web. With Stephanie Strickland he wrote Sea and Spar Between, a generator of about 225 trillion stanzas arranged in a grid and combining language from Herman Melville Moby Dick and Emily Dickison's poems. Montfort and William Gillespie wrote 2002: A Palindrome Story, a 2002-word narrative palindrome published in 2002 in print and on the web. Riddle & Bind (Spineless Books, 2010), Montfort's first book of poetry, is a collection of poems written under constraint and literary riddles. Interactive fiction Montfort has written about interactive fiction and written several interactive fiction games. Book and Volume (2005) was a finalist in the 2007 Slamdance Guerilla Gamemaker Competition. However, after Super Columbine Massacre RPG!, which had also been named a finalist, was excluded from the festival, Montfort withdrew from the competition in protest. Released in 2000, Ad Verbum is a wordplay-based game in which the player has to figure out stylistic constraints in different locations and type certain commands in order to solve puzzles. It received the 2000 XYZZY Award for Best Puzzles. Twisty Little Passages: An Approach to Interactive Fiction, a 2003 book, was described by Steve Meretzky as "a thoroughly researched history of interactive fiction, as well as a brilliant analysis of the genre." Writing on Digital Media The group blog Grand Text Auto, active in the early 2000s, was one site where Montfort wrote with others about digital media. Montfort wrote Twisty Little Passages: An Approach to Interactive Fiction (MIT Press, 2003) and co-edited The Electronic Literature Collection: Volume 1 (ELO, 2006) and The New Media Reader (MIT Press, 2003) during that time. Montfort and Georgia Institute of Technology professor Ian Bogost wrote Racing the Beam: The Atari Video Computer System (MIT Press, 2009), a study of world's first widespread gaming system, the Atari 2600. Racing the Beam. In the book, they analyze the platforms, or systems, that underlie the computing process. They also discuss the social and cultural implications of the system that dominated the video game market. In 2012 he and nine co-authors published a book about a one-liner program for the Commodore 64. His most recent book is The Future (MIT Press, 2017). A futures studies reviewer describes The Future as "written by an outsider to the foresight community" who "examines the works of artists, inventors, and designers and how they have imagined the future." The book was reviewed as "striking a balance between planning and poetry ... a sober, tight account of what 'the future' is and has been, as well as how to think and make it." Works Poetry The Truelist (2017), also in print, Counterpath Autopia (2016), also in print, Troll Thread 2x6 (collaboration with six others, 2016), also in print, Les Figues Taroko Gorge (2009) Ream/Rame (collaboration with Anick Bergeron, 2008) 2002: A Palindrome Story (collaboration with William Gillespie, 2002), also in print, Spineless Books Prose Grand Text Auto group blog Book and Volume (2005) Implementation (collaboration with Scott Rettberg, 2004) Ad Verbum (2000) Winchester's Nightmare (1999) In print Montfort, Nick (2017). The Future. Cambridge, MA: MIT Press. Montfort, Nick (2016). Exploratory Programming for the Arts and Humanities. Cambridge, MA: MIT Press. Montfort, Nick, and Ian Bogost (2009). Racing the Beam: The Atari Video Computer System. Cambridge, MA: MIT Press. References External links Nick Montfort's personal website Grand Text Auto Blog Taroko Gorge Remixes Ad Verbum 20th-century births Living people American computer scientists American mass media scholars Interactive fiction writers Electronic literature Game researchers Digital media educators Massachusetts Institute of Technology faculty Year of birth missing (living people)
744156
https://en.wikipedia.org/wiki/Discogs
Discogs
Discogs (short for discographies) is a website and crowdsourced database of information about audio recordings, including commercial releases, promotional releases, and bootleg or off-label releases. While the site was originally created with a goal of becoming the largest online database of electronic music, the site now includes releases in all genres on all formats. After the database was opened to contributions from the public, rock music began to become the most prevalent genre listed. Discogs contains over 14.9 million releases, by over 7.9 million artists, across over 1.8 million labels, contributed from over 617,000 contributor user accounts — with these figures constantly growing as users continually add previously unlisted releases to the site over time. The Discogs servers, currently hosted under the domain name discogs.com, are owned by Zink Media, Inc. and located in Portland, Oregon, United States. History The discogs.com domain name was registered on 30 August 2000, and Discogs itself was launched in November 2000 by programmer, DJ, and music fan Kevin Lewandowski originally intended to be a large database of electronic music. Lewandowski's original goal was to build the most comprehensive database of electronic music, organized around the artists, labels, and releases available in electronic genres. In 2003, the Discogs system was completely rewritten, and in January 2004 it began to support other genres, starting with hip hop. Since then, it has expanded to include rock and jazz in January 2005 and funk/soul, Latin and reggae in October of the same year. In January 2006, blues and non-music (e.g. comedy records, field recordings, interviews) were added. Classical music started being supported in June 2007, and in September 2007 the "final genres were turned on" – adding support for the Stage & Screen, Brass & Military, Children's, and Folk, World, & Country music genres, allowing capture of virtually every single type of audio recording that has ever been released. On 30 June 2004, Discogs released a report claiming that it had 15,788 contributors and 260,789 releases. On 20 July 2007, a new system for sellers was introduced on the site called Market Price History. It made information available to users who paid for a subscriptionthough 60 days of information was freeaccess to the past price items were sold for up to 12 months ago by previous sellers who had sold exactly the same release. At the same time, the US$12 per year charge for advanced subscriptions was abolished, as it was felt that the extra features should be made available to all subscribers now that a different revenue stream had been found from sellers and purchasers. Later that year, all paid access features were discarded and full use of the site became free of charge, allowing all users to view the full 12-month Market Price History of each item. Milestones Discogs publishes information indicating the number of releases, labels, and artists presently in its database, along with its contributors: * Master Release: from 30 April 2009 the function was made available. † Contributors pages: in mid 2019 these pages were limited to show only the top 5000 users, with the total user count being made private, although the total user count figure was re-added sometime during early 2021 (also the About Us page mentions "More than 592,000..." have contributed the site.) Other projects Discogs has so far created a further six online databases, for collating information on related topics. Although only one, VinylHub, remains in use. VinylHub In mid-2014, a side project website called VinylHub was started for users to add world-wide information about record stores including location, contact details, what type of items they stocked, etc. In August 2020 it was relocated as part of the main Discogs website, under subdomain vinylhub.discogs.com. Previous projects Five other online databases were previously created, however they have since closed. Filmogs In late 2014, the company released a new beta website called Filmogs. Users could add their physical film collections (on VHS, DVD, Blu-ray, LaserDisc, or any other type of physical film release) to the database, and buy and sell film releases in the global marketplace. The site was closed down on 31 August 2020. Gearogs Gearogs was launched as a beta in late 2014, at the same time as Filmogs. The site let users add and track music equipment, including items such as synths, drum machines, sequencers, samplers, audio software, and any other electronic music making equipment. The site was closed down on 31 August 2020. Bookogs At the start of 2015, the company began Bibliogs as another beta project. Users could submit information about their books, physical or electronic, different versions and editions, and also connect different credits (writers, illustrators, translators, publishers, etc.) to these books. 21,000 books were submitted by the end of 2016. The project was in beta phase until 15 August 2017 when it reached more than 31,000 book titles, and rebranded without explanation to Bookogs.com, because of legal issues with the old name Bibliogs, and removed 'Beta state' notice from the main page. The next day the Marketplace Beta feature was presented. On 8 June 2019, the project reached a total amount of 100,000 books. The site was closed down on 31 August 2020, counting more than 154,000 books and 345,000 credits. Comicogs Comicogs launched around the same time as Bookogs, as a means for comic collectors and enthusiasts to catalog their collections and create an archive of comic releases. Similar to Bookogs, users could contribute comics, manga, graphic novels, and strips to the database, along with information on credits, publishers, writers, etc. 18,000 comics were submitted by the start of 2018. The Comicogs marketplace was launched on 23 August 2017, allowing users to buy and sell comics from across the world. The site was closed down on 3 August 2020. Posterogs In September 2017, the company launched Posterogs. Posterogs was the only Discogs site to launch a database and marketplace simultaneously. The scope of Posterogs was left broad at the time of launch, with the company opting to let the community define what type of posters, flyers, or similar, should be included in the database. While non-music related items were fully acceptable for inclusion, much of the primary focus seemed to be on music posters, such as gig/tour posters, album promo posters, and promotional flyers (in keeping with Discogs' music theme), though there were also many film posters in the database. As with all other databases, users could save items to their 'Collection' and 'Wantlist', in addition to buying and selling in the marketplace. The site was closed down on 31 August 2020. API In mid-August 2007, Discogs data became publicly accessible via a RESTful, XML-based API and a license that allowed specially attributed use, but did not allow anyone to "alter, transform, or build upon" the data. The license has since been changed to a public domain one. Prior to the advent of this license and API, Discogs data was only accessible via the Discogs web site's HTML interface and was intended to be viewed only using web browsers. The HTML interface remains the only authorized way to modify Discogs data. On 7 June 2011, version 2 of the API was released. Notable in this release was that a license key was no longer required, the default response was changed from XML to JSON, and the 5000 queries per day limit was removed (although a limit of 2000 image lookups per days was introduced). On 1 November 2011, a major update to version 2 of the API was released. This new release dropped support for XML, data is always returned in JSON format, however the monthly data dumps of new data are only provided in XML format. On 1 February 2014, Discogs modified their API so that image requests will now require OAuth authorization, requiring each user of third-party applications to have a Discogs "application ID", with image requests now limited to 1,000 per day. Additionally the Premium API service was dropped. On 24 June 2014, Discogs deprecated their XML API in lieu of a JSON-formatted API. Discogs also allows full XML downloads of its Release, Artist, and Label data through the data.discogs.com subdomain. The recommendations API is not publicly available. Contribution system The data in Discogs comes from submissions contributed by users who have registered accounts on the site. The system has gone through four major revisions. Version One (V1) All incoming submissions were checked for formal and factual correctness by privileged users called "moderators", or "mods" for short, who had been selected by site management. Submissions and edits wouldn't become visible or searchable until they received a single positive vote from a "mod". An even smaller pool of super-moderators called "editors" had the power to vote on proposed edits to artist and label data. Version Two (V2) This version introduced the concept of "submission limits" which prevented new users from submitting more than 2-3 releases for moderation. The number of possible submissions by a user increased on a logarithmic scale. The purpose of this was two-fold: 1) it helped keep the submission queue fairly small and manageable for moderators, and 2) it allowed the new user to acclimatise themselves slowly with the many formatting rules and guidelines of submitting to Discogs. Releases required a number of votes to be accepted into the database - initially the number of votes required was from 4 different moderators but in time the amount was decreased to 3 and then 2. Version Three (V3) V3 launched in August 2007. Submission limits were eliminated, allowing each user to submit an unlimited number of updates and new entries. New releases added to the database were explicitly marked as "Unmoderated" with a top banner, and updates to existing items, such as releases, artists, or labels, were not shown (or available to search engines or casual visitors) until they were approved by the moderators. Version Four (V4) This system launched on 10 March 2008. New submissions and edits currently take effect immediately. Any time a new release is added or old release edited, that entry becomes flagged as needing "votes" (initially, "review," but this term caused confusion). A flagged entry is marked as a full yellow bar across a release in the list views and, like version three, a banner on the submission itself – although, initially, this banner was omitted. Any item can be voted on at any time, even if it isn't flagged. Votes consist of a rating of the correctness & completeness of the full set of data for an item (not just the most recent changes), as assessed by users who have been automatically determined, by an undisclosed algorithm, to be experienced and reliable enough to be allowed to cast votes. An item's "average" vote is displayed with the item's data. The ranking system has also changed in v4. In v3, rank points were only awarded to submitters when a submission was "Accepted" by moderator votes. While in v4, rank points are now awarded immediately when a submission is made, regardless of the accuracy of the information and what votes it eventually receives, if any. Discogs-aware metadata software Tag editors ASMT MP3 Tagger – single release tagger foobar2000 – freeware media player and music management software with a plugin Helium Music Manager – music management software with a plugin Jaikoz – shareware OS X/Windows/Linux spreadsheet-based tag editor Kid3 – open-source project, tagger for all common music formats Mp3tag – freeware tag editor, batch and spreadsheet interfaces OrangeCD Catalog – music management software puddletag – a free and open-source tag editor written for PyQt taghycardia – freeware, automated MP3 tagger Tagog – Linux audio file tagger TagScanner – freeware tag editor with Discogs, FreeDB, TrackType.org support The GodFather – freeware tag editor The Tagger – MP3 and AAC formats tag editor for OS X TigoTago – spreadsheet-based tag editor Other Album Art Downloader – Discogs cover art downloads Discogs Bar – Discogs navigation and search control toolbar for Firefox Discogs Enhancer – Discogs extension adding extra functionality to Google Chrome (inc. dark mode) Discographic for Discogs. Client for Apple devices for iOS MP3 Filenamer – online MP3 file name generator, based on Discogs release data Stecotec Musikverwaltung Pro – Music database software by stecotec.de Music Collector – Music database software by collectorz.com WWW::Discogs – Perl module for interfacing with the Discogs API XLD (X Lossless Decoder) – a CD ripper and audio file converter for OS X See also List of online music databases Global Electronic Music Marketplace References External links American music websites Companies based in Portland, Oregon Internet properties established in 2000 Online music and lyrics databases Social cataloging applications
13038870
https://en.wikipedia.org/wiki/80%20Plus
80 Plus
80 Plus (trademarked 80 PLUS) is a voluntary certification program launched in 2004, intended to promote efficient energy use in computer power supply units (PSUs). Certification is acquirable for products that have more than 80% energy efficiency at 20%, 50% and 100% of rated load, and a power factor of 0.9 or greater at 100% load. History EPRI (Electric Power Research) and Ecos Consulting (promoter of the brand) develop the Generalized Internal Power Supply Efficiency Test Protocol for desktop derived multi-output power supplies. March 2004: the 80 Plus idea was presented as an initiative at the ACEEE Market Transformation Symposium. February 2005: the first market-ready power supply was created by Seasonic. 2006: Energy Star added 80 Plus requirements to their then-upcoming (in effect since July 2007) Energy Star 4.0 computer specifications. November and February 2006: HP and Dell certify their PSUs to the 80 Plus specification. 20 July 2007: Energy Star Computer Specification 4.0 goes into effect. The specification includes 80 Plus power supply efficiency levels for desktop computers. December 2007: over 200 PSUs on the market are 80 Plus certified and it is becoming the market standard. First-quarter 2008: standards revised to add Bronze, Silver, and Gold higher efficiency level certifications. October 2009: added specification for the Platinum efficiency level. February 2012: Dell and Delta Electronics working together were able to achieve world-first 80 Plus Titanium server power supply. Efficiency level certifications 4 categories for the certification: 115 V lists power supplies certified for desktop, workstation, and non-redundant server applications. 230 V lists power supplies certified for redundant, data center applications. 115 V Industrial lists power supplies for industrial applications. Units may be any physical format (embedded, encapsulated, open frame, rack mount, DIN-mount). 230 V EU Internal power supplies are certified for desktop, workstation, and server applications in non-redundant configurations. For the higher certification levels, the requirement of 0.9 or better power factor was extended to apply to 20% and 50% load levels, as well as at 100% load. The Platinum level requires 0.95 or better power factor for servers. The Climate Savers Computing Initiative efficiency level targets for workstations for 2007 through 2011, corresponding to the 80 Plus certification levels. From July 2007 through June 2008, the basic 80 Plus level (Energy Star 4.0). For the next year, the target is 80 Plus Bronze level, the following year 80 Plus Silver, then 80 Plus Gold, and finally Platinum. Redundancy is typically used in data centers. Misleading power supply advertising There have been instances where companies claim or imply that their supplies are 80 Plus when they have not been certified, and in some cases do not meet the requirements. When a company resells an OEM power supply under a new name it must be certified under the new name and company, even if the OEM supply is certified. In some instances, a reseller has claimed a higher wattage than the supply can deliver – in which case, the reseller's supply would not meet 80 Plus requirements. The 80 Plus website has a list of all certified supplies, so it is possible to confirm that supply meets the requirements. Although some power supply manufacturers name their products with similar names, such as "85 Plus", "90 Plus" and "95 Plus" there is no such official certification or standard. Verify certification Other than misleading consumers by falsely claiming certification with similar names or companies "reselling", another way companies try to confuse consumers is by claiming they meet a certain certification requirement when in fact they do not. For example, the highest 80 Plus is 80+ Titanium (96% efficiency at 50% load). Some companies will claim they meet this requirement even when they are only close (i.e. 95.xx%) therefore claiming 80+ Titanium. However, this is not the case as one could easily modify the test unit to be more enhanced than production models in order to slightly raise numbers. The Plug Load Solutions official website provides lists certifications for each company, allowing consumers to verify how many and which models are listed. For example, if we take a look at the highest rating available (80 Plus Titanium) in the 230V Internal category (most common for industrial PSUs) for comparison; we can see that Compuware Corporation leads with 29 models, followed by Super Micro Computer, Inc. with 20 80 Plus Titanium units, and close third place is Delta Electronics with 18 units. Technical overview The efficiency of a computer power supply is its output power divided by its input power; the remaining input power is converted into heat. For instance, a 600-watt power supply with 60% efficiency running at full load would draw 1000 W from the mains and would therefore waste 400 W as heat. On the other hand, a 600-watt power supply with 80% efficiency running at full load would draw 750 W from the mains and would therefore waste only 150 W as heat. For a given power supply, efficiency varies depending on how much power is being delivered. Supplies are typically most efficient at between half and three-quarters load, much less efficient at low load, and somewhat less efficient at maximum load. Older ATX power supplies were typically 60% to 75% efficient. To qualify for 80 Plus, a power supply must achieve at least 80% efficiency at three specified loads (20%, 50% and 100% of maximum rated power). However, 80 Plus supplies may still be less than 80% efficient at lower loads. For instance, an 80 Plus, 520 watt supply could still be 70% or less efficient at 60 watts (a typical idle power for a desktop computer). Thus it is still important to select a supply with capacity appropriate to the device being powered. It is easier to achieve the higher efficiency levels for higher wattage supplies, so gold and platinum supplies may be less available in consumer-level supplies of reasonable capacity for typical desktop machines. Typical computer power supplies may have power factors as low as 0.5 to 0.6. The higher power factor reduces the peak current draw, reducing load on the circuit or on an uninterruptible power supply. Reducing the heat output of the computer helps reduce noise, since fans do not have to spin as fast to cool the computer. Reduced heat and resulting in lower cooling demands may increase computer reliability. The testing conditions may give an unrealistic expectation of efficiency for heavily loaded, high power (rated much larger than 300 W) supplies. A heavily loaded power supply and the computer it is powering generate significant amounts of heat, which may raise the power supply temperature, which is likely to decrease its efficiency. Since power supplies are certified at room temperature, this effect is not taken into account. 80 Plus does not set efficiency targets for very low load. For instance, generation of standby power may still be relatively inefficient, and may not meet requirements of the One Watt Initiative. Testing of 80 Plus power supplies shows that they vary considerably in standby efficiency. Some power supplies consume half a watt or less in standby with no load, where others consume several times as much at standby, even though they may meet higher 80 Plus certification requirement levels. See also AC adapter Green computing IT energy management Power management Performance per watt Quiet PC References External links . Computer hardware standards Computers and the environment Power supplies Energy conservation
4946636
https://en.wikipedia.org/wiki/Mike%20Hull%20%28fullback%29
Mike Hull (fullback)
Michael Bruce Hull (born April 2, 1945) is a retired American football fullback that played in the National Football League. He played college football at the University of Southern California and was one of five USC Trojans players taken in the first round of the 1968 NFL Draft after his senior year. Biography Hull started his football career at 14 yrs old on the bench, as a reserve offensive tackle for Clark Junior H.S., which is now Crescenta Valley High School ("CVHS"). When in tenth grade, Hull was moved to starting tight-end, and defensive end. In his junior year, as the Falcon team began its first year of varsity football, Hull was moved to tailback by Head Coach Gary Hess, in the Falcon's single-wing offense. He returned the opening kickoff 88 yards for a touchdown the first time he touched the ball in a varsity regular season game and gained 142 yards rushing on just ten carries, not including his kickoff return. In his senior year at CVHS Hull earned First Team All-League honors, amassing over 1,000 yards total offense. In the last game beating Burbank HS, Mike had 137 yds. rushing, on 20 carries, with just 70 yds passing. He also ran the high and low hurdles, long jumped and ran the relays on the Falcon track team. At the close of his career at CVHS Hull was also the student body president, and held every Falcon varsity football rushing and total offense record. He was recruited by several university teams but decided to start locally at Glendale College and gain some experience in the T-formation, where he matured into a versatile fullback/halfback. He was named the offensive MVP Running Back, and All-Conference, on Glendale's Western Conference Championship Bowl team, before heading to the University of Southern California in the spring to run track as a Trojan freshman. At USC, Hull converted to full-time fullback, working himself into a starting role for the Trojans for three years, playing on two Rose Bowl teams and the 1967 USC National Championship Team. Between his sophomore year blocking for Heisman Trophy winner, Mike Garrett, and his senior year blocking for future Heisman winner O. J. Simpson, Hull led the USC Rose Bowl team in rushing avg. with 6.7 yds. per carry, winning the "Roy Bullet Baker" award as the Trojans' Most Valuable back. He was also selected as the Most Valuable Player in the USC vs. UCLA game, while rushing for 147 yds on 14 carries against the Bruins. Professional career Hull was drafted in the first round of the 1968 NFL draft by the Chicago Bears, the 16th player selected. He played for the Bears for three years, suffering through the Brian Piccolo tragedy, though helping All Pro and future Hall of Famer Gale Sayers, as the lead blocker. In 1971 Hull was traded to the Washington Redskins for three players, joining coach George Allen's "Over-the-Hill-Gang". He played in 86 consecutive games with the Redskins over five NFL seasons, six NFL playoff games, the NFL Championship and the Super Bowl, where he was a Special Teams leader. Law career After seven years in the NFL, Hull started his law school education at Georgetown University Law Center in Washington, D.C., earning his Juris Doctor degree in 1979. He became an Assistant Corporation Counsel for the District of Columbia, and then became an Assistant US Attorney for the US, a prosecutor for the United States. He met his wife Connie, then a lawyer representing the NFL, married and had a daughter, Michelle. After 20 years in the East as a professional football player, law student and lawyer, he moved back with Connie and Michelle to Southern California, where his son, Thomas, was born. His other son, Ernie, from a prior marriage, graduated from Cal Poly San Luis Obispo, has his Masters from Pepperdine University. Mike joined Coldwell Banker in 1989 and has been guiding the Southwest team ever since. He retired in July 2020. Mike now resides in San Clemente, California, with wife Connie and their daughter Michelle, who graduated magna cum laude from Harvard University and from Columbia Law School, in New York. 1945 births Living people Sportspeople from Los Angeles County, California USC Trojans football players Chicago Bears players Washington Redskins players Players of American football from California People from La Crescenta-Montrose, California
4175227
https://en.wikipedia.org/wiki/HotDocs
HotDocs
HotDocs is a document automation (also known as document generation or document assembly) software company currently owned by AbacusNext. Version 1.0 of HotDocs was introduced in 1993. Description HotDocs transforms documents and graphical (PDF) forms into document-generation templates and deploys of these templates to various server environments. Document modeling in HotDocs can range from variable insertions to the formation and insertions of complex, computed variables. Business logic consisting of IF/THEN statements and REPEAT loops can be built into the template to control the inclusion or exclusion of language blocks. HotDocs includes a variety of other scripting instructions and sets of pre-packaged functions using boolean logic. HotDocs also enables system architects to create custom functions. In use, a HotDocs template queries the user for the information necessary to generate a document (or set of documents) and saves the information in an answer file. The application then uses the saved information to assemble a custom version of the document, inserting and formatting variable information, inserting the right clauses based on transactional conditions, and inserting correct pronouns and verbs. The HotDocs technology stack includes a logic core, a set of development tools, platforms for deploying intelligent templates in any environment, and a wide range of user-layer technologies (web applications for consuming HotDocs templates). Logic Core At the base of the HotDocs stack is a logic core, which consists of 1 million+ lines of code. The logic core enables HotDocs to handle the modeling complexities of documents and forms of any complexity and any length. HotDocs Developer HotDocs Developer is a document-generation-process-modeling environment that allows a system architect to build business logic into a document. HotDocs Developer, likewise, allows a system architect to design interviews (sequences of interactive data-gathering forms) that gather all the information necessary to generate the underlying document or documents. In combination, a modeled document and its accompanying interview are the two parts of a document automation template. HotDocs Developer works within commercially available word processors such as Microsoft Word. This approach is useful for organizations that want to retain all the formatting attributes currently used in its word processing documents, including font faces, columns, pagination elements, etc. HotDocs includes a development environment for the automation of PDF-based graphical forms (fields, check boxes, etc.). HotDocs allows for shared components among any number of documents, meaning all the Word documents, WordPerfect documents, and PDF-based forms in a set can be generated from a single answer file. HotDocs Platforms The HotDocs stack includes platforms for desktop, client/server (on-premises), and cloud deployment of HotDocs document-generation templates. Multiple third-party developers build their own technologies on the HotDocs desktop API. HotDocs Server is designed for on-premises, server-based document generation. HotDocs Cloud Services is a multi-tenant, cloud version of HotDocs Server designed for enterprises that want to forego the upfront cost and upkeep of HotDocs Server. HotDocs User-Layer Technologies HotDocs provides multiple off-the-shelf applications for using HotDocs templates, including a standard desktop application and several browser applications. History What is now HotDocs Corporation began as a research project in the mid-1970s at Brigham Young University Law School. Funded at the time by West Publishing, the project began as a code base, developed for the VAX mainframe computer running the VMS operating system. In the late 1980s, the project became commercial with the founding of Capsoft Development by Marshall Morrise. Capsoft Development licensed the technology from BYU and ported the code base into DOS. A few years later, the technology was re-birthed as HotDocs, a Windows-based version that reflected many of the original feature sets from the old VAX version. Version 1 of HotDocs was released in 1993. Graphical forms functionality was added in 1996. In 1998, HotDocs Corporation was purchased by Matthew Bender. HotDocs Corporation became the property of LexisNexis in 1999, when LexisNexis bought Matthew Bender. In 2009, Capsoft UK, the largest independent reseller of HotDocs software, bought the HotDocs business from LexisNexis. In October 2011, HotDocs announced that its software would be deployed to 15,000 members of the U.S. Department of Justice. The company also entered into a partnership with Thomson Reuters to provide the technology platform for Interactive Decision Tools on Checkpoint, a research and analysis tool for tax lawyers and accountancy firms. In January 2012, HotDocs released HotDocs Document Services, a software-as-a-service (SaaS) application designed to extend browser-based document generation to small-to-medium-sized law firms. HotDocs has recently launched HotDocs Author 1.0. This is a massive bundle service for big enterprise businesses. They are still retaining HotDocs Developer and User. These are being sold to smaller businesses. In 2017, HotDocs was acquired by AbacusNext. Products HotDocs Developer—used to transform documents and forms into templates. HotDocs User—a desktop application for organizing and accessing templates built with HotDocs Developer. HotDocs Server—designed for server-based deployment of HotDocs templates. HotDocs Cloud Services—launched in January 2012, a cloud-based SaaS (Software-as-a-Service) product which allows law firms and other entities already using HotDocs to transition from the desktop to the internet for document production. See also Document Automation Cloud computing References External links HotDocs (USA) HotDocs (International) Business software
27327422
https://en.wikipedia.org/wiki/SecMsg
SecMsg
eMudhra SecMsg is mobile application designed to secure the SMS channel. It allows users to send SMS's that are encrypted and signed using PKI technology and ensures that it is decrypted only by the intended user. The algorithms used for crypto processes like Signing, Encryption, Decryption are RSA/ECC, AES and SHA. Technical insight Key pair generation The RSA or ECC key pair is generated in the application and stored in the mobile. These keys will be used for all the crypto process like signing, encryption and decryption. The key pair generation is a onetime process and is completely user driven. The user can also use X.509 digital certificates from any certification authority (CA) for the crypto processes mentioned above. Built-in-security The user is required to provide the application PIN whenever he wants to gain access to it. The key pair is protected with the key PIN which controls signing, encryption/decryption processes. Usage areas Secure communication SecMsg can be used to send and receive confidential messages. Secure communication with peers can be established by exchanging the public key with friends/spouse/colleagues. The public key received from peers will be stored in the application. The messages composed by the user will be encrypted with recipient's public key and will land up in the inbox of SecMsg. This message can be decrypted with the intended recipient's private key only.[see Fig.1] Secure Safe 'My Vault' is an organizer that gives additional security to store personal information that is protected with PIN. All the information stored will be encrypted with the RSA private key. Two-factor authentication Any online transaction (Fund transfer, Add payee, Online broking, End of day confirmation) can be acknowledged by digitally signing the transaction details. This ensures confidentiality, integrity and more importantly non repudiation. Acknowledgment of the transaction thus made cannot be repudiated later. Password/ATM PIN retrieval Users can use the application for resetting/receiving passwords. Banks can send the ATM PIN/Internet banking passwords as an encrypted text message for which a digitally signed acknowledgment can be received in no time. Features This application supports multiple X.509 certificates from any certification authority (CA). These digital signatures can also be used for digital signing and encryption/decryption. Logging the history of transactions and messages, access to which requires PIN. Remote data wiping for clearing the contents of the application if the mobile is lost. Uses simplest and ubiquitous communication mode-SMS channel. Low operating cost. Extensive device support. SecMsg on the News Article about SecMsg in Network computing SecMsg coverage in Economic Times SecMsg coverage in Dataquest See also Mobile Signature Digital signature Public-key cryptography References Cryptographic software Pocket PC software
13574643
https://en.wikipedia.org/wiki/Operating%20System%20Projects
Operating System Projects
OSP, an Environment for Operating System Projects, is a teaching operating system designed to provide an environment for an introductory course in operating systems. By selectively omitting specific modules of the operating system and having the students re-implement the missing functionality, an instructor can generate projects that require students to understand fundamental operating system concepts. The distribution includes the OSP project generator, which can be used to package a project and produce stubs (files that are empty except for required components, and that can be compiled) for the files that the students must implement. OSP includes a simulator that the student code runs on. See also Mobile operating system Network operating system Operating system References OSP: An Environment for Operating System Projects by Michael Kifer and Scott A. Smolka, Addison Wesley, 1991, 86 pages (2nd printing in 1992). External links 1992 paper (ACM portal) 1996 paper Discontinued operating systems
127759
https://en.wikipedia.org/wiki/End%20user
End user
In product development, an end user (sometimes end-user) is a person who ultimately uses or is intended to ultimately use a product. The end user stands in contrast to users who support or maintain the product, such as sysops, system administrators, database administrators, information technology experts, software professionals and computer technicians. End users typically do not possess the technical understanding or skill of the product designers, a fact easily overlooked and forgotten by designers: leading to features creating low customer satisfaction. In information technology, end users are not "customers" in the usual sense—they are typically employees of the customer. For example, if a large retail corporation buys a software package for its employees to use, even though the large retail corporation was the "customer" which purchased the software, the end users are the employees of the company, who will use the software at work. Certain American defense-related products and information require export approval from the United States Government under the ITAR and EAR. In order to obtain a license to export, the exporter must specify both the end user and the end use for undertaking an end-user certificate. In End-User License Agreements (EULAs), the end user is distinguished from the value-added reseller, who installs the software or the organization who purchases and manages the software. In the UK, there exist documents that accompany licenses for products named in the end user undertaking statements(EUU). Context End users are one of the three major factors contributing to the complexity of managing information systems. The end user's position has changed from a position in the 1950s (where end users did not interact with the mainframe; computer experts programmed and ran the mainframe) to one in the 2010s where the end user collaborates with and advises the management information system and Information Technology department about his or her needs regarding the system or product. This raises new questions, such as: Who manages each resource?, What is the role of the MIS Department? and What is the optimal relationship between the end-user and the MIS Department?. Empowerment The concept of "end-user" first surfaced in the late 1980s and has since then raised many debates. One challenge was the goal to give both the user more freedom, by adding advanced features and functions (for more advanced users) and add more constraints (to prevent a neophyte user from accidentally erasing an entire company's database). This phenomenon appeared as a consequence of "consumerization" of computer products and software. In the 1960s and 1970s, computer users were generally programming experts and computer scientists. However, in the 1980s, and especially in the mid-to-late 1990s and the early 2000s, everyday, regular people began using computer devices and software for personal and work use. IT specialists needed to cope with this trend in various ways. In the 2010s, users now want to have more control over the systems they operate, to solve their own problems, and be able to change, customize and "tweak" the systems to suit their needs. The apparent drawbacks were the risk of corruption of the systems and data the users had control of, due to their lack of knowledge on how to properly operate the computer/software at an advanced level. For companies to appeal to the user, they took primary care to accommodate and think of end-users in their new products, software launches, and updates. A partnership needed to be formed between the programmer-developers and the everyday end users so both parties could maximize the use of the products effectively. A major example of the public's effects on end-users requirements were the public libraries. They have been effected by new technologies in many ways, ranging from the digitalization of their card catalog, the shift to e-books, e-journals and offering online services. Libraries have had to undergo many changes in order to cope, including training existing librarians in Web 2.0 and database skills, to hiring IT and software experts... End user documentation The aim of end user documentation (e.g., manuals and guidebooks for products) is to help the user understand certain aspects of the systems and to provide all the answers in one place. A lot of documentation is available for users to help them understand and properly use a certain product or service. Due to the fact that the information available is usually very vast, inconsistent or ambiguous (e.g., a user manual with hundreds of pages, including guidance on using advanced features), many users suffer from an information overload. Therefore, they become unable to take the right course of action. This needs to be kept in mind when developing products and services and the necessary documentation for them. Well-written documentation is needed for a user to reference. Some key aspects of such a documentation are: Specific titles and subtitles for subsections to aid the reader in finding sections Use of videos, annotated screenshots, text and links to help the reader understand how to use the device or program Structured provision of information, which goes from the most basic instructions, written in plain language, without specialist jargon or acronyms, progressing to the information that intermediate or advanced users will need (these sections can include jargon and acronyms, but each new term should be defined or spelled out upon its first use) Easy to search the help guide, find information and access information Clear end results are described to the reader (e.g., "When the program is installed properly, an icon will appear in the left-hand corner of your screen and the LED will turn on...") Detailed, numbered steps, to enable users with a range of proficiency levels (from novice to advanced) to go step-by-step to install, use and troubleshoot the product or service Unique Uniform Resource Locator (URLs) so that the user can go to the product website to find additional help and resources. At times users do not refer to the documentation available to them due to various reasons, ranging from finding the manual too large or due to not understanding the jargon and acronyms it contains. In other cases, the users may find that the manual makes too many assumptions about a user having pre-existing knowledge of computers and software, and thus the directions may "skip over" these initial steps (from the users' point of view). Thus, frustrated user may report false problems because of their inability to understand the software or computer hardware. This in turn causes the company to focus on “perceived” problems instead of focusing on the “actual” problems of the software. Security In the 2010s, there is a lot of emphasis on user's security and privacy. With the increasing role that computers are playing in people's lives, people are carrying laptops and smartphones with them and using them for scheduling appointments, making online purchases using credit cards and searching for information. These activities can potentially be observed by companies, governments or individuals, which can lead to breaches of privacy, identity theft, by, blackmailing and other serious concerns. As well, many businesses, ranging from small business startups to huge corporations are using computers and software to design, manufacture, market and sell their products and services, and businesses also use computers and software in their back office processes (e.g., human resources, payroll, etc.). As such, it is important for people and organizations to need know that the information and data they are storing, using, or sending over computer networks or storing on computer systems is secure. However, developers of software and hardware are faced with many challenges in developing a system that can be both user friendly, accessible 24/7 on almost any device and be truly secure. Security leaks happen, even to individuals and organizations that have security measures in place to protect their data and information (e.g., firewalls, encryption, strong passwords). The complexities of creating such a secure system come from the fact that the behaviour of humans is not always rational or predictable. Even in a very-well secured computer system, a malicious individual can telephone a worker and pretend to be a private investigator working for the software company, and ask for the individual's password, a dishonest process called "phishing". As well, even with a well-secured system, if a worker decides to put the company's electronic files on a USB drive to take them home to work on them over the weekend (against many companies' policies), and then loses this USB drive, the company's data may be compromised. Therefore, developers need to make systems that are intuitive to the user in order to have information security and system security. Another key step to end user security is informing the people and employees about the security threats and what they can do to avoid them or protect themselves and the organization. Underlining clearly the capabilities and risks makes users more aware and informed whilst they are using the products. Some situations that could put the user at risk are: Auto-logon as administrator options Auto-fill options, in which a computer or program "remembers" a user's personal information and HTTP "cookies" Opening junk emails of suspicious emails and/or opening/running attachments or computer files contained in these Email can be monitored by third parties, especially when using Wi-Fi connections Unsecure Wi-Fi or use of a public Wi-Fi network at a coffee shop or hotel Weak passwords (using a person's own name, own birthdate, name or birthdate of children, or easy-to-guess passwords such as "1234") Malicious programs such as viruses Even if the security measures in place are strong, the choices the user makes and his/her behaviour have a major impact on how secure their information really is. Therefore, an informed user is one who can protect and achieve the best security out of the system they use. Because of the importance of end-user security and the impact it can have on organisations the UK government set out a guidance for the public sector, to help civil servants learn how to be more security aware when using government networks and computers. While this is targeted to a certain sector, this type of educational effort can be informative to any type of user. This helps developers meet security norms and end users be aware of the risks involved. Reimers and Andersson have conducted a number of studies on end user security habits and found that the same type of repeated education/training in security "best practices" can have a marked effect on the perception of compliance with good end user network security habits, especially concerning malware and ransomware. Undertaking End user undertaking (EUU) is a document saying who the user is, why they are using a product and where they live (or where they work). This document needs to be completed and signed by a person in a position of authority who is in the end user business. All documents should be in English or if not so accompanied by a valid English translation. Usually the EUU is sent together with the product license. See also End-user certificate End-user computing End-user development End-user license agreement Voice of the customer Notes References Computing terminology Export and import control Consumer
27260066
https://en.wikipedia.org/wiki/FlexSim
FlexSim
FlexSim is a discrete-event simulation software package developed by FlexSim Software Products, Inc. The FlexSim product family currently includes the general purpose FlexSim product and healthcare systems modeling environment (FlexSim HC). History FlexSim development began in late-2001 as an unnamed development project of F&H Simulations, Inc., a U.S. distributor of F&H Holland's Taylor II and Taylor ED products. Development was initially led by Dr. Eamonn Lavery, with lead developer Anthony Johnson joining in April 2002. Before the end of 2002, the development project was renamed FlexSim, which coincided with F&H Simulations, Inc. changing its name to FlexSim Software Products, Inc. FlexSim 1.0 was released in February 2003. FlexSim used a major.minor.build software versioning scheme until version 7.7.4; beginning with version 16.0.0 on March 14, 2016, FlexSim transitioned to a year.update.bugfix versioning scheme. Usage Manufacturing FlexSim has been used in a variety of simulation projects involving both standard and flexible manufacturing systems. Some examples include studies to determine optimal buffer sizes, optimizing blend components in feed production, rescheduling problems in mixed-line production planning, optimizing electronics assembly lines, and steel production scheduling. Industry 4.0 FlexSim has been used to automate simulation model development for more than a decade; a 2008 study described a FlexSim-based solution that communicates with Product Lifecycle Management (PLM) software to generate simulation models. With the ongoing trend of Industry 4.0 pushing manufacturers toward automation and improved communication, FlexSim has been used to develop computer simulation models for these applications. FlexSim can be extended through C++, which allows the software to be integrated into systems involving real-time data communication. The software has been used for nearly real-time production planning, which improves upon the Master Schedule approach (which can get out of date and miss on-site changes). In one study, FlexSim was integrated into a dynamic data-driven application system to automatically generate simulation models via the XML language. Robotics and Crane FlexSim's standard object library contains a 6-axis robot object capable that contains both pre-built motion logic and the ability to create customized motion paths. FlexSim has been used to model and analyze robotic cells in manufacturing environments, including dynamic scheduling and control of a robotic assembly cell. The standard object library also contains a crane object, "designed to simulate rail-guided cranes such as gantry, overhead, or jib cranes." FlexSim, through the use of the crane object, has been used to evaluate solutions to crane scheduling in a shipbuilding environment. Healthcare In April 2009, FlexSim Software Products, Inc. released a standalone healthcare simulation product named FlexSim HC. It was developed as simulation package focused on modeling patient flows and other healthcare processes. The final release in original FlexSim HC development path was version 5.3.10 on February 19, 2019; beginning with FlexSim version 19.1.0 on April 29, 2019, FlexSim HC functionality was merged into the core FlexSim development and became a modeling environment within the software. In practice, the FlexSim HC environment is used by healthcare organizations to evaluate different scenarios in their healthcare processes and validate the scenarios before they are implemented. The environment has been used in various patient care improvement initiatives, including studies to understand different treatment options in Labor & Delivery, deploying advanced practice nurses in treating non-urgent patients, and demonstrating simulation-based design of a breast-screening facility as both a process improvement tool and as a management training tool. During the COVID-19 pandemic, FlexSim HC was used to analyze vaccination rollout efforts and improve patient flow at vaccination sites. Outside of the traditional healthcare setting, FlexSim has been used to dynamically calculate and visualize radiation exposure. Academia FlexSim has been used extensively in academic research and conference proceedings worldwide. The software package is usually taught as part of an industrial engineering or systems engineering curriculum, often in a Systems Simulation course; however, FlexSim has also been introduced as part of undergraduate or graduate coursework in manufacturing engineering, operations research, business management, health systems engineering, and nursing. Other As general purpose simulation software, FlexSim is used in a number of fields: Material handling: Conveyor systems, AGV, packaging, warehousing Logistics and distribution: Container terminal operation, supply chain design, distribution center work flow, service and storage layout, etc. Transportation: Highway system traffic flow, transit station pedestrian flow, maritime vessel coordination, custom traffic congestion, etc. Others: Oil field or mining processes, networking data flow, etc. Main features Robust standard objects FlexSim includes a standard object library, with each object containing pre-built logic and task execution to mimic the resources found in real-world operations. FlexSim objects are defined and programmed in four classes: fixed resource class, task executer class, node class and visual object class. FlexSim uses an object-oriented design. Logic building tools The logic for a FlexSim model can be built using very little or no computer code. Most standard objects contain an array of drop-down lists, properties windows, and triggers that allow the user to customize the logic required for an accurate model of the system. FlexSim also includes a flowcharting tool to create the logic for a model using pre-built activity blocks. Drag-and-drop controls Users can build the model by dragging and dropping predefined 3D objects into a "model view" to layout and link the model. Experienced users also have the option to specify and modify object parameters and behaviors using FlexScript and C++ programming languages. See also List of computer simulation software List of discrete event simulation software Computer simulation References Further reading External links FlexSim Documentation Simulation software Software companies of the United States 2003 establishments in Utah
10088964
https://en.wikipedia.org/wiki/Remote%20graphics%20unit
Remote graphics unit
A remote graphics unit (RGU) is a device that allows a computer to be separated from some input/output devices such as keyboard, mouse, speakers, and display monitors. The key part being remoted is the graphics sub-system of the computer. History RGUs may have their origin with experiments with graphics controllers on mainframe computers in the 1970s. RGUs have been mostly associated with high end workstations running Unix-like operating systems or Windows since the late 1990s. Generally RGUs are used for special applications like remote sensing, financial services commodity trading desks, computer-aided design, etc. Depending on how one chooses to define RGUs, dedicated X terminals may also be included. Application Usually the reasons that might lie behind the desire to separate the user interface of a computer from the actual computer itself would be: securing computers away from users for corporate or government security, to reduce heat and noise in rooms with many computer operators, or to facilitate computer maintenance by placing all computers in very close proximity to one another. KVM interoperability Unlike other technologies used to achieve this, such as KVM Extension (or Remote KVM) and DVI Extension for example, a remote graphics unit will effectively split a computer's PCI or PCI-Express bus and transmit only bus commands over to the user side. With KVM Extension and DVI Extension, the graphics processing is done by a traditional graphics processing unit (GPU) on the computer side. Bus data is much smaller than rendered graphics data so the theory behind the remote graphics unit is that it is possible to achieve higher resolutions and better graphics performance when there is a large separation in distance between the user-side input/output devices and the computer side. Examples An example of a product line that was commercialized using RGU as the description of the technology is the Matrox Extio series. Extio is a brand that is marketing shorthand for "External I/O". Related products Other products supporting the concept behind the remote graphics unit include bus extension technologies where a standard graphics processing unit is plugged into a remote PCI slot via a standard graphics add-in card. Various types of bus extension technologies are available including the DeTwo System from Amulet Hotkey as well as products from Avocent. Graphics hardware Computer peripherals
47663842
https://en.wikipedia.org/wiki/Harry%20Giles%20%28basketball%29
Harry Giles (basketball)
Harry Lee Giles III (born April 22, 1998) is an American professional basketball player for the Agua Caliente Clippers of the NBA G League. He played college basketball for the Duke Blue Devils. High school career Freshman and sophomore seasons Giles attended Wesleyan Christian Academy in High Point, North Carolina. As a freshman, Giles averaged 12.5 points per game and 9.5 rebounds per game after leading Wesleyan Christian to a 2013 NCISAA 3A State Championship alongside Dallas Mavericks shooting guard Theo Pinson. Giles missed his entire sophomore year due to a left knee injury. During the 2014 summer, Giles participated in the Under Armour Elite 24 game in Brooklyn, New York, finishing with 16 points and 11 rebounds and earning Co-MVP honors. Junior season In his junior year, Giles and Wesleyan were ranked the No.2 team in the Nation by USA Today. On November 14, 2014 In his second game back since his injury, Harry scored a career high 38 points and grabbed 19 rebounds in an 82–58 win over Northside Christian Academy. On December 21, Giles scored 29 points in a 67–62 win over Mater Dei. On December 24, Giles tallied 24 points and 14 rebounds in a 51–47 win over Trinity High School. Giles and the Trojans then played in the 2014–15 High School OT Holiday Invitational Tournament at Needham B. Broughton High School in Raleigh, North Carolina. On December 28, Giles scored 22 points and 12 rebounds in a 72–56 victory over the Carlisle School. On December 29, Giles and Wesleyan defeated Word of God Christian Academy (98–85) behind Giles 31 points and 17 rebounds to advance to the championship game. On December 30, 2014, Giles went head to head against an Orangeville Prep team that featured former #1 player in 2016 Thon Maker. The Wesleyan trojans would defeat Orangeville Prep (78–75) with Giles scoring 26 points and 14 rebounds while Maker scored 24 points and 11 between the two. On January 15, 2015, Giles scored 17 points, 12 rebounds, and 4 assist to help the Trojans defeat Malik Monk and Bentonville High School (63–55). On the season, Giles averaged 23.9 points per game, 12.5 rebounds per game, 2.0 assist per game, and 3.0 blocks per game while leading the Wesleyan Trojans to a (30–5) record and a NCISAA 3A state championship game appearance, losing to in state rival Greensboro Day School. At the end of his junior season, Giles earned first-team All-USA honors by USA Today. During the summer of 2015, Giles would join his AAU Team, CP3 All Stars, sponsored by fellow Winston-Salem native and NBA superstar Chris Paul of the Phoenix Suns. Giles averaged 18.2 points per game and 12.0 rebounds per game in 16 games on the EYBL circuit, earning first-team All-EYBL honors. In August 2015, Slam Magazine would name Giles to its Summer All-American Team. Senior season Before his senior season, Giles decided to attend and play for the high school basketball powerhouse Oak Hill Academy in Mouth of Wilson, Virginia. Giles dominated the majority of his high school career; however, his senior year ended with a torn ACL in his right knee. The injury occurred during his first scrimmage game with Oak Hill. Weeks later, Giles enrolled to (now defunct) Forest Trail Academy in Kernersville, North Carolina to take online courses to finish his senior year of high school while rehabbing from his knee injury. On November 6, 2015, Giles made his verbal commitment to attend Duke University and play for the Duke Blue Devils live on ESPN joining Fellow five-star 2016 recruits Tatum and Frank Jackson. He was selected to play in the 2016 Jordan Brand Classic and Nike Hoop Summit but was unable due to injury. Giles was rated as a five-star recruit and considered the best high school prospect of the 2016 class. Giles ranked as the No.1 overall recruit and No.1 power forward in the 2016 high school class by ESPN, while Scout.com and Rivals ranked him No. 2 in the Class of 2016 only behind Josh Jackson. College career Before the start of the 2016–17 season, Giles was selected to both Naismith and John R. Wooden Award preseason watchlists, while also finishing third in voting for ACC Preseason Rookie of the Year. On October 3, 2016 it was announced Giles would likely miss up to six weeks to have surgery on his knee. On December 19, 2016, Giles made his college debut in a win against Tennessee State. On January 4, 2017, He recorded his first Double-double with 10 points and 12 rebounds in a win over Georgia Tech. On March 10, 2017 in the ACC Tournament semi-finals against rival North Carolina, Giles had 4 blocks, 7 rebounds, and 6 points in a (95–83) win. At the conclusion of his freshman season, Giles announced that he would forgo his final three years of collegiate eligibility at Duke and enter the 2017 NBA draft. Professional career Sacramento Kings (2017–2020) Draft year injury (2017–18) On June 22, 2017, Giles was selected by the Portland Trail Blazers with the 20th overall pick in the 2017 NBA draft. His rights were later traded to the Sacramento Kings on Draft night. Giles would sit out the entire 2017 NBA Summer League. On July 8, 2017, Giles signed his rookie scale contract with the Kings worth $10,621,750 over 4 years. On October 6, 2017, it was announced Giles would be out of action and will make his NBA debut in January 2018. On January 18, 2018, it was announced that Giles would "not be introduced to NBA gameplay during the 2017–18 season but focus on more vigorous practice activity and individual workouts tailored to continue developing overall strength and aid ACL injury prevention." Rookie season (2018–2019) On May 14, 2018, The Sacramento Bee announced that Giles would participate in the California Classic Summer League on July 2, 3, and 5 in Sacramento. Giles joined the Kings for the 2018 NBA Summer League. Before the start of the 2018–19 season, there were high expectations set for Giles and the Kings, which also mentioned Giles as a possible candidate for rookie of the year. In his NBA debut on October 17, 2018, Giles scored 2 points in a 123–117 season-opening loss against the Utah Jazz. On November 10, 2018, Giles was assigned to the Stockton Kings, the G-League affiliate of the Kings, where he scored 30 points in his debut for the team. On November 11, 2018, Giles was recalled by the Kings. On November 12, 2018, Giles scored 12 points and grabbed 6 rebounds in a 104–99 win against the San Antonio Spurs. On January 31, Giles recorded a career-high 20 points and 7 rebounds in a 135–113 victory over the Atlanta Hawks. On March 4, Giles tallied 17 points and 7 rebounds in a 115–108 win over the New York Knicks. On March 17, Giles scored 16 points and 6 rebounds in a 129–102 victory over Chicago Bulls. On April 3, the Kings shut down Giles for the remainder of the season. Second season (2019–20) On October 31, 2019, the Kings declined Giles's option for the 2020–21 season worth $4 million. After the Kings signed centers Dewayne Dedmon and Richaun Holmes as free agents in the off-season, Giles saw himself on the periphery of the team's big man rotation, not appearing in game action from November 30 to December 28. With injuries to Holmes and former No. 2 pick Marvin Bagley and the team trading Dedmon to the Atlanta Hawks, Giles found himself as the team's starting center on February 7, 2020, against the Miami Heat. Giles scored a season-high 19 points in Sacramento's 112–108 loss to the Oklahoma City Thunder on February 27, 2020. Giles scored in double figures in four consecutive games from February 22 to 28. Portland Trail Blazers (2020–2021) On November 22, 2020, Giles signed with the Portland Trail Blazers. On April 4, 2021, he scored 12 points and 2 rebounds in a 133–85 win over the Oklahoma City Thunder. Agua Caliente Clippers (2021–present) On September 27, 2021, Giles signed with the Los Angeles Clippers. However, he was waived on October 16, 2021. On October 27, Giles signed with the Agua Caliente Clippers as an affiliate player. National team career Giles competed for Team USA at the 2015 FIBA Under-19 World Cup in Greece. During the tournament, he finished third in points per game per 40 minutes, with an average of 26.4, second in offensive rebounding percentage, at 17.1%, and first in defensive rebounding percentage, at 28.7%. He was named to the All-Tournament Team. Career statistics NBA Regular seasson |- | style="text-align:left;"| 2018–19 | style="text-align:left;"| Sacramento | 58 || 0 || 14.1 || .503 || .000 || .637 || 3.8 || 1.5 ||.5 ||.4 ||7.0 |- | style="text-align:left;"| 2019–20 | style="text-align:left;"| Sacramento | 46 || 17 || 14.5 || .554 || .000 || .776 || 4.1 || 1.3 || .5 || .4 || 6.9 |- | style="text-align:left;"| 2020–21 | style="text-align:left;"| Portland | 38 || 0 || 9.2 || .433 || .348 || .593 || 3.5 || .8 || .2 || .3 || 2.8 |- class="sortbottom" | style="text-align:center;" colspan="2"|Career | 142 || 17 || 12.9 || .511 || .258 || .71 || 3.8 || 1.2 || .4 || .4 || 5.9 Playoffs |- | style="text-align:left;"| 2021 | style="text-align:left;"| Portland | 1 || 0 || 4.0 || .000 || — || — || 3.0 || .0 || .0 || .0 || .0 College |- | style="text-align:left;"| 2016–17 | style="text-align:left;"| Duke | 26 || 6 || 11.5 || .577 || .000 || .500 || 3.9 || .4 || .4 || .7 || 3.9 Personal life Giles is the son of Harry and Melissa Giles. He has one brother and three sisters. Giles's father Harry Giles II played both college basketball and football at Winston-Salem State University. Giles is good friends with former Duke teammate and current NBA player Jayson Tatum. References External links Duke Blue Devils bio 1998 births Living people 21st-century African-American sportspeople African-American basketball players Agua Caliente Clippers players American men's basketball players Basketball players from Winston-Salem, North Carolina Duke Blue Devils men's basketball players Portland Trail Blazers draft picks Portland Trail Blazers players Power forwards (basketball) Sacramento Kings players Stockton Kings players
16591871
https://en.wikipedia.org/wiki/Winston%20Smith%20Project
Winston Smith Project
The Winston Smith Project (, or PWS) is an informational and operational project for the defence of human rights on the Internet and in the digital era. The project was started in 1999 as an anonymous association and it is characterised by the absence of a physical reference identity. It is named after the main character in George Orwell's novel "Nineteen Eighty-Four". The reference to Orwell's dystopia is embodied in the motto of PWS: "Unplug the Big Brother" which is aligned with its more generalist motto "Paranoia is a virtue". PWS aims to make users aware of the risks of violation of privacy on the Internet and threats to freedom of speech. PWS is engaged in spreading the informational tools and counter-censorship technologies that allow users to maintain confidentiality in their communications, anonymity in the network and freedom of expression. PWS has generated and maintains the e-privacy, Big Brother Awards Italy, Privacy Box and Project 95% initiatives. Objectives PWS upholds the thesis that the Big Brother described in George Orwell's novel is gradually gaining form, passively and with silent acceptance by the people. It is gradually inserted into our lives through the false statement: "it is right to sacrifice one's privacy in exchange for greater security". According to security experts such as Bruce Schneier, following events such as the SISMI-Telecom scandal, official bodies which monitor telecommunications are acquiring a de facto totalitarian power, whatever the official political situation. If the target is really to increase public security, the mere presence of monitoring agencies constitutes an element of insecurity. Because of the ignorance of citizens regarding security, official agencies push for an ever increasing situation which damages human rights. PWS is engaged in spreading tools to protect users from such risk. Fortunately, those tools exist because the Internet is based upon open technologies. Individual security can be achieved only by using preventive protection tools on private computers. It cannot be delegated to others such as Internet service providers. PWS aims to increase the use of technologies such as data encryption and anonymity. This can be achieved by using programs written according to the guidelines of secure software, such as: the software in use (including the operating system) must be an open system, allowing the user, if so inclined and capable, to verify its effective behaviour. the cryptographic algorithms employed must be public. Only thus can the community perform a mathematical analysis (cryptanalysis) and a study of potential attacks, in order to achieve a continuous improvement. If either of these conditions is missing, the software cannot be considered secure, as it is then based upon the concept of security through obscurity, which has never been proved a valid security paradigm. Events such as JAP, PGP 5.x and 6.x have shown the unreliability of this model. To be consistent with the proposed technologies, the PWS website is not available on the Internet, but through Freenet, with the following key: USK@RU-C2q5kN7K62WO3seMMjSTUY8izF2vCFyVFOnLf~Q0,wxvGO2QMT6IN9c7dNUhHeHnXVVwhq8YLbQL~DlMA7YE,AQACAAE/pws/3 A mirror is also available on the Internet to increase accessibility. To contact members of PWS conventional e-mail addresses are not used, but rather the nym alias [email protected], whose PGP key is published on a keyserver. Project resources Anonymity in the network is guaranteed by the Mix-net technology, first studied by David Chaum in 1981. This technology requires that user resources be employed and shared in collaboration. The reciprocal sharing of resources through secure algorithms ensures that an attacker able to monitor the network passively (reading all traffic in all segments) or actively (generating arbitrary traffic) would be unable to discover the identities of individuals. Software such as anonymous remailer, Tor and Freenet are based on these advanced concepts, and have evolved through the years. The architecture of these networks is based on collaboration and availability of shared resources. As a start, groups of volunteers like PWS are making eight servers available, dedicated to the support of this network. Law proposal At the 2005 annual convention organized by Bileta, an association active since 1986 for the study of laws concerning the use of technology in Britain and Ireland, PWS presented a study concerning data retention. Data retention is the automatic collection of network data in support of investigative bodies and law enforcement. Before several reforms concerning security, it was necessary in some jurisdictions to possess a mandate by a judicial authority before collecting data to be used in investigations. With the decentralization of technologies due to the spread of Internet, many private bodies have been invested with the responsibility for data collection. Such automatic data collection of Internet traffic is possible using freely available software, such as Wireshark or tcpdump, originally conceived to aid network technicians in debugging and maintenance. Collection of personal data is forbidden in the European Union according to the principle of freedom of secrecy of correspondence. For this reason a law proposal has been investigated to regulate the collection of log and backup data, which would define which data are to be considered sensitive, and allowing technicians to perform maintenance operations, but at the same time preventing the unauthorized access to personal data by external parties. This law proposal was presented to the Italian Parliament by deputy Maurizio Turco in 2006. This law proposal was not accepted. Instead, the validity of the current Italian Law Decree 144/2005, due to expire on 31 December 2007, has been extended in time, to continue allowing the collection of personal data with a view to combating international terrorism. E-Privacy conference PWS organizes the annual E-Privacy conference, which is the first such Italian conference concerning aspects of privacy in the network. Contributions are given by both technical and law experts. This conference has been held in the Palazzo Vecchio in Florence, with the exception of the 2002 edition, which was hosted at the University of Florence. Each edition has had a main theme: 2002: E-privacy, confidentiality and individual rights in the network, opposing Big Brother in the third millennium Topics covered: Italian Law 675/1996, political trends to pass laws reducing freedom in the net, Freenet, PGP/GPG, anonymous remailer and steganography. 2003: Defending identity and freedom of expression against requests for more security Topics treated included: data retention, TCPA, analysing threat models to define a minimum personal security standard, digital signatures, cryptography as a basic user defence tool. 2004: Data retention and the right to oblivion Topics included: data retention, RFID, the right to delete sensitive data, surveillance as an answer to terrorism, anonymous peer-to-peer (P2P) networking, abuses of video surveillance, decentralized technologies. 2005: Data retention and privacy in the network: darknet was considered, as well as presenting the P-Box project, then Free software, civil responsibilities and privacy violations, the OpenPGP standard, a law proposal to regulate automatic data collection, Biometry. 2006: The main theme was not set. Topics discussed were: spyware, trusted computing, DRM, possible misuses of electronic voting, dangers to privacy caused by search engines. 2007: Social control and technocontrol. Topics included: VOIP, personal identity and digital identity, accessibility, the Tanga articles and IT incidents. The 2008 conference was hypothesized to be held on the 9th and 10 May in the Palazzo Vecchio located in Florence. "e-privacy" is also the name of a mailing list. Its e-mail address is [email protected], subscription is free and archives are publicly available online. P-Box project Anonymity technologies are based on collaborative groups of users who reciprocally choose to share their resources. These anonymous networks can be accessed even from devices with low computing power and low communications bandwidth. To help diffuse these technologies PWS has introduced P-Boxes (Privacy Boxes), which are small and simple devices to help protect privacy. Three models have been developed: P-Box Model I: a modified Xbox, with the Linux operating system, running standard services and the Mixminion remailer. P-Box Model II: a PC Soekris 4501, with the Linux operating system, running Mixminion, TOR, Mixmaster and the Postfix mail server. P-Box Model III: based on a Soekris 4801, it includes the same applications as model II. It can also be used as access point and includes the e-mail server protocols IMAP and POP3. Big Brother Awards (Italian section) Big Brother Awards (BBA) is an initiative of Privacy International with the motto "watching the watchman worldwide". PWS manages the polling and award assignation to the Italian bodies with the worst performance in the field of human rights. Several categories exist, according to the rights violation achieved: Lifelong threat: the body or agency which has caused most damage to privacy throughout its existence. Worst public agency: given to the public agency (government institution, public body, authority, etc.) which caused most damage to privacy in the current year. Worst private enterprise: awarded to private or corporate institutions with the worst privacy record in the current year. Most invasive technology: the technology with the worst impact on privacy. Boot mouth: the "best" (most terrifying, ridiculous, erroneous, falsely tranquillizing) statement said or printed about privacy in the current year. People's lament: who received most votes, also in different categories. Project 95% Project 95% (Ninety Five Percent – No False Privacy) is a project advocating awareness in Internet issues. The Internet was born as a free and decentralized network, but its most common use relies on a few centralized services. A blatant example is the number of users who are increasingly more dependent on webmail services such as Gmail, Hotmail and Yahoo! Mail. Even though there is an understandable tendency favouring ease of use, as the customers can access their services from disparate locations, the downside is the vast usage of profiling instruments on the part of free service providers, with the view of providing more targeted web marketing. It is not PWS's intention to tag any specific commercial service as a danger to privacy, but to point out that a greater confidentiality can be achieved using individual mail servers, private webmail programs, privately owned domains. This is perfectly achievable using freely available software and their configuration can be automated even for non technically competent users. 95% is the percentage of reliability of a home based server, connected to the Internet through a flat ADSL line, to demonstrate that it is not necessary to employ the offerings of centralized enterprises to obtain good services. Hence the NFP project, which informs on the technical possibilities that a modern computer can offer, to connect to the Internet fully and without undue effort on the part of the user. The P-Box is an example of a technological answer to these necessities. References Digital rights organizations Privacy organizations Organizations established in 1999
2998007
https://en.wikipedia.org/wiki/Architecture%20Analysis%20%26%20Design%20Language
Architecture Analysis & Design Language
The Architecture Analysis & Design Language (AADL) is an architecture description language standardized by SAE. AADL was first developed in the field of avionics, and was known formerly as the Avionics Architecture Description Language. The Architecture Analysis & Design Language is derived from MetaH, an architecture description language made by the Advanced Technology Center of Honeywell. AADL is used to model the software and hardware architecture of an embedded, real-time system. Due to its emphasis on the embedded domain, AADL contains constructs for modeling both software and hardware components (with the hardware components named "execution platform" components within the standard). This architecture model can then be used either as a design documentation, for analyses (such as schedulability and flow control) or for code generation (of the software portion), like UML. AADL ecosystem AADL is defined by a core language that defines a single notation for both system and software aspects. Having a single model ease the analysis tools by having only one single representation of the system. The language specifies system-specific characteristics using properties. The language can be extended with the following methods: user-defined properties: user can extend the set of applicable properties and add their own to specify their own requirements language annexes: the core language is enhanced by annex languages that enrich the architecture description. For now, the following annexes have been defined. Behavior annex: add components behavior with state machines Error-model annex: specifies fault and propagation concerns ARINC653 annex: defines modelling patterns for modelling avionics system Data-Model annex: describes the modelling of specific data constraint with AADL AADL tools AADL is supported by a wide range of tools: OSATE includes a modeling platform, a graphical viewer and a constraint query languages Ocarina, an AADL toolchain for generating code from models TASTE toolchain, supported by the European Space Agency A complete list of the tool set can be found on the AADL public wiki Related projects AADL has been used for the following research projects: AVSI/SAVI: an initiative that leverages AADL (among other languages) to perform virtual integration of aerospace and defense systems META: a DARPA project for improving software engineering methods PARSEC: a French initiative to validate and implement avionics systems from architecture models TASTE: a platform for designing safety-critical systems from models A complete list of the past and current projects/initiatives can not be found on the AADL public wiki because it has been retired. No replacement has been provided as of Dec 2020. References External links AADL.info AADL public wiki AADL tools AADL at Axlog AADL at Ecole Nationale Supérieure des Télécommunications de Paris (ENST) AADL performance analysis with Cheddar, Univ. of Brest (real time scheduling and queueing system analysis) Industrial project support using Stood for AADL AADL In Practice, a book dedicated to the use of the languages and its related modeling tools Systems architecture Architecture description language Software modeling language Modeling languages
11378088
https://en.wikipedia.org/wiki/Certified%20software%20manager
Certified software manager
Certified software manager (CSM) is a professional designation for IT asset management. The course was developed by the Software Publishers Association (now SIIA) in 1994 as a component of one of the first global anti-piracy efforts, led by Ken Wasch. In 2004, administration of the CSM, and its successor class the Advanced Software Manager, moved to Washington, DC-based LicenseLogic. According to ITIL, Software Manager is defined as a person who manages “…all of the infrastructure and processes necessary for the effective management, control, and protection of the software assets…throughout all stages of their lifecycle.” The course syllabus walks the attendee through from the beginnings of "What is Software Asset Management (SAM)?" through to developing a basic asset management program for their organization. The CSM is the basis for various other ITAM courses. References External links LicenseLogic Information technology qualifications Computer occupations
60545337
https://en.wikipedia.org/wiki/MikroTik
MikroTik
MikroTik (officially SIA "Mikrotīkls") is a Latvian network equipment manufacturer. The company develops and sells wired and wireless network routers, network switches, access points, as well as operating systems and auxiliary software. The company was founded in 1996 with the focus of selling equipment in emerging markets. As of August 2019, the company website reported an estimated 280 employees. In 2021, with a value of EUR 1.24B, Mikrotik was the 3rd largest company in Latvia and the first private company to surpass EUR 1B value in Latvia. History MikroTik was founded in 1996 in Riga, Latvia as a PC software company. In 2002, MikroTik began producing their own hardware. On 23 May 2018, the Cisco Talos Intelligence Group reported that some MikroTik devices were found to be vulnerable to the VPNFilter malware. On 3 August 2018, MikroTik routers were found to have been compromised by the Coinhive cryptocurrency malware. Product vulnerabilities On 23 May 2018, Cisco Talos Intelligence Group reported that some MikroTik devices were found vulnerable to the VPNFilter malware. MikroTik routers have been compromised by Coinhive cryptocurrency malware. RouterOS through 6.42 allows unauthenticated remote attackers to read arbitrary files and remote authenticated attackers to write arbitrary files due to a directory traversal vulnerability in the WinBox interface. Meris Beginning in June 2021, a botnet composed of unprotected Mikrotik devices created huge volumes of application-layer traffic using http pipelining, resulting in DDOS. The net was named Mēris (or Meris) by Qrator. Yandex reported attacks beginning August 4 2021 (over 5 million requests per second) with a massive attack on September 5, 2021 reaching almost 22 million RPS (requests per second). Cloudflare acknowledged an attack at over 17 million RPS in July 2021. The botnet appeared to be composed of 250,000 devices. References External links Company website Companies of Latvia Telecommunications equipment vendors
34776670
https://en.wikipedia.org/wiki/List%20of%20moths%20of%20Kenya
List of moths of Kenya
There are about 2,100 known moth species of Kenya. The moths (mostly nocturnal) and butterflies (mostly diurnal) together make up the taxonomic order Lepidoptera. This is a list of moth species which have been recorded in Kenya. Alucitidae Alucita dohertyi (Walsingham, 1909) Anomoeotidae Anomoeotes elegans Pagenstecher, 1903 Staphylinochrous holotherma Hampson, 1920 Arctiidae Acantharctia atriramosa Hampson, 1907 Acantharctia bivittata (Butler, 1898) Acantharctia latifusca (Hampson, 1907) Acantharctia metaleuca Hampson, 1901 Acantharctia nigrivena Rothschild, 1935 Acantharctia tenuifasciata Hampson, 1910 Acanthofrontia lithosiana Hampson, 1910 Afraloa bifurca (Walker, 1855) Afrasura hyporhoda (Hampson, 1900) Afrasura indecisa (Walker, 1869) Afrasura obliterata (Walker, 1864) Afrasura peripherica (Strand, 1912) Afrasura violacea (Cieslak & Häuser, 2006) Afroarctia kenyana (Rothschild, 1933) Afrospilarctia lucida (Druce, 1898) Aglossosia deceptans Hampson, 1914 Aglossosia flavimarginata Hampson, 1900 Alpenus investigatorum (Karsch, 1898) Alpenus maculosa (Stoll, 1781) Alpenus nigropunctata (Bethune-Baker, 1908) Alpenus pardalina (Rothschild, 1910) Alpenus schraderi (Rothschild, 1910) Amata alicia (Butler, 1876) Amata chloroscia (Hampson, 1901) Amata cholmlei (Hampson, 1907) Amata congener (Hampson, 1901) Amata consimilis (Hampson, 1901) Amata cuprizonata (Hampson, 1901) Amata dissimilis (Bethune-Baker, 1911) Amata marinoides Kiriakoff, 1954 Amata phoenicia (Hampson, 1898) Amata rubritincta (Hampson, 1903) Amata stenoptera (Zerny, 1912) Amata williami Rothschild, 1910 Amerila affinis (Rothschild, 1910) Amerila androfusca (Pinhey, 1952) Amerila bipartita (Rothschild, 1910) Amerila brunnea (Hampson, 1901) Amerila bubo (Walker, 1855) Amerila luteibarba (Hampson, 1901) Amerila magnifica (Rothschild, 1910) Amerila mulleri Häuser & Boppré, 1997 Amerila niveivitrea (Bartel, 1903) Amerila puella (Fabricius, 1793) Amerila roseomarginata (Rothschild, 1910) Amerila thermochroa (Hampson, 1916) Amerila vidua (Cramer, 1780) Amerila vitrea Plötz, 1880 Amphicallia pactolicus (Butler, 1888) Amphicallia solai (Druce, 1907) Amphicallia thelwalli (Druce, 1882) Amsacta melanogastra (Holland, 1897) Amsactarctia pulchra (Rothschild, 1933) Anaphosia mirabilis (Bartel, 1903) Anapisa histrio (Kiriakoff, 1953) Anapisa melaleuca (Holland, 1898) Anapisa metarctioides (Hampson, 1907) Apisa canescens Walker, 1855 Apisa fontainei Kiriakoff, 1959 Apisa subargentea Joicey & Talbot, 1921 Archilema dentata Kühne, 2007 Archilema modiolus (Kiriakoff, 1958) Archilema nivea Kühne, 2007 Archilema uelleburgensis (Strand, 1912) Argina amanda (Boisduval, 1847) Asura friederikeae Kühne, 2007 Asura gubunica (Holland, 1893) Asura mutabilis Kühne, 2007 Asura naumanni Kühne, 2005 Asura pectinella Strand, 1922 Asura spinata Kühne, 2007 Asura spurrelli (Hampson, 1914) Asurgylla collenettei Kiriakoff, 1958 Automolis pallida (Hampson, 1901) Balacra compsa (Jordan, 1904) Balacra flavimacula Walker, 1856 Balacra preussi (Aurivillius, 1904) Balacra pulchra Aurivillius, 1892 Balacra rattrayi (Rothschild, 1910) Balacra rubricincta Holland, 1893 Balacra rubrostriata (Aurivillius, 1892) Binna penicillata Walker, 1865 Carcinarctia laeliodes Hampson, 1916 Carcinarctia metamelaena Hampson, 1901 Caripodia fuscicincta Hampson, 1914 Caripodia persimilis Hampson, 1914 Ceryx crawshayi Hampson, 1901 Ceryx semihyalina Kirby, 1896 Cragia adiastola (Kiriakoff, 1958) Cragia distigmata (Hampson, 1901) Cragia quadrinotata (Walker, 1864) Creatonotos leucanioides Holland, 1893 Ctenosia nephelistis Hampson, 1918 Cyana bigutta Karisch, 2005 Cyana flammeostrigata Karisch, 2003 Cyana margarethae (Kiriakoff, 1958) Cyana pretoriae (Distant, 1897) Cyana rejecta (Walker, 1854) Cyana ugandana (Strand, 1912) Diota rostrata (Wallengren, 1860) Disparctia vittata (Druce, 1898) Eilema aurantisquamata (Hampson, 1918) Eilema creatoplaga (Hampson, 1901) Eilema debilissima Kiriakoff, 1958 Eilema flavibasis Hampson, 1900 Eilema gracilipennis (Wallengren, 1860) Eilema intermixta Kühne, 2007 Eilema leia (Hampson, 1901) Eilema marwitziana Strand, 1912 Eilema melasonea Hampson, 1903 Eilema mesosticta Hampson, 1911 Eilema peperita (Hampson, 1901) Eilema polioplaga (Hampson, 1901) Eilema rufofasciata (Rothschild, 1912) Eilema sanguicosta (Hampson, 1901) Eilema tegudentata Kühne, 2007 Epitoxis albicincta Hampson, 1903 Epitoxis ansorgei Rothschild, 1910 Epitoxis procridia Hampson, 1898 Estigmene acrea (Drury, 1773) Estigmene ansorgei Rothschild, 1910 Estigmene atrifascia (Hampson, 1907) Estigmene multivittata Rothschild, 1910 Estigmene ochreomarginata Bethune-Baker, 1909 Estigmene tenuistrigata (Hampson, 1900) Estigmene trivitta (Walker, 1855) Euchromia amoena (Möschler, 1872) Euchromia folletii (Guérin-Méneville, 1832) Eugoa corniculata Kühne, 2007 Eugoa coronaria Kühne, 2007 Eurozonosia atricincta Hampson, 1918 Eurozonosia fulvinigra Hampson, 1914 Exilisia contrasta Kühne, 2007 Exilisia friederikeae Kühne, 2007 Exilisia gablerinus Kühne, 2008 Exilisia kruegeri Kühne, 2007 Exilisia prominentia Kühne, 2007 Eyralpenus scioana (Oberthür, 1880) Galtara aurivilii (Pagenstecher, 1901) Galtara elongata (Swinhoe, 1907) Hippurarctia ferrigera (Druce, 1910) Ilemodes astriga Hampson, 1916 Ischnarctia cinerea (Pagenstecher, 1903) Kiriakoffalia costimacula (Joicey & Talbot, 1924) Lepidilema unipectinata Aurivillius, 1910 Lepista pandula (Boisduval, 1847) Macrosia chalybeata Hampson, 1901 Mecistorhabdia haematoessa (Holland, 1893) Meganaclia sippia (Plötz, 1880) Metarctia benitensis Holland, 1893 Metarctia flavicincta Aurivillius, 1900 Metarctia flavivena Hampson, 1901 Metarctia fulvia Hampson, 1901 Metarctia fusca Hampson, 1901 Metarctia haematica Holland, 1893 Metarctia haematricha Hampson, 1905 Metarctia hypomela Kiriakoff, 1956 Metarctia inconspicua Holland, 1892 Metarctia lateritia Herrich-Schäffer, 1855 Metarctia paremphares Holland, 1893 Metarctia rubripuncta Hampson, 1898 Metarctia rufescens Walker, 1855 Metarctia sarcosoma Hampson, 1901 Metarctia schoutedeni Kiriakoff, 1953 Metarctia subpallens Kiriakoff, 1956 Metarctia unicolor (Oberthür, 1880) Micralarctia punctulatum (Wallengren, 1860) Muxta xanthopa (Holland, 1893) Nanna collinsii Kühne, 2007 Nanna eningae (Plötz, 1880) Nanna naumanni Kühne, 2005 Neuroxena ansorgei Kirby, 1896 Nyctemera apicalis (Walker, 1854) Nyctemera glauce (Fawcett, 1916) Nyctemera itokina (Aurivillius, 1904) Nyctemera leuconoe Hopffer, 1857 Nyctemera rattrayi (Swinhoe, 1904) Nyctemera restrictum (Butler, 1894) Ochrota unicolor (Hopffer, 1857) Onychipodia nigricostata (Butler, 1894) Ovenna guineacola (Strand, 1912) Palaeosiccia honeyi Kühne, 2007 Palaeosiccia punctata Hampson, 1900 Paralacydes arborifera (Butler, 1875) Paralacydes bivittata (Bartel, 1903) Paralacydes decemmaculata (Rothschild, 1916) Paralacydes fiorii (Berio, 1937) Paralacydes minorata (Berio, 1935) Paralpenus flavicosta (Hampson, 1909) Paralpenus ugandae (Hampson, 1916) Paramaenas nephelistis (Hampson, 1907) Paramaenas strigosus Grünberg, 1911 Paremonia argentata Hampson, 1914 Pericaliella melanodisca (Hampson, 1907) Phryganopsis angulifascia (Strand, 1912) Phryganopsis kinuthiae Kühne, 2007 Phryganopsis parasordida Kühne, 2007 Phryganopsis punctilineata (Hampson, 1901) Phryganopsis tryphosa Kiriakoff, 1958 Popoudina linea (Walker, 1855) Popoudina pamphilia Kiriakoff, 1958 Pseudlepista holoxantha Hampson, 1918 Pseudodiptera alberici (Dufrane, 1945) Pseudonaclia bifasciata Aurivillius, 1910 Pseudonaclia puella (Boisduval, 1847) Pseudothyretes erubescens (Hampson, 1901) Pseudothyretes kamitugensis (Dufrane, 1945) Pseudothyretes nigrita (Kiriakoff, 1961) Pseudothyretes perpusilla (Walker, 1856) Pseudothyretes rubicundula (Strand, 1912) Pusiola chota (Swinhoe, 1885) Pusiola minutissima (Kiriakoff, 1958) Pusiola ochreata (Hampson, 1901) Pusiola poliosia (Kiriakoff, 1958) Pusiola roscidella (Kiriakoff, 1954) Pusiola sorghicolor (Kiriakoff, 1954) Pusiola straminea (Hampson, 1901) Pusiola tinaeella (Kiriakoff, 1958) Radiarctia jacksoni (Rothschild, 1910) Radiarctia lutescens (Walker, 1854) Radiarctia rhodesiana (Hampson, 1900) Rhabdomarctia rubrilineata (Bethune-Baker, 1911) Rhipidarctia crameri Kiriakoff, 1961 Rhipidarctia forsteri (Kiriakoff, 1953) Rhipidarctia pareclecta (Holland, 1893) Secusio strigata Walker, 1854 Seydelia ellioti (Butler, 1895) Siccia adiaphora Kiriakoff, 1958 Siccia anserina Kühne, 2007 Siccia chogoriae Kühne, 2007 Siccia conformis Hampson, 1914 Siccia cretata Hampson, 1914 Siccia duodecimpunctata Kiriakoff, 1958 Siccia elgona Kühne, 2007 Siccia grossagranularis Kühne, 2007 Siccia gypsia Hampson, 1914 Siccia margopuncta Kühne, 2007 Siccia melanospila Hampson, 1911 Siccia orbiculata Kühne, 2007 Siccia pallidata Kühne, 2007 Siccia rarita Kühne, 2007 Siccia ursulae Kühne, 2007 Siccia yvonneae Kühne, 2007 Spilosoma atridorsia Hampson, 1920 Spilosoma baxteri (Rothschild, 1910) Spilosoma bipartita Rothschild, 1933 Spilosoma clasnaumanni Kühne, 2005 Spilosoma curvilinea Walker, 1855 Spilosoma lineata Walker, 1855 Spilosoma nyasana Rothschild, 1933 Spilosoma pales (Druce, 1910) Spilosoma rava (Druce, 1898) Spilosoma sublutescens Kiriakoff, 1958 Spilosoma sulphurea Bartel, 1903 Spilosoma unipuncta (Hampson, 1905) Stenarctia abdominalis Rothschild, 1910 Stenarctia quadripunctata Aurivillius, 1900 Teracotona abyssinica (Rothschild, 1933) Teracotona alicia (Hampson, 1911) Teracotona approximans (Rothschild, 1917) Teracotona clara Holland, 1892 Teracotona jacksoni (Rothschild, 1910) Teracotona melanocera (Hampson, 1920) Teracotona pardalina Bartel, 1903 Teracotona pitmanni Rothschild, 1933 Teracotona rhodophaea (Walker, 1865) Teracotona subterminata Hampson, 1901 Thumatha africana Kühne, 2007 Thumatha kakamegae Kühne, 2007 Utetheisa amhara Jordan, 1939 Utetheisa pulchella (Linnaeus, 1758) Zobida trinitas (Strand, 1912) Autostichidae Autosticha euryterma Meyrick, 1920 Bombycidae Racinoa obliquisigna (Hampson, 1910) Racinoa spiralis Kühne, 2008 Racinoa versicolora Kühne, 2008 Racinoa zolotuhini Kühne, 2008 Vingerhoedtia ruficollis (Strand, 1910) Brahmaeidae Dactyloceras lucina (Drury, 1872) Dactyloceras neumayeri (Pagenstecher, 1885) Dactyloceras noellae Bouyer, 2006 Dactyloceras ocelligera (Butler, 1889) Carposinidae Carposina mesospila Meyrick, 1920 Choreutidae Anthophila massaicae Agassiz, 2008 Trichocirca tyrota Meyrick, 1920 Coleophoridae Blastobasis acirfa Adamski, 2010 Blastobasis aynekiella Adamski, 2010 Blastobasis catappaella Adamski, 2010 Blastobasis chuka Adamski, 2010 Blastobasis elgonae Adamski, 2010 Blastobasis glauconotata Adamski, 2010 Blastobasis kenya Adamski, 2010 Blastobasis millicentae Adamski, 2010 Blastobasis mpala Adamski, 2010 Coleophora enchitis Meyrick, 1920 Neoblastobasis laikipiae Adamski, 2010 Neoblastobasis perisella Adamski, 2010 Neoblastobasis wangithiae Adamski, 2010 Neoblastobasis ximeniaella Adamski, 2010 Copromorphidae Rhynchoferella hoppei Mey, 2007 Rhynchoferella kuehnei Mey, 2007 Cosmopterigidae Gisilia conformata (Meyrick, 1921) Gisilia sclerodes (Meyrick, 1909) Gisilia stereodoxa (Meyrick, 1925) Cossidae Nomima chloroptera (Meyrick, 1920) Strigocossus capensis (Walker, 1856) Crambidae Adelpherupa albescens Hampson, 1919 Adelpherupa flavescens Hampson, 1919 Aethaloessa floridalis (Zeller, 1852) Agathodes musivalis Guenée, 1854 Alphacrambus prodontellus (Hampson, 1919) Analyta vansomereni Tams, 1932 Anania mesophaealis (Hampson, 1913) Anania phaeopastalis (Hampson, 1913) Anania piperitalis (Hampson, 1913) Ancylolomia atrifasciata Hampson, 1919 Ancylolomia capensis Zeller, 1852 Ancylolomia chrysographellus (Kollar, 1844) Ancylolomia croesus Hampson, 1919 Ancylolomia perfasciata Hampson, 1919 Ancylolomia planicosta Martin, 1956 Brihaspa chrysostomus (Zeller, 1852) Cadarena sinuata (Fabricius, 1781) Caffrocrambus undilineatus (Hampson, 1919) Calamotropha niveicostellus (Hampson, 1919) Charltona argyrastis Hampson, 1919 Chilo argyrogramma (Hampson, 1919) Chilo flavirufalis (Hampson, 1919) Chilo orichalcociliella (Strand, 1911) Cotachena smaragdina (Butler, 1875) Crambus acyperas Hampson, 1919 Crambus hampsoni Błeszyński, 1961 Crambus tessellatus Hampson, 1919 Donacaula rufalis (Hampson, 1919) Epichilo irroralis (Hampson, 1919) Euchromius labellum Schouten, 1988 Euclasta gigantalis Viette, 1957 Glaucobotys spiniformis Maes, 2008 Glyphodes capensis (Walker, 1866) Goniophysetis lactealis Hampson, 1916 Hendecasis apicefulva Hampson, 1916 Hendecasis fulviplaga Hampson, 1916 Lamprophaia ablactalis (Walker, 1859) Lamprosema ommatalis (Hampson, 1912) Lygropia amyntusalis (Walker, 1859) Nausinoe geometralis (Guenée, 1854) Nomophila noctuella ([Denis & Schiffermüller], 1775) Orphanostigma abruptalis (Walker, 1859) Palpita phaealis (Hampson, 1913) Palpita stenocraspis (Butler, 1898) Parapoynx diminutalis (Snellen, 1880) Parerupa africana (Aurivillius, 1910) Patissa atrilinealis Hampson, 1919 Patissa fractilinealis Hampson, 1919 Patissa geminalis Hampson, 1919 Powysia rosealinea Maes, 2006 Prionapteryx albescens (Hampson, 1919) Prionapteryx alternalis Maes, 2002 Prionapteryx ochrifasciata (Hampson, 1919) Prionapteryx rubrifusalis (Hampson, 1919) Prochoristis calamochroa (Hampson, 1919) Pycnarmon cribrata (Fabricius, 1794) Pyrausta bostralis (Hampson, 1919) Pyrausta centralis Maes, 2009 Pyrausta diatoma Hampson, 1913 Pyrausta flavimarginalis (Hampson, 1913) Pyrausta haematidalis Hampson, 1913 Pyrausta perfervidalis Hampson, 1913 Pyrausta sanguifusalis Hampson, 1913 Pyrausta sthenialis Hampson, 1916 Syllepte attenualis (Hampson, 1912) Udeoides muscosalis (Hampson, 1913) Udeoides nigribasalis (Hampson, 1913) Udeoides nolalis (Felder & Rogenhofer, 1875) Udeoides viridis Maes, 2006 Zebronia phenice (Cramer, 1780) Drepanidae Epicampoptera andersoni (Tams, 1925) Epicampoptera marantica (Tams, 1930) Gonoreta subtilis (Bryk, 1913) Elachistidae Elachista brevis Sruoga & De Prins, 2009 Elachista chelonitis Meyrick, 1909 Elachista kakamegensis Sruoga & De Prins, 2009 Elachista longispina Sruoga & De Prins, 2009 Elachista planca Sruoga & De Prins, 2009 Ethmia argomicta Meyrick, 1920 Ethmia ballistis Meyrick, 1908 Ethmia bicolorella (Guenée, 1879) Ethmia cirrhosoma Meyrick, 1920 Ethmia ditreta Meyrick, 1920 Ethmia epiloxa Meyrick, 1914 Ethmia glabra Meyrick, 1920 Ethmia hemicosma Meyrick, 1920 Ethmia melanocrates Meyrick, 1923 Perittia falciferella Sruoga & De Prins, 2009 Perittia gnoma Sruoga & De Prins, 2009 Perittia spatulata Sruoga & De Prins, 2009 Perittia tantilla Sruoga & De Prins, 2009 Sphecodora porphyrias Meyrick, 1920 Eupterotidae Acrojana scutaea Strand, 1909 Hoplojana rhodoptera (Gerstaecker, 1871) Hoplojana roseobrunnea Rothschild, 1917 Jana eurymas Herrich-Schäffer, 1854 Jana fletcheri Berger, 1980 Jana germana Rothschild, 1917 Jana preciosa Aurivillius, 1893 Parajana gabunica (Aurivillius, 1892) Phiala parabiota Kühne, 2007 Stenoglene obtusus (Walker, 1864) Stenoglene preussi (Aurivillius, 1893) Stenoglene roseus (Druce, 1886) Stenoglene sulphureoides Kühne, 2007 Vianga crowleyi (Aurivillius, 1904) Galacticidae Homadaula watamomaritima Mey, 2007 Gelechiidae Anarsia agricola Walsingham, 1891 Anarsia arsenopa Meyrick, 1920 Chilopselaphus ethicodes Meyrick, 1920 Deltophora angulella Sattler, 1979 Deltophora diversella Sattler, 1979 Encolpotis scioplasta Meyrick, 1920 Gelechia crudescens Meyrick, 1920 Hypatima mangiferae Sattler, 1989 Parallactis plaesiodes (Meyrick, 1920) Sphenogrypa syncosma Meyrick, 1920 Telphusa microsperma Meyrick, 1920 Telphusa phaulosema Meyrick, 1920 Thiognatha metachalca Meyrick, 1920 Trichotaphe melanosoma Meyrick, 1920 Geometridae Acanthovalva inconspicuaria (Hübner, 1796) Acidaliastis bicurvifera Prout, 1916 Acidaliastis micra Hampson, 1896 Acidaliastis subbrunnescens Prout, 1916 Acrostatheusis atomaria (Warren, 1901) Allochrostes impunctata (Warren, 1897) Antharmostes papilio Prout, 1912 Aphilopota calaria (Swinhoe, 1904) Aphilopota cardinalli Prout, 1954 Aphilopota confusata (Warren, 1902) Aphilopota cydno Prout, 1954 Aphilopota dicampsis Prout, 1934 Aphilopota nubilata Prout, 1954 Aphilopota ochrimacula (Warren, 1902) Aphilopota rufiplaga (Warren, 1902) Aphilopota semiusta (Distant, 1898) Aplochlora pseudossa Prout, 1932 Archichlora jacksoni Carcasson, 1971 Ascotis reciprocaria (Walker, 1860) Asthenotricha amblycoma Prout, 1935 Asthenotricha anisobapta Prout, 1932 Asthenotricha ansorgei Warren, 1899 Asthenotricha dentatissima Warren, 1899 Asthenotricha flavicoma Warren, 1899 Asthenotricha inutilis Warren, 1901 Asthenotricha pycnoconia Janse, 1933 Asthenotricha semidivisa Warren, 1901 Asthenotricha serraticornis Warren, 1902 Asthenotricha straba Prout, 1921 Asthenotricha strangulata Herbulot, 1953 Asthenotricha unipecten (Prout, 1915) Biston abruptaria (Walker, 1869) Biston gloriosaria Karisch, 2005 Biston pteronyma (Prout, 1938) Cabera andrica Prout, 1932 Cabera pictilinea (Warren, 1902) Caradrinopsis obscuraria Swinhoe, 1904 Cartaletis libyssa (Hopffer, 1857) Casilda lucidaria (Swinhoe, 1904) Casuariclystis latifascia (Walker, 1866) Centrochria unipunctata Gaede, 1917 Chiasmia amarata (Guenée, 1858) Chiasmia assimilis (Warren, 1899) Chiasmia ate (Prout, 1926) Chiasmia baringensis Agassiz, 2009 Chiasmia brongusaria (Walker, 1860) Chiasmia butaria (Swinhoe, 1904) Chiasmia cararia (Swinhoe, 1904) Chiasmia contaminata (Warren, 1902) Chiasmia costiguttata (Warren, 1899) Chiasmia featheri (Prout, 1922) Chiasmia feraliata (Guenée, 1858) Chiasmia fulvimargo (Warren, 1899) Chiasmia geminilinea (Prout, 1932) Chiasmia inconspicua (Warren, 1897) Chiasmia maculosa (Warren, 1899) Chiasmia majestica (Warren, 1901) Chiasmia marmorata (Warren, 1897) Chiasmia nubilata (Warren, 1897) Chiasmia obliquilineata (Warren, 1899) Chiasmia olindaria (Swinhoe, 1904) Chiasmia procidata (Guenée, 1858) Chiasmia semialbida (Prout, 1915) Chiasmia sororcula (Warren, 1897) Chiasmia subcurvaria (Mabille, 1897) Chiasmia sufflata (Guenée, 1858) Chiasmia trinotata (Warren, 1902) Chiasmia trizonaria (Hampson, 1909) Chiasmia umbrata (Warren, 1897) Chiasmia umbratilis (Butler, 1875) Chiasmia velia Agassiz, 2009 Chiasmia warreni (Prout, 1915) Chiasmia zelota Prout, 1922 Chlorerythra rubriplaga Warren, 1895 Chlorissa attenuata (Walker, 1862) Chlorissa dialeuca Prout, 1930 Chlorissa malescripta (Warren, 1897) Chloroclystis consocer Prout, 1937 Chloroclystis grisea Warren, 1897 Chloroclystis muscosa (Warren, 1902) Chloroclystis schoenei Karisch, 2008 Chlorodrepana cryptochroma Prout, 1913 Chrysocraspeda leighata Warren, 1904 Cleora munda (Warren, 1899) Cleora thyris D. S. Fletcher, 1967 Cleora tulbaghata (Felder & Rogenhofer, 1875) Coenina aurivena Butler, 1898 Collix foraminata Guenée, 1858 Colocleora bipannosa Prout, 1938 Colocleora proximaria (Walker, 1860) Comibaena leucospilata (Walker, 1863) Comostolopsis coerulea Warren, 1902 Comostolopsis simplex Warren, 1902 Conchylia interstincta (Prout, 1923) Conolophia persimilis (Warren, 1905) Conolophia rectistrigaria Rebel, 1914 Ctenaulis albirupta Warren, 1902 Cyclophora lyciscaria (Guenée, 1857) Derambila hyperphyes (Prout, 1911) Derambila iridoptera (Prout, 1913) Derambila jacksoni Prout, 1915 Disclisioprocta natalata (Walker, 1862) Discomiosis anfractilinea Prout, 1915 Discomiosis synnephes Prout, 1915 Dithecodes delicata (Warren, 1899) Dithecodes ornithospila (Prout, 1911) Drepanogynis cambogiaria (Guenée, 1858) Drepanogynis somereni (Prout, 1926) Dysrhoe olbia (Prout, 1911) Dysrhoe rhiogyra (Prout, 1932) Ecpetala animosa Prout, 1935 Ecpetala carnifasciata (Warren, 1899) Ecpetala indentata (Warren, 1902) Ecpetala obtusa (Warren, 1902) Ectropis delosaria (Walker, 1862) Ectropis ocellata Warren, 1902 Encoma irisaria Swinhoe, 1904 Encoma pulviscula Prout, 1932 Eois alticola (Aurivillius, 1925) Eois diapsis Prout, 1932 Eois grataria (Walker, 1861) Epigynopteryx ansorgei (Warren, 1901) Epigynopteryx coffeae Prout, 1934 Epigynopteryx commixta Warren, 1901 Epigynopteryx flavedinaria (Guenée, 1857) Epigynopteryx mutabilis (Warren, 1903) Epirrhoe annulifera (Warren, 1902) Episteira frustrata Prout, 1935 Erastria albosignata (Walker, 1863) Erastria leucicolor (Butler, 1875) Erastria madecassaria (Boisduval, 1833) Eretmopus anadyomene (Townsend, 1952) Eretmopus nereis (Townsend, 1952) Ereunetea reussi Gaede, 1914 Eucrostes disparata Walker, 1861 Eupithecia albistillata Prout, 1932 Eupithecia amphiplex Prout, 1932 Eupithecia anguinata (Warren, 1902) Eupithecia atomaria (Warren, 1902) Eupithecia candicans Herbulot, 1988 Eupithecia celatisigna (Warren, 1902) Eupithecia devestita (Warren, 1899) Eupithecia dilucida (Warren, 1899) Eupithecia dohertyi Prout, 1935 Eupithecia ecplyta Prout, 1932 Eupithecia gradatilinea Prout, 1916 Eupithecia hemiochra Prout, 1932 Eupithecia immensa (Warren, 1902) Eupithecia isotenes Prout, 1932 Eupithecia jeanneli Herbulot, 1953 Eupithecia mecodaedala Prout, 1932 Eupithecia medilunata Prout, 1932 Eupithecia mendosaria (Swinhoe, 1904) Eupithecia nigribasis (Warren, 1902) Eupithecia oblongipennis (Warren, 1902) Eupithecia orbaria (Swinhoe, 1904) Eupithecia perculsaria (Swinhoe, 1904) Eupithecia picturata (Warren, 1902) Eupithecia proflua Prout, 1932 Eupithecia psiadiata Townsend, 1952 Eupithecia regulosa (Warren, 1902) Eupithecia resarta Prout, 1932 Eupithecia rigida Swinhoe, 1892 Eupithecia rubristigma Prout, 1932 Eupithecia semiflavata (Warren, 1902) Eupithecia semipallida Janse, 1933 Eupithecia tabacata D. S. Fletcher, 1951 Eupithecia tetraglena Prout, 1932 Eupithecia undiculata Prout, 1932 Exeliopsis tholera (Prout, 1932) Gonanticlea meridionata (Walker, 1862) Gymnoscelis acutipennis Warren, 1902 Gymnoscelis birivulata Warren, 1902 Gymnoscelis carneata Warren, 1902 Hemicopsis purpuraria Swinhoe, 1904 Hemidromodes unicolorata Hausmann, 1996 Heterorachis haploa (Prout, 1912) Heterostegane minutissima (Swinhoe, 1904) Hierochthonia migrata Prout, 1930 Horisme pallidimacula Prout, 1925 Hydrelia argyridia (Butler, 1894) Hypomecis assimilis (Warren, 1902) Idaea fylloidaria (Swinhoe, 1904) Idaea laciniata (Warren, 1902) Idaea lalasaria (Swinhoe, 1904) Idaea lilliputaria (Warren, 1902) Idaea macrostyla (Warren, 1900) Idaea minimaria (Warren, 1904) Idaea parallelaria (Warren, 1902) Idaea pulveraria (Snellen, 1872) Idaea subscutulata (Warren, 1899) Idaea tornivestis (Prout, 1932) Idaea umbricosta (Prout, 1913) Idiochlora approximans (Warren, 1897) Idiochlora subrufibasis (Prout, 1930) Idiodes flexilinea (Warren, 1898) Isoplenodia arabukoensis Sihvonen & Staude, 2010 Isturgia catalaunaria (Guenée, 1858) Isturgia deerraria (Walker, 1861) Isturgia disputaria (Guenée, 1858) Isturgia exerraria (Prout, 1925) Isturgia exospilata (Walker, 1861) Isturgia presbitaria (Swinhoe, 1904) Isturgia pulinda (Walker, 1860) Isturgia quadriplaga (Rothschild, 1921) Lasiochlora bicolor (Thierry-Mieg, 1907) Leucoxena lactea Warren, 1900 Lobidiopteryx veninotata Warren, 1902 Lomographa aridata (Warren, 1897) Lomographa indularia (Guenée, 1858) Lophorrhachia burdoni Townsend, 1958 Lycaugidia albatus (Swinhoe, 1885) Melinoessa amplissimata (Walker, 1863) Melinoessa pauper Warren, 1901 Menophra obtusata (Warren, 1902) Menophra olginaria (Swinhoe, 1904) Mesocoela obscura Warren, 1902 Mesocolpia lita (Prout, 1916) Mesocolpia nanula (Mabille, 1900) Mesocolpia protrusata (Warren, 1902) Microloxia ruficornis Warren, 1897 Milocera divorsa Prout, 1922 Milocera podocarpi Prout, 1932 Mimandria cataractae Prout, 1917 Mimoclystia cancellata (Warren, 1899) Mimoclystia pudicata (Walker, 1862) Mixocera albistrigata (Pagenstecher, 1893) Nothofidonia ansorgei (Warren, 1901) Nothofidonia bicolor Prout, 1915 Oaracta maculata (Warren, 1897) Obolcola petronaria (Guenée, 1858) Odontopera acyrthoria (Prout, 1938) Odontopera breviata (Prout, 1922) Odontopera curticosta (Prout, 1932) Odontopera xera (Prout, 1922) Oedicentra albipennis Warren, 1902 Omizodes rubrifasciata (Butler, 1896) Omphacodes pulchrifimbria (Warren, 1902) Omphalucha brunnea (Warren, 1899) Omphax nigricornis (Warren, 1897) Omphax plantaria Guenée, 1858 Orbamia octomaculata (Wallengren, 1872) Oreometra vittata Aurivillius, 1910 Orthonama obstipata (Fabricius, 1794) Pachypalpella subalbata (Warren, 1900) Paraptychodes tenuis (Butler, 1878) Piercia bryophilaria (Warren, 1903) Piercia fumitacta (Warren, 1903) Piercia kuehnei Karisch, 2008 Piercia myopteryx Prout, 1935 Piercia pallidifascia Karisch, 2008 Piercia prasinaria (Warren, 1901) Piercia spatiosata (Walker, 1862) Piercia subrufaria (Warren, 1903) Piercia subterlimbata (Prout, 1917) Piercia subtrunca (Prout, 1932) Pigiopsis parallelaria Warren, 1902 Pingasa distensaria (Walker, 1860) Pingasa rhadamaria (Guenée, 1858) Pingasa ruginaria (Guenée, 1858) Pitthea trifasciata Dewitz, 1881 Prasinocyma albisticta (Warren, 1901) Prasinocyma bifimbriata Prout, 1912 Prasinocyma centralis Prout, 1915 Prasinocyma differens (Warren, 1902) Prasinocyma dohertyi Warren, 1903 Prasinocyma geminata Prout, 1913 Prasinocyma immaculata (Thunberg, 1784) Prasinocyma leucophracta Prout, 1932 Prasinocyma loveridgei Prout, 1926 Prasinocyma nigrimacula Prout, 1915 Prasinocyma permitis Prout, 1932 Prasinocyma perpulverata Prout, 1916 Prasinocyma pulchraria Swinhoe, 1904 Prasinocyma pupillata (Warren, 1902) Prasinocyma salutaria (Swinhoe, 1904) Prasinocyma stictimargo (Warren, 1902) Prasinocyma tandi Bethune-Baker, 1913 Prasinocyma tricolorifrons (Prout, 1913) Prasinocyma unipuncta Warren, 1897 Prasinocyma vermicularia (Guenée, 1858) Problepsis aegretta Felder & Rogenhofer, 1875 Problepsis digammata Kirby, 1896 Problepsis flavistigma Swinhoe, 1904 Prosomphax anomala (Warren, 1902) Protosteira spectabilis (Warren, 1899) Pseudochesias neddaria (Swinhoe, 1904) Pseudolarentia arenaria (Warren, 1902) Pseudolarentia megalaria (Guenée, 1858) Pseudolarentia monosticta (Butler, 1894) Psilocerea cneca Prout, 1932 Psilocerea turpis Warren, 1902 Psilocladia diaereta Prout, 1923 Rheumaptera relicta (Herbulot, 1953) Rhodometra intervenata Warren, 1902 Rhodometra sacraria (Linnaeus, 1767) Rhodophthitus commaculata (Warren, 1897) Rhodophthitus rudicornis (Butler, 1898) Scardamia maculata Warren, 1897 Scopula accentuata (Guenée, 1858) Scopula acyma Prout, 1932 Scopula agrapta (Warren, 1902) Scopula alma Prout, 1920 Scopula argentidisca (Warren, 1902) Scopula atricapilla Prout, 1934 Scopula bigeminata (Warren, 1897) Scopula caducaria Swinhoe, 1904 Scopula candidaria (Warren, 1902) Scopula cassiaria (Swinhoe, 1904) Scopula cassioides Prout, 1932 Scopula commaria (Swinhoe, 1904) Scopula crawshayi Prout, 1932 Scopula curvimargo (Warren, 1900) Scopula dapharia (Swinhoe, 1904) Scopula dissonans (Warren, 1897) Scopula erinaria (Swinhoe, 1904) Scopula fibulata (Guenée, 1857) Scopula fimbrilineata (Warren, 1902) Scopula fragilis (Warren, 1903) Scopula fuscobrunnea (Warren, 1901) Scopula internata (Guenée, 1857) Scopula internataria (Walker, 1861) Scopula isomala Prout, 1932 Scopula longitarsata Prout, 1932 Scopula mesophaena Prout, 1923 Scopula metacosmia Prout, 1932 Scopula minoa (Prout, 1916) Scopula minorata (Boisduval, 1833) Scopula natalica (Butler, 1875) Scopula ocellicincta (Warren, 1901) Scopula recurvinota (Warren, 1902) Scopula rufisalsa (Warren, 1897) Scopula sagittilinea (Warren, 1897) Scopula sanguinisecta (Warren, 1897) Scopula sevandaria (Swinhoe, 1904) Scopula silonaria (Guenée, 1858) Scopula sinnaria Swinhoe, 1904 Scopula spoliata (Walker, 1861) Scopula technessa Prout, 1932 Scopula vitiosaria (Swinhoe, 1904) Scotopteryx nictitaria (Herrich-Schäffer, 1855) Sesquialtera ridicula Prout, 1916 Somatina ctenophora Prout, 1915 Somatina figurata Warren, 1897 Somatina vestalis (Butler, 1875) Somatina virginalis Prout, 1917 Sphingomima viriosa Prout, 1915 Syncollesis elegans (Prout, 1912) Synpelurga innocens (Warren, 1902) Terina rogersi Prout, 1915 Thalassodes albifimbria Warren, 1897 Thalassodes quadraria Guenée, 1857 Thelycera hemithales (Prout, 1912) Traminda acuta (Warren, 1897) Traminda neptunaria (Guenée, 1858) Traminda ocellata Warren, 1895 Traminda vividaria (Walker, 1861) Tricentroscelis protrusifrons Prout, 1916 Trimetopia aetheraria Guenée, 1858 Tropicollesis albiceris Prout, 1930 Xanthisthisa fulva (Warren, 1902) Xanthisthisa nigrocumulata (Warren, 1902) Xanthisthisa tarsispina (Warren, 1901) Xanthisthisa tumida (Warren, 1902) Xanthorhoe ansorgei (Warren, 1899) Xanthorhoe argenteolineata (Aurivillius, 1910) Xanthorhoe conchata Warren, 1898 Xanthorhoe exorista Prout, 1922 Xanthorhoe heliopharia (Swinhoe, 1904) Xanthorhoe poseata (Geyer, 1837) Xanthorhoe procne (Fawcett, 1916) Xanthorhoe scarificata Prout, 1932 Xanthorhoe sublesta (Prout, 1932) Xanthorhoe submaculata (Warren, 1902) Xanthorhoe tamsi D. S. Fletcher, 1963 Xanthorhoe transcissa (Warren, 1902) Xanthorhoe transjugata Prout, 1923 Xanthorhoe trientata (Warren, 1901) Xenimpia angusta Prout, 1915 Xenochroma candidata Warren, 1902 Xylopteryx arcuata (Walker, 1862) Xylopteryx aucilla Prout, 1926 Xylopteryx bifida Herbulot, 1984 Xylopteryx emunctaria (Guenée, 1858) Xylopteryx inquilina Agassiz, 2009 Xylopteryx prasinaria Hampson, 1909 Xylopteryx protearia Guenée, 1858 Xylopteryx sima Prout, 1926 Zamarada amicta Prout, 1915 Zamarada ansorgei Warren, 1897 Zamarada bonaberiensis Strand, 1915 Zamarada calypso Prout, 1926 Zamarada chrysopa D. S. Fletcher, 1974 Zamarada collarti Debauche, 1938 Zamarada crystallophana Mabille, 1900 Zamarada cucharita D. S. Fletcher, 1974 Zamarada deceptrix Warren, 1914 Zamarada delosis D. S. Fletcher, 1974 Zamarada delta D. S. Fletcher, 1974 Zamarada dentigera Warren, 1909 Zamarada differens Bastelberger, 1907 Zamarada ekphysis D. S. Fletcher, 1974 Zamarada erosa D. S. Fletcher, 1974 Zamarada erugata D. S. Fletcher, 1974 Zamarada euerces Prout, 1928 Zamarada euphrosyne Oberthür, 1912 Zamarada eurygnathus D. S. Fletcher, 1974 Zamarada excavata Bethune-Baker, 1913 Zamarada hyalinaria (Guenée, 1857) Zamarada iobathra Prout, 1932 Zamarada keraia D. S. Fletcher, 1974 Zamarada labrys D. S. Fletcher, 1974 Zamarada latilimbata Rebel, 1948 Zamarada longidens D. S. Fletcher, 1963 Zamarada mashariki Aarvik & Bjørnstad, 2007 Zamarada melasma D. S. Fletcher, 1974 Zamarada melpomene Oberthür, 1912 Zamarada mesotaenia Prout, 1931 Zamarada metrioscaphes Prout, 1912 Zamarada ochrata Warren, 1902 Zamarada phaeozona Hampson, 1909 Zamarada plana Bastelberger, 1909 Zamarada pringlei D. S. Fletcher, 1974 Zamarada psammites D. S. Fletcher, 1958 Zamarada psectra D. S. Fletcher, 1974 Zamarada pulverosa Warren, 1895 Zamarada reflexaria (Walker, 1863) Zamarada rufilinearia Swinhoe, 1904 Zamarada scintillans Bastelberger, 1909 Zamarada secutaria (Guenée, 1858) Zamarada torrida D. S. Fletcher, 1974 Zamarada townsendi D. S. Fletcher, 1974 Zamarada varii D. S. Fletcher, 1974 Zamarada vigilans Prout, 1915 Zamarada vulpina Warren, 1897 Zeuctoboarmia hyrax (Townsend, 1952) Zeuctoboarmia translata Prout, 1915 Zygophyxia erlangeri Prout, 1932 Zygophyxia palpata Prout, 1932 Zygophyxia relictata (Walker, 1866) Gracillariidae Caloptilia fera Triberti, 1989 Cameraria sokoke de Prins, 2012 Cameraria torridella de Prins, 2012 Conopobathra gravissima (Meyrick, 1912) Cremastobombycia kipepeo de Prins, 2012 Phyllonorycter achilleus de Prins, 2012 Phyllonorycter acutulus de Prins, 2012 Phyllonorycter agassizi de Prins, 2012 Phyllonorycter albertinus de Prins, 2012 Phyllonorycter grewiaecola (Vári, 1961) Phyllonorycter grewiaephilos de Prins, 2012 Phyllonorycter grewiella (Vári, 1961) Phyllonorycter hibiscina (Vári, 1961) Phyllonorycter hibiscola de Prins, 2012 Phyllonorycter kazuri de Prins, 2012 Phyllonorycter lantanae (Vári, 1961) Phyllonorycter loxozona (Meyrick, 1936) Phyllonorycter melanosparta (Meyrick, 1912) Phyllonorycter mida de Prins, 2012 Phyllonorycter obandai De Prins & Mozuraitis, 2006 Phyllonorycter ocimellus de Prins, 2012 Phyllonorycter ololua de Prins, 2012 Phyllonorycter rongensis de Prins, 2012 Phyllonorycter silvicola de Prins, 2012 Phyllonorycter tsavensis de Prins, 2012 Phyllonorycter turensis de Prins, 2012 Hepialidae Antihepialus keniae (Holland, 1892) Lasiocampidae Anadiasa fuscofasciata (Aurivillius, 1922) Anadiasa simplex Pagenstecher, 1903 Beralade bettoni Aurivillius, 1905 Beralade convergens Hering, 1932 Beralade pelodes (Tams, 1937) Beralade sorana Le Cerf, 1922 Bombycomorpha bifascia (Walker, 1855) Bombycopsis conspersa Aurivillius, 1905 Bombycopsis lepta (Tams, 1931) Braura elgonensis (Kruck, 1940) Catalebeda jamesoni (Bethune-Baker, 1908) Catalebeda tamsi Hering, 1932 Chionopsyche montana Aurivillius, 1909 Chrysopsyche jefferyi Tams, 1926 Chrysopsyche lutulenta Tams, 1923 Cleopatrina bilinea (Walker, 1855) Cleopatrina phocea (Druce, 1887) Dasychirinula chrysogramma Hering, 1926 Dollmania flavia (Fawcett, 1915) Epicnapteroides lobata Strand, 1912 Eucraera decora (Fawcett, 1915) Eucraera koellikerii (Dewitz, 1881) Eupagopteryx affinis (Aurivillius, 1909) Eutricha morosa (Walker, 1865) Euwallengrenia reducta (Walker, 1855) Filiola lanceolata (Hering, 1932) Gelo joannoui Zolotuhin & Prozorov, 2010 Gelo jordani (Tams, 1936) Gonometa nysa Druce, 1887 Gonometa postica Walker, 1855 Gonometa regia Aurivillius, 1905 Grellada imitans (Aurivillius, 1893) Lechriolepis griseola Aurivillius, 1927 Lechriolepis ochraceola Strand, 1912 Leipoxais batesi Bethune-Baker, 1927 Leipoxais compsotes Tams, 1937 Leipoxais fuscofasciata Aurivillius, 1908 Leipoxais humfreyi Aurivillius, 1915 Leipoxais marginepunctata Holland, 1893 Leipoxais peraffinis Holland, 1893 Leipoxais proboscidea (Guérin-Méneville, 1832) Leipoxais rufobrunnea Strand, 1912 Leipoxais siccifolia Aurivillius, 1902 Leipoxais tamsi D. S. Fletcher, 1968 Mallocampa audea (Druce, 1887) Mallocampa leucophaea (Holland, 1893) Metajana chanleri Holland, 1896 Mimopacha cinerascens (Holland, 1893) Mimopacha gerstaeckerii (Dewitz, 1881) Mimopacha tripunctata (Aurivillius, 1905) Morongea arnoldi (Aurivillius, 1909) Morongea lampara Zolotuhin & Prozorov, 2010 Odontocheilopteryx corvus Gurkovich & Zolotuhin, 2009 Odontocheilopteryx foedifragus Gurkovich & Zolotuhin, 2009 Odontocheilopteryx myxa Wallengren, 1860 Odontocheilopteryx pattersoni Tams, 1926 Odontocheilopteryx politzari Gurkovich & Zolotuhin, 2009 Odontocheilopteryx scilla Gurkovich & Zolotuhin, 2009 Odontocheilopteryx spicola Gurkovich & Zolotuhin, 2009 Odontocheilopteryx stokata Gurkovich & Zolotuhin, 2009 Odontogama nigricans Aurivillius, 1914 Opisthodontia budamara Zolotuhin & Prozorov, 2010 Opisthodontia vensani Zolotuhin & Prozorov, 2010 Pachymeta contraria (Walker, 1855) Pachymeta immunda (Holland, 1893) Pachymetana guttata (Aurivillius, 1914) Pachytrina diablo Zolotuhin & Gurkovich, 2009 Pachytrina flamerchena Zolotuhin & Gurkovich, 2009 Pachytrina okzilina Zolotuhin & Gurkovich, 2009 Pachytrina philargyria (Hering, 1928) Pallastica kakamegata Zolotuhin & Gurkovich, 2009 Pallastica lateritia (Hering, 1928) Pallastica meloui (Riel, 1909) Pallastica pallens (Bethune-Baker, 1908) Pallastica rubinia Zolotuhin & Gurkovich, 2009 Pallastica sericeofasciata (Aurivillius, 1921) Philotherma jacchus Möschler, 1887 Philotherma sordida Aurivillius, 1905 Pseudolyra despecta (Le Cerf, 1922) Pseudometa andersoni Tams, 1925 Pseudometa choba (Druce, 1899) Pseudometa pagetodes Tams, 1929 Sena donaldsoni (Holland, 1901) Sena prompta (Walker, 1855) Sena scotti (Tams, 1931) Sophyrita argibasis (Mabille, 1893) Stoermeriana callizona Tams, 1931 Stoermeriana cervina (Aurivillius, 1927) Stoermeriana coilotoma (Bethune-Baker, 1911) Stoermeriana fusca (Aurivillius, 1905) Stoermeriana graberi (Dewitz, 1881) Stoermeriana ocellata Tams, 1929 Stoermeriana sjostedti (Aurivillius, 1902) Stoermeriana tessmanni (Strand, 1912) Stoermeriana versicolora Kühne, 2008 Streblote butiti (Bethune-Baker, 1906) Streblote sodalium (Aurivillius, 1915) Theophasida kawai Zolotuhin & Prozorov, 2010 Trabala charon Druce, 1910 Lecithoceridae Eridachtha calamopis Meyrick, 1920 Eridachtha phaeochlora Meyrick, 1920 Lecithocera sceptrarcha Meyrick, 1920 Lemoniidae Sabalia jacksoni Sharpe, 1890 Sabalia picarina Walker, 1865 Limacodidae Caffricola kenyensis Talbot, 1932 Casphalia elongata Jordan, 1915 Chrysopoloma crawshayi Aurivillius, 1904 Coenobasis farouki Wiltshire, 1947 Coenobasis postflavida Hampson, 1910 Ctenolita melanosticta (Bethune-Baker, 1909) Delorhachis kitale West, 1940 Gavara camptogramma Hampson, 1910 Gavara velutina Walker, 1857 Halseyia bisecta (Butler, 1898) Halseyia similis (Hering, 1937) Latoia albicosta (Hampson, 1910) Lembopteris puella Butler, 1898 Macroplectra albescens Hampson, 1910 Macroplectra fuscifusa Hampson, 1910 Macroplectra obliquilinea Hampson, 1910 Narosa nephochloeropis Bethune-Baker, 1909 Niphadolepis auricincta Butler, 1898 Omocena syrtis (Schaus & Clements, 1893) Parapluda incincta (Hampson, 1909) Parapluda monogramma (Hampson, 1910) Scotinocerides conspersa (Kirby, 1896) Scotinocerides microsticta (Bethune-Baker, 1911) Scotinochroa inconsequens Butler, 1897 Zinara cymatoides West, 1937 Lymantriidae Aclonophlebia flavinotata Butler, 1898 Aclonophlebia poecilanthes (Collenette, 1931) Aclonophlebia triangulifera Hampson, 1910 Aroa discalis Walker, 1855 Aroa incerta Rogenhofer, 1891 Bracharoa mixta (Snellen, 1872) Bracharoa quadripunctata (Wallengren, 1875) Carpenterella chionobosca Collenette, 1960 Casama hemippa Swinhoe, 1906 Casama impura (Hering, 1926) Collenettema crocipes (Boisduval, 1833) Creagra liturata (Guérin-Méneville, 1844) Cropera testacea Walker, 1855 Crorema evanescens (Hampson, 1910) Crorema fuscinotata (Hampson, 1910) Crorema setinoides (Holland, 1893) Dasychira aeschra (Hampson, 1926) Dasychira chorista Hering, 1926 Dasychira gonophoroides Collenette, 1939 Dasychira ilesha Collenette, 1931 Dasychira lulua Collenette, 1937 Dasychira ocellifera (Holland, 1893) Dasychira punctifera (Walker, 1857) Dasychira robusta (Walker, 1855) Dasychira sphaleroides Hering, 1926 Dasychira stegmanni Grünberg, 1910 Dasychira umbricolora Hampson, 1910 Eudasychira calliprepes (Collenette, 1933) Eudasychira dina (Hering, 1926) Eudasychira georgiana (Fawcett, 1900) Eudasychira proleprota (Hampson, 1905) Euproctis bigutta Holland, 1893 Euproctis confluens Hering, 1926 Euproctis conizona Collenette, 1933 Euproctis consocia Walker, 1865 Euproctis cryphia Collenette, 1960 Euproctis dewitzi (Grünberg, 1907) Euproctis molunduana Aurivillius, 1925 Euproctis neavei Tams, 1924 Euproctis nessa Swinhoe, 1903 Euproctis nigrifinis (Swinhoe, 1903) Euproctis pallida (Kirby, 1896) Euproctis perpusilla Hering, 1926 Euproctis rubricosta Fawcett, 1917 Euproctis sericaria (Tams, 1924) Euproctis utilis Swinhoe, 1903 Euproctis xanthosoma Hampson, 1910 Griveaudyria ila (Swinhoe, 1904) Homoeomeria flavicapilla (Wallengren, 1860) Hyaloperina nudiuscula Aurivillius, 1904 Jacksoniana striata (Collenette, 1937) Knappetra fasciata (Walker, 1855) Lacipa albula Fawcett, 1917 Lacipa flavitincta Hampson, 1910 Lacipa florida (Swinhoe, 1903) Lacipa gracilis Hopffer, 1857 Lacipa impuncta Butler, 1898 Lacipa jefferyi (Collenette, 1931) Lacipa melanosticta Hampson, 1910 Lacipa ostra (Swinhoe, 1903) Lacipa sundara (Swinhoe, 1903) Laelia bifascia Hampson, 1905 Laelia eutricha Collenette, 1931 Laelia extorta (Distant, 1897) Laelia figlina Distant, 1899 Laelia fracta Schaus & Clements, 1893 Laelia gigantea Hampson, 1910 Laelia gwelila (Swinhoe, 1903) Laelia lavia Swinhoe, 1903 Laelia rogersi Bethune-Baker, 1913 Leucoma flavifrons (Hampson, 1910) Leucoma melanochila (Hering, 1926) Leucoma monosticta (Butler, 1898) Leucoma parva (Plötz, 1880) Lymantria hemipyra Collenette, 1932 Lymantria tacita Hering, 1927 Marbla paradoxa (Hering, 1926) Marblepsis kakamega Collenette, 1937 Marblepsis macrocera (Sharpe, 1890) Marblepsis tiphia (Swinhoe, 1903) Mylantria xanthospila (Plötz, 1880) Naroma nigrolunata Collenette, 1931 Naroma varipes (Walker, 1865) Neomardara africana (Holland, 1893) Ogoa simplex Walker, 1856 Olapa fulviceps Hampson, 1910 Olapa tavetensis (Holland, 1892) Orgyia hopkinsi Collenette, 1937 Palasea arete (Fawcett, 1915) Palasea conspersa (Hering, 1927) Palasea gondona (Swinhoe, 1903) Palasea melia (Fawcett, 1915) Palasea melissa (Fawcett, 1915) Paramarbla beni (Bethune-Baker, 1909) Paramarbla catharia (Collenette, 1933) Paramarbla lindblomi (Aurivillius, 1921) Pirga bipuncta Hering, 1926 Pirga loveni Aurivillius, 1921 Pirga magna Swinhoe, 1903 Porthesaroa lacipa Hering, 1926 Porthesaroa noctua Hering, 1926 Pteredoa monosticta (Butler, 1898) Pteredoa siderea Hering, 1926 Rhodesana mintha Fawcett, 1917 Rhypopteryx diplogramma Hering, 1927 Rhypopteryx hemiphanta Collenette, 1955 Rhypopteryx pachytaenia (Hering, 1926) Rhypopteryx psoloconiama Collenette, 1960 Rhypopteryx summissa Hering, 1927 Rhypopteryx triangulifera (Hampson, 1910) Rhypopteryx xuthosticta (Collenette, 1938) Ruanda eleuteriopsis Hering, 1926 Sphragista kitchingi (Bethune-Baker, 1909) Stilpnaroma venosa Hering, 1926 Stracena kamengo Collenette, 1936 Stracena promelaena (Holland, 1893) Stracena striata Schultze, 1934 Stracilla ghesquierei Collenette, 1937 Tamsita habrotima (Tams, 1930) Tamsita ochthoeba (Hampson, 1920) Lyonetiidae Platacmaea cretiseca Meyrick, 1920 Metarbelidae Aethiopina argentifera Gaede, 1929 Kroonia dallastai Lehmann, 2010 Kroonia natalica (Hampson, 1910) Lebedodes endomela (Bethune-Baker, 1909) Lebedodes johni Lehmann, 2008 Lebedodes naevius Fawcett, 1916 Lebedodes velutina Le Cerf, 1914 Metarbela alluaudi Le Cerf, 1914 Metarbela cinereolimbata Le Cerf, 1914 Metarbela dialeuca Hampson, 1910 Metarbela diodonta Hampson, 1916 Metarbela distincta Le Cerf, 1922 Metarbela haberlandorum Lehmann, 1997 Metarbela latifasciata Gaede, 1929 Metarbela nubifera (Bethune-Baker, 1909) Metarbela pallescens Le Cerf, 1914 Metarbela perstriata Hampson, 1916 Metarbela shimonii Lehmann, 2008 Metarbela simillima (Hampson, 1910) Metarbelodes obliqualinea (Bethune-Baker, 1909) Mountelgonia arcifera (Hampson, 1909) Mountelgonia lumbuaensis Lehmann, 2013 Mountelgonia percivali Lehmann, 2013 Mountelgonia thikaensis Lehmann, 2013 Ortharbela rufula (Hampson, 1910) Ortharbela tetrasticta (Hampson, 1910) Paralebedella shimonii Lehmann, 2009 Salagena albonotata (Butler, 1898) Salagena atridiscata Hampson, 1910 Salagena bennybytebieri Lehmann, 2008 Salagena charlottae Lehmann, 2008 Salagena eustrigata Hampson, 1916 Salagena irrorata Le Cerf, 1914 Salagena narses Fawcett, 1916 Salagena quentinlukei Lehmann, 2008 Salagena tessellata Distant, 1897 Teragra simplicius Le Cerf, 1922 Teragra trimaculata Gaede, 1929 Nepticulidae Stigmella pelanodes (Meyrick, 1920) Noctuidae Abrostola brevipennis (Walker, 1858) Abrostola confusa Dufay, 1958 Abrostola triopis Hampson, 1902 Achaea catella Guenée, 1852 Achaea finita (Guenée, 1852) Achaea illustrata Walker, 1858 Achaea lienardi (Boisduval, 1833) Acontia aarviki Hacker, Legrain & Fibiger, 2008 Acontia albatrigona Hacker, Legrain & Fibiger, 2008 Acontia antica Walker, 1862 Acontia apatelia (Swinhoe, 1907) Acontia atripars Hampson, 1914 Acontia aurelia Hacker, Legrain & Fibiger, 2008 Acontia basifera Walker, 1857 Acontia binominata (Butler, 1892) Acontia caeruleopicta Hampson, 1916 Acontia caffraria (Cramer, 1777) Acontia callima Bethune-Baker, 1911 Acontia carnescens (Hampson, 1910) Acontia dichroa (Hampson, 1914) Acontia discoidea Hopffer, 1857 Acontia discoidoides Hacker, Legrain & Fibiger, 2008 Acontia ectorrida (Hampson, 1916) Acontia goateri Hacker, Legrain & Fibiger, 2010 Acontia guttifera Felder & Rogenhofer, 1874 Acontia hampsoni Hacker, Legrain & Fibiger, 2008 Acontia hausmanni Hacker, 2010 Acontia hemixanthia (Hampson, 1910) Acontia hoppei Hacker, Legrain & Fibiger, 2008 Acontia hortensis Swinhoe, 1884 Acontia insocia (Walker, 1857) Acontia leucotrigona (Hampson, 1905) Acontia mascheriniae (Berio, 1985) Acontia melaphora (Hampson, 1910) Acontia miogona (Hampson, 1916) Acontia natalis (Guenée, 1852) Acontia niphogona (Hampson, 1909) Acontia notha Hacker, Legrain & Fibiger, 2010 Acontia nubila Hampson, 1910 Acontia obliqua Hacker, Legrain & Fibiger, 2010 Acontia opalinoides Guenée, 1852 Acontia porphyrea (Butler, 1898) Acontia purpurata Hacker, Legrain & Fibiger, 2010 Acontia purpureofacta Hacker, Legrain & Fibiger, 2010 Acontia schreieri Hacker, Legrain & Fibiger, 2010 Acontia secta Guenée, 1852 Acontia semialba Hampson, 1910 Acontia sublactea Hacker, Legrain & Fibiger, 2008 Acontia szunyoghyi Hacker, Legrain & Fibiger, 2010 Acontia tanzaniae Hacker, Legrain & Fibiger, 2010 Acontia torrefacta (Distant, 1898) Acontia trimaculata Aurivillius, 1879 Acontia versicolora Hacker, 2010 Acontia wiltshirei Hacker, Legrain & Fibiger, 2008 Acrapex brunnea Hampson, 1910 Acrapex curvata Hampson, 1902 Acrapex rhabdoneura Hampson, 1910 Adisura atkinsoni Moore, 1881 Aegocera brevivitta Hampson, 1901 Aegocera rectilinea Boisduval, 1836 Agoma trimenii (Felder, 1874) Agrotana jacksoni Bethune-Baker, 1911 Agrotis biconica Kollar, 1844 Agrotis longidentifera (Hampson, 1903) Agrotis segetum ([Denis & Schiffermüller], 1775) Aletia consanguis (Guenée, 1852) Aletia tincta (Walker, 1858) Amazonides ascia D. S. Fletcher, 1961 Amazonides epipyria (Hampson, 1903) Amazonides fuscirufa (Hampson, 1903) Amazonides griseofusca (Hampson, 1913) Amazonides ustula (Hampson, 1913) Amyna axis Guenée, 1852 Amyna magnifoveata Hampson, 1918 Amyna punctum (Fabricius, 1794) Androlymnia clavata Hampson, 1910 Anoba rufitermina Hampson, 1926 Anoba sinuata (Fabricius, 1775) Anomis erosa (Hübner, 1818) Anomis involuta Walker, 1857 Anomis polymorpha Hampson, 1926 Anomis sabulifera (Guenée, 1852) Apospasta fuscirufa (Hampson, 1905) Apospasta venata (Hampson, 1905) Ariathisa abyssinia (Guenée, 1852) Ariathisa semiluna (Hampson, 1909) Asota speciosa (Drury, 1773) Aspidifrontia binagwahoi Laporte, 1978 Aspidifrontia radiata Hampson, 1905 Aspidifrontia sagitta Berio, 1964 Athetis anomoeosis Hampson, 1909 Athetis glauca (Hampson, 1902) Athetis glaucopis (Bethune-Baker, 1911) Athetis hyperaeschra Hampson, 1909 Athetis ignava (Guenée, 1852) Athetis melanosticta Hampson, 1909 Athetis micra (Hampson, 1902) Athetis nitens (Saalmüller, 1891) Athetis pigra (Guenée, 1852) Athetis scotopis (Bethune-Baker, 1911) Audea fatilega (Felder & Rogenhofer, 1874) Audea melanoplaga Hampson, 1902 Autoba admota (Felder & Rogenhofer, 1874) Axylia coniorta (Hampson, 1903) Bamra glaucopasta (Bethune-Baker, 1911) Bocula horus (Fawcett, 1916) Brevipecten calimanii (Berio, 1939) Brevipecten cornuta Hampson, 1902 Brevipecten marmoreata Hacker & Fibiger, 2007 Brevipecten tessenei Berio, 1939 Calesia zambesita Walker, 1865 Calliodes pretiosissima Holland, 1892 Callopistria latreillei (Duponchel, 1827) Callopistria maillardi (Guenée, 1862) Caradrina atriluna Guenée, 1852 Carcharoda flavirosea Hampson, 1910 Carpostalagma pulverulentus Talbot, 1929 Catephia abrostolica Hampson, 1926 Catephia sciras Fawcett, 1916 Catephia metaleuca Hampson, 1926 Catephia scylla Fawcett, 1916 Catephia serapis Fawcett, 1916 Catephia sospita Fawcett, 1916 Cerocala masaica Hampson, 1913 Cerynea endotrichalis Hampson, 1910 Cerynea ignealis Hampson, 1910 Cerynea thermesialis (Walker, 1866) Chabuata amoeba Hampson, 1905 Chalciope delta (Boisduval, 1833) Chelecala trefoliata (Butler, 1898) Chrysodeixis acuta (Walker, [1858]) Chrysodeixis eriosoma (Doubleday, 1843) Corgatha drepanodes Hampson, 1910 Cortyta canescens Walker, 1858 Cortyta remigiana Hampson, 1913 Crameria amabilis (Drury, 1773) Cretonia atrisigna Hampson, 1910 Cretonia ethiopica Hampson, 1910 Cryphia leucomelaena (Hampson, 1908) Crypsotidia maculifera (Staudinger, 1898) Crypsotidia mesosema Hampson, 1913 Ctenoplusia fracta (Walker, 1857) Ctenoplusia limbirena (Guenée, 1852) Ctenoplusia camptogamma (Hampson, 1910) Cucullia rufescens Hampson, 1906 Cuneisigna cumamita (Bethune-Baker, 1911) Cuneisigna rivulata (Hampson, 1902) Cyligramma fluctuosa (Drury, 1773) Cyligramma latona (Cramer, 1775) Cyligramma limacina (Guérin-Méneville, 1832) Dicerogastra proleuca (Hampson, 1913) Digama africana Swinhoe, 1907 Digama serratula Talbot, 1932 Dysgonia abnegans (Walker, 1858) Dysgonia angularis (Boisduval, 1833) Dysgonia erectata (Hampson, 1902) Dysgonia torrida (Guenée, 1852) Ectolopha marginata Hampson, 1910 Ectolopha viridescens Hampson, 1902 Egnasia vicaria (Walker, 1866) Egybolis vaillantina (Stoll, 1790) Elyptron leucosticta (Hampson, 1909) Entomogramma pardus Guenée, 1852 Epharmottomena sublimbata Berio, 1894 Epischausia dispar (Rothschild, 1896) Erebus walkeri (Butler, 1875) Ericeia congregata (Walker, 1858) Ericeia inangulata (Guenée, 1852) Ethiopica hesperonota Hampson, 1909 Ethiopica umbra Le Cerf, 1922 Eublemma anachoresis (Wallengren, 1863) Eublemma baccalix (Swinhoe, 1886) Eublemma bicolora Bethune-Baker, 1911 Eublemma brunneosa Bethune-Baker, 1911 Eublemma chlorochroa Hampson, 1910 Eublemma cochylioides (Guenée, 1852) Eublemma decora (Walker, 1869) Eublemma exigua (Walker, 1858) Eublemma flavistriata Hampson, 1910 Eublemma foedosa (Guenée, 1852) Eublemma gayneri (Rothschild, 1901) Eublemma hypozonata Hampson, 1910 Eublemma leucozona Hampson, 1910 Eublemma minutoides Poole, 1989 Eublemma nyctichroa Hampson, 1910 Eublemma ornatula (Felder & Rogenhofer, 1874) Eublemma perobliqua Hampson, 1910 Eublemma psamathea Hampson, 1910 Eublemma ragusana (Freyer, 1844) Eublemma reducta Butler, 1894 Eublemma roseocincta Hampson, 1910 Eublemma seminivea Hampson, 1896 Eublemma therma Hampson, 1910 Eublemma xanthocraspis Hampson, 1910 Eublemmoides apicimacula (Mabille, 1880) Eudocima divitiosa (Walker, 1869) Eudocima materna (Linnaeus, 1767) Eulocastra aethiops (Distant, 1898) Eulocastra hypotaenia (Wallengren, 1860) Euplexia azyga Hampson, 1908 Euplexia chalybsa Hampson, 1908 Euplexia melanocycla Hampson, 1908 Euplexia rhoda Hampson, 1908 Eustrotia amydrozona Hampson, 1910 Eustrotia citripennis Hampson, 1910 Eustrotia cumalinea Bethune-Baker, 1911 Eustrotia decissima (Walker, 1865) Eustrotia diascia Hampson, 1910 Eustrotia megalena (Mabille, 1900) Eustrotia melanopis Hampson, 1910 Eustrotia trigonodes Hampson, 1910 Eustrotiopis chlorota Hampson, 1926 Eutelia amatrix Walker, 1858 Eutelia bowkeri (Felder & Rogenhofer, 1874) Eutelia discitriga Walker, 1865 Eutelia symphonica Hampson, 1902 Euxoa axiliodes Hampson, 1903 Euxootera atrisparsa (Hampson, 1903) Euxootera melanomesa (Hampson, 1913) Exathetis strigata (Hampson, 1911) Feliniopsis africana (Schaus & Clements, 1893) Feliniopsis connivens (Felder & Rogenhofer, 1874) Feliniopsis consummata (Walker, 1857) Feliniopsis duponti (Laporte, 1974) Feliniopsis nigribarbata (Hampson, 1908) Feliniopsis opposita (Walker, 1865) Feliniopsis parvuloides Hacker, 2010 Feliniopsis talhouki (Wiltshire, 1983) Grammodes congenita Walker, 1858 Grammodes exclusiva Pagenstecher, 1907 Grammodes geometrica (Fabricius, 1775) Grammodes stolida (Fabricius, 1775) Hadena bulgeri (Felder & Rogenhofer, 1874) Hadjina atrinota Hampson, 1909 Heliocheilus cana (Hampson, 1903) Heliocheilus discalis (Hampson, 1903) Heliocheilus multiradiata (Hampson, 1902) Heliocheilus perdentata (Hampson, 1903) Heliophisma klugii (Boisduval, 1833) Hemituerta mahdi (Pagenstecher, 1903) Heraclia africana (Butler, 1875) Heraclia aisha (Kirby, 1891) Heraclia flavisignata (Hampson, 1912) Heraclia gruenbergi (Wichgraf, 1911) Heraclia hypercompoides (Butler, 1895) Heraclia karschi (Holland, 1897) Heraclia monslunensis (Hampson, 1901) Heraclia nandi Kiriakoff, 1974 Heraclia perdix (Druce, 1887) Heraclia poggei (Dewitz, 1879) Heraclia superba (Butler, 1875) Heraclia thruppi (Butler, 1886) Heteropalpia robusta Wiltshire, 1988 Honeyia clearchus (Fawcett, 1916) Hopetounia marginata Hampson, 1926 Hypena obacerralis Walker, 1859 Hypena striolalis Aurivillius, 1910 Hypena vulgatalis Walker, 1859 Hypocala rostrata (Fabricius, 1794) Hypoperigea medionota Hampson, 1920 Hypopyra africana (Kirby, 1896) Hypopyra capensis Herrich-Schäffer, 1854 Hypopyra rufescens (Kirby, 1896) Hypotacha isthmigera Wiltshire, 1968 Hypotacha ochribasalis (Hampson, 1896) Iambia thwaitesi (Moore, 1885) Idia pernix (Townsend, 1958) Janseodes melanospila (Guenée, 1852) Leucania acrapex (Hampson, 1905) Leucania bilineata (Hampson, 1905) Leucania citrinotata (Hampson, 1905) Leucania clavifera (Hampson, 1907) Leucania confluens (Bethune-Baker, 1909) Leucania leucogramma (Hampson, 1905) Leucania melianoides Möschler, 1883 Leucania nebulosa Hampson, 1902 Leucania pectinata (Hampson, 1905) Leucania phaea Hampson, 1902 Leucania praetexta Townsend, 1955 Leucania sarca Hampson, 1902 Leucania tacuna Felder & Rogenhofer, 1874 Leucania tenebra (Hampson, 1905) Leucania usta Hampson, 1902 Lithacodia blandula (Guenée, 1862) Lophoptera litigiosa (Boisduval, 1833) Lophorache fulvirufa Hampson, 1910 Lophotidia trisema Hampson, 1913 Marathyssa cuneata (Saalmüller, 1891) Marca proclinata Saalmüller, 1891 Marcipa carcassoni Pelletier, 1975 Masalia albiseriata (Druce, 1903) Masalia beatrix (Moore, 1881) Masalia bimaculata (Moore, 1888) Masalia disticta (Hampson, 1902) Masalia fissifascia (Hampson, 1903) Masalia flavistrigata (Hampson, 1903) Masalia galatheae (Wallengren, 1856) Masalia latinigra (Hampson, 1907) Masalia leucosticta (Hampson, 1902) Masalia perstriata (Hampson, 1903) Masalia transvaalica (Distant, 1902) Matopo actinophora Hampson, 1909 Matopo descarpentriesi (Laporte, 1975) Maxera marchalii (Boisduval, 1833) Melanephia endophaea Hampson, 1926 Melanephia nigrescens (Wallengren, 1856) Mentaxya albifrons (Geyer, 1837) Mentaxya ignicollis (Walker, 1857) Mentaxya indigna (Herrich-Schäffer, 1854) Mentaxya muscosa Geyer, 1837 Mentaxya rimosa (Guenée, 1852) Metachrostis quinaria (Moore, 1881) Metappana ethiopica (Hampson, 1907) Micragrotis acydonta Hampson, 1903 Micragrotis cinerosa Bethune-Baker, 1911 Micragrotis lacteata Hampson, 1903 Mitrophrys ansorgei (Rothschild, 1897) Mitrophrys menete (Cramer, 1775) Mocis mayeri (Boisduval, 1833) Mocis mutuaria (Walker, 1858) Mocis repanda (Fabricius, 1794) Mocis undata (Fabricius, 1775) Mythimna poliastis (Hampson, 1902) Nodaria externalis Guenée, 1854 Nyodes viridirufa (Hampson, 1918) Odontestra albivitta Hampson, 1905 Oedicodia limbata Butler, 1898 Oedicodia violascens Hampson, 1910 Oligia ambigua (Walker, 1858) Omphalestra geraea (Hampson, 1907) Omphalestra submedianata (Hampson, 1905) Omphaletis ethiopica Hampson, 1909 Ophiusa finifascia (Walker, 1858) Oraesia emarginata (Fabricius, 1794) Oraesia provocans Walker, [1858] Oraesia wintgensi (Strand, 1909) Ozarba accincta (Distant, 1898) Ozarba apicalis Hampson, 1910 Ozarba atrifera Hampson, 1910 Ozarba bipartita (Hampson, 1902) Ozarba flavescens Hampson, 1910 Ozarba heliastis (Hampson, 1902) Ozarba hypoxantha (Wallengren, 1860) Ozarba isocampta Hampson, 1910 Ozarba lepida Saalmüller, 1891 Ozarba megaplaga Hampson, 1910 Ozarba nyanza (Felder & Rogenhofer, 1874) Ozarba sinua Hampson, 1910 Ozarba terribilis Berio, 1940 Ozarba tricuspis Hampson, 1910 Pandesma robusta (Walker, 1858) Parachalciope mahura (Felder & Rogenhofer, 1874) Parachalciope trigonometrica Hampson, 1913 Parafodina pentagonalis (Butler, 1894) Pericyma atrifusa (Hampson, 1902) Pericyma mendax (Walker, 1858) Pericyma metaleuca Hampson, 1913 Phaegorista enarges Tams, 1930 Phaegorista leucomelas (Herrich-Schäffer, 1855) Photedes homora (Bethune-Baker, 1911) Phytometra curvifera (Hampson, 1926) Phytometra magalium (Townsend, 1958) Phytometra rhodopa (Bethune-Baker, 1911) Plecoptera approximans Hampson, 1926 Plecoptera diplogramma Hampson, 1926 Plecoptera hypoxantha Hampson, 1926 Plecoptera melanoscia Hampson, 1926 Plecopterodes synethes Hampson, 1913 Plusiodonta basirhabdota Hampson, 1926 Plusiodonta macra Hampson, 1926 Plusiodonta megista Hampson, 1926 Polia atrirena Hampson, 1905 Polydesma umbricola Boisduval, 1833 Prionofrontia nyctiscia Hampson, 1926 Proconis abrostoloides Hampson, 1902 Procriosis albizona Hampson, 1918 Procriosis dileuca Hampson, 1910 Proschaliphora citricostata Hampson, 1901 Pseudcraspedia punctata Hampson, 1898 Pseudomicrodes rufigrisea Hampson, 1910 Pseudozarba mianoides (Hampson, 1893) Rhabdophera hansali (Felder & Rogenhofer, 1874) Rhanidophora piguerator Hampson, 1926 Rhesala moestalis (Walker, 1866) Rhesala punctisigna Hampson, 1926 Rhynchina taruensis Butler, 1898 Rivula lophosoma Hampson, 1926 Rougeotia osellai Berio, 1978 Rougeotia praetexta Townsend, 1956 Sesamia epunctifera Hampson, 1902 Sesamia roseoflammata Pinhey, 1956 Simplicia extinctalis (Zeller, 1852) Simplicia inflexalis Guenée, 1854 Soloe fumipennis Hampson, 1910 Soloe plicata Pinhey, 1952 Soloella orientis Kühne, 2007 Sommeria culta Hübner, 1831 Sphingomorpha chlorea (Cramer, 1777) Spodoptera cilium Guenée, 1852 Spodoptera exempta (Walker, 1857) Spodoptera exigua (Hübner, 1808) Spodoptera littoralis (Boisduval, 1833) Spodoptera mauritia (Boisduval, 1833) Stenosticta grisea Hampson, 1912 Stilbotis basalis (Berio, 1978) Stilbotis georgyi Laporte, 1984 Stilbotis jouanini Laporte, 1975 Syngrapha circumflexa (Linnaeus, 1767) Tathorhynchus leucobasis Bethune-Baker, 1911 Thiacidas berenice (Fawcett, 1916) Thiacidas fasciata (Fawcett, 1917) Thiacidas fuscomacula Hacker & Zilli, 2010 Thiacidas hampsoni (Hacker, 2004) Thiacidas orientalis Hacker & Zilli, 2010 Thiacidas permutata Hacker & Zilli, 2007 Thiacidas schausi (Hampson, 1905) Thiacidas senex (Bethune-Baker, 1911) Thiacidas smythi (Gaede, 1939) Thiacidas subhampsoni Hacker & Zilli, 2010 Thiacidas triangulata (Gaede, 1939) Thyas arcifera (Hampson, 1913) Thysanoplusia sestertia (Felder & Rogenhofer, 1874) Timora adamsoni Pinhey, 1956 Toana flaviceps Hampson, 1918 Tolpia atripuncta Hampson, 1926 Tracheplexia lucia (Felder & Rogenhofer, 1974) Trichoplusia ni (Hübner, [1803]) Trichoplusia orichalcea (Fabricius, 1775) Trigonodes hyppasia (Cramer, 1779) Tuerta cyanopasta Hampson, 1907 Tycomarptes inferior (Guenée, 1852) Tytroca alabuensis Wiltshire, 1970 Ugia albilinea Hampson, 1926 Ulotrichopus eugeniae Saldaitis & Ivinskis, 2010 Ulotrichopus phaeopera Hampson, 1913 Ulotrichopus primulina (Hampson, 1902) Uncula tristigmatias (Hampson, 1902) Vietteania torrentium (Guenée, 1852) Vittaplusia vittata (Wallengren, 1856) Xanthomera leucoglene (Mabille, 1880) Zalaca snelleni (Wallengren, 1875) Zethesides bettoni (Butler, 1898) Nolidae Arcyophora dives (Butler, 1898) Blenina squamifera (Wallengren, 1860) Bryophilopsis tarachoides Mabille, 1900 Characoma submediana Wiltshire, 1986 Earias biplaga Walker, 1866 Earias cupreoviridis (Walker, 1862) Earias insulana (Boisduval, 1833) Eligma laetepicta Oberthür, 1893 Garella nubilosa Hampson, 1912 Giaura leucotis (Hampson, 1905) Maurilia arcuata (Walker, [1858]) Meganola jacobi Agassiz, 2009 Meganola melanosticta (Hampson, 1914) Meganola reubeni Agassiz, 2009 Negeta luminosa (Walker, 1858) Nola chionea Hampson, 1911 Nola diplozona Hampson, 1914 Nola leucalea Hampson, 1907 Nola melaleuca (Hampson, 1901) Nola melanoscelis (Hampson, 1914) Nola phaeocraspis (Hampson, 1909) Nola progonia (Hampson, 1914) Nycteola malachitis (Hampson, 1912) Odontestis striata Hampson, 1912 Oedicraspis subfervida Hampson, 1912 Pardasena melanosticta Hampson, 1912 Pardasena roeselioides (Walker, 1858) Pardasena virgulana (Mabille, 1880) Risoba diplogramma Hampson, 1912 Risoba obstructa Moore, 1881 Risoba sticticraspis Hampson, 1912 Selepa leucogonia (Hampson, 1905) Selepa nephelozona (Hampson, 1905) Xanthodes dinarodes (Hampson, 1912) Notodontidae Achaera ochribasis (Hampson, 1910) Afrocerura leonensis (Hampson, 1910) Antheua liparidioides (Rothschild, 1910) Antheua simplex Walker, 1855 Antheua trifasciata (Hampson, 1909) Antheua woerdeni (Snellen, 1872) Bisolita rubrifascia (Hampson, 1910) Clostera solitaria Kiriakoff, 1962 Desmeocraera decorata (Wichgraf, 1922) Desmeocraera tripuncta Janse, 1920 Epicerura plumosa Kiriakoff, 1962 Epicerura steniptera (Hampson, 1910) Epidonta eroki Bethune-Baker, 1911 Eulavinia lavinia (Fawcett, 1916) Eutimia marpissa Wallengren, 1858 Nepheliphora nubifera (Hampson, 1910) Phalera princei Grünberg, 1909 Polienus capillata (Wallengren, 1875) Psalisodes atrifasciata Hampson, 1910 Psalisodes discalis (Hampson, 1910) Psalisodes xylochroa (Hampson, 1910) Rasemia macrodonta (Hampson, 1909) Simesia dasychiroides (Butler, 1898) Stenostaura harperi Agassiz, 2009 Tmetopteryx dorsimaculata Kiriakoff, 1965 Tmetopteryx maura Kiriakoff, 1965 Trotonotus bettoni Butler, 1898 Xanthodonta nigrovittata (Aurivilius, 1921) Xanthodonta unicornis Kiriakoff, 1961 Oecophoridae Tortilia rimulata (Meyrick, 1920) Plutellidae Genostele fornicata Meyrick, 1920 Paraxenistis africana Mey, 2007 Paraxenistis serrata Mey, 2007 Plutella xylostella (Linnaeus, 1758) Psychidae Acanthopsyche calamochroa (Hampson, 1910) Ctenocompa amydrota Meyrick, 1920 Ctenocompa famula Meyrick, 1920 Melasina hyacinthias Meyrick, 1920 Melasina ichnophora Meyrick, 1920 Melasina olenitis Meyrick, 1914 Melasina spumosa Meyrick, 1920 Melasina stabularia Meyrick, 1908 Melasina varicosa Meyrick, 1920 Narycia acharis Meyrick, 1920 Narycia exalbida Meyrick, 1920 Narycia nubilosa Meyrick, 1920 Typhonia bettoni (Butler, 1898) Pterophoridae Agdistis aberdareana Arenberger, 1988 Agdistis kenyana Arenberger, 1988 Agdistis korana Arenberger, 1988 Agdistis linnaei Gielis, 2008 Agdistis malitiosa Meyrick, 1909 Agdistis obstinata Meyrick, 1920 Agdistis riftvalleyi Arenberger, 2001 Amblyptilia direptalis (Walker, 1864) Apoxyptilus anthites (Meyrick, 1936) Bipunctiphorus etiennei Gibeaux, 1994 Crassuncus chappuisi Gibeaux, 1994 Emmelina amseli (Bigot, 1969) Emmelina bigoti Gibeaux, 1990 Emmelina monodactyla (Linnaeus, 1758) Exelastis atomosa (Walsingham, 1885) Exelastis caroli Gielis, 2008 Hellinsia conscius (Meyrick, 1920) Megalorhipida leptomeres (Meyrick, 1886) Megalorhipida leucodactylus (Fabricius, 1794) Merrifieldia improvisa Arenberger, 2001 Oxyptilus insomnis (Townsend, 1956) Picardia eparches (Meyrick, 1931) Platyptilia aarviki Gielis, 2008 Platyptilia humida Meyrick, 1920 Platyptilia molopias Meyrick, 1906 Platyptilia morophaea Meyrick, 1920 Platyptilia picta Meyrick, 1913 Platyptilia rhyncholoba Meyrick, 1924 Platyptilia sciophaea Meyrick, 1920 Platyptilia thiosoma Meyrick, 1920 Pselnophorus jaechi (Arenberger, 1993) Pterophorus albidus (Zeller, 1852) Pterophorus candidalis (Walker, 1864) Pterophorus cleronoma (Meyrick, 1920) Pterophorus massai Gielis, 1991 Pterophorus rhyparias (Meyrick, 1908) Sphenarches anisodactylus (Walker, 1864) Stenodacma wahlbergi (Zeller, 1852) Stenoptilia conicephala Gielis, 1990 Stenoptilia ionota Meyrick, 1920 Stenoptilia melanoloncha Meyrick, 1927 Stenoptilia zophodactylus (Duponchel, 1840) Stenoptilodes taprobanes (Felder & Rogenhofer, 1875) Titanoptilus melanodonta Hampson, 1905 Walsinghamiella illustris (Townsend, 1958) Pyralidae Aglossa fumifusalis Hampson, 1916 Anobostra varians (Butler, 1898) Ematheudes straminella Snellen, 1872 Endotricha consobrinalis Zeller, 1852 Endotricha ellisoni Whalley, 1963 Endotricha vinolentalis Ragonot, 1891 Lamoria imbella (Walker, 1864) Mussidia nigrivenella Ragonot, 1888 Paraglossa atrisquamalis Hampson, 1906 Pempelia morosalis (Saalmüller, 1880) Phycitodes albistriata Hampson, 1917 Pithyllis metachryseis (Hampson, 1906) Pyralis galactalis Hampson, 1916 Saturniidae Argema besanti Rebel, 1895 Argema mimosae (Boisduval, 1847) Aurivillius arata (Westwood, 1849) Aurivillius seydeli Rougeot, 1962 Bunaea aslauga Kirby, 1877 Bunaeopsis hersilia (Westwood, 1849) Bunaeopsis jacksoni (Jordan, 1908) Bunaeopsis licharbas (Maassen & Weymer, 1885) Bunaeopsis oubie (Guérin-Méneville, 1849) Campimoptilum boulardi (Rougeot, 1974) Campimoptilum hollandi (Butler, 1898) Campimoptilum kuntzei (Dewitz, 1881) Cinabra hyperbius (Westwood, 1881) Decachorda aspersa (Felder, 1874) Decachorda bouvieri Hering, 1929 Decachorda fulvia (Druce, 1886) Decachorda mombasana Stoneham, 1962 Decachorda rosea Aurivillius, 1898 Eosia digennaroi Bouyer, 2008 Eosia insignis Le Cerf, 1911 Epiphora albidus (Druce, 1886) Epiphora antinorii (Oberthür, 1880) Epiphora bauhiniae (Guérin-Méneville, 1832) Epiphora congolana (Bouvier, 1929) Epiphora intermedia (Rougeot, 1955) Epiphora magdalena Grünberg, 1909 Epiphora mythimnia (Westwood, 1849) Epiphora rectifascia Rothschild, 1907 Gonimbrasia anna (Maassen & Weymer, 1885) Gonimbrasia conradsi (Rebel, 1906) Gonimbrasia hoehnelii (Rogenhofer, 1891) Gonimbrasia occidentalis Rothschild, 1907 Gonimbrasia rectilineata (Sonthonnax, 1899) Gonimbrasia tyrrhea (Cramer, 1775) Gonimbrasia wahlbergii (Boisduval, 1847) Gonimbrasia zambesina (Walker, 1865) Goodia unguiculata Bouvier, 1936 Gynanisa albescens Sonthonnax, 1904 Gynanisa kenya Darge, 2008 Gynanisa maja (Klug, 1836) Gynanisa westwoodi Rothschild, 1895 Holocerina angulata (Aurivillius, 1893) Holocerina istsariensis Stoneham, 1962 Holocerina smilax (Westwood, 1849) Imbrasia epimethea (Drury, 1772) Imbrasia ertli Rebel, 1904 Lobobunaea acetes (Westwood, 1849) Lobobunaea goodi (Holland, 1893) Lobobunaea jeanneli Rougeot, 1959 Lobobunaea kuehnei Naumann, 2008 Lobobunaea phaedusa (Drury, 1782) Ludia arguta Jordan, 1922 Ludia delegorguei (Boisduval, 1847) Ludia dentata (Hampson, 1891) Ludia hansali Felder, 1874 Ludia orinoptena Karsch, 1892 Ludia pseudovetusta Rougeot, 1978 Melanocera menippe (Westwood, 1849) Melanocera pinheyi Lemaire & Rougeot, 1974 Melanocera sufferti (Weymer, 1896) Melanocera widenti Terral & Darge, 1991 Micragone cana (Aurivillius, 1893) Nudaurelia anthinoides Rougeot, 1978 Nudaurelia belayneshae Rougeot, 1978 Nudaurelia capdevillei Rougeot, 1979 Nudaurelia dione (Fabricius, 1793) Nudaurelia eblis Strecker, 1876 Nudaurelia emini (Butler, 1888) Nudaurelia krucki Hering, 1930 Nudaurelia macrothyris (Rothschild, 1906) Orthogonioptilum adiegetum Karsch, 1892 Pselaphelia flavivitta (Walker, 1862) Pselaphelia vandenberghei Bouyer, 1992 Pseudaphelia apollinaris (Boisduval, 1847) Pseudobunaea cleopatra (Aurivillius, 1893) Pseudobunaea deaconi (Stoneham, 1962) Pseudobunaea epithyrena (Maassen & Weymer, 1885) Pseudobunaea irius (Fabricius, 1793) Pseudobunaea tyrrhena (Westwood, 1849) Rohaniella pygmaea (Maassen & Weymer, 1885) Tagoropsis flavinata (Walker, 1865) Tagoropsis hanningtoni (Butler, 1883) Tagoropsis rougeoti D. S. Fletcher, 1952 Urota sinope (Westwood, 1849) Usta angulata Rothschild, 1895 Usta wallengrenii (C. & R. Felder, 1859) Yatanga smithi (Holland, 1892) Sesiidae Camaegeria massai Bartsch & Berg, 2012 Chamanthedon leucocera Hampson, 1919 Homogyna alluaudi Le Cerf, 1911 Lophoceps abdominalis Hampson, 1919 Macrotarsipus albipunctus Hampson, 1893 Macrotarsipus microthyris Hampson, 1919 Melittia amblyphaea Hampson, 1919 Melittia haematopis Fawcett, 1916 Melittia lentistriata Hampson, 1919 Melittia natalensis Butler, 1874 Melittia xanthogaster Hampson, 1919 Paranthrene xanthopyga Hampson, 1919 Synanthedon erythromma Hampson, 1919 Tipulamima pyrosoma Hampson, 1919 Sphingidae Acanthosphinx guessfeldti (Dewitz, 1879) Acherontia atropos (Linnaeus, 1758) Agrius convolvuli (Linnaeus, 1758) Andriasa contraria Walker, 1856 Antinephele achlora Holland, 1893 Antinephele anomala (Butler, 1882) Antinephele camerounensis Clark, 1937 Antinephele marcida Holland, 1893 Atemnora westermannii (Boisduval, 1875) Basiothia aureata (Karsch, 1891) Basiothia charis (Boisduval, 1875) Callosphingia circe (Fawcett, 1915) Centroctena imitans (Butler, 1882) Centroctena rutherfordi (Druce, 1882) Ceridia mira Rothschild & Jordan, 1903 Chaerocina dohertyi Rothschild & Jordan, 1903 Chloroclanis virescens (Butler, 1882) Coelonia fulvinotata (Butler, 1875) Daphnis nerii (Linnaeus, 1758) Dovania poecila Rothschild & Jordan, 1903 Ellenbeckia monospila Rothschild & Jordan, 1903 Euchloron megaera (Linnaeus, 1758) Falcatula falcata (Rothschild & Jordan, 1903) Hippotion aporodes Rothschild & Jordan, 1912 Hippotion balsaminae (Walker, 1856) Hippotion chloris Rothschild & Jordan, 1907 Hippotion dexippus Fawcett, 1915 Hippotion eson (Cramer, 1779) Hippotion irregularis (Walker, 1856) Hippotion moorei Jordan, 1926 Hippotion osiris (Dalman, 1823) Hippotion rebeli Rothschild & Jordan, 1903 Hippotion rosae (Butler, 1882) Hippotion roseipennis (Butler, 1882) Hippotion socotrensis (Rebel, 1899) Hippotion stigma (Rothschild & Jordan, 1903) Leucostrophus alterhirundo d'Abrera, 1987 Likoma apicalis Rothschild & Jordan, 1903 Likoma crenata Rothschild & Jordan, 1907 Lophostethus dumolinii (Angas, 1849) Macroglossum trochilus (Hübner, 1823) Macropoliana ferax (Rothschild & Jordan, 1916) Macropoliana natalensis (Butler, 1875) Microclanis erlangeri (Rothschild & Jordan, 1903) Neoclanis basalis (Walker, 1866) Neopolyptychus compar (Rothschild & Jordan, 1903) Neopolyptychus serrator (Jordan, 1929) Nephele accentifera (Palisot de Beauvois, 1821) Nephele aequivalens (Walker, 1856) Nephele bipartita Butler, 1878 Nephele comma Hopffer, 1857 Nephele discifera Karsch, 1891 Nephele funebris (Fabricius, 1793) Nephele monostigma Clark, 1925 Nephele rosae Butler, 1875 Nephele xylina Rothschild & Jordan, 1910 Platysphinx constrigilis (Walker, 1869) Poliana buchholzi (Plötz, 1880) Poliana micra Rothschild & Jordan, 1903 Poliana wintgensi (Strand, 1910) Poliodes roseicornis Rothschild & Jordan, 1903 Polyptychoides digitatus (Karsch, 1891) Polyptychoides erosus (Jordan, 1923) Polyptychoides grayii (Walker, 1856) Polyptychus affinis Rothschild & Jordan, 1903 Praedora leucophaea Rothschild & Jordan, 1903 Praedora marshalli Rothschild & Jordan, 1903 Pseudoclanis kenyae Clark, 1928 Pseudoclanis postica (Walker, 1856) Rhodafra marshalli Rothschild & Jordan, 1903 Rufoclanis fulgurans (Rothschild & Jordan, 1903) Rufoclanis numosae (Wallengren, 1860) Temnora albilinea Rothschild, 1904 Temnora crenulata (Holland, 1893) Temnora curtula Rothschild & Jordan, 1908 Temnora eranga (Holland, 1889) Temnora iapygoides (Holland, 1889) Temnora mirabilis Talbot, 1932 Temnora plagiata Walker, 1856 Temnora pseudopylas (Rothschild, 1894) Temnora pylades Rothschild & Jordan, 1903 Temnora spiritus (Holland, 1893) Temnora subapicalis Rothschild & Jordan, 1903 Temnora zantus (Herrich-Schäffer, 1854) Theretra monteironis (Butler, 1882) Xanthopan morganii (Walker, 1856) Thyrididae Arniocera albiguttata Talbot, 1928 Arniocera amoena Jordan, 1907 Arniocera auriguttata Hopffer, 1857 Arniocera cyanoxantha (Mabille, 1893) Arniocera ericata Butler, 1898 Arniocera erythropyga (Wallengren, 1860) Arniocera imperialis Butler, 1898 Arniocera poecila Jordan, 1907 Arniocera sternecki Rogenhofer, 1891 Cecidothyris parobifera Whalley, 1971 Chrysotypus vittiferalis (Gaede, 1917) Dilophura caudata (Jordan, 1907) Dysodia fenestratella Warren, 1900 Dysodia fumida Whalley, 1968 Dysodia intermedia (Walker, 1865) Dysodia lutescens Whalley, 1968 Hapana carcealis Whalley, 1971 Hypolamprus quaesitus Whalley, 1971 Kuja carcassoni Whalley, 1971 Marmax vicaria (Walker, 1854) Nemea betousalis (Gaede, 1917) Netrocera basalis Jordan, 1907 Netrocera diffinis Jordan, 1907 Netrocera hemichrysa (Hampson, 1910) Netrocera setioides Felder, 1874 Striglina minutula (Saalmüller, 1880) Tineidae Acridotarsa melipecta (Meyrick, 1915) Archemitra iorrhoa Meyrick, 1920 Ceratophaga ethadopa (Meyrick, 1938) Ceratophaga vastellus (Zeller, 1852) Ceratophaga xanthastis (Meyrick, 1908) Cylicobathra argocoma (Meyrick, 1914) Cylicobathra chionarga Meyrick, 1920 Dinica aspirans (Meyrick, 1920) Edosa crassivalva (Gozmány, 1968) Edosa melanostoma (Meyrick, 1908) Erechthias pentatypa (Meyrick, 1920) Hapsifera glebata Meyrick, 1908 Hapsifera ignobilis Meyrick, 1919 Hapsifera lithocentra Meyrick, 1920 Hapsifera nidicola Meyrick, 1935 Hapsifera ochroptila Meyrick, 1908 Hapsifera pachypsaltis Gozmány, 1965 Hapsifera paraglareosa Gozmány, 1968 Hapsifera revoluta Meyrick, 1914 Hapsifera rhodoptila Meyrick, 1920 Hapsifera septica Meyrick, 1908 Leptozancla talaroscia Meyrick, 1920 Machaeropteris magnifica Gozmány, 1968 Mitrogona laevis Meyrick, 1920 Monopis liparota Meyrick, 1920 Monopis rutilicostella (Stainton, 1860) Myrmecozela isopsamma Meyrick, 1920 Opogona anisacta Meyrick, 1920 Opogona tanydora Meyrick, 1920 Pachypsaltis pachystoma (Meyrick, 1920) Pelecystola decorata Meyrick, 1920 Perissomastix breviberbis (Meyrick, 1933) Perissomastix catapulta Gozmány, 1968 Perissomastix marcescens (Meyrick, 1908) Perissomastix ruwenzorica Gozmány & Vári, 1973 Phalloscardia semiumbrata (Meyrick, 1920) Phthoropoea oenochares (Meyrick, 1920) Pitharcha marmorata Gozmány, 1968 Tiquadra lichenea Walsingham, 1897 Tracheloteina eccephala (Meyrick, 1914) Wegneria scaeozona (Meyrick, 1920) Tischeriidae Coptotriche kenyensis Mey, 2010 Tortricidae Accra plumbeana Razowski, 1966 Acleris kinangopana Razowski, 1964 Acleris thylacitis (Meyrick, 1920) Actihema hemiacta (Meyrick, 1920) Actihema msituni Aarvik, 2010 Actihema simpsonae Aarvik, 2010 Aethes illota (Meyrick, 1914) Afroploce karsholti Aarvik, 2004 Afroploce turiana Aarvik, 2004 Apotoforma kakamegae Razowski, 2012 Bactra sinassula Diakonoff, 1963 Bactra stagnicolana Zeller, 1852 Cnephasia galeotis Meyrick, 1920 Cnephasia incinerata Meyrick, 1920 Cnephasia melliflua Meyrick, 1914 Cnephasia taganista Meyrick, 1920 Cochylimorpha exoterica (Meyrick, 1924) Cornesia arabuco Razowski, 2012 Cornesia molytes Razowski, 1993 Cosmorrhyncha acrocosma (Meyrick, 1908) Cosmorrhyncha microcosma Aarvik, 2004 Crocidosema plebejana Zeller, 1847 Cryptaspasma caryothicta (Meyrick, 1920) Cryptaspasma phycitinana Aarvik, 2005 Cryptaspasma subtilis Diakonoff, 1959 Ctenopseustis haplodryas Meyrick, 1920 Cydia chrysocosma (Meyrick, 1920) Cydia leptogramma (Meyrick, 1913) Eccopsis aegidia (Meyrick, 1932) Eccopsis agassizi Aarvik, 2004 Eccopsis deprinsi Aarvik, 2004 Eccopsis incultana (Walker, 1863) Eccopsis nebulana Walsingham, 1891 Eccopsis praecedens Walsingham, 1897 Eccopsis tucki Aarvik, 2004 Eccopsis wahlbergiana Zeller, 1852 Epiblema riciniata (Meyrick, 1911) Epichorista benevola Meyrick, 1920 Epichorista mesosceptra Meyrick, 1920 Epichorista passaleuta Meyrick, 1920 Epichorista prodigiosa Meyrick, 1920 Epichorista psoropis Meyrick, 1920 Epichoristodes licmaea (Meyrick, 1920) Eucosma antirrhoa Meyrick, 1920 Eucosma cyphospila Meyrick, 1920 Eucosma inscita Meyrick, 1913 Eucosma metagypsa Meyrick, 1920 Eucosma pharangodes Meyrick, 1920 Eucosma superciliosa Meyrick, 1920 Eugnosta misella Razowski, 1993 Eugnosta percnoptila (Meyrick, 1933) Eupoecilia kruegeriana Razowski, 1993 Falseuncaria aberdarensis Aarvik, 2010 Fulcrifera halmyris (Meyrick, 1909) Fulcrifera periculosa (Meyrick, 1913) Gypsonoma paradelta (Meyrick, 1925) Leguminovora glycinivorella (Matsumura, 1898) Lobesia harmonia (Meyrick, 1908) Megalota archana Aarvik, 2004 Megalota purpurana Aarvik, 2004 Megalota rhopalitis (Meyrick, 1920) Metamesia elegans (Walsingham, 1881) Metendothenia balanacma (Meyrick, 1914) Multiquaestia agassizi Aarvik & Karisch, 2009 Multiquaestia dallastai Aarvik & Karisch, 2009 Olethreutes clavifera (Meyrick, 1920) Olethreutes nimbosa (Meyrick, 1920) Orilesa mediocris (Meyrick, 1914) Panegyra sokokana Razowski, 2012 Paraccra chorogiae Razowski, 2012 Paraeccopsis insellata (Meyrick, 1920) Phtheochroa aarviki Razowski & J. W. Brown, 2012 Phtheochroa kenyana Aarvik, 2010 Procrica intrepida (Meyrick, 1912) Procrica parva Razowski, 2002 Sycacantha nereidopa (Meyrick, 1927) Tortrix chalicodes Meyrick, 1920 Tortrix dinota Meyrick, 1918 Tortrix exedra Meyrick, 1920 Tortrix mitrota Meyrick, 1920 Tortrix poliochra Meyrick, 1920 Tortrix triadelpha Meyrick, 1920 Xenosocia elgonica Karisch, 2008 Uraniidae Dirades angulifera Warren, 1902 Epiplema carbo Warren, 1902 Epiplema dohertyi Warren, 1904 Epiplema negro Warren, 1901 Epiplema nymphaeata Warren, 1902 Epiplema perpulchra Warren, 1902 Epiplema semipicta Warren, 1904 Heteroplema dependens Warren, 1902 Leucoplema ansorgei (Warren, 1901) Leucoplema triumbrata (Warren, 1902) Urapteroides recurvata Warren, 1898 Xyloryctidae Scythris invisa Meyrick, 1920 Zygaenidae Astyloneura biplagata (Bethune-Baker, 1911) Astyloneura cupreitincta (Hampson, 1920) Astyloneura difformis (Jordan, 1907) Epiorna abessynica (Koch, 1865) Saliunca aenescens Hampson, 1920 Saliunca fulviceps Hampson, 1920 Saliunca kamilila Bethune-Baker, 1911 Saliunca meruana Aurivillius, 1910 References External links Moths Kenya Moths
2183996
https://en.wikipedia.org/wiki/Josh%20Fisher
Josh Fisher
Joseph A "Josh" Fisher is an American and Spanish computer scientist noted for his work on VLIW architectures, compiling, and instruction-level parallelism, and for the founding of Multiflow Computer. He is a Hewlett-Packard Senior Fellow (Emeritus). Biography Fisher holds a BA (1968) in mathematics (with honors) from New York University and obtained a Master's and PhD degree (1979) in Computer Science from The Courant Institute of Mathematics of New York University. Fisher joined the Yale University Department of Computer Science in 1979 as an assistant professor, and was promoted to associate professor in 1983. In 1984 Fisher left Yale to found Multiflow Computer with Yale colleagues John O'Donnell and John Ruttenberg. Fisher joined HP Labs upon the closing of Multiflow in 1990. He directed HP Labs in Cambridge, MA USA from its founding in 1994, and became an HP Fellow (2000) and then Senior Fellow (2002) upon the inception of those titles at Hewlett-Packard. Fisher retired from HP Labs in 2006. Fisher is married (1967) to Elizabeth Fisher; they have a son, David Fisher, and a daughter, Dora Fisher. He holds Spanish citizenship due to his Sephardic heritage. Work Trace Scheduling In his Ph.D. dissertation, Fisher created the Trace Scheduling compiler algorithm and coined the term Instruction-level parallelism to characterize VLIW, superscalar, dataflow and other architecture styles that involve fine-grained parallelism among simple machine-level instructions. Trace scheduling was the first practical algorithm to find large amounts of parallelism between instructions that occupied different basic blocks. This greatly increased the potential speed-up for instruction-level parallel architectures. The VLIW architecture style Because of the difficulty of applying trace scheduling to idiosyncratic systems (such as 1970s-era DSPs) that in theory should have been suitable targets for a trace scheduling compiler, Fisher put forward the VLIW architectural style. VLIWs are normal computers, designed to run compiled code and used like ordinary computers, but offering large amounts of instruction-level parallelism scheduled by a trace scheduling or similar compiler. VLIWs are now used extensively, especially in embedded systems. The most popular VLIW cores have sold in quantities of several billion processors. Multiflow Computer Multiflow was founded to commercialize trace scheduling and VLIW architectures, then widely thought to be impractical. Multiflow's technical success and the dissemination of its technology and people had a great effect on the future of computer science and the computer industry. Awards and honors 1984 NSF Presidential Young Investigator's Award. (This award was meant to persuade promising faculty to stay at universities; financial grant to Yale University declined due to Fisher's leaving to start Multiflow.) 1987 Eli Whitney Connecticut Entrepreneur of the Year. 2003 Eckert–Mauchly Award given by The IEEE Computer Society and The Association for Computing Machinery, in recognition of 25 years of seminal contributions to instruction-level parallelism, pioneering work on VLIW architectures, and the formulation of the Trace Scheduling compilation technique. The Eckert-Mauchly is known as the computer architecture community’s highest award. 2012 B. Ramakrishna Rau Award given by The IEEE Computer Society for the development of trace scheduling compilation and pioneering work in VLIW (Very Long Instruction Word) architectures. Writings Joseph A Fisher, Paolo Farabochi and Cliff Young: Embedded Computing: A VLIW Approach to Architecture, Compilers and Tools. Elsevier/Morgan Kaufmann, 2004. Joseph A Fisher: Trace Scheduling: A Technique for Global Microcode Compaction IEEE Trans. Computers, 30(7):478-490, 1981. Joseph A. Fisher: Very Long Instruction Word architectures and the ELI-512 ISCA '83 Proceedings of the 10th annual international symposium on Computer architecture, Pages 140-150, ACM, New York, NY, USA. Retrospective, 25 Years of ISCA, ACM, 1998. Joseph A. Fisher, John R. Ellis, John C. Ruttenberg, Alexandru Nicolau: Parallel Processing: A Smart Compiler and a Dumb Machine Symp. Compiler Construction, 1984: 37-47. Retrospective, Best of PLDI, ACM SIGPLAN Notices, 39(4):112, 2003. B. Ramakrishna Rau, Joseph A. Fisher: Instruction-level parallel processing: history, overview, and perspective The Journal of Supercomputing - Special issue on instruction-level parallelism, Volume 7 Issue 1-2, May 1993. Also published by Kluwer Academic Publishers Hingham, MA, USA. References External links Elizabeth Fisher: Multiflow Computer: A Startup Odyssey. CreateSpace, 2013. IEEE: The VLIW Architecture of Joseph A. Fisher, Part 1 Solid-State Circuits Magazine, IEEE, 2009, Volume: 1, Issue: 2. Also Part 2 American computer scientists Spanish computer scientists Computer designers People from the Bronx Living people 1946 births Yale University faculty Hewlett-Packard people Computer science writers Scientists from the Bronx
542347
https://en.wikipedia.org/wiki/Specialization%20%28pre%29order
Specialization (pre)order
In the branch of mathematics known as topology, the specialization (or canonical) preorder is a natural preorder on the set of the points of a topological space. For most spaces that are considered in practice, namely for all those that satisfy the T0 separation axiom, this preorder is even a partial order (called the specialization order). On the other hand, for T1 spaces the order becomes trivial and is of little interest. The specialization order is often considered in applications in computer science, where T0 spaces occur in denotational semantics. The specialization order is also important for identifying suitable topologies on partially ordered sets, as is done in order theory. Definition and motivation Consider any topological space X. The specialization preorder ≤ on X relates two points of X when one lies in the closure of the other. However, various authors disagree on which 'direction' the order should go. What is agreed is that if x is contained in cl{y}, (where cl{y} denotes the closure of the singleton set {y}, i.e. the intersection of all closed sets containing {y}), we say that x is a specialization of y and that y is a generization of x; this is commonly written y ⤳ x. Unfortunately, the property "x is a specialization of y" is alternatively written as "x ≤ y" and as "y ≤ x" by various authors (see, respectively, and ). Both definitions have intuitive justifications: in the case of the former, we have x ≤ y if and only if cl{x} ⊆ cl{y}. However, in the case where our space X is the prime spectrum Spec R of a commutative ring R (which is the motivational situation in applications related to algebraic geometry), then under our second definition of the order, we have y ≤ x if and only if y ⊆ x as prime ideals of the ring R. For the sake of consistency, for the remainder of this article we will take the first definition, that "x is a specialization of y" be written as x ≤ y. We then see, x ≤ y if and only if x is contained in all closed sets that contain y. x ≤ y if and only if y is contained in all open sets that contain x. These restatements help to explain why one speaks of a "specialization": y is more general than x, since it is contained in more open sets. This is particularly intuitive if one views closed sets as properties that a point x may or may not have. The more closed sets contain a point, the more properties the point has, and the more special it is. The usage is consistent with the classical logical notions of genus and species; and also with the traditional use of generic points in algebraic geometry, in which closed points are the most specific, while a generic point of a space is one contained in every nonempty open subset. Specialization as an idea is applied also in valuation theory. The intuition of upper elements being more specific is typically found in domain theory, a branch of order theory that has ample applications in computer science. Upper and lower sets Let X be a topological space and let ≤ be the specialization preorder on X. Every open set is an upper set with respect to ≤ and every closed set is a lower set. The converses are not generally true. In fact, a topological space is an Alexandrov-discrete space if and only if every upper set is also open (or equivalently every lower set is also closed). Let A be a subset of X. The smallest upper set containing A is denoted ↑A and the smallest lower set containing A is denoted ↓A. In case A = {x} is a singleton one uses the notation ↑x and ↓x. For x ∈ X one has: ↑x = {y ∈ X : x ≤ y} = ∩{open sets containing x}. ↓x = {y ∈ X : y ≤ x} = ∩{closed sets containing x} = cl{x}. The lower set ↓x is always closed; however, the upper set ↑x need not be open or closed. The closed points of a topological space X are precisely the minimal elements of X with respect to ≤. Examples In the Sierpinski space {0,1} with open sets {∅, {1}, {0,1}} the specialization order is the natural one (0 ≤ 0, 0 ≤ 1, and 1 ≤ 1). If p, q are elements of Spec(R) (the spectrum of a commutative ring R) then p ≤ q if and only if q ⊆ p (as prime ideals). Thus the closed points of Spec(R) are precisely the maximal ideals. Important properties As suggested by the name, the specialization preorder is a preorder, i.e. it is reflexive and transitive. The equivalence relation determined by the specialization preorder is just that of topological indistinguishability. That is, x and y are topologically indistinguishable if and only if x ≤ y and y ≤ x. Therefore, the antisymmetry of ≤ is precisely the T0 separation axiom: if x and y are indistinguishable then x = y. In this case it is justified to speak of the specialization order. On the other hand, the symmetry of specialization preorder is equivalent to the R0 separation axiom: x ≤ y if and only if x and y are topologically indistinguishable. It follows that if the underlying topology is T1, then the specialization order is discrete, i.e. one has x ≤ y if and only if x = y. Hence, the specialization order is of little interest for T1 topologies, especially for all Hausdorff spaces. Any continuous function between two topological spaces is monotone with respect to the specialization preorders of these spaces. The converse, however, is not true in general. In the language of category theory, we then have a functor from the category of topological spaces to the category of preordered sets that assigns a topological space its specialization preorder. This functor has a left adjoint, which places the Alexandrov topology on a preordered set. There are spaces that are more specific than T0 spaces for which this order is interesting: the sober spaces. Their relationship to the specialization order is more subtle: For any sober space X with specialization order ≤, we have (X, ≤) is a directed complete partial order, i.e. every directed subset S of (X, ≤) has a supremum sup S, for every directed subset S of (X, ≤) and every open set O, if sup S is in O, then S and O have non-empty intersection. One may describe the second property by saying that open sets are inaccessible by directed suprema. A topology is order consistent with respect to a certain order ≤ if it induces ≤ as its specialization order and it has the above property of inaccessibility with respect to (existing) suprema of directed sets in ≤. Topologies on orders The specialization order yields a tool to obtain a preorder from every topology. It is natural to ask for the converse too: Is every preorder obtained as a specialization preorder of some topology? Indeed, the answer to this question is positive and there are in general many topologies on a set X that induce a given order ≤ as their specialization order. The Alexandroff topology of the order ≤ plays a special role: it is the finest topology that induces ≤. The other extreme, the coarsest topology that induces ≤, is the upper topology, the least topology within which all complements of sets ↓x (for some x in X) are open. There are also interesting topologies in between these two extremes. The finest sober topology that is order consistent in the above sense for a given order ≤ is the Scott topology. The upper topology however is still the coarsest sober order-consistent topology. In fact, its open sets are even inaccessible by any suprema. Hence any sober space with specialization order ≤ is finer than the upper topology and coarser than the Scott topology. Yet, such a space may fail to exist, that is, there exist partial orders for which there is no sober order-consistent topology. Especially, the Scott topology is not necessarily sober. References M.M. Bonsangue, Topological Duality in Semantics, volume 8 of Electronic Notes in Theoretical Computer Science, 1998. Revised version of author's Ph.D. thesis. Available online, see especially Chapter 5, that explains the motivations from the viewpoint of denotational semantics in computer science. See also the author's homepage. Order theory Topology
1107299
https://en.wikipedia.org/wiki/Token%20bucket
Token bucket
The token bucket is an algorithm used in packet-switched and telecommunications networks. It can be used to check that data transmissions, in the form of packets, conform to defined limits on bandwidth and burstiness (a measure of the unevenness or variations in the traffic flow). It can also be used as a scheduling algorithm to determine the timing of transmissions that will comply with the limits set for the bandwidth and burstiness: see network scheduler. Overview The token bucket algorithm is based on an analogy of a fixed capacity bucket into which tokens, normally representing a unit of bytes or a single packet of predetermined size, are added at a fixed rate. When a packet is to be checked for conformance to the defined limits, the bucket is inspected to see if it contains sufficient tokens at that time. If so, the appropriate number of tokens, e.g. equivalent to the length of the packet in bytes, are removed ("cashed in"), and the packet is passed, e.g., for transmission. The packet does not conform if there are insufficient tokens in the bucket, and the contents of the bucket are not changed. Non-conformant packets can be treated in various ways: They may be dropped. They may be enqueued for subsequent transmission when sufficient tokens have accumulated in the bucket. They may be transmitted, but marked as being non-conformant, possibly to be dropped subsequently if the network is overloaded. A conforming flow can thus contain traffic with an average rate up to the rate at which tokens are added to the bucket, and have a burstiness determined by the depth of the bucket. This burstiness may be expressed in terms of either a jitter tolerance, i.e. how much sooner a packet might conform (e.g. arrive or be transmitted) than would be expected from the limit on the average rate, or a burst tolerance or maximum burst size, i.e. how much more than the average level of traffic might conform in some finite period. Algorithm The token bucket algorithm can be conceptually understood as follows: A token is added to the bucket every seconds. The bucket can hold at the most tokens. If a token arrives when the bucket is full, it is discarded. When a packet (network layer PDU) of n bytes arrives, if at least n tokens are in the bucket, n tokens are removed from the bucket, and the packet is sent to the network. if fewer than n tokens are available, no tokens are removed from the bucket, and the packet is considered to be non-conformant. Variations Implementers of this algorithm on platforms lacking the clock resolution necessary to add a single token to the bucket every seconds may want to consider an alternative formulation. Given the ability to update the token bucket every S milliseconds, the number of tokens to add every S milliseconds = . Properties Average rate Over the long run the output of conformant packets is limited by the token rate, . Burst size Let be the maximum possible transmission rate in bytes/second. Then is the maximum burst time, that is the time for which the rate is fully utilized. The maximum burst size is thus Uses The token bucket can be used in either traffic shaping or traffic policing. In traffic policing, nonconforming packets may be discarded (dropped) or may be reduced in priority (for downstream traffic management functions to drop if there is congestion). In traffic shaping, packets are delayed until they conform. Traffic policing and traffic shaping are commonly used to protect the network against excess or excessively bursty traffic, see bandwidth management and congestion avoidance. Traffic shaping is commonly used in the network interfaces in hosts to prevent transmissions being discarded by traffic management functions in the network. Comparison to leaky bucket The token bucket algorithm is directly comparable to one of the two versions of the leaky bucket algorithm described in the literature. This comparable version of the leaky bucket is described on the relevant Wikipedia page as the leaky bucket algorithm as a meter. This is a mirror image of the token bucket, in that conforming packets add fluid, equivalent to the tokens removed by a conforming packet in the token bucket algorithm, to a finite capacity bucket, from which this fluid then drains away at a constant rate, equivalent to the process in which tokens are added at a fixed rate. There is, however, another version of the leaky bucket algorithm, described on the relevant Wikipedia page as the leaky bucket algorithm as a queue. This is a special case of the leaky bucket as a meter, which can be described by the conforming packets passing through the bucket. The leaky bucket as a queue is therefore applicable only to traffic shaping, and does not, in general, allow the output packet stream to be bursty, i.e. it is jitter free. It is therefore significantly different from the token bucket algorithm. These two versions of the leaky bucket algorithm have both been described in the literature under the same name. This has led to considerable confusion over the properties of that algorithm and its comparison with the token bucket algorithm. However, fundamentally, the two algorithms are the same, and will, if implemented correctly and given the same parameters, see exactly the same packets as conforming and nonconforming. Hierarchical token bucket The hierarchical token bucket (HTB) is a faster replacement for the class-based queueing (CBQ) queuing discipline in Linux. It is useful to limit a client's download/upload rate so that the limited client cannot saturate the total bandwidth. Conceptually, HTB is an arbitrary number of token buckets arranged in a hierarchy. The primary egress queuing discipline (qdisc) on any device is known as the root qdisc. The root qdisc will contain one class. This single HTB class will be set with two parameters, a rate and a ceil. These values should be the same for the top-level class, and will represent the total available bandwidth on the link. In HTB, rate means the guaranteed bandwidth available for a given class and ceil is short for ceiling, which indicates the maximum bandwidth that class is allowed to consume. Any bandwidth used between rate and ceil is borrowed from a parent class, hence the suggestion that rate and ceil be the same in the top-level class. Hierarchical Token Bucket implements a classful queuing mechanism for the linux traffic control system, and provides rate and ceil to allow the user to control the absolute bandwidth to particular classes of traffic as well as indicate the ratio of distribution of bandwidth when extra bandwidth become available(up to ceil). When choosing the bandwidth for a top-level class, traffic shaping only helps at the bottleneck between the LAN and the Internet. Typically, this is the case in home and office network environments, where an entire LAN is serviced by a DSL or T1 connection. See also Leaky bucket Rate limiting Traffic shaping Counting semaphores References Further reading Network performance Network scheduling algorithms
70087260
https://en.wikipedia.org/wiki/Ali%20Dehghantanha
Ali Dehghantanha
Ali Dehghantanha is an academic-entrepreneur in cybersecurity and cyber threat intelligence. He is an Associate Professor of Cybersecurity and a Canada Research Chair in Cybersecurity and Threat Intelligence. Dehghantanha is a pioneer in applying machine learning techniques toward cyber threat hunting, cyber threat intelligence, and enterprise risk management. His research is highly cited in both academic and industrial settings. He is the Founder and Director of Cyber Science Lab. Education After completing his Diploma in Mathematics at National Organization for Development of Exceptional Talents (NODET), Dehghantanha attended Islamic Azad University, Mashhad Branch, from which he graduated with a bachelor's degree in Software Engineering in 2005. He earned his Master's and Doctoral degrees in Security in Computing from University Putra Malaysia in 2008 and 2011, respectively. Career Dehghantanha started his academic career as Sr. Lecturer of Computer Science and Information Technology at the University Putra Malaysia in 2011, and later on joined the University of Salford as Marie Curie International Incoming Post-Doctoral Research Fellow in 2015. From 2017 to 2018, he held appointment as Sr. Lecturer (Associate Professor) in the Department of Computer Science at the University of Sheffield. Following this appointment, he joined the University of Guelph (UoG), Ontario, Canada, as an Associate Professor and Director of Master of Cybersecurity and Threat Intelligence program. He became a Tier 2 NSERC Canada Research Chair in Cybersecurity and Threat Intelligence at the University of Guelph (UoG) in 2020. He also holds a concurrent appointment as Adjunct Associate Professor in Schulich School of Engineering's Department of Electrical & Software Engineering at the University of Calgary since 2020. He has developed two Master's programs in cybersecurity, one in the University of Guelph – Canada, and another in the University of Salford. Research Dehghantanha is among highly cited researchers in cybersecurity. He is well-recognized for his research in cyber threat intelligence, and in several fields of cyber security including malware analysis, Internet of Things (IoT) security, and digital forensics. Application of AI in Cyber Threat Hunting and Attribution Dehghantanha was among the first to introduce some major security and forensics challenges within the Internet of Things (IoT) domain. He also reviewed previous studies published in this special issue targeting identified challenges. In 2016, he proposed a two-layer dimension reduction and two-tier classification model for anomaly-based intrusion detection in IoT backbone networks. He has influenced the IoT/ICS network defense field by creating an Intrusion Detection System (IDS) for IoT networks, a secret sharing method of encryption key exchange in vehicular IoT networks, and a method for secret key sharing and distribution between IoT devices. He conducted experiments using NSL-KDD dataset, and proved that his proposed model outperforms previous models designed to detect U2R and R2L attacks. His most notable contributions were made to building AI-based methods for cyber-attack identification and analysis in IoT. Moreover, he developed a Deep Recurrent Neural Network structure for in-depth analysis of IoT malware. Dehghantanha introduced ensemble-based multi-filter feature selection method for DDoS detection in cloud computing, and also discussed its applications in terms of detection rate and classification accuracy when compared to other classification techniques. While presenting a systematic literature review of blockchain cyber security, he conducted a systematic analysis of the most frequently adopted blockchain security applications. The systematic review also highlights the future directions of research, education and practices in the blockchain and cyber security space, such as security of blockchain in IoT, security of blockchain for AI data, and sidechain security. Furthermore, he focused his study on machine learning aided Android malware classification, and also presented two machine learning aided approaches for static analysis of Android malware. Frameworks for Cybersecurity Technology Adoption and Organizational Risk Assessment In 2019, Dehghantanha built a framework that models the impacts of adopting Privacy Enhancing Technologies (PETs) on the performance of SMEs in Canada. He has also created several frameworks for security analysis of cloud platforms, including CloudMe, OneDrive, Box, GoogleDrive, DropBox, MEGA, and SugarSync. He also works to create frameworks for breach coaching and exposure management. In 2016, he published a book entitled Contemporary Digital Forensic Investigations of Cloud and Mobile Applications, and explored the implications of cloud (storage) services and mobile applications on digital forensic investigations. Awards and honors 2016 - Senior Member, Institute of Electrical and Electronics Engineers (IEEE) 2016 - Fellowship, U.K. Higher Education Academy 2018 - Marie-Curie International Incoming Fellowship 2020 - Research Excellence Award, University of Guelph College of Engineering and Physical Sciences 2020 - Tier II Canada Research Chair in Cybersecurity and Threat Intelligence 2021 - Outstanding Leadership Award, IEEE Bibliography Books Contemporary Digital Forensic Investigations of Cloud and Mobile Applications 1st Ed. (2016) ISBN 9780128053034 Cyber Threat Intelligence (2018) ISBN 9783319739502 Handbook of Big Data and IoT Security (2019) ISBN 9783030105433 Blockchain Cybersecurity, Trust and Privacy (2020) ISBN 9783030381813 Handbook of Big Data Privacy (2020) ISBN 9783030385576 Handbook of Big Data Analytics and Forensics (2021) ISBN 9783030747527 Selected Articles Pajouh, H. H., Javidan, R., Khayami, R., Dehghantanha, A., & Choo, K. K. R. (2016). A two-layer dimension reduction and two-tier classification model for anomaly-based intrusion detection in IoT backbone networks. IEEE Transactions on Emerging Topics in Computing, 7(2), 314–323. Osanaiye, O., Cai, H., Choo, K. K. R., Dehghantanha, A., Xu, Z., & Dlodlo, M. (2016). Ensemble-based multi-filter feature selection method for DDoS detection in cloud computing. EURASIP Journal on Wireless Communications and Networking, 2016(1), 1–10. Milosevic, N., Dehghantanha, A., & Choo, K. K. R. (2017). Machine learning aided Android malware classification. Computers & Electrical Engineering, 61, 266–274. Conti, M., Dehghantanha, A., Franke, K., & Watson, S. (2018). Internet of Things security and forensics: Challenges and opportunities. Future Generation Computer Systems, 78, 544–546. Taylor, P. J., Dargahi, T., Dehghantanha, A., Parizi, R. M., & Choo, K. K. R. (2020). A systematic literature review of blockchain cyber security. Digital Communications and Networks, 6(2), 147–156. References Living people Iranian Canadian University of Guelph faculty 1982 births
43977660
https://en.wikipedia.org/wiki/JLIVECD
JLIVECD
JLIVECD is an open source CLI (command line interface) based live CD/DVD customization tool for Debian, Arch Linux, Ubuntu family distributions and Linux Mint and some of their derivatives. The host system is not restricted to be the same as the live CD/DVD system (e.g Arch Linux live CD can be modified on a Ubuntu host, Ubuntu live CD can be modified on a Debian host etc.). This tool is released under GPL-2 and primarily intended for non-commercial use. History It was developed with the help of the documentation found on LiveCDCustomization written by the Ubuntu community, Debian/Modify/CD from Debian wiki and Remastering the install ISO from Arch Linux wiki. Uses One needs to know the customization methods to make use of this tool because it does no customization itself at all; It only prepares the environment for customization and automates the task of creating a customized live CD/DVD ISO image. A base ISO image is needed on which customization will be brought upon. The resulting ISO image can then be burnt into a CD or DVD or a bootable live USB can also be prepared to use it. Related documentations How to customize Linux Mint live CD/DVD LiveCDCustomization Debian/Modify/CD Remastering the install ISO See also Ubuntu Customization Kit Reconstructor Remastersys List of remastering software References & External Links README.md JLIVECD https://help.ubuntu.com/community/LiveCDCustomization Linux software Live CD
29648150
https://en.wikipedia.org/wiki/List%20of%20rogue%20security%20software
List of rogue security software
The following is a partial list of rogue security software, most of which can be grouped into families. These are functionally identical versions of the same program repackaged as successive new products by the same vendor. Windows Anti Breaking System ANG Antivirus a AntiVermins Antivirus 360 – Clone of MS Antivirus. Antivirus 2008 Antivirus 2009 Antivirus 2010 – Clone of MS Antivirus. Also known as Anti-virus-1, AntiVirus Gold or AntivirusGT – Developed by ICommerce Solutions. Mimics the name of AVG Antivirus. Antivirus Master Antivirus Pro 2009 Antivirus Pro 2010 Antivirus Pro 2017 Antivirus System PRO Antivirus XP 2008 Antivirus XP 2010 AV Antivirus Suite AVG Antivirus 2011 – Imitates AVG Antivirus and it is not affiliated with the legitimate AVG. Now discontinued. AV Security Essentials AV Security Suite Awola BestsellerAntivirus, Browser Defender ByteDefender also known as ByteDefender Security 2010 – Knock-off of the legitimate BitDefender Antivirus software Cleanator CleanThis Cloud Protection ContraVirus – Uses outated signature database. Discontinued. Control Center Cyber Security, Core Security Data Protection Defense Center – Discontinued. Defru Desktop Security 2017 Disc Antivirus Professional Disk Doctor Doctor Antivirus Dr Guard DriveCleaner EasySpywareCleaner, EasyFix Tools Eco AntiVirus Errorsafe, Error Expert E-Set Antivirus 2011 – Also known as ESet Antivirus 2011. exploits name ESET (should not be confused with the legitimate app of the same name) Essential Cleaner Flu Shot 4 – Probably the earliest well-known instance of rogue security software Green Antivirus 2009 Hard Drive Diagnostic HDD Fix HDD Plus HDD Rescue Home Security Solutions IEDefender InfeStop Internet Antivirus, InstallShield (aka Internet Antivirus Pro, distributed by plus4scan.com) Internet Antivirus 2011 Internet Defender 2011 Internet Security 2010, Internet Security 2011 Internet Security 2012 Internet Security Essentials Internet Security Guard Live PC Care Live Security Platinum Live Security Suite Mac Defender Mac Protector MacSweeper MalwareAlarm MalwareCore MalwareCrush Malware Defense Malware Protection Center Memory Fixer MS AntiSpyware 2009 – Exploits the name of the legitimate Microsoft Antispyware, now Windows Defender. Discontinued. MS Antivirus – Also known as Microsoft Anti Malware Mimics the name of Microsoft Antivirus or Microsoft Security Essentials. MS Removal Tool Microsoft Security Essentials – Masquerades as the legitimate program. Now discontinued. My Security Engine My Security Shield My Security Wall MxOne Antivirus Netcom3 Cleaner Paladin Antivirus PAL Spyware Remover PC Antispy PC Clean Pro PC Privacy Cleaner PC Optimizer Pro PCPrivacy Tools PCSecureSystem – Now discontinued. PerfectCleaner – Discontinued. Perfect Defender 2009, Perfect Optimizer PersonalAntiSpy Free Personal Antivirus Personal Internet Security 2011 Personal Security Personal Shield Pro PC Antispyware PC Defender Antivirus PCKeeper Privacy Center SAntivirus Security Shield Security Solution 2011 Security Suite Platinum Security Tool Security Tool Security Toolbar 7.1 Security Essentials 2010 (not to be confused with Microsoft Security Essentials) Smartpcfixer Segurazo SpyBouncer SpyContra SpyControl SpyCrush Spydawn – Discontinued. SpyEraser (Video demonstration) SpyGuarder SpyHeal (a.k.a. SpyHeals & VirusHeal) SpyLax – Previously known as SpyDoctor, masquerades as Spyware Doctor. Discontinued. Spylocked SpyMarshal SpyOfficer SpyRid SpyShelter SpySheriff (a.k.a. PestTrap, BraveSentry, SpyTrooper) SpySpotter SpywareBot – Imitates Spybot - Search & Destroy. Now discontinued. Spyware B1aster – Exploits the name of Javacool's SpywareBlaster and no trial version locatable online. Spyware Cleaner SpywareGuard 2008 – Mimics the name of SpywareGuard by Javacool Software SpywareNo – Clone of SpySheriff. Spyware Protect 2009 SpywareQuake SpywareSheriff – Confused clone of SpySheriff. SpywareStop – Previous version was SpywareBot. Spyware Stormer, Spyware X-terminator SpywareStrike Spyware Striker Pro SpyWiper Super AV SysGuard Sysinternals Antivirus System Antivirus 2008 Terminexor Rogue clone of Spybot Search & Destroy and violates software's privacy policy or end user license agreement. TheSpyBot – Spybot - Search & Destroy knockoff ThinkPoint Total Secure 2009 Total Secure 2009 Total Win 7 Security Total Win Vista Security Total Win XP Security UltimateCleaner Ultra Defragger VirusHeat VirusIsolator Virus Locker VirusMelt VirusProtectPro (a.k.a. AntiVirGear) Vista Antivirus 2008 Vista Home Security 2011 Vista Internet Security 2012 Vista Security 2011 Vista Security 2012 Vista Smart Security 2010 VirusBurst VirusBursters VirusGuard Volcano Security Suite Win7 Antispyware 2011 Win Antispyware Center Win 7 Home Security 2011 WinAntiVirus Pro 2006 WinFixer Win HDD WinHound Winwebsec Windows Police Pro Winpc Antivirus Winpc Defender – Imitates Windows Defender. Now discontinued. WinSpywareProtect WinWeb Security 2008 Wireshark Antivirus WorldAntiSpy Wolfram Antivirus XP AntiMalware XP AntiSpyware 2009 XP AntiSpyware 2010 XP AntiSpyware 2012 XP Antivirus 2010 XP Antivirus 2012 XP Antivirus Pro 2010 XP Defender Pro XP Home Security 2011 XP Internet Security 2010 Your PC Protector Total Antivirus 2020 GroffoAV References Scareware Social engineering (computer security) Antivirus software
20332795
https://en.wikipedia.org/wiki/Copyright%20aspects%20of%20hyperlinking%20and%20framing
Copyright aspects of hyperlinking and framing
In copyright law, the legal status of hyperlinking (also termed "linking") and that of framing concern how courts address two different but related Web technologies. In large part, the legal issues concern use of these technologies to create or facilitate public access to proprietary media content — such as portions of commercial Web sites. When hyperlinking and framing have the effect of distributing, and creating routes for the distribution of content (information) that does not come from the proprietors of the Web pages affected by these practices, the proprietors often seek the aid of courts to suppress the conduct, particularly when the effect of the conduct is to disrupt or circumvent the proprietors' mechanisms for receiving financial compensation. The issues about linking and framing have become so intertwined under copyright law that it is impractical to attempt to address them separately. As will appear, some decisions confuse them with one another, while other decisions involve and therefore address both. Framing involves the use of hyperlinking, so that any challenge of framing under copyright law is likely to involve a challenge of hyperlinking as well. Linking While hyperlinking occurs in other technologies, U.S. copyright litigation has centered on HTML. Accordingly, this article considers only such technology. Ordinary link The HTML code for a simple, ordinary hyperlink is shown below. A home page link would be written this way: <a href="https://www.uspto.gov/">USPTO</a> Deep link Most Web sites are organized hierarchically, with a home page at the top and deeper pages within the site, reached by links on the home page. Deep linking is the practice of using a hyperlink that takes a user directly to a page other than the top or home page. The link given below is a deep link. <a href="https://www.uspto.gov/patents-getting-started/general-information-concerning-patents">General information concerning patents</a> A typical Internet browser would render the foregoing HTML code as: When a user clicks on the underlined text, the browser jumps from the page on which the link is shown to a page of the Web site of the U.S. Patent and Trademark Office (USPTO) that has the URL (Web address) shown above: https://www.uspto.gov/patents-getting-started/general-information-concerning-patents. Several lawsuits have involved complaints by proprietors of Web pages against the use of deep links. Inline link Related issues arise from use of inline links (also called image-source or img-src links because the HTML code begins with "img src=") on Web pages. An inline link places material — usually an image such as a JPEG or GIF — from a distant Web site into the Web page being viewed. For example, the adjacent image is the seal of the USPTO, as shown on some of its pages at the USPTO Web site. The URL of one version of the USPTO seal image is https://www.uspto.gov/images/uspto_seal_200.gif, a version of which can be seen in context at https://www.uspto.gov/main/profiles/copyright.htm. The former of these becomes an inline or img-src link if img src= is inserted before the http, angle brackets enclose the whole expression, and the entire code fragment is inserted into the text of a page of HTML code. When an inline (img-src) link of an image is used on a Web page, it seems to be present as a part of the Web page. The presence of the image is only virtual, however, in the sense that the image file is not physically present at the server for the Web site being viewed. The actual location of the image file, if the image were that of the USPTO seal, would be at the USPTO server in Virginia. Use of inline linking has led to contentious litigation (discussed below). Hierarchy of links The image at right is a front view of the U.S. Supreme Court (SCOTUS). It can be found on the SCOTUS Web site as an element in the headings for various pages of that site: https://www.supremecourt.gov/about/biographies.aspx—Biographies of Current Justices of the Supreme Court; https://www.supremecourt.gov/about/about.aspx—About the Supreme Court; and https://www.supremecourt.gov/about/courtbuilding.aspx—The Supreme Court Building. The image can also be found in isolation: https://web.archive.org/web/20170618212426/https://www.supremecourt.gov/images/sectionbanner13.png. All these files are stored on the SCOTUS server and can be accessed by clicking on the respective hyperlinks (deep links). The link to the SCOTUS main page or portal is https://www.supremecourt.gov/default.aspx. Image links can be categorized in a hierarchy, based on the technological expedient used to effectuate the link, as explained below. Each further step corresponds to successively lower levels of risk of copyright infringement. The hierarchy operates as follows, using the picture of the SCOTUS building as an example for discussion purposes (selected because it is in the public domain and is not subject to copyright protection; 17 U.S.C. § 105 provides that copyright protection is not available for any work of the United States Government). An image may be placed on a Web page or made available for viewing by any of the following expedients: Copying the image file to the server hosting the page (as that of the Supreme Court has been copied to the Wikipedia server to present the image at the upper right of this section of text). Unless the image is in the public domain, that copying will create copyright infringement liability unless a defense, such as fair use or license, applies. The HTML code for embedding such an image in text is the ordinary form for an image in text (where the PNG file is in same directory as the text): <img src="Supreme_Court.png" align=right> Using an img-src link to the image at the proprietor's Web page, to make the page appear to contain the image. In an ordinary context, the page's creator would place text around the image, making the image appear as it does at the immediate right. The image from an img-src link looks like the image from a file copied to the page's server, even though the image (i.e., its code) is actually stored on the remote server of the other Web site and not on the page's own server. The U.S. Court of Appeals for the Ninth Circuit considered this fact of crucial significance in the Perfect 10 case, discussed subsequently in this article. The court held that, when Google provided links to images, Google did not violate the provisions of the copyright law prohibiting unauthorized reproduction and distribution of copies of a work: "Because Google's computers do not store the photographic images, Google does not have a copy of the images for purposes of the Copyright Act." (This fact about image storage is also true of all links that follow in this list.) This expedient has been challenged as copyright infringement. See the Arriba Soft and Perfect 10 cases (below). In the Perfect 10 case, Perfect 10 argued that Google's image pages caused viewers to believe they were seeing the images on Google's Web site. The court brushed this argument aside: "While in-line linking and framing may cause some computer users to believe they are viewing a single Google Web page, the Copyright Act, unlike the Trademark Act, does not protect a copyright holder against acts that cause consumer confusion." Using an ordinary (deep) hyperlink to the image at the remote server, so that users must click on a link on the hosting page to jump to the image. The HTML code would be https://www.supremecourt.gov/images/sectionbanner13.png. This has been protested because it bypasses everything at the other site but the image. Such protests have been largely ineffective. This argument on Kelly's behalf is articulated in the amicus curiae brief supporting Kelly by the American Society of Media Photographers:[I]t was the actual display of the full-size images of Kelly’s work stripped from the original context that was not fair use. Merely linking to Kelly's originating home page, on the other hand, without free-standing display of the full-size images, would not run afoul of the fair use limits established by the Panel. It is striking that nowhere in [the adversaries'] briefs do they explain why linking could not be constructed in this fashion. Using a deep link to the specific page on the image proprietor's Web site at which the image is located, thus presenting the image to the user along with the textual material with which the proprietor surrounded it (but avoiding the portal or home page). The HTML code is https://www.supremecourt.gov/about/biographies.aspx for Biographies of Current Justices of the Supreme Court. This has been protested because it does not require the viewer to look at the advertising or other material at the home and other earlier pages on the proprietor's site, although the user must look at what is on the same page as the image. Such protests have been largely ineffective. Linking to the home page of the image proprietor's Web site and explaining how to page down through his successive pages and all of his extraneous material to find the image. For example: https://www.supremecourt.gov/default.aspx. This will not create copyright infringement liability under any theory thus far advanced in U.S. litigation. Framing Framing is the juxtaposition of two separate Web pages within the same page, usually with a separate frame with navigational elements. Framing is a method of presentation in a Web page that breaks the screen up into multiple non-overlapping windows. Each window contains a display from a separate HTML file, for example, a Web page from a different Web site that is fetched by automatically hyperlinking to it. While the usage of frames as a common Web design element has been deprecated for several years (replaced by the usage of <div> elements), some sites, like Google Images and Google Translate, use frames as a way to help navigate non-Google pages from a framed Google interface. Incorporating copyrighted Web content by usage of framing has led to contentious litigation. Frames can be used for Web pages belonging to the original site or to load pages from other sites into a customized arrangement of frames that provide a generalized interface without actually requiring the viewer to browse the linked site from that site's URLs and interfaces. Proprietors of copyrighted content have at times contended that framing their Web pages constituted copyright infringement. Copyright provides exclusive rights to reproduce ((1)) or distribute ((3)) copies of the work. However, framing does not directly reproduce or distribute any copy of the original Web page. Rather, the accused infringer simply establishes a pointer that the user's browser follows to the proprietor's server and Web page. For a pedagogically exaggerated example of the kind of framing that has incensed proprietors of copyrighted content, which "frames" a page titled Is Framing Copyright Infringement?, see Framing the 'Framing' Page. On the theory that a picture is worth 1000 words, the viewer is invited to compare the referenced pages to understand what framing is and why it annoys proprietors of framed pages. History of copyright litigation in field In large part, linking and framing are not held to be copyright infringement under U.S. and German copyright law, even though the underlying Web pages are protected under copyright law. Because the copyright-protected content is stored on a server other than that of the linking or framing person (it is stored on the plaintiff's server), there is typically no infringing "copy" made by the defendant linking or framing person (as may be essential) on which to base liability. Some European countries take a more protective view, however, and hold unauthorized framing and so-called deep linking unlawful. European Union The European Court of Justice's binding ruling in 2014 was that embedding a work could not be a violation of copyright: In September 2016, the European Court of Justice ruled that knowingly linking to an unauthorized posting of a copyrighted work for commercial gain constituted infringement of the exclusive right to communicate the work to the public. The case surrounded GeenStijl and Sanoma; in 2011, photos were leaked from an upcoming issue of the Dutch version of Playboy (published by Sanoma) and hosted on a Web site known as FileFactory. GeenStijl covered the leak by displaying a thumbnail of one of the images and linking to the remainder of the unauthorized copies. The court ruled in favor of Sanoma, arguing that GeenStijl's authors knowingly reproduced and communicated a copyrighted work to the public without consent of its author, and that GeenStijl had profited from the unauthorized publication. Belgium Belgian Association of Newspaper Editors v. Google In September 2006 the Belgian Association of Newspaper Editors sued Google and obtained an injunctive order from the Belgian Court of First Instance that Google must stop deep linking to Belgian newspapers without paying royalties, or else pay a fine of €1 million daily. Many newspaper columns were critical of the Belgian newspapers' position. Denmark Danish Newspaper Publishers Association v. Newsbooster The Bailiff's Court of Copenhagen ruled in July 2002 against the Danish Web site Newsbooster, holding, in a suit brought by the Danish Newspaper Publishers Association (DNPA), that Newsbooster had violated Danish copyright law by deep linking to newspaper articles on Danish newspapers' Internet sites. Newsbooster's service allows users to enter keywords to search for news stories, and then deep links to the stories are provided. The DNPA said that this conduct was "tantamount to theft." The court ruled in favor of the DNPA, not because of the mere act of linking but because Newsbooster used the links to gain commercial advantage over the DNPA, which was unlawful under the Danish Marketing Act. The court enjoined Newsbooster's service. home A/S v. Ofir A-S The Maritime and Commercial Court in Copenhagen took a somewhat different view in 2005 in a suit that home A/S, a real estate chain, brought against Ofir A-S, an Internet portal (OFiR), which maintains an Internet search engine. home A/S maintains an Internet Web site that has a searchable database of its current realty listings. Ofir copied some database information, which the court held unprotected under Danish law, and also Ofir's search engine provided deep links to the advertisements for individual properties that home A/S listed, thus by-passing the home page and search engine of home A/S. The court held that the deep linking did not create infringement liability. The Court found that search engines are desirable as well as necessary to the function of the Internet; that it is usual that search engines provide deep links; and that businesses that offer their services on the Internet must expect that deep links will be provided to their Web sites. Ofir's site did not use banner advertising and its search engine allowed users, if they so chose, to go to a home page rather than directly to the advertisement of an individual property. The opinion does not appear to distinguish or explain away the difference in result from that of the Newsbooster case. Germany Holtzbrinck v. Paperboy In July 2003 a German Federal Superior Court held that the Paperboy search engine could lawfully deep link to news stories. An appellate court then overturned the ruling, but the German Federal Supreme Court reversed in favor of Paperboy. "A sensible use of the immense wealth of information offered by the World Wide Web is practically impossible without drawing on the search engines and their hyperlink services (especially deep links)," the German court said. Decision I-20 U 42/11 Dusseldorf Court of Appeal 8 October 2011 In Germany making content available to the public on a Web site by embedding the content with inline links now appears to be copyright infringement. This applies even though a copy has never been taken and kept of an image and even though the image is never "physically" part of the Web site. The Düsseldorf appeal court overruled the lower Court of First Instance in this case. The Defendant had included links on his blog to two photographs that appeared on the Claimant’s Web site. No prior permission had been sought or obtained. Scotland Shetland Times Ltd. v. Wills The first suit of prominence in the field was Shetland Times Ltd. v. Wills, Scot. Ct. of Session (Edinburgh, 24 Oct 1996). The Shetland Times challenged use by Wills of deep linking to pages of the newspaper on which selected articles of interest appeared. The objection was that defendant Wills thus by-passed the front and intervening pages on which advertising and other material appeared in which the plaintiff had an interest but defendant did not. The Times obtained an interim interdict (Scottish for preliminary injunction) and the case then settled. United States Washington Post v. Total News In February 1997 the Washington Post, CNN, the Los Angeles Times, Dow Jones (Wall Street Journal), and Reuters sued Total News Inc. for framing their news stories on the Total News Web page. The case was settled in June 1997, on the basis that linking without framing would be used in the future. Ticketmaster v. Microsoft In April 1997 Ticketmaster sued Microsoft in Los Angeles federal district court for deep linking. Ticketmaster objected to Microsoft's bypassing the home and intermediate pages on Ticketmaster's site, claiming that Microsoft had "pilfered" its content and diluted its value. Microsoft's Answer raised a number of defenses explained in detail in its pleadings, including implied license, contributory negligence, and voluntary assumption of the risk. Microsoft also argued that Ticketmaster had breached an unwritten Internet code, under which any Web site operator has the right to link to anyone else's site. A number of articles in the trade press derided Ticketmaster's suit. The case was settled in February 1999, on confidential terms; Microsoft stopped the deep linking and instead used a link to Ticketmaster's home page. A later case, Ticketmaster Corp. v. Tickets.com, Inc. (2000), yielded a ruling in favour of deep linking. Kelly v. Arriba Soft The first important U.S. decision in this field was that of the Ninth Circuit in Kelly v. Arriba Soft Corp. Kelly complained, among other things, that Arriba's search engine used thumbnails to deep link to images on his Web page. The court found that Arriba's use was highly transformative, in that it made available to Internet users a functionality not previously available, and that was not otherwise readily provided — an improved way to search for images (by using visual cues instead of verbal cues). This factor, combined with the relatively slight economic harm to Kelly, tipped the fair use balance decisively in Arriba's favour. As in other cases, Kelly objected to linking because it caused users to bypass his home page and intervening pages. He was unable, however, to show substantial economic harm. Kelly argued largely that the part of the copyright statute violated was the public display right ((5)). He was aware of the difficulties under the reproduction and distribution provisions (17 U.S.C. §§ 106(1) and (3)), which require proof that the accused infringer trafficked in copies of the protected work. The court focused on the fair use defense, however, under which it ruled in Arriba's favour. Perfect 10 v. Amazon In Perfect 10, Inc. v. Amazon.com, Inc., the Ninth Circuit again considered whether an image search engine's use of thumbnail was a fair use. Although the facts were somewhat closer than in the Arriba Soft case, the court nonetheless found the accused infringer's use fair because it was "highly transformative." The court explained: We conclude that the significantly transformative nature of Google's search engine, particularly in light of its public benefit, outweighs Google's superseding and commercial uses of the thumbnails in this case. … We are also mindful of the Supreme Court's direction that "the more transformative the new work, the less will be the significance of other factors, like commercialism, that may weigh against a finding of fair use." In addition, the court specifically addressed the copyright status of linking, in the first U.S. appellate decision to do so: Google does not … display a copy of full-size infringing photographic images for purposes of the Copyright Act when Google frames in-line linked images that appear on a user's computer screen. Because Google's computers do not store the photographic images, Google does not have a copy of the images for purposes of the Copyright Act. In other words, Google does not have any "material objects … in which a work is fixed … and from which the work can be perceived, reproduced, or otherwise communicated" and thus cannot communicate a copy. Instead of communicating a copy of the image, Google provides HTML instructions that direct a user's browser to a Web site publisher's computer that stores the full-size photographic image. Providing these HTML instructions is not equivalent to showing a copy. First, the HTML instructions are lines of text, not a photographic image. Second, HTML instructions do not themselves cause infringing images to appear on the user's computer screen. The HTML merely gives the address of the image to the user's browser. The browser then interacts with the computer that stores the infringing image. It is this interaction that causes an infringing image to appear on the user's computer screen. Google may facilitate the user's access to infringing images. However, such assistance raised only contributory liability issues and does not constitute direct infringement of the copyright owner's display rights. … While in-line linking and framing may cause some computer users to believe they are viewing a single Google Web page, the Copyright Act, unlike the Trademark Act, does not protect a copyright holder against acts that cause consumer confusion. State of U.S. law after Arriba Soft and Perfect 10 The Arriba Soft case stood for the proposition that deep linking and actual reproduction in reduced-size copies (or preparation of reduced-size derivative works) were both excusable as fair use because the defendant's use of the work did not actually or potentially divert trade in the marketplace from the first work; and also it provided the public with a previously unavailable, very useful function of the kind that copyright law exists to promote (finding desired information on the Web). The Perfect 10 case involved similar considerations, but more of a balancing of interests was involved. The conduct was excused because the value to the public of the otherwise unavailable, useful function outweighed the impact on Perfect 10 of Google's possibly superseding use. Moreover, in Perfect 10, the court laid down a far-reaching precedent in favour of linking and framing, which the court gave a complete pass under copyright. It concluded that "in-line linking and framing may cause some computer users to believe they are viewing a single Google Web page, [but] the Copyright Act . . . does not protect a copyright holder against acts that cause consumer confusion." A February 2018 summary judgement from the District Court for the Southern District of New York created a new challenge to the established cases. In Goldman v. Breitbart, Justin Goldman, a photographer, posted his on-the-street photograph of Tom Brady with Boston Celtics GM Danny Ainge and others to Snapchat, which became popular over social media such as Twitter on rumors that Brady was helping with the Celtics' recruitment. Several news organizations subsequently published stories embedding the tweets with Goldman's photograph in the stories. Goldman took legal action against nine of these news agencies, claiming they violated his copyright. Judge Katherine Forrest decided in favour of Goldman and asserting the news sites violated his copyright, rejecting elements of the Perfect 10 ruling. Forrest said that as the news agencies had to take specific steps to embed the tweets with the photograph in their stories, wrote the stories to highlight those, and otherwise was not providing an automated service like Google's search engine. References Computer law Copyright law Hypertext
52949872
https://en.wikipedia.org/wiki/Kalyna%20%28cipher%29
Kalyna (cipher)
Kalyna (Ukrainian: Калина, Viburnum opulus) is a symmetric block cipher. It supports block sizes of 128, 256 or 512 bits; the key length is either equal to or double the block size. Kalyna was adopted as the national encryption standard of Ukraine in 2015 (standard DSTU 7624:2014) after holding Ukrainian national cryptographic competition. Kalyna is a substitution–permutation network and its design is based on the Rijndael (AES) encryption function having quite different key schedule, another set of four different S-boxes and increased MDS matrix size. Kalyna has 10 rounds for 128-bit keys, 14 rounds for 256-bit keys and 18 rounds for 512-bit keys. Independent researchers proposed some attacks on reduced-round variants of Kalyna, but all of them have a very high complexity and none of them are practical. References Roman Oliynykov, Ivan Gorbenko, Oleksandr Kazymyrov, Victor Ruzhentsev, Oleksandr Kuznetsov, Yurii Gorbenko, Oleksandr Dyrda, Viktor Dolgov, Andrii Pushkaryov, Ruslan Mordvinov, Dmytro Kaidalov. A New Encryption Standard of Ukraine: The Kalyna Block Cipher. IACR Cryptology ePrint Archive, p650 (2015) https://eprint.iacr.org/2015/650 Roman Oliynykov, Ivan Gorbenko, Viktor Dolgov and Viktor Ruzhentsev. Results of Ukrainian national public cryptographic competition. Tatra Mt. Math. Publ. 47 (2010), 99–113. DOI: 10.2478/v10127-010-0033-6 https://www.degruyter.com/view/j/tmmp.2010.47.issue-1/v10127-010-0033-6/v10127-010-0033-6.xml Roman Oliynykov. Kalyna block cipher presentation (in English) http://www.slideshare.net/oliynykov/kalyna-english Akshima, Donghoon Chang, Mohona Ghosh, Aarushi Goel, Somitra Kumar Sanadhya. Single Key Recovery Attacks on 9-Round Kalyna-128/256 and Kalyna-256/512. Volume 9558 of the series Lecture Notes in Computer Science, pp. 119–135. https://link.springer.com/chapter/10.1007/978-3-319-30840-1_8 Riham Altawy, Ahmed Abdelkhalek, Amr M. Youssef. A Meet-in-the-Middle Attack on Reduced-Round Kalyna-b/2b. IEICE Transactions on Information and Systems, Vol. E99-D, No.4, pp. 1246–1250. http://search.ieice.org/bin/summary.php?id=e99-d_4_1246 External links Reference implementation of the Kalyna block cipher (DSTU 7624:2014) K
26042874
https://en.wikipedia.org/wiki/Back%20in%20Luv%27
Back in Luv'
Back In Luv' is a DVD compilation by Dutch girl group Luv'. It includes TV performances of the original line-up recorded during their heyday (1977 - 1980). It was released by Princess Entertainment in 2006 when Luv's schedule was hectic again as the pop formation had reunited. Background When the double CD compilation 25 Jaar Na Waldolala came out in late 2003, a DVD was supposed to be released simultaneously. However, Luv' singer Patty Brard vetoed this video collection because of her right of publicity and her conflict with another member of the group (José Hoebee). After the reconciliation between Brard and Hoebee which led to Luv's unexpected performance at the 60th birthday of Hans van Hemert (who formed the girl group) in April 2005, a comeback was planned. 2006 saw a profusion of projects around the nostalgia for Luv' in the Netherlands and Belgium: the broadcast of a reality TV documentary about their reunion on RTL 5 and vtm, their participation in three mega concerts of De Toppers at the Amsterdam ArenA, many live shows as well as the release of a compilation "Het Mooiste Van Luv'" and a box set "Completely In Luv'". Furthermore, nothing could stop the distribution of a DVD as Luv' was a visual act. The Dutch subsidiary of Universal Music (which held the rights to the trio's back catalogue) was expected to release this long-awaited DVD, and it was rumoured that BR Music (the leading oldies specialist in the Benelux countries) would distribute it. Instead, the Entertainment division of "Princess Household Appliances" was involved in this video project. Back In Luv features the girls' performances on Dutch TV in the late 1970s and their appearance on the German TV programme Musikladen where they performed their debut single "My Man" in August 1977. At a time when MTV did not exist, Luv' took advantage of television to become a household name in Benelux, German-speaking countries, and Denmark. In addition to popular shows like AVRO's TopPop, and Showbizzquizz, the main highlights of the DVD are three TV specials : All You Need Is Luv, broadcast on TROS in November 1978. The aim of this program was to promote the With Luv' album. Lots of LUV, aired on TROS in July 1979 and produced by media tycoon John de Mol. It was named after Luv's second studio LP. This Is True LUV, also produced by De Mol as he had an affair with Luv' singer Marga Scheide and broadcast in early 1980 on NCRV to promote the trio's third opus. Track listing My Man - 3:04 Dream, Dream - 3:04 U.O.Me - 2:55 You're the Greatest Lover - 2:50 Trojan Horse - 3:24 Special: All You Need Is Luv You're the Greatest Lover - 2:50 Who Do You Wanna Be - 3:43 Trojan Horse - 3:24 Louis Je T'adore - 3:40 U.O.Me - 2:55 Oh, Get Ready - 3:16 Casanova - 3:48 Eeny Meeny Miny Moe - 2:50 Special: Lots Of Luv Medley: U.O.Me/Eres Mi Major Amante/Trojan Horse D.J. - 3:20 Casanova - 3:48 Marcellino - 3:14 Shoes Off (Boots On) - 3:07 Eeny Meeny Miny Moe - 2:46 I.M.U.R. - 2:46 Ooh, Yes I Do - 2:58 Ann-Maria - 4:38 One More Little Kissy - 3:49 Special: True Luv Wine, Women And Song - 3:45 Boys Goodnight - 2:40 Ooh, Yes I Do - 2:58 Rhythm 'n Shoes - 3:07 Ann-Maria - 4:38 My Guy - 3:49 Getaway - 3:03 Let There Be Love - 2:39 References External links Review about Back In Luv' 2006 video albums Luv' video albums
1762176
https://en.wikipedia.org/wiki/Applicant%20tracking%20system
Applicant tracking system
An applicant tracking system (ATS) is a software application that enables the electronic handling of recruitment and hiring needs. An ATS can be implemented or accessed online at enterprise- or small-business levels, depending on the needs of the organization; free and open-source ATS software is also available. An ATS is very similar to customer relationship management (CRM) systems, but are designed for recruitment tracking purposes. In many cases they filter applications automatically based on given criteria such as keywords, skills, former employers, years of experience and schools attended. This has caused many to adapt resume optimization techniques similar to those used in search engine optimization when creating and formatting their résumé. Principle A dedicated ATS is not uncommon for recruitment-specific needs. On the enterprise level it may be offered as a module or functional addition to a human resources suite or human resource information system (HRIS). The ATS is expanding into small and medium enterprises through open-source or software as a service offerings (SaaS). The principal function of an ATS is to provide a central location and database for a company's recruitment efforts. ATSs are built to better assist management of resumes and applicant information. Data is either collected from internal applications via the ATS front-end, located on the company website or is extracted from applicants on job boards. Most job and resume boards (Reed Online, LinkedIn.com, Monster.com, Hotjobs, CareerBuilder, Indeed.com) have partnerships with ATS software providers to provide parsing support and easy data migration from one system to another. Newer applicant tracking systems (often the epithet is next-generation) are platforms as a service, where the main piece of software has integration points that allow providers of other recruiting technology to plug in seamlessly. The ability of these next-generation ATS solutions allows jobs to be posted where the candidate is and not just on-job boards. This ability is being referred to as omnichannel talent acquisition. Recent changes include use of artificial intelligence (AI) tools and natural language processing to facilitate guided semantic search capabilities offered through cloud-based platforms that allow companies to score and sort resumes with better alignment to the job requirements and descriptions. With the advent of ATS, resume optimization techniques and online tools are often used by applicants to increase their chances of landing an interview call. References Business software Human resource management E-recruitment
54516386
https://en.wikipedia.org/wiki/2017%20Cotton%20Bowl%20Classic%20%28December%29
2017 Cotton Bowl Classic (December)
The 2017 Cotton Bowl Classic was a college football bowl game played on December 29, 2017, at AT&T Stadium in Arlington, Texas. It featured the Ohio State Buckeyes from the Big Ten Conference and the USC Trojans from the Pac-12 Conference. The 82nd Cotton Bowl Classic was one of the 2017–18 bowl games that concluded the 2017 FBS football season. The game was broadcast on ESPN, ESPN Deportes, ESPN Radio and XM Satellite Radio. It was sponsored by the Goodyear Tire and Rubber Company and was officially known as the Goodyear Cotton Bowl Classic. Teams The Cotton Bowl was played by Pac-12 Conference champion USC Trojans and Big Ten Conference champion Ohio State Buckeyes. These teams were chosen by the CFP Selection Committee. Traditionally, the Pac-12 and Big Ten champions meet in the Rose Bowl, however in the 2017 season, that game will be used for one of the two College Football Playoff semifinal games. Of note is that this year's Rose Bowl features SEC champion Georgia and Big 12 champion Oklahoma, a matchup which traditionally occurs in the Sugar Bowl, which is being used as the other CFP semifinal game for this year. Prior to kickoff, the Trojans led the all-time series 13-9-1; the most recent game was on September 12, 2009, where the Trojans defeated the Buckeyes by a score of 18–15, scoring with 1:05 remaining in the game to go ahead to stay. USC had won the last 7 games in the series, with Ohio State's last win coming in the 1974 Rose Bowl. This is the eighth time that the schools met in a bowl game. The previous seven bowl meetings were all in the Rose Bowl, most recently in 1985—a game the Trojans won, 20–17. The Buckeyes defeated the Trojans 24–7. With the victory, the all-time series between the schools now stands at 13-10-1 in favor of USC. Game summary Scoring summary Statistics References Cotton Bowl Classic Cotton Bowl Classic Ohio State Buckeyes football bowl games USC Trojans football bowl games 2017 in sports in Texas 21st century in Arlington, Texas Cotton Bowl Classic
1285118
https://en.wikipedia.org/wiki/Empeg%20Car
Empeg Car
The Empeg Car is the first in-dash MP3 player developed. In 1998 a British company called Empeg was formed to build the unit, which shipped the following year. The Empeg Car was a Linux-based unit that transferred MP3 tracks from the user's computer to the player via USB, Ethernet, or a serial port connection. Prices started at $1,100 US for the 4GB version and went all the way up to a $2,400 28GB unit that utilized two laptop drives (which was considered very large capacity at the time). The Empeg Car garnered quite a following and became beloved among the small group of users who bought one. Initial production was less than 400 units, after which it was redesigned to make production cheaper and easier. SONICblue - the former S3 company that had already acquired both the Rio line of MP3 portables by purchasing Diamond and the Rave-MP line by purchasing Sensory Science - took notice of the unit and decided the Empeg Car would fit into its plans. On November 1, 2000 Empeg was acquired by SONICblue Incorporated and the unit was renamed the Rio Car. The original British development team was rolled into the company and eventually took responsibility for all audio software development at SONICblue. Unfortunately, SONICblue did not have a clear game plan with how to promote the Rio Car. Rio did little to market it and soon left it to languish. Despite their owners' strong devotion to the product, sales of new units were modest and on September 24, 2001 SONICblue discontinued the line. Fewer than 6000 players were ever produced. Most of the resources, including people, code, and design work, behind the Empeg Car went into the following products: Rio Receiver - Network enabled client for streaming music off a computer to anywhere in the home Rio Central - A home stereo component that ripped CDs to MP3s and stored them on an internal hard drive. Supported Rio Receivers as clients. Rio Karma - Portable 20gb music player SONICblue went bankrupt in 2003, but the Rio division was purchased by D&M Holdings. The former Empeg employees still with Rio went on to produce the Rio Karma, the Rio Carbon, and several recent flash memory players. In 2005, D&M sold all of their audio player technology to SigmaTel, including all of the Empeg technology, and all of the patents, source code, and designs related to the Rio audio players. The former Empeg employees as well as other Rio technical employees are now employed by SigmaTel at their Austin, TX offices. References External links Diamond Rio Buys Car MP3 Player Company - October 2000 MP3 Newswire article empeg home page unofficial empeg discussion boards Riocar.org - Home of the empeg FAQ, and other empeg information MP3Mobile - Hugo Fiennes's first in car MP3 player Hugo Fiennes home page - (rather out of date) Three Years in Embedded Linux: A Talk with Hugo Fiennes and Marc Merlin - About the empeg and Rio Car (from 2001) From Warwick University to Apple Inc California - Article about Hugo Fiennes's Career (from 2009) Digital audio players Linux-based devices
1323027
https://en.wikipedia.org/wiki/1984%20in%20video%20games
1984 in video games
1984 saw many sequels and prequels along with new titles such as 1942, Boulder Dash, Cobra Command, Jet Set Willy, Karate Champ, Kung-Fu Master, Tetris, Yie Ar Kung-Fu and Punch-Out! The year's highest-grossing arcade games were Pole Position in the United States, for the second year in a row, and Track & Field in the United Kingdom. The year's best-selling home system was Nintendo's Family Computer (Famicom), which was only sold in Japan at the time. Financial performance In the United States, home video game sales fall to ( adjusted for inflation). Highest-grossing arcade games Japan In Japan, the following titles were the top-grossing arcade video games of each month on the Game Machine charts in 1984. United Kingdom and United States The following titles were the highest-grossing arcade games of 1984 in the United Kingdom and United States. Best-selling home systems Best-selling home video games in the United Kingdom In the United Kingdom, the following titles were the top ten best-selling home computer games of 1984, according to N.O.P. Market Research. Major awards The fifth Arcade Awards are held, for games released during 1982–1983. Pole Position wins Coin-Op Game of the Year, Ms. Pac-Man and Lady Bug win console Videogames of the Year, Lode Runner wins Computer Game of the Year, and Q*bert wins dedicated Stand-Alone Game of the Year. In the second Golden Joystick Awards (held in 1985) for best home computer games, Knight Lore takes Game of the Year. Business New companies: Accolade, Elite Systems, Gremlin Graphics, Kemco, New World Computing, Novagen, Ocean, Psygnosis, Sculptured Software Defunct companies: Astrocade, Human Engineered Software, Imagine, Sirius, Starpath. Hasbro, Inc. acquires Milton Bradley Company. Management Sciences America acquires Edu-Ware Services. Broderbund acquires 8-bit gaming competitor Synapse Software. Atari shuts down the Atari Program Exchange, which sold notable "user written" games such as Eastern Front (1941) and Dandy. Warner Communications Inc. sells Atari arcade video game, home video game, and home computer intellectual properties including the Atari logo and trademark, inventories of Atari home video game and home computer hardware and software, as well as certain Atari international subsidiaries to Tramel Technology. Warner Communications effectively closes its domestic home video game and computer divisions but retains the arcade games division and renames Atari Inc. to Atari Games, with permission from Tramel Technology. Tramel Technology renamed to Atari Corporation. Sega and CSK merge to form Sega Enterprises Ltd. Mattel sells video game assets including M Network and Intellivision hardware and software intellectual property to a group led by a former Mattel Electronics executive that becomes INTV Corporation. Mattel Electronics closes their games development offices in California and Taiwan. The games development office in France is sold to investors and renamed Nice Ideas. Births May May 17 – Alejandro Edda: Mexican-American actor Notable releases Games Arcade February 17 - Nintendo launch the initial version of boxing game Punch-Out!!. April – Namco releases Gaplus, the sequel to Galaga. July – Data East releases Technōs Japan's Karate Champ, laying the foundations for the one-one-one fighting game genre. July 20 – Namco releases action role-playing game Tower of Druaga. October – Namco releases Pac-Land and lays the foundations for horizontally-scrolling platform games. November 1 – Namco releases Grobda, a spin-off from Xevious. December – Namco releases Super Xevious and Dragon Buster, the latter of which is one of the first games to feature a life bar. December – Capcom releases 1942. December – Irem releases Kung-Fu Master and lays the foundations for the beat 'em up genre. December – Atari Games releases Marble Madness, their first game written in the C programming language and to use a 68000-family microprocessor. Bally Midway releases Demolition Derby, which features a damage bar and the ability to join a game in progress. Computer January - Bullet-Proof Software releases The Black Onyx on the PC-8801, which helps popularize turn-based role-playing games in Japan. June - Ultimate Play the Game release Sabre Wulf on the ZX Spectrum. June 6 – Alexei Pajitnov creates Tetris for the Electronika 60 in the Soviet Union. September 20 – Elite, an influential wireframe 3D space trading game offering a then-unique open-ended design, is published by Acornsoft. October – Nihon Falcom releases action role-playing game Dragon Slayer. October - Automata UK releases Deus Ex Machina on ZX Spectrum. December – T&E Soft releases Hydlide, an early action role-playing game that features a health regeneration mechanic and anticipates elements of The Legend of Zelda and Ys series. December 7 – Knight Lore by Ultimate Play the Game is released for the ZX Spectrum (and later ported to the BBC Micro, Amstrad CPC, MSX, and Famicom Disk System). It is the third title in the Sabreman series, but the first to use the isometric Filmation engine. Broderbund releases The Ancient Art of War by Dave and Barry Murry. It is a real-time tactics game and a precursor to the real-time strategy genre. Broderbund releases Karateka for the Apple II. The Lords of Midnight, a strategy adventure game by Mike Singleton, is released. Infocom releases The Hitchhiker's Guide to the Galaxy, Sorcerer, Cutthroats, and Seastalker. First Star releases Boulder Dash, which inspired enough clones to create the rocks-and-diamonds genre. Epyx releases Impossible Mission for the Commodore 64. Electronic Arts releases Adventure Construction Set. Sierra On-Line releases King's Quest I for the PCjr. Synapse releases Atari 8-bit game Dimension X, over 9 months after running magazine ads showing features that weren't present in the final game. Software Projects releases seminal platformer Jet Set Willy on the ZX Spectrum. First Star Software releases Spy vs. Spy for the Commodore 64. Console March 30 - Activision releases H.E.R.O. for Atari 2600. June 4 – Nintendo releases a conversion of their own Donkey Kong 3 for the Famicom. October 5 - Nintendo release Devil World on the Famicom in Japan. December 17 – Nintendo releases Ice Climber and Balloon Fight for the Famicom. Activision releases Pitfall II: Lost Caverns, one of the last major titles for the Atari 2600. Each cartridge contains a custom chip allowing improved visuals and 4-voice sound. Hardware January 24 – Apple Inc. announces the original, 128K, floppy disc-only, Macintosh. March – IBM releases the IBM PCjr in an attempt to enter the home computer market. It has improved sound and graphics over the original, business-oriented IBM PC, but is commercial failure. Atari, Inc. announces the Atari 7800, a next-gen console that's compatible with Atari 2600 cartridges, but capable of greatly improved visuals. It is shelved until 1986 due to the sale of the company and legal issues. Discontinued systems: Atari 5200, Magnavox Odyssey², Vectrex References Video games Video games by year