id
stringlengths 3
8
| url
stringlengths 32
207
| title
stringlengths 1
114
| text
stringlengths 93
492k
|
---|---|---|---|
1059813 | https://en.wikipedia.org/wiki/Amazo | Amazo | Amazo () is a fictional character appearing in American comic books published by DC Comics. The character was created by Gardner Fox and Mike Sekowsky and first appeared in The Brave and the Bold #30 (June 1960) as an adversary of the Justice League of America. Since debuting during the Silver Age of Comic Books, the character has appeared in comic books and other DC Comics-related products, including animated television series, trading cards and video games. Traditionally, Amazo is an android created by the villain scientist Professor Ivo and gifted with technology that allows him to mimic the abilities and powers of superheroes he fights (usually the Justice League), as well as make copies of their weapons (though these copies are less powerful than the original). His default powers are often those of Flash, Aquaman, Martian Manhunter, Wonder Woman, and Green Lantern (the Justice League founding members that he first fought). He is similar and often compared to the later created Marvel android villain Super-Adaptoid.
In the New 52 timeline of DC Comics, Amazo begins as the A-Maze Operating System and then becomes an android capable of duplicating superhuman powers. Later on, a sentient Amazo Virus infects research scientist Armen Ikarus and takes over his mind. With Ikarus as a host, the Amazo Virus infects other people, granting them super-powers and controlling their minds before they die within 24 hours.
In live-action media, multiple Amazo robots appeared in the Arrowverse crossover event Elseworlds.
Publication history
Amazo first appeared in a one-off story in The Brave and the Bold #30 (June 1960) and returned as an opponent of the Justice League of America in Justice League of America #27 (May 1964) and #112 (August 1974), plus a briefer appearance in #65 when another antagonist weaponized Amazo and other items from the JLA trophy room. Other significant issues included an encounter with a depowered Superman in Action Comics #480-483 (February – May 1978), and in Justice League of America #191 (June 1981) and #241-243 (August – October 1985). Amazo also battles a fully powered Superman in Superman Special #3 (1985).
A different Amazo model featured in Justice League Quarterly #12 (Fall 1993) and battled the hero Aztek in Aztek: The Ultimate Man #10 (May 1997) before being destroyed in Resurrection Man #2 (June 1997). An advanced version debuted in a one-off story in JLA #27 (March 1999), while another appeared in the limited series Hourman, specifically issues #1, #5-7, #17, and #19-21 (April 1999 – December 2000).
Amazo's origin is revealed in Secret Origins of Super-Villains 80-Page Giant #1 (December 1999). Another version is discovered to be part of a weapons shipment in Batman #636-637 (March – April 2005) and during the Villains United storyline in Firestorm vol. 2 #14-16 (August – October 2005), Villains United #5-6 (November – December 2005), and the Villains United: Infinite Crisis Special (June 2006).
Amazo's consciousness returned in Justice League of America #1-5 (October 2006 – March 2007), planted in the body of fellow android the Red Tornado. Ivo also created Amazo's "offspring" in JLA Classified #37-41 (June – October 2007). A story continuing the first Red Tornado storyline featured in Justice League of America vol. 2 #21-23 (July – September 2008).
Writer Mike Conroy noted: "Amazo was a persistent thorn in the JLA's side... although his programming and own sentience have displayed no ambition towards world conquest... His very existence is a hazard to all of humanity".
Fictional character biography
The android Amazo was created by Professor Anthony Ivo, a scientist with expertise in multiple fields who is obsessed with immortality. The original Justice League of America (Green Lantern, Flash, Aquaman, Wonder Woman, and the Martian Manhunter) discover their powers are being drained and somehow then being used by a thief who is after animals known to have long lifespans. While attempting to discover the perpetrator, the League is confronted and defeated by Amazo, who has duplicated their powers thanks to "absorption cell" technology created by Ivo. Amazo brings the team to Ivo, who reveals he has created a means of extending his life span courtesy of the data obtained from studying the creatures Amazo captured. The League then defeats Ivo and the android. Ivo's immortality results in his body becoming monstrous in form, and the android is stored in the League trophy room.
The android is temporarily re-activated twice to assist the League in regaining lost abilities. In these and subsequent stories, the android's duplication of powers don't make others feel their powers being drained. When red sun radiation reaches Earth, Amazo reactivates and engages in an extensive battle with Superman involving time-travel, only to be defeated before it can murder Ivo and the League. Later, the super-villain called the Key, having been shrunken in size, re-activates the android in a failed bid to restore his original height. The League defeats Amazo and then new team member Zatanna restores the Key to his former state.
After the Justice League of America disbands and reforms as a small team of mostly new heroes based in Detroit, Ivo reactivates Amazo to attack this less experienced, "weaker" League. The android defeats all the new members but is finally stopped by Justice League founding members the Martian Manhunter and Aquaman.
A different Amazo model is later activated and battles the superhero team the Conglomerate. This updated Amazo searches for Ivo and encounters the hero Aztek, who succeeds in reasoning with the android rather than overpowering it. This Amazo model also briefly battles the Resurrection Man before finally being destroyed. Before his destruction, the second model of Amazo is summoned into the future by the android hero Hourman, who wishes to meet his "ancestor". This Amazo copies Hourman's time-warping "Worlogog" artifact, becoming "Timazo" in the process. Timazo wreaks havoc with his new ability to manipulate time, but is defeated by Hourman and returned to his place in the past so his history can run its course. Another, similar model of Amazo later has several more encounters with Hourman.
Another model of Amazo is activated that can wield multiple powers at once and is programmed to automatically upgrade its abilities to match those of all active Justice League members. Initially not understanding this upgrade, the Justice League calls in reserve members to help defeat Amazo, which only results in its power increasing. On the Atom's advice, Superman (active team chairman at the time) announces the League is officially disbanded. Programmed only to mimic the powers of active members, this Amazo is suddenly depowered and easily deactivated. Years later, Batman and Nightwing discover a partially built Amazo android in a weapons shipment and destroy it.
Another Amazo participates in a massive attack by a group of villains on the city of Metropolis, but is destroyed by Black Adam.
It is eventually revealed that after perfecting Amazo's absorption cells, Ivo combined this technology with human ova and DNA to create a "son" of Amazo who grows up as Frank Halloran, unaware of his heritage. Years later, Frank is a philosophy student dating a young woman named Sara when his powers are awakened prematurely. Rather than emulate his villainous "father", Frank hopes to be a hero called "Kid Amazo". Slowly becoming mentally unstable, Kid Amazo discovers Sara is Ivo's daughter and was instructed to monitor Frank by posing as a girlfriend. Kid Amazo goes on a rampage. Batman deduces Kid Amazo has not only the powers of the Leaguers but also their contrasting personality traits. This is later used to cause greater internal instability, destroying Kid Amazo.
Later, Ivo downloads Amazo's programming into the body of the Red Tornado, the android villain-turned-hero created by Professor T.O. Morrow, another enemy of the Justice League. The League battles an army of Red Tornado androids before discovering that the villain Solomon Grundy intends to transfer his mind into the original android Tornado's body. Although this plan is defeated, the Amazo programming asserts itself and attacks the League until member Vixen destroys it. A new body is created to house Red Tornado's consciousness but the Amazo programming inhabits it instead, battling Justice League before he's defeated by being teleported into the gravity well of the red star Antares.
The New 52
As part of The New 52, the new origin story of the Justice League references the "A-Maze Operating System" and "B-Maze Operating System" designed by Anthony Ivo. The League later battles an android equipped with a corrupt version of this operating system.
During the Forever Evil storyline, the New 52 Amazo appears as a member of the Secret Society of Super Villains.
During the "Amazo Virus" storyline, a biotech pathogen is created based on the android's absorption cells. The first person to be infected by this virus is former Lexcorp research scientist Armen Ikarus, whose mind becomes corrupted in the process and replaced by the virus's will. Now possessing power and driven to infect others, Ikarus's personality is replaced by the new Amazo. The Ikarus Amazo infects others, granting them super-powers based on desires and personality traits, but killing them within 24 hours. The Ikarus Amazo, able to enhance infected humans and control them through a "hive-mind" connection, is defeated by the Justice League. Young Reggie Meyer and his family are also affected. Influenced by technology from the original Amazo android, Reggie becomes Kid-Amazo.
DC Rebirth
In the storyline "Outbreak", Amazo is one of the villains recruited by an A.I. named Genie, created by the daughter of computer technician James Palmer. His technology cells are later hacked and he briefly joins the Justice League's side.
Amazo later appeared as a member of the Cabal.
Powers and abilities
Amazo (Android)
Amazo is an advanced android built using Professor Anthony Ivo's "absorption cell" technology. This technology (later indicated to involve nanites) allows Amazo's cells to mimic the physical structure and energy output of organic beings he encounters, empowering him to mimic physical and energy-based abilities (such as the strength of Superman, the speed of the Flash, or the fighting skill of Batman). Amazo's internal energy source provides power for these abilities, so it does not matter what source of power is used by the superhuman he is mimicking (such as Wonder Woman's speed being based on magical empowerment and Superman's speed being a result of Kryptonian cells fueled by solar radiation). After his first story, many Amazo models retain the powers of the first five founding Leaguers he met as a default power set, absorbing new abilities based on other Leaguers they encounter. The models are usually only able to access a single target's unique attributes at a time. Some models have internally possessed the powers of many Justice League members, not just founding members, in their internal database and can summon them at will, but again only utilizing one person's powers at a time. Some later Amazo versions are upgraded to use and mimic multiple powers at once from any superhuman they come in contact with or anyone it identifies as a Justice League member. Several Amazo models can create duplicates of weapons as well, such as the power ring of Green Lantern, the metal mace of Hawkgirl, or the lasso of Wonder Woman. These copied weapons are more limited in power than the original products.
At times, Amazo is a simple minded android, capable of basic strategies and possessing average intelligence but with narrow focus. Some models of Amazo have demonstrated advanced analysis and tactics in battle, helping them maneuver to apply their stolen powers effectively to defeat opponents. In most incarnations, Amazo takes on a person's weaknesses simultaneously when mimicking their powers (as an example, becoming vulnerable to kryptonite radiation while using Superman abilities). Multiple stories have also indicated that his android body, designed to emulate the form and function of a human being, also possesses the pressure points and stress spots the average human body possesses.
Amazo (Ikarus)
Arman Ikarus is a former scientist and researcher at Lexcorp who is the first to be exposed to the Amazo Virus outbreak. This version of Amazo is driven to infect others with the Amazo virus, causing them to develop psychoactive superhuman abilities based on inherent desires and characteristics before dying within 24 hours. The Ikarus Amazo could emulate technology and super-powers he encountered by crudely modifying his genetic structure and biological structure. The Ikarus Amazo can remotely augment the physical abilities of anyone infected with the Amazo virus and influence their behavior through establishing a mental "hive-mind" connection. Initially, Ikarus's body seemed to degenerate from the strain of the virus altering his biology, but later his form stabilized and evolved into the appearance of the classic Amazo android.
Other versions
Captain Carrot and His Amazing Zoo Crew!
Captain Carrot and His Amazing Zoo Crew! #14-15 featured the parallel Earth of "Earth-C-Minus", a world populated by talking animal superheroes that paralleled the mainstream DC Universe. On Earth-C-Minus a counterpart of the character Amazo existed called "Amazoo": a robotic composite of a dozen different animal body parts and abilities.
DC Super Friends
Amazo appears in the DC Super Friends comics.
Injustice 2
Amazo appears in the Injustice 2 comics based on the Injustice reality. After the fall of Chancellor Superman's Regime, Ivo sold off Amazo to a terrorist initiative under the control of Ra's al Ghul and Solovar.
In other media
Television
Live-action
Amazo is featured in the TV shows taking place in the Arrowverse:
In Arrow, Dr. Ivo inhabited a ship called the Amazo where he tested experiments with Mirakuru serum and tortured his captives including Anatoly Knyazev. The ship was eventually taken over by Slade Wilson (who was going insane due to the Mirakuru serum) after he killed its former Captain, "The Butcher", and cut off Ivo's hand. Following Slade's defeat by Oliver and the death of Ivo and his crew, the ship became wrecked on the shores of Lian Yu.
The android Amazo, or A.M.A.Z.O. (Anti-Metahuman Adaptive Zootomic Organism), debuts in the Elseworlds crossover. Built by Ivo Laboratories for A.R.G.U.S., it is capable of replicating all natural skills and special abilities of an individual such as a metahuman, extraterrestrial, or ordinary human. In "Elseworlds, Part 1", A.M.A.Z.O. is awakened when Oliver (who switched lives with Barry under mysterious circumstances and became the Flash) accidentally throws a stray lightning bolt at it. In a climactic battle, it is defeated by a computer virus made by Cisco Ramon with the help of Superman, Supergirl, Flash, and Green Arrow (who, due to the switch, is Barry Allen). A.M.A.Z.O. is brought back by John Deegan as he rewrites reality during the crossover's final battle in "Elseworlds, Part 3", only to be confronted and defeated by Brainiac 5 of Earth-38.
Animated
Amazo made multiple appearances in the DCAU, voiced by Robert Picardo:
Amazo appears in the Justice League two-part episode "Tabula Rasa". Here he is depicted being a grey blank like humanoid, only referred to as "the Android" (although a schematic showing his construction is labeled with the acronym "A.M.A.Z.O."). Unlike other incarnations, this Amazo is able to access any and all abilities replicated simultaneously. Also, while initially obtaining any target's weakness and various character flaws (i.e. Flash's unfocused and flirting nature or Superman's vulnerability to kryptonite respectively), he can eventually adapt away these flaws. Amazo was found by Luthor who was seeking Dr. Ivo after a battle with the League destroyed part of his battle suit. Upon discovering Ivo's death and Amazo's abilities, Luthor uses the android to steal the League's powers and steal parts needed to fix his suit. However, after absorbing Martian Manhunter's abilities, Amazo learns the truth, becoming a gold form and setting off for space, intent on finding meaning in his existence.
Amazo appears in the Justice League Unlimited episodes "The Return" and "Wake the Dead", voiced once again by Robert Picardo. Amazo had obtained godhood, having demonstrated the ability to teleport Oa away, giving the appearance that he had destroyed it. His intent was to kill Luthor for using him, going through every member of the Justice League in the process. Ultimately, he gives up his quest for revenge due to the words of Doctor Fate, returning Oa, and is given sanctuary in the Tower of Fate to find his purpose. Amazo returns in "Wake the Dead", trying to stop the newly empowered and resurrected Solomon Grundy. However, not only do Amazo's attacks do nothing to harm the zombie, the already dangerous monster is able to drain some of Amazo's immense power. Amazo realizes his presence puts the League at risk, and teleports away. Amazo is briefly mentioned during part 3 of the season 2 finale, "Panic in the Sky". Luthor reveals he set up the League so Project Cadmus could destroy them. This distraction would allow Lex to transfer his mind into an Amazo-like body, which he crafted from stolen Cadmus tech. His plans are foiled, and Amanda Waller destroys the body.
Amazo appears in the Young Justice episode "Schooled", voiced by Peter MacNicol. After replicating the powers of Black Canary, Captain Atom, Flash, Martian Manhunter, Red Tornado, and Superman, the Justice League was able to just barely defeat the android and dismantle it. While transporting the parts to a safe location, Professor Ivo intercepts the trucks and was able to reassemble his android. The Justice League's respective sidekicks, now called The Team, pursued Ivo and Amazo. Thanks to a timely aid by Artemis, the Team was able to exploit Amazo's inability to use more than one target's replicated powers at a time and cause it to explode.
Amazo appears in the Batman: The Brave and the Bold episode "Triumvirate of Terror", voiced by Roger Rose.
Amazo was mentioned in the DC Nation short Enter Extremo.
Amazo appears in the Justice League Action episode "Boo-ray for Bizarro", voiced by Thomas Lennon. In addition to replicating a target's skills, powers, and personal tools, he is also able to replicate mental prowess of an individual, as he obtained Batman's keen intuition. He arrives at the Justice League Watchtower with the intent to replicate the powers of every member in the Justice League and capture them before taking over the world. Amazo does make a reference to Professor Ivo who is dead. While capturing Batman, Flash, Green Lantern, Martian Manhunter, Superman, and Wonder Woman, he sends out a distress signal for all other League members to return to base. Before his trap can be sprung, Bizarro arrives. Wanting to join the League, Bizarro decides to help stop Amazo by bringing who he claims is the "Smartest Man in the Galaxy", which turns out to be Space Cabbie. As Amazo fights Bizarro while replicating his powers, the clone's backwards mentality conflicts with Amazo processing, ultimately overloading him and leaving Amazo catatonic.
Film
Amazo appears in Batman: Under the Red Hood, voiced by Fred Tatasciore. This version is described by Batman as an android "with the ability to absorb the powers of super-humans". However, Amazo does not display any powers other than super-strength, flight and heat vision, implying that it has at least absorbed Superman's abilities. It has the same weak points as a human being, as its head is easily destroyed by C-4 charges.
Amazo appears in the animated film Injustice. This version was built by Ra's al Ghul, supposedly to help Superman enforce global peace; in reality, its main purpose was to kill Superman, whose powers it is able to replicate. After Amazo becomes extemely violent in its efforts to maintain order, Superman fights the android with the help of his allies, as well as Batman and his resistance. After killing Hawkman and Cyborg, Amazo is destroyed by the heroes' combined efforts, who manage to restrain him long enough for Plastic Man to blow it up from the inside.
Video games
Amazo appears in Justice League: Chronicles.
Amazo is mentioned as "The Amazo Project" in Lego DC Super-Villains, which is the project that gives the player's custom character (nicknamed "the Rookie") his/her powers.
See also
Kid Amazo
References
Comics characters introduced in 1960
Characters created by Gardner Fox
Characters created by Murphy Anderson
DC Comics robots
Fictional androids
Robot supervillains
DC Comics supervillains |
54225447 | https://en.wikipedia.org/wiki/Qatar%20diplomatic%20crisis | Qatar diplomatic crisis | The Qatar diplomatic crisis was a diplomatic incident that began on 5 June 2017 when Saudi Arabia, the United Arab Emirates, Bahrain and Egypt severed diplomatic relations with Qatar and banned Qatar-registered planes and ships from utilising their airspace and sea routes, along with Saudi Arabia blocking Qatar’s only land crossing. They were later joined by Jordan and were supported by the Maldives, Mauritania, Senegal, Djibouti, the Comoros, and the Tobruk-based government in Libya.
The Saudi-led coalition cited Qatar's alleged support for terrorism as the main reason for their actions, alleging that Qatar had violated a 2014 agreement with the members of the Gulf Cooperation Council (GCC), of which Qatar is a member. Saudi Arabia and other countries have criticized Al Jazeera and Qatar's relations with Iran. Qatar acknowledged that it had provided assistance to some Islamist groups (such as the Muslim Brotherhood), but denied aiding militant groups linked to al-Qaeda or the Islamic State of Iraq and the Levant (ISIL). Qatar also claimed that it had assisted the United States in the War on Terror and the ongoing military intervention against ISIL.
Initial supply disruptions were minimised by additional imports from Iran and Turkey, and Qatar did not agree to any of the Saudi-led coalition's demands. The demands included reducing diplomatic relations with Iran, stopping military coordination with Turkey, and closing Al-Jazeera.
On 27 July 2017, Qatari foreign minister Mohammed bin Abdulrahman Al Thani told reporters that Egypt, Saudi Arabia, United Arab Emirates and Bahrain were showing "stubbornness" to Qatar and had not taken any steps to solve the crisis. Al Thani added that the Security Council, the General Assembly and "all the United Nations mechanisms" could play a role in resolving the situation. On 24 August 2017, Qatar announced that it would restore full diplomatic relations with Iran.
On 4 January 2021, Qatar and Saudi Arabia agreed to a resolution of the crisis brokered by Kuwait and the United States. Saudi Arabia will reopen its border with Qatar and begin the process for reconciliation. An agreement and final communique signed on 5 January 2021 following a GCC summit at Al-'Ula marks the resolution of the crisis, with precise details to be released later. On 7 January 2021, Al Jazeera provided an "English translation of closing statement of the summit in full".
According to Foreign Policy, the publication views the crisis as a failure for Saudi Arabia, the UAE, Bahrain, and Egypt as Qatar generated closer ties to Iran and Turkey and became economically and militarily stronger and more autonomous.
Background
Since he took power in 1995, Hamad bin Khalifa al-Thani believed Qatar could find security only by transforming itself from a Saudi appendage to a rival of Saudi Arabia. Saudi Arabia withdrew its ambassador to Doha from 2002 to 2008 to try to pressure Qatar to curb its individualistic tendencies. This approach broadly failed. The Arab Spring left a power vacuum which both Saudi Arabia and Qatar sought to fill, with Qatar being supportive of the revolutionary wave and Saudi Arabia opposing it; since both states are allies of the United States, they avoid direct conflict with one another. Qatar has had differences with other Arab governments on a number of issues: it broadcasts Al Jazeera; it is accused of maintaining good relations with Iran; and it has supported the Muslim Brotherhood in the past. Qatar has been accused of sponsoring terrorism. Some countries have faulted Qatar for funding rebel groups in Syria, including al-Qaeda’s affiliate in Syria, the al-Nusra Front, although Saudi Arabia has done the same. Qatar has allowed the Afghan Taliban to set up a political office inside the country. Qatar is a close ally of the United States, hosting the largest American base in the Middle East, Al Udeid Air Base.
In March 2014, Saudi Arabia, the United Arab Emirates, Bahrain and Egypt withdrew their ambassadors from Qatar. This severing of relations was the first of its kind since the establishment of the Gulf Cooperation Council (GCC). The crisis affected the GCC negatively at first – raising questions among member states, revealing shifts in their political agendas, and changing the balance of power in the region to some extent.
The exact reasons for the break in diplomatic relations in 2017 are unclear, but contemporary news coverage primarily attributed this to several events in April and May 2017:
April 2017 hostage negotiations
In April 2017, Qatar was involved in a deal with both Sunni and Shi'ite militants in Iraq and Syria. The deal had two goals. The immediate goal was to secure the return of 26 Qatari hostages (including Qatari royals) who had been kidnapped by Shi'ite militants while falcon hunting in southern Iraq and kept in captivity for more than 16 months. The second goal was to get both Sunni and Shi'ite militants in Syria to allow humanitarian aid to pass through and allow the safe evacuation of civilians. According to the New York Times, this deal allowed the evacuation of at least 2,000 civilians from the Syrian village of Madaya alone. What outraged Saudi Arabia and the UAE was the amount of money Qatar paid to secure the deal. According to the Financial Times, Qatar paid $700 million to multiple Iranian-backed Shi'a militias in Iraq, $120–140 million to Tahrir al-Sham, and $80 million to Ahrar al-Sham.
Riyadh Summit 2017
As part of the Riyadh Summit in late May 2017, many world leaders, including US President Donald Trump visited the region. Trump gave strong support for Saudi Arabia's efforts in fighting against states and groups allied with Iran and the Muslim Brotherhood, leading to an arms deal between the countries. The Business Insider reported that "Elliott Broidy a top fundraiser for President Donald Trump; and George Nader, Broidy's business partner ... pushed for anti-Qatar policies at the highest levels of government, and expected large consulting contracts from Saudi Arabia and the UAE." Trump's support may have induced other Sunni states to fall in line with Saudi Arabia to take a stance against Qatar. Trump's public support for Saudi Arabia, according to The New York Times, emboldened the kingdom and sent a chill through other Gulf states, including Oman and Kuwait, that fear that any country that defies the Saudis or the United Arab Emirates could face ostracism as Qatar had. The Saudi-led move was at once an opportunity for the GCC partners and Egypt to punish their adversaries in Doha, please their allies in Washington, and remove attention from their own shortcomings and challenges.
Hacking operations against Qatar
According to Qatar-based Al Jazeera and the American FBI, the Qatar News Agency website and other government media platforms were hacked in May 2017, where hackers posted fake remarks on the official Qatar News Agency attributed to the Emir of Qatar, Sheikh Tamim bin Hamad Al Thani, that expressed support for Iran, Hamas, Hezbollah, and Israel. The emir was quoted as saying: "Iran represents a regional and Islamic power that cannot be ignored and it is unwise to face up against it. It is a big power in the stabilization of the region." Qatar reported that the statements were false and did not know their origin. Despite this, the remarks were widely publicized in the various non-Qatari Arab news media, including UAE-based Sky News Arabia and Saudi-owned Al Arabiya. On 3 June 2017, the Twitter account of Bahraini foreign minister Khalid bin Ahmed Al Khalifa was hacked.
Initially alleged intelligence gathered by the US security agencies indicated that Russian hackers were behind the intrusion first reported by the Qataris. However, a US official briefed on the inquiry told the New York Times that it "was unclear whether the hackers were state-sponsored" and The Guardian diplomatic editor Patrick Wintour reported that "it is believed that the Russian government was not involved in the hacks; instead, freelance hackers were paid to undertake the work on behalf of some other state or individual." A US diplomat said that Russia and its ally Iran stood to benefit from sowing discord among US allies in the region, "particularly if they made it more difficult for the United States to use Qatar as a major base." The FBI sent a team of investigators to Doha to help the Qatari government investigate the hacking incident. Later, the New York Times reported that the hacking incidents may be part of a long-running cyberwar between Qatar and other Gulf countries that was only revealed to the public during the recent incidents, and they noted how Saudi and UAE media picked up the statement made by the hacked media in less than 20 minutes and began interviewing many well-prepared commentators against Qatar.
US intelligence agencies believe that the hacking was done by the United Arab Emirates, according to a Washington Post report published on 16 July. The intelligence officials stated that the hacking was discussed among Emirati officials on 23 May, one day before the operation took place. The UAE denied any involvement in the hacking. It was announced on 26 August 2017, that five individuals allegedly involved in the hacking were arrested in Turkey.
A former employee of the US’ National Security Agency (NSA), David Evenden was hired by the UAE to work at the cyberespionage firm CyberPoint. The Emirates signaled a green flag to Evenden and his team to run hacking operations against Qatar, in order to scavenge substance that Qatar has been involved in funding the Muslim Brotherhood at some point in time. In its major stratagem against Qatar, the UAE allowed Evenden and the network of other former NSA employees to neglect rationalizations and fetch confidential data. This team at CyberPoint carried out several global hacking attempts, including against the Qatari royals, the officials at FIFA, and even the Internet critics of the UAE. In 2015, the Emirates, using these NSA employees, hacked the emails of Michelle Obama, before her scheduled visit to an event in Qatar. Mrs. Obama was invited by Qatar's Sheikha Moza bint Nasser. The email hack provided the UAE with every piece of information exchanged between the two women and their staff.
Hacking of UAE ambassador's email
In May 2017, the email account of the UAE's ambassador to the US, Yousef Al-Otaiba, was hacked. The emails were seen as an attempt "to embarrass" Al Otaiba by the Huffington Post because they showed links between the UAE and the US-based pro-Israel Foundation for Defense of Democracies. The hack was seen as a move to benefit Qatar, which deepened the rift even more between the two sides. According to The Intercept, Yousef al-Otaiba has been reportedly linked to buy influence for UAE-led campaigns in the White House.
Severance of diplomatic and economic ties
Between 5 and 6 June 2017, Saudi Arabia, the UAE, Yemen, Egypt, the Maldives, and Bahrain all separately announced that they were cutting diplomatic ties with Qatar; among these Bahrain was the first to announce the severing of ties at 02:50 GMT in the early morning of 5 June.
A variety of diplomatic actions were taken. Saudi Arabia and the UAE notified ports and shipping agents not to receive Qatari vessels or ships owned by Qatari companies or individuals. Saudi Arabia closed the border with Qatar. Saudi Arabia restricted its airspace to Qatar Airways. Instead, Qatar was forced to reroute flights to Africa and Europe through Iranian airspace. Saudi Arabia's central bank advised banks not to trade with Qatari banks in Qatari riyals.
Qatar verbally criticized the ban. The Foreign Ministry of Qatar criticized the ban, arguing that it undermined Qatar's sovereignty. The foreign minister of Qatar, Mohammed bin Abdulrahman Al-Thani, said that Saudi statements regarding Qatar were contradictory: on the one hand, Saudi Arabia claimed Qatar was supporting Iran, on the other hand, it claimed Qatar was funding Sunni extremists fighting against Iran.
Saudi Arabia's move was welcomed by United States president Donald Trump despite a large US presence at the Al Udeid Air Base, the primary base of US air operations against the Islamic State of Iraq and the Levant. However, Secretary of State Rex W. Tillerson and Defense Secretary James Mattis worked on de-escalating the situation. Tillerson, as the CEO of ExxonMobil, was acquainted with the current and previous emirs of Qatar. A number of countries in the region, including Turkey, Russia and Iran, called for the crisis to be resolved through peaceful negotiations.
All GCC countries involved in the announcement ordered their citizens out of Qatar. Three Gulf states (Saudi Arabia, UAE, Bahrain) gave Qatari visitors and residents two weeks to leave their countries. The foreign ministries of Bahrain and Egypt gave Qatari diplomats 48 hours to leave their countries. Qatar was expelled from the Saudi Arabian-led intervention in Yemen and Yemen's government itself has cut ties. Kuwait and Oman remained neutral.
The Tobruk-based government of Libya claimed to have cut diplomatic ties with Qatar despite having no diplomatic representation in that country.
The semi-autonomous Somali regions of Puntland,
Hirshabelle, and Galmudug each issued statements cutting ties with Qatar, in opposition to the neutral stance of the federal government of Somalia.
Other countries made statements condemning Qatar, including Gabon and Eritrea.
Demands on Qatar and responses
On 22 June 2017, Saudi Arabia, the United Arab Emirates (UAE), Egypt and Bahrain issued Qatar a list of 13 demands through Kuwait, which is acting as a mediator, that Qatar should agree in full within 10 days, which expired on 2 July 2017. According to reports on 23 June, these demands included:
Closing Al Jazeera and its affiliate stations.
Closing other news outlets that Qatar funds, directly and indirectly, including Arabi21, Rassd, Al-Araby Al-Jadeed and Middle East Eye.
Closing the Turkish military base in Qatar, and terminate the Turkish military presence and any joint military cooperation with Turkey inside Qatar.
Reducing diplomatic relations with Iran. Only trade and commerce with Iran that complies with US and international sanctions will be permitted.
Expelling any members of the Islamic Revolutionary Guard Corps (IRGC) and cutting off military and intelligence cooperation with Iran.
"Qatar must announce it is severing ties with terrorist, ideological and sectarian organizations including the Muslim Brotherhood, Hamas, the Islamic State of Iraq and the Levant (ISIL), Al-Qaeda, Hezbollah, and Jabhat Fateh al Sham, formerly al Qaeda's branch in Syria" according to one Arab official.
Surrendering all designated terrorists in Qatar, and stopping all means of funding for individuals, groups or organisations that have been designated as terrorists.
Ending interference in the four countries' domestic and foreign affairs and having contact with their political oppositions.
Stopping granting citizenship to wanted nationals from Saudi Arabia, the United Arab Emirates, Egypt and Bahrain.
Revoking Qatari citizenship for existing nationals where such citizenship violates those countries' laws.
The payment of reparations for years of alleged wrongs.
Monitoring for 10 years.
Aligning itself with the other Gulf and Arab countries militarily, politically, socially and economically, as well as on economic matters, in line with an agreement reached with Saudi Arabia in 2014.
According to a report by the Qatar-endowed AlJazeera, "Qatari officials immediately dismissed the document as neither reasonable or actionable." Iran denounced the blockade. US Secretary of State Rex Tillerson said that some of the demands would be very hard to meet but encouraged further dialogue.
On 3 July, Saudi Arabia accepted a Kuwaiti request for the deadline to be extended by 48 hours.
On 5 July, foreign ministers from Egypt, Saudi Arabia, United Arab Emirates and Bahrain met in Cairo after receiving a response from Qatar to their list of demands. The meeting aimed to resolve the dispute ended in a stalemate when Saudi foreign minister Adel al-Jubeir said that the political and economic boycott of Qatar will remain until it changes its policies. Also on the same day, the Saudi-led bloc said it no longer insisting it to comply with its 13 specific demands they tabled last month. Instead, it asked Qatar to accept six broad principles, which includes commitments to combat terrorism, extremism, to end acts of provocation, and incitement.
However, by 30 July 2017, the 13 demands had been reinstated. In the meantime, a joint statement was made in Cairo in order to restart the negotiation process with Qatar, which included six principles:
Commitment to combat extremism and terrorism in all its forms and to prevent their financing or the provision of safe havens.
Prohibiting all acts of incitement and all forms of expression which spread, incite, promote or justify hatred and violence.
Full commitment to Riyadh Agreement 2013 and the supplementary agreement and its executive mechanism for 2014 within the framework of the Gulf Cooperation Council (GCC) for Arab States.
Commitment to all the outcomes of the Arab-Islamic-US Summit held in Riyadh in May 2017.
To refrain from interfering in the internal affairs of States and from supporting illegal entities.
The responsibility of all States of international community to confront all forms of extremism and terrorism as a threat to international peace and security.
Global reactions
United States
United States President Donald Trump claimed credit for engineering the diplomatic crisis in a series of tweets. On 6 June, Trump began by tweeting: "During my recent trip to the Middle East I stated that there can no longer be funding of Radical Ideology. Leaders pointed to Qatar – look!" An hour and a half later, he remarked on Twitter that it was "good to see the Saudi Arabia visit with the King and 50 countries already paying off. They said they would take a hard line on funding extremism, and all reference was pointing to Qatar. Perhaps this will be the beginning of the end to the horror of terrorism!" This was in contrast to attempts by the Pentagon and the Department of State to remain neutral. The Pentagon praised Qatar for hosting the Al Udeid Air Base and for its "enduring commitment to regional security." US Ambassador to Qatar, Dana Shell Smith, sent a similar message. Earlier, the US Secretary of State had taken a neutral stance and called for dialogue. On the same day, Trump also had a phone call with Saudi King Salman and rejected a Saudi proposal to invade Qatar. Instead, the United States requested Kuwaiti mediation with the goal of resolving the conflict.
On 8 June, President Donald Trump, during a phone call with the Emir of Qatar Tamim bin Hamad Al Thani, offered to act as a mediator in the conflict with a White House meeting between the parties if necessary. The offer was declined, and a Qatari official stated, "The emir has no plans to leave Qatar while the country is under a blockade." On 9 June, Trump once again put the blame on Qatar, calling the blockade "hard but necessary" while claiming that Qatar had been funding terrorism at a "very high level" and described the country as having an "extremist ideology in terms of funding." This statement was in conflict with Secretary of State Tillerson's comments on the same day, which called on Gulf states to ease the blockade. In 13 June 2017 after meeting with Tillerson in Washington, Saudi Foreign Minister Adel al-Jubeir stated that there was "no blockade" and "what we have done is we have denied them use of our airspace, and this is our sovereign right," and that the King Salman Centre for Humanitarian Aid and Relief would send food or medical aid to Qatar if needed. The following day, Trump authorized the sale of $12 billion of U.S. weapons to Qatar. According to The intercept, Saudi Arabia and the UAE lobbied Trump to fire Rex Tillerson because he "intervened to stop a secret Saudi-led, UAE-backed plan to invade and essentially conquer Qatar."
On 21 June 2017, Trump told a crowd in Iowa that "We cannot let these incredibly rich nations fund radical Islamic terror or terrorism of any kind", noting that after his visit to Riyadh in May 2017 to meet with Saudi King Salman and urge an end to terror funding, "He has taken it to heart. And now they're fighting with other countries that have been funding terrorism. And I think we had a huge impact."
Other countries
Multiple countries, the European Union and the United Nations called for resolution of the diplomatic crisis through dialogue:
Israel's former defense minister, Avigdor Lieberman, described the situation as an "opportunity" for Israel, stating, "Some [Arab countries'] interests overlap with Israeli interests, including the issue with al-Jazeera." He went on to describe Al Jazeera Media Network as an "incitement machine" and "pure propaganda." Israeli Prime Minister Benjamin Netanyahu has demanded AJMN shut down its offices in Israel.
Reports that Mauritius had cut ties with Qatar were refuted by the Mauritian government. A report in the Saudi Gazette incorrectly stated that Mauritius had broken off ties with Qatar and that Mauritius' Vice Prime-Minister had issued a communiqué pledging his country's support for Saudi Arabia. This prompted further erroneous reports by other outlets. However, Mauritian Vice Prime Minister Showkutally Soodhun in an interview with Le Défi Media Group of Mauritius refuted claims that he had issued any such communiqué, and Mauritius' Ministry of Foreign Affairs issued a statement that Mauritius continued to maintain diplomatic relations with Qatar.
Pakistan stated that it had no plans to cut diplomatic relations with Qatar. The National Assembly passed a resolution urging all countries to "show restraint and resolve their differences through dialogue." The Pakistani Federal minister for Petroleum and Natural Resources said that "Pakistan will continue to import liquefied natural gas (LNG) from Qatar." A six-member Qatari delegation headed by a special envoy of the Qatari Emir visited Pakistan and asked Pakistan to play a positive role in resolving the diplomatic crisis former, and the Prime Minister of Pakistan, Nawaz Sharif, was quoted as saying that "Pakistan would do 'all it can' to help resolve the crisis," as well as calling on the Muslim world to play a role in ending hostilities. A TRT report stated that Pakistan would deploy 20,000 troops in Qatar, which the Pakistani foreign ministry denied.
The Philippines suspended the deployment of migrant workers to Qatar on 6 June 2017. However, the next day, they allowed the deployment of returning workers and those with an Overseas Employment Certificate, but still maintained the suspension of the deployment of new workers. The suspension was later fully lifted on 15 June.
On 8 June 2017, Egypt's deputy UN Ambassador Ihab Moustafa called for the United Nations Security Council to launch an investigation into accusations that Qatar "paid up to $1 billion to a terrorist group active in Iraq" to free 26 Qatari hostages, including members of its royal family, which payment would violate UN resolutions. The Qataris were kidnapped on 16 December 2015 from a desert camp for falcon hunters in southern Iraq. The hostages were released eighteen months later in April 2017. Qatari diplomats responded to the Egyptian calls for an investigation by reaffirming their commitment to the UN resolutions towards eliminating the financing of terrorism.
In June 2017, the government of Qatar hired American attorney and politician John Ashcroft to lobby on its behalf and help the state deny international allegations of supporting terror.
On 24 November 2017, deputy chief of Dubai Police Lieutenant General Dhahi Khalfan, blamed the 2017 Sinai attack on Al-Jazeera's reporting and called for the bombing of Al-Jazeera's headquarters by the Saudi-led coalition.
Israel hasn’t shared a direct diplomatic relation with Qatar since 2012, yet over the past few years, the country has emerged as an unlikely peacemaker in the Middle East by extending a helping hand to Doha and strengthening its ties with its nemesis, UAE. Israel offered Doha to work along with it on Gaza’s reconstruction in June 2020, changing Washington’s narrative towards Qatar concerning its relation with Hamas as one focused on getting all parties to cooperate in support of the peace plan initiated by the Trump administration. In 2017, Israel condemned the legislation introduced in the US Congress, designating Qatar as a state sponsor of terrorism for having ties with Hamas. The legislation was filed by Ed Royce, a Republican, and the then-Chairman of the House Foreign Affairs Committee. In the meantime, Israel also strengthened its partnership with the UAE by holding a meeting in Washington of the ambassadors of UAE and Bahrain with Prime Minister Benjamin Netanyahu. Israel maintained a balance between strengthening its relations with Abu Dhabi and simultaneously extending a helping hand to Doha.
The Syrian Coalition of Revolutionary and Opposition Forces welcomed the Al-Ula agreement by stating that “Syria is awaiting a more active Arab role that will contribute to putting an end to the nearly 10-year suffering of the Syrian people; help the Syrian people achieve their aspirations for freedom and independence; getting rid of the murderous Assad regime and Iranian sectarian militias; and putting an end the Iranian subversive project in Syria and the region.”
United Nations
In November 2020, United Nations special rapporteur Alena Douhan published a preliminary report condemning the Saudi-led blockade of the State of Qatar and urged to lift the ban immediately as a result of human rights violations of the people of Qatar. The UAE, Saudi Arabia, Bahrain and Egypt-led blockade of Qatar were concluded as illegal by the special rapporteur. Douhan has stated to present a final report to the United Nations Human Rights Council in September 2021.
Reportedly, it is prohibited under the UN Charter to impose any unilateral coercive measures against its member states unless authorized by the organization’s relevant organs or found consistent with the principles composing the charter.
Impact
Logistical implications
On 6 June 2017, Emirates Post of UAE halted postal services to Qatar.
Nearly 80 percent of Qatar's food requirements come from Persian Gulf Arab neighbors, with only 1 percent being produced domestically and even imports from outside the Gulf states usually crossing the now closed land border with Saudi Arabia. Immediately after the cutting of relations, local reports indicated residents swarmed grocery stores in hopes of stockpiling food. Many food delivery trucks were idled along the Saudi-Qatari border. On 8 June 2017, Qatari Foreign Minister Sheikh Mohammed bin Abdulrahman al-Thani said, "We're not worried about a food shortage, we're fine. We can live forever like this, we are well prepared." Qatar has been in talks with both Turkey and Iran to secure supply of food. On 11 June 2017, Iran sent four cargo planes with fruit and vegetables and promised to continue the supply. Turkey has pledged food and water supplies to go along with their troop deployment at their Turkish military base in Qatar.
As part of the Qatari government's response to lost food imports, it provided support to domestic agricultural company Baladna, which built a new dairy farm with imported cattle that was planned to produce enough milk to fulfill domestic demand for dairy products by June 2018.
Air travel
All airlines based in these countries, including Emirates, suspended flights to and from Qatar. Gulf Air, EgyptAir, flydubai, Air Arabia, Saudi Arabian Airlines and Etihad Airways suspended their flights to and from Qatar. Bahrain, Egypt, Saudi Arabia, and the United Arab Emirates are also banning overflights by aircraft registered in Qatar (A7). Instead Qatar has rerouted flights to Africa and Europe via Iran, paying a “hefty” overflight fee for each such flight.
Qatar Airways in response also suspended its flight operations to Saudi Arabia, the UAE, Egypt, and Bahrain.
Pakistan International Airlines sent special flights to bring back over 200 Pakistani pilgrims stuck at Doha airport. Over 550 Pakistani pilgrims in Doha were subsequently flown to Muscat.
Due to the blockade of Qatar Airways from the airspace of Saudi Arabia, the UAE, Bahrain and Egypt, Oman Air has taken up a significant role transporting travelers from and to Doha, mostly through Iranian airspace, while still allowing Qatari passport holders to book flights. The travel embargo has had a significant impact on foreign nationals living and working in Qatar, with about 100,000 Egyptians and citizens from other countries stranded there, unable to book direct flights or obtain travel documents for their return. Per request from Qatar, the blockade was under review by the International Civil Aviation Organization (ICAO), a UN-agency seeking a "consensus-based solution" for the resolution of the crisis.
On 31 July 2017, the agency asserted its neutrality in the conflict and announced that Qatar Airways will have access to three contingency routes over international waters in early August based on a preliminary agreement reached with the Saudi aviation authority (GACA) early that month. The ICAO, based in Montreal, also reminded all member countries to comply with the 1944 Chicago Convention on International Civil Aviation and its agenda.
In December 2020, Qatar’s ambassador to the United Nations sent a letter to the UN Secretary-General António Guterres and the Security Council members, reporting the airspace offenses by four Bahraini fighter jets. He said that Bahrain’s military aircraft violated Qatar’s airspace on December 9 by flying over the country’s territorial waters.
Shipping
The United Arab Emirates banned Qatar-flagged ships from calling at Fujairah. It also banned vessels from Qatar from the port and vessels at the port from sailing directly to Qatar. Similar restrictions were put in place at Jebel Ali, which pre-boycott used to handle over 85% of shipborne cargo for Qatar. Bahrain, Egypt and Saudi Arabia also banned Qatar-flagged ships from their ports.
On 8 June 2017 shipping giant Maersk was unable to transport in or out of Qatar entirely. Due to Qatar's shallow ports, large cargo ships are required to dock at Jebel Ali or other nearby ports where a feeder service transports the goods into Qatar. In response, Maersk and Swiss-based MSC vessels for Qatar were rerouted to Salalah and Sohar in Oman. Particularly smaller shipments of perishable and frozen foods have taken that route.
On 12 June 2017, Chinese shipping company COSCO announced suspension of services to and from Qatar. Taiwan's Evergreen Marine and Hong Kong's Orient Overseas Container Line have already suspended services.
Media ban
Hamad Saif al-Shamsi, the Attorney-General of the United Arab Emirates announced on 7 June that publishing expressions of sympathy towards Qatar through social media, or any type of written, visual or verbal form is considered illegal under UAE's Federal Penal Code and the Federal law on Combating Information Technology Crimes. Violators of this offense face between 3 and 15 years imprisonment, a fine of up to 500,000 emirati dirhams ($136,000) or both. Bahrain also issued a similar statement with a penalty up to 5 years imprisonment and a fine.
Saudi Arabia, Egypt, Bahrain, and the UAE all blocked access to Qatari news agencies, including the controversial Qatar-based Al Jazeera. Saudi Arabia shut down the local office of Al Jazeera Media Network. The BBC speculated that changes to Al-Jazeera would be a necessary part of any peaceful resolution.
The Qatar-based beIN Sports channels (a spin-off of Al Jazeera) were also initially banned in June in the UAE. The following month, the UAE restored normal access to beIN Sports channels via its local telecom providers. In Saudi Arabia, the channels remain banned, while a large-scale pirate decryption operation known as "beoutQ" has made the channels' content available. In 2018, in parallel with the diplomatic crisis and piracy issues, Saudi officials also began to accuse beIN Sports of having a monopoly position in sports broadcasting in the region, including revoking its broadcast licenses based on accusations of anti-competitive behaviour, and pulling its rights to the Asian Football Confederation in the Kingdom in 2019. beIN Sports has considered the moves to be politically motivated, and has accused beoutQ of operating from Saudi Arabia.
In May 2020, the Guardian revealed that World Trade Organization (WTO) had recognized Saudi Arabia as the main offender in regulating beIN’s copyright protected content via beoutQ. It also disclosed that the Premier League has received the 130-page WTO’s final report and made submissions against the gulf nation, as a part of its legal process. In June 2020, UEFA welcomed WTO report on Saudi piracy and recognized beoutQ as the culprit behind illegal streaming of copyright protected content. “What is clear is that beoutQ’s broadcasts constitute piracy of UEFA’s matches and as such, are illegal. BeoutQ was hosted on frequencies transmitted by Arabsat and was promoted and carried out by individuals and entities subject to Saudi Arabia’s territorial jurisdiction,” the union said.
Finances
At the start of the crisis, Standard & Poor's downgraded Qatar's debt by one notch from AA to AA-. Qatar's stock market dropped 7.3% on the first day of the crisis, and reached a 9.7% drop by 8 June 2017. Additionally, in the first months following the crisis the government of Qatar injected $38.5 billion, which was equivalent to 23% of the country's GDP, to support the country's economy and its banking sector.
As per S&P Global Ratings, banks in Qatar are strong enough to survive a withdrawal of all Gulf country deposits.
Despite the ongoing diplomatic blockade led by Saudi Arabia, international banks like HSBC, Goldman Sachs and others sought to repair their ties with Qatar by building stronger financial and business relations. Saudi Arabia and the United Arab Emirates informally warned the bankers not to have close relations with Doha or else there would be consequences.
On 20 January 2019, Sheikh Tamim bin Hamad Al Thani attended the opening session of the Arab Economic Summit in Beirut, Lebanon. This helped Qatar increase its influence and soft power in the region. Sheikh Tamim Bin Hamad Al Thani and Mauritanian President Mohamed Ould Abdel Aziz were the only two Arab leaders to attend the summit. Since Sheikh Tamim was the only GCC leader to attend, he received praise from the President of Lebanon himself, Michel Aoun. Hilal Khashan, a professor of political science at the American University of Beirut, said “He became the star of the summit."
Energy
Qatar is a global leader in liquefied natural gas production. Despite the severing of ties, Qatari natural gas continues to flow to the UAE and Oman through Abu Dhabi based Dolphin Energy's pipeline. The pipeline meets about 30–40 percent of UAE's energy needs. Shipping constraints from the crisis have also rerouted multiple shipments of oil and gas to and from the Gulf, which has caused reverberations in many local energy markets.
On 8 June 2017, gas futures spiked nearly 4 percent in the United Kingdom, which had nearly a third of all its imported gas arriving from Qatar. A secondary effect of the dispute has been on worldwide supplies of helium, which is often extracted from natural gas. Qatar is the world's second largest supplier of helium (the US ranks first).
In March 2019, Qatar lodged a complaint to International Atomic Energy Agency regarding the United Arab Emirates Barakah nuclear power plant, stating that it poses a serious threat to regional stability and the environment. The UAE denied that there are safety issues with the plant, which is being built by Korea Electric Power Corporation (KEPCO) with operation by French utility Électricité de France, and stated “The United Arab Emirates ... adheres to its commitment to the highest standards of nuclear safety, security and non-proliferation.”
23rd Gulf Cup
The 23rd Arabian Gulf Cup was scheduled to be hosted in Qatar. In November 2017, Saudi Arabia, United Arab Emirates, and Bahrain pulled out of the Gulf Cup due to the Qatar boycott. On 7 December 2017, it was announced that Kuwait will host the football tournament after Saudi Arabia, United Arab Emirates, and Bahrain all withdrew because of the diplomatic crisis.
Qatari military relations
On 7 June 2017, the Turkish parliament passed, with 240 votes in favour and 98 against, a legislative act first drafted in May allowing Turkish troops to be deployed to a Turkish military base in Qatar. During a speech on 13 June 2017, the President of Turkey, Recep Tayyip Erdoğan, condemned the boycott of Qatar as "inhumane and against Islamic values" and stated that "victimising Qatar through smear campaigns serves no purpose". On 23 June 2017, Turkey rejected demands to shut down its military base in Qatar.
Qatar hosts about 10,000 US troops at Al Udeid Air Base, which houses the forward operating base of United States Central Command that plays a commanding role in US airstrikes in Syria, Iraq, and Afghanistan. A Pentagon spokesperson claimed the diplomatic crisis would not affect the US military posture in Qatar.
On 30 January 2018 an inaugural United States-Qatar Strategic Dialogue meeting was held, co-chaired by U.S. Secretary of State Rex Tillerson, U.S. Secretary of Defense Jim Mattis, Qatari Deputy Prime Minister and Minister of State for Defence Affairs Khalid al-Attiyah and Qatari Deputy Prime Minister and Foreign Minister Mohammed bin Abdulrahman Al Thani. The meeting expressed the need for an immediate resolution of the crisis which respects Qatar's sovereignty. In a Joint Declaration on Security Cooperation, the United States expressed its readiness to deter and confront any external threat to Qatar's territorial integrity. Qatar offered to help fund the expansion of facilities at US bases in Qatar.
On 25 March 2018, the United States Central Command (CENTCOM) officially denied rumours that the Incirlik Air Base in Turkey and the Al Udeid Air Base in Qatar will be closed due to the ongoing regional conflict.
In January 2018, Qatar's ambassador was in talks with Russia with the intent to purchase S-400 surface-to-air missiles. Both countries signed an agreement on military and technical cooperation in 2017. In May 2018, the French daily newspaper Le Monde reported that King Salman of Saudi Arabia would take military action if Qatar installed the Russian air defence system. However, a senior Russian official remarked the system would still be delivered even against the will of Saudi Arabia. The Saudis were themselves approaching Russia to improve economic and military ties in 2017, but talks relating to the arms deal were hindered by concerns the United States and Saudi Arabia had with regard to the Russian position towards Iran's military and strategic involvement in the Middle East.
In June 2018, Qatar expressed its wish to join NATO. However, NATO declined membership, stating that only additional European countries could join according to Article 10 of NATO's founding treaty. Qatar and NATO have previously signed a security agreement together in January 2018.
Arab League Council 2017
During the 148th Session of the Arab League Council at the level of Foreign Ministers which was held in Cairo, Qatar’s State Minister for Foreign Affairs, Sultan Al Muraikhi, got into a heated argument with Saudi delegate Ahmad Al Qattan. The argument was caused after Al Muraikhi gave a short speech about how Qatar had to go back to supporting Iran after it recalled the Qatari ambassador to Iran to support Saudi Arabia in 2016.
2019 Asian Cup
During the semi-final match between the Qatar national football team and the tournament hosts the United Arab Emirates, the UAE supporters threw shoes and bottles onto the pitch. This conduct was preceded by booing the Qatari national anthem. Qatar won 4–0, paving the way to their first Asian Cup final and eventual title.
A British-Sudanese football fan was allegedly beaten by fans for wearing a Qatar football shirt to a match in which Qatar was playing and then, after investigation by the UAE police, arrested for wasting police time and making false statements of being assaulted. According to The Guardian, the fan was arrested for wearing a Qatar football shirt. The claim was denied by UAE authorities who stated the fan was arrested for wasting police time and making false assault claims to the police. The UAE police said that the fan had admitted to making false statements. A UAE official in London stated “He was categorically not arrested for wearing a Qatar football shirt. This is instead an instance of a person seeking media attention and wasting police time.” According to photos shown by The National, fans were seen waving the Qatari flag and wearing Qatari football shirts without any instances of arrests in the final. According to The New York Times, the UAE accuses Qatar of fielding ineligible players due to them being not originally Qatari. The players accused were the competitions' top scorer Almoez Ali, a Sudan born striker, as well as Bassam Al-Rawi, an Iraq born defender.
2022 FIFA World Cup
Saudi Arabia, Yemen, Mauritania, the United Arab Emirates, Bahrain and Egypt in a letter asked FIFA to replace Qatar as 2022 FIFA World Cup Host, calling the country as a "base of terrorism". In reaction to the Qatar diplomatic crisis over the alleged support of terrorism by the Qatari Government, the president of the German Football Association, Reinhard Grindel stated in June 2017, that "the football associations of the world should conclude that major tournaments cannot be held in countries which actively support terrorism", and that the German Football Association will talk with UEFA and the German Government in order to evaluate whether to boycott the tournament in Qatar in 2022. Hassan Al Thawadi, general secretary of Qatar’s FIFA World Cup organizing committee, stated that "Qatar does not support terrorism. Qatar is at the forefront of the fight against terrorism on the ground. It’s one of the main partners in the coalition fighting ISIS (the Islamic State group)." Thawadi said: "Our projects are going ahead as scheduled. This (blockade) is no risk in relation to the hosting of the World Cup."
In October 2017, Lieutenant General Dhahi Khalfan Tamim, deputy chief of Dubai Police, wrote about the Qatar diplomatic crisis on Twitter in Arabic; "If the World Cup leaves Qatar, Qatar’s crisis will be over … because the crisis is created to get away from it". According to observers, the message appeared to imply that the blockade was enacted due to Qatar hosting the world's biggest football event. In reaction to media coverage of his tweet, Dhahi Khalfan tweeted; “I said Qatar is faking a crisis and claims it’s besieged so it could get away from the burdens of building expensive sports facilities for the World Cup” UAE Minister of State for Foreign Affairs Anwar Gargash said the official, Dhahi Khalfan, had been misunderstood in media coverage. Gargash stated that Qatar's hosting of World Cup 2022 "should include a repudiation of policies supporting extremism & terrorism."
Hassan al-Thawadi, secretary general of the Qatar World Cup supreme committee, stated that projects were progressing as scheduled for the 2022 FIFA World Cup. Qatar's only land border and air and sea routes have been cut off by Bahrain, Egypt, Saudi Arabia and the United Arab Emirates (UAE). As the blockade came into play, World Cup organizers were asked to investigate a "Plan B", FIFA however has confidence in not exploring a "Plan B" for an alternate 2022 host. According to Thawadi, all these logistical obstacles are being overcome and building progress is continuing with only minimal cost increases, in preparation for the first World Cup in the Middle East.
Resolution
An article published by the Financial Times stated that, according to people briefed on the matter, the Kingdom of Saudi Arabia moved towards ending the 2017 blockade against the State of Qatar after the victory of President-elect Joe Biden in 2020. An advisor to Saudi Arabia and the United Arab Emirates, was quoted as calling the shift in the Kingdom’s policy towards Qatar “a gift for Biden” signaling that Crown Prince Mohammed bin Salman had shown willingness “to take steps” towards resolving differences with Qatar. The prince is said to have been intimidated by the incoming administration, as the former administration had reportedly supported the Riyadh government in times of crisis such as casualties in the Yemen war, detention of activists, the murder of Jamal Khashoggi, etc. On the other hand, UAE’s ambassador to the US, Yousef al-Otaiba, was quoted as stating that ending the dispute was not a priority, referring to the ongoing differences with the blockaded nation.
On 4 January 2021, Kuwait, Saudi Arabia's neighbour and a fellow GCC member, along with the United States, jointly brokered a deal resolving the crisis. Under the deal, Saudi Arabia will end its blockade of its Gulf neighbour and reopen its border.
On 5 January 2021, Qatar's Emir Sheikh Tamim bin Hamad Al Thani arrived in Saudi Arabia for a GCC summit in the old town of Al-'Ula. Later, the leaders signed the Al-'Ula statement. Before the signing, Bin Salman said that Kuwait and the United States support had resulted in "the Al-'Ula declaration agreement that will be signed at this blessed summit, in which the Gulf, Arab and Islamic solidarity and stability were emphasized." In addition to the statement, a final communique was signed but their contents are not yet known and Saudi Foreign Minister Faisal bin Farhan al-Saud said it and its allies agreed to restore full ties with Doha, including resumption of flights. Qatar has apparently not fulfilled any of the 13 demands, analysts saying that the Gulf states agreed instead to a joint security declaration.
The United Nations Secretary-General, António Guterres, welcomed the resolution of the crisis and the opening of the airspace, land, and sea borders between Saudi Arabia, the United Arab Emirates, Bahrain, Egypt, and Qatar. In a statement issued on 5 January 2021, he expressed hope that "all countries concerned will continue to act in a positive spirit to strengthen their relations". He also recognized the roles of the late Emir of Kuwait and late Sultan of Oman, who worked tirelessly towards resolving the Gulf rift. One Middle East policy analyst believed that the secret pact among the Gulf leaders is likely to have been multi-level, which includes several bilateral agreements between individual states rather than a unitary document.
three sovereign governments had not restored diplomatic ties with Qatar.
two countries still had downgraded diplomatic ties with Qatar without fully cutting relations.
See also
International propagation of Salafism and Wahhabism
OPEC
Qatar–Saudi Arabia diplomatic conflict
Iran–Saudi Arabia proxy conflict
Iran–Israel proxy conflict
2017 Lebanon–Saudi Arabia dispute
2019–2021 Persian Gulf crisis
International Maritime Security Construct
Russia–Syria–Iran–Iraq coalition
Shia–Sunni relations
Middle Eastern Cold War (disambiguation)
Arab Cold War
Axis of Resistance
References
2010s conflicts
2020s conflicts
Conflicts in 2017
Conflicts in 2018
Conflicts in 2019
Conflicts in 2020
Conflicts in 2021
2010s in Qatar
2020s in Qatar
2017 in Qatar
2018 in Qatar
2019 in Qatar
2020 in Qatar
2021 in Qatar
June 2017 events in Asia
2017 in international relations
2018 in international relations
2019 in international relations
2020 in international relations
2021 in international relations
Iran–Saudi Arabia proxy conflict
Arab Winter
Shia–Sunni sectarian violence
Geopolitical rivalry
Gulf Cooperation Council
Politics of Qatar
Politics of the Arab world
Politics of the Middle East
Bahrain–Iran relations
Bahrain–Qatar relations
Bahrain–Turkey relations
Chad–Iran relations
Chad–Qatar relations
Chad–Turkey relations
Comoros–Iran relations
Comoros–Qatar relations
Comoros–Turkey relations
Djibouti–Iran relations
Djibouti–Qatar relations
Djibouti–Turkey relations
Egypt–Iran relations
Egypt–Qatar relations
Egypt–Turkey relations
Iran–Jordan relations
Iran–Maldives relations
Iran–Mauritania relations
Iran–Niger relations
Iran–United Arab Emirates relations
Iran–Saudi Arabia relations
Iran–Senegal relations
Iran–Qatar relations
Iran–Turkey relations
Iran–Yemen relations
Gabon–Iran relations
Gabon–Qatar relations
Gabon–Turkey relations
Jordan–Qatar relations
Jordan–Turkey relations
Maldives–Qatar relations
Maldives–Turkey relations
Mauritania–Qatar relations
Mauritania–Turkey relations
Niger–Qatar relations
Niger–Turkey relations
Qatar–Saudi Arabia relations
Qatar–Senegal relations
Qatar–United Arab Emirates relations
Qatar–Turkey relations
Qatar–Yemen relations
Saudi Arabia–Turkey relations
Senegal–Turkey relations
Turkey–United Arab Emirates relations
Turkey–Yemen relations
Censorship in the United Arab Emirates
Censorship in Saudi Arabia
Censorship in Bahrain
Censorship in Egypt |
2583493 | https://en.wikipedia.org/wiki/Device%20mapper | Device mapper | The device mapper is a framework provided by the Linux kernel for mapping physical block devices onto higher-level virtual block devices. It forms the foundation of the logical volume manager (LVM), software RAIDs and dm-crypt disk encryption, and offers additional features such as file system snapshots.
Device mapper works by passing data from a virtual block device, which is provided by the device mapper itself, to another block device. Data can be also modified in transition, which is performed, for example, in the case of device mapper providing disk encryption or simulation of unreliable hardware behavior.
This article focuses on the device mapper implementation in the Linux kernel, but the device mapper functionality is also available in both NetBSD and DragonFly BSD.
Usage
Applications (like LVM2 and Enterprise Volume Management System (EVMS)) that need to create new mapped devices talk to the device mapper via the libdevmapper.so shared library, which in turn issues ioctls to the /dev/mapper/control device node. Configuration of the device mapper can be also examined and configured interactivelyor from shell scriptsby using the utility.
Both of these two userspace components have their source code maintained alongside the LVM2 source.
Features
Functions provided by the device mapper include linear, striped and error mappings, as well as crypt and multipath targets. For example, two disks may be concatenated into one logical volume with a pair of linear mappings, one for each disk. As another example, crypt target encrypts the data passing through the specified device, by using the Linux kernel's Crypto API.
The following mapping targets are available:
cache allows creation of hybrid volumes, by using solid-state drives (SSDs) as caches for hard disk drives (HDDs)
clone will permit usage before a transfer is complete.
crypt provides data encryption, by using the Linux kernel's Crypto API
delay delays reads and/or writes to different devices (used for testing)
era behaves in a way similar to the linear target, while it keeps track of blocks that were written to within a user-defined period of time
error simulates I/O errors for all mapped blocks (used for testing)
flakey simulates periodic unreliable behaviour (used for testing)
linear maps a continuous range of blocks onto another block device
mirror maps a mirrored logical device, while providing data redundancy
multipath supports the mapping of multipathed devices, through usage of their path groups
raid offers an interface to the Linux kernel's software RAID driver (md)
snapshot and snapshot-origin used for creation of LVM snapshots, as part of the underlying copy-on-write scheme
striped stripes the data across physical devices, with the number of stripes and the striping chunk size as parameters
thin allows creation of devices larger than the underlying physical device, physical space is allocated only when written to
zero an equivalent of /dev/zero, all reads return blocks of zeros, and writes are discarded
Applications
The following Linux kernel features and projects are built on top of the device mapper:
cryptsetup utility used to conveniently setup disk encryption based on dm-crypt
dm-crypt/LUKS mapping target that provides volume encryption
dm-cache mapping target that allows creation of hybrid volumes
dm-integrity mapping target that provides data integrity, either using checksumming or cryptographic verification, also used with LUKS
dm-log-writes mapping target that uses two devices, passing through the first device and logging the write operations performed to it on the second device
dm-verity validates the data blocks contained in a file system against a list of cryptographic hash values, developed as part of the Chromium OS project
provides access to "fake" RAID configurations via the device mapper
DM Multipath provides I/O failover and load-balancing of block devices within the Linux kernel
Docker uses device mapper to create copy-on-write storage for software containers
DRBD (Distributed Replicated Block Device)
EVMS (deprecated)
utility called from hotplug upon device maps creation and deletion
LVM2 logical volume manager for the Linux kernel
Linux version of TrueCrypt
VDO - Virtual Data Optimizer
References
External links
Device mapper home at Red Hat
an article illustrating the use of various device mapper targets
userspace tool to set up software RAID using various RAID metadata formats
Multipath support in the device mapper, LWN.net, February 23, 2005, by Jonathan Corbet
Red Hat software |
67953981 | https://en.wikipedia.org/wiki/Windows%2011 | Windows 11 | Windows 11 is the latest major release of Microsoft's Windows NT operating system that was announced on June 24, 2021, and is the successor to Windows 10, which was released in 2015. Windows 11 was released to the public on October 5, 2021, as a free upgrade via Windows Update and Windows 11 Installation Assistant on eligible devices running Windows 10.
Windows 11 features major changes to the Windows shell influenced by the canceled Windows 10X, including a redesigned Start menu, the replacement of its "live tiles" with a separate "Widgets" panel on the taskbar, the ability to create tiled sets of windows that can be minimized and restored from the taskbar as a group, and new gaming technologies inherited from Xbox Series X and Series S such as Auto HDR and DirectStorage on compatible hardware. Internet Explorer (IE) has been replaced by the Chromium-based Microsoft Edge as the default web browser like its predecessor, Windows 10, and Microsoft Teams is integrated into the Windows shell. Microsoft also announced plans to allow more flexibility in software that can be distributed via Microsoft Store, and to support Android apps on Windows 11 (including a partnership with Amazon to make its app store available for the function).
Citing security considerations, the system requirements for Windows 11 were increased over Windows 10. Microsoft only officially supports the operating system on devices using an eighth-generation Intel Core CPU or newer (with some minor exceptions), AMD Ryzen CPU based on Zen+ microarchitecture or newer, or a Qualcomm Snapdragon 850 ARM system-on-chip or newer, with UEFI secure boot and Trusted Platform Module (TPM) 2.0 supported and enabled (although Microsoft may provide exceptions to the TPM 2.0 requirement for OEMs). While the OS can be installed on unsupported processors, Microsoft does not guarantee the availability of updates. Windows 11 removed support for 32-bit x86 CPUs and devices which use BIOS firmware.
Windows 11 has received a mixed to positive reception; pre-release coverage of the operating system focused on its stricter hardware requirements, with discussions over whether they were legitimately intended to improve the security of Windows or a ploy to upsell users to newer devices, and over e-waste associated with the changes. Upon its release, Windows 11 received positive reviews for its improved visual design, window management, and a stronger focus on security, but was criticized for regressions and modifications to aspects of its user interface.
Development
At the 2015 Ignite conference, Microsoft employee Jerry Nixon stated that Windows 10 would be the "last version of Windows", a statement that Microsoft confirmed was "reflective" of its view. The operating system was considered to be a service, with new builds and updates to be released over time.
In October 2019, Microsoft announced "Windows 10X", a future edition of Windows 10 designed exclusively for dual-touchscreen devices such as the then-upcoming Surface Neo. It featured a modified user interface designed around context-sensitive "postures" for different screen configurations and usage scenarios, and changes such as a centered taskbar and updated Start menu without Windows 10's "live tiles". Legacy Windows applications would also be required to run in "containers" to ensure performance and power optimization. Microsoft stated that it planned to release Windows 10X devices by the end of 2020.
In May 2020, amid the COVID-19 pandemic, chief product officer for Microsoft Windows and Office Panos Panay stated that "as we continue to put customers' needs at the forefront, we need to focus on meeting customers where they are now", and therefore announced that Windows 10X would only launch on single-screen devices at first, and that Microsoft would "continue to look for the right moment, in conjunction with our OEM partners, to bring dual-screen devices to market".
In January 2021, it was reported that a job listing referring to a "sweeping visual rejuvenation of Windows" had been posted by Microsoft. A visual refresh for Windows, developed under the codename "Sun Valley", was reportedly set to re-design the system's user interface. Microsoft began to implement and announce some of these visual changes and other new features on Windows 10 Insider Preview builds, such as new system icons (which also included the replacement of shell resources dating back as far as Windows 95), improvements to Task View to allow changing the wallpaper on each virtual desktop, emulation of x64 applications on ARM, and adding the Auto HDR feature from Xbox Series X.
On May 18, 2021, Head of Windows Servicing and Delivery John Cable stated that Windows 10X had been canceled and that Microsoft would be "accelerating the integration of key foundational 10X technology into other parts of Windows and products at the company".
Announcement
At the Microsoft Build 2021 developer conference, CEO and chairman Satya Nadella teased about the existence of the next generation of Windows during his keynote speech. According to Nadella, he had been self-hosting it for several months. He also teased that an official announcement would come very soon. Just a week after Nadella's keynote, Microsoft started sending invitations for a dedicated Windows media event at 11 am ET on June 24, 2021. Microsoft also posted an 11-minute video of Windows start-up sounds to YouTube on June 10, 2021, with many people speculating both the time of the Microsoft event and the duration of the Windows start-up sound video to be a reference to the name of the operating system as Windows 11.
On June 24, 2021, Windows 11 was officially announced at a virtual event hosted by Chief Product Officer Panos Panay. According to Nadella, Windows 11 is "a re-imagining of the operating system". Further details for developers such as updates to the Microsoft Store, the new Windows App SDK (code-named "Project Reunion"), new Fluent Design guidelines, and more were discussed during another developer-focused event on the same day.
Release
The Windows 11 name was accidentally released in an official Microsoft support document in June 2021. Leaked images of a purported beta build of Windows 11's desktop surfaced online later on June 15, 2021, which were followed by a leak of the aforementioned build on the same day. The screenshots and leaked build show an interface resembling that of the canceled Windows 10X, alongside a redesigned out-of-box experience (OOBE) and Windows 11 branding. Microsoft would later confirm the authenticity of the leaked beta, with Panay stating that it was an "early weird build".
At the June 24 media event, Microsoft also announced that Windows 11 would be released in "Holiday 2021". Its release will be accompanied by a free upgrade for compatible Windows 10 devices through Windows Update. On June 28, Microsoft announced the release of the first preview build and SDK of Windows 11 to Windows Insiders.
On August 31, 2021, Microsoft announced that Windows 11 was to be released on October 5, 2021. The release would be phased, with newer eligible devices to be offered the upgrade first. Since its predecessor Windows 10 was released on July 29, 2015, more than six years earlier, this is the longest time span between successive releases of Microsoft Windows operating systems, beating the time between Windows XP (released on October 25, 2001) and Windows Vista (released on January 30, 2007).
Microsoft officially released Windows 11 on October 4, 2021, at 2:00 p.m. PT, which was October 5 in parts of the world. It can be obtained as an in-place upgrade via either the Windows 11 Installation Assistant application (the successor to the Media Creation Tool from Windows 10, which can also generate an ISO image or USB install media), or via Windows Update on eligible devices.
Upgrades through Windows Update are a phased rollout, and are distributed on an opt-in basis: Microsoft stated that they "expect all eligible Windows 10 devices to be offered the upgrade to Windows 11 by mid-2022." Eligible devices also may present an option to download Windows 11 during the Windows 10 out-of-box experience (OOBE) on a new installation.
In launch they partnered up with Mikey Likes It Ice Cream in NYC to make a flavor named "Bloomberry" based on the default "Bloom" wallpaper to promote Windows 11. The ice cream is blueberry with blueberry pie filling, pound cake & candy chocolate pieces. The Burj Khalifa was also lighted up to promote the operating system. The ice cream could be compared to Windows 7's launch using a 7-patty whopper and Windows 10's launch with a "10-scoop upgrade" for a ice cream parlor that did 8-scoop ice cream.
Features
Windows 11, the first major Windows release since 2015, builds upon its predecessor by revamping the user interface to follow Microsoft's new Fluent Design guidelines. The redesign, which focuses on ease of use and flexibility, comes alongside new productivity and social features and updates to security and accessibility, addressing some of the deficiencies of Windows 10.
The Microsoft Store, which serves as a unified storefront for apps and other content, is also redesigned in Windows 11. Microsoft now allows developers to distribute Win32, progressive web applications, and other packaging technologies in the Microsoft Store, alongside Universal Windows Platform apps. Microsoft also announced plans to allow third-party application stores (such as Epic Games Store) to distribute their clients on Microsoft Store. Windows 11 supports x86-64 software emulation on ARM-based platforms.
The collaboration platform Microsoft Teams is integrated into the Windows 11 user interface, and is accessible via the taskbar. Skype will no longer be bundled with the OS by default.
Microsoft claims performance improvements such as smaller update sizes, faster web browsing in "any browser", faster wake time from sleep mode, and faster Windows Hello authentication.
Windows 11 ships with the Chromium-based Microsoft Edge web browser (for compatibility with Google Chrome web browser), and does not include or support Internet Explorer. Its rendering engine MSHTML (Trident) is still included with the operating system for backwards compatibility reasons, and Edge can be configured with Group Policy to render whitelisted websites in "IE Mode" (which still uses IE's rendering engine MSHTML, instead of Blink layout engine). Windows 11 is the first version of Windows since the original retail release of Windows 95 to not ship with Internet Explorer.
The updated Xbox app, along with the Auto HDR and DirectStorage technologies introduced by the Xbox Series X and Series S, will be integrated into Windows 11; the latter requiring a graphics card supporting DirectX 12 and an NVMe solid-state drive.
User interface
A redesigned user interface is present frequently throughout the operating system, building upon Fluent Design System; translucency, shadows, a new color palette, and rounded geometry are prevalent throughout the UI. A prevalent aspect of the design is an appearance known as "Mica", described as an "opaque, dynamic material that incorporates theme and desktop wallpaper to paint the background of long-lived windows such as apps and settings". Much of the interface and start menu takes heavy inspiration from the now-canceled Windows 10X. The Segoe UI font used since Windows Vista has been updated to a variable version, improving its ability to scale between different display resolutions.
The taskbar's buttons are center-aligned by default, and it is permanently pinned to the bottom edge of the screen; it cannot be moved to the top, left, or right edges of the screen as in previous versions of Windows without manual changes to the registry. The notifications sidebar is now accessed by clicking the date and time, with other Quick Actions toggles, as well as volume, brightness, and media playback controls, moved to a new settings pop-up displayed by clicking on the system tray. The "Widgets" button on the taskbar displays a panel with Microsoft Start, a news aggregator with personalized stories and content (expanding upon the "news and interests" panel introduced in later builds of Windows 10). Microsoft Teams is similarly integrated with the taskbar, with a pop-up showing a list of recent conversations.
The Start menu has been significantly redesigned, replacing the "live tiles" used by Windows 8.x and 10 with a grid of "pinned" applications, and a list of recent applications and documents. File Explorer was updated to replace its ribbon toolbar with a more traditional toolbar, while its context menus have been redesigned to move some tasks (such as copy and paste) to a toolbar along the top of the menu, and hide other operations under an overflow menu.
Task View, a feature introduced in Windows 10, features a refreshed design, and supports giving separate wallpapers to each virtual desktop. The window snapping functionality has been enhanced with two additional features; hovering over a window's maximize button displays pre-determined "Snap Layouts" for tiling multiple windows onto a display, and tiled arrangement of windows can be minimized and restored from the taskbar as a "snap group". When a display is disconnected in a multi-monitor configuration, the windows that were previously on that display will be minimized rather than automatically moved to the main display. If the same display is reconnected, the windows are restored to their prior location.
Windows Subsystem for Android
On October 21, 2021, Windows Subsystem for Android (WSA) became available to Beta channel builds of Windows 11 for users in the United States, which allows users to install and run Android apps on their devices. Users can install Android apps through any source using the APK file format. An Amazon Appstore client for Microsoft Store will also be available.
WSA is based on the Intel Bridge runtime compiler; Intel stated that the technology is not dependent on its CPUs, and will also be supported on x86-64 and ARM CPUs from other vendors.
System security
As part of the minimum system requirements, Windows 11 only runs on devices with a Trusted Platform Module 2.0 security coprocessor. According to Microsoft, the TPM 2.0 coprocessor is a "critical building block" for protection against firmware and hardware attacks. In addition, Microsoft now requires devices with Windows 11 to include virtualization-based security (VBS), hypervisor-protected code integrity (HVCI), and Secure Boot built-in and enabled by default. The operating system also features hardware-enforced stack protection for supported Intel and AMD processors for protection against zero-day exploits.
Like its predecessor, Windows 11 also supports multi-factor authentication and biometric authentication through Windows Hello.
Versions
Windows 11 is available in two main editions; the Home edition, which is intended for consumer users, and the Pro edition, which contains additional networking and security features (such as BitLocker), as well as the ability to join a domain. Windows 11 Home may be restricted by default to verified software obtained from Microsoft Store ("S Mode"). Windows 11 Home requires an internet connection and a Microsoft account in order to complete first-time setup. In February 2022, it was announced that this restriction will also apply to Windows 11 Pro in the future.
Windows 11 SE was announced on November 9, 2021, as an edition exclusively for low-end devices sold in the education market, and a successor to Windows 10 S. It is designed to be managed via Microsoft Intune, and has changed based on feedback from educators to simplify the user interface and reduce "distractions", such as Snap Layouts not containing layouts for more than two applications at once, all applications opening maximized by default, Widgets being completely removed, and Microsoft Edge is configured by default to allow extensions from the Chrome Web Store (primarily to target those migrating from Chrome OS). It is bundled with applications such as Microsoft Office for Microsoft 365, Minecraft Education Edition, and Flipgrid, while OneDrive is used to save files by default. Windows 11 SE does not include Microsoft Store; third-party software is provisioned or installed by administrators.
The Windows Insider program carries over from Windows 10, with pre-release builds divided into "Dev" (unstable builds used to test features for future feature updates), "Beta" (test builds for the next feature update; relatively stable in comparison to Dev channel), and "Release Preview" (pre-release builds for final testing of upcoming feature updates) channels.
Supported languages
Before the launch of Windows 11, OEMs (as well as mobile operators) and businesses were offered two options for device imaging: Component-Based Servicing lp.cab files (for the languages to be preloaded on the first boot) and Local Experience Pack .appx files (for the languages available for download on supported PCs). The 38 fully-localized Language Pack (LP) languages were available as both lp.cab and .appx packages, while the remaining 72 partially-localized Language Interface Pack (LIP) languages were only available as .appx packages.
With Windows 11, that process has changed. Five new LP languages were added — Catalan, Basque, Galician, Indonesian, and Vietnamese — bringing the total number of LP languages to 43. Furthermore, these 43 languages can only be imaged using lp.cab packages. This is to ensure a fully supported language-imaging and cumulative update experience.
The remaining 67 LIP languages that are LXP-based will move to a self-service model, and can only be added by Windows users themselves via the Microsoft Store and Windows Settings apps, not during the Windows imaging process. Any user, not just admins, can now add both the display language and its features, which can help users in business environments, but these exact options for languages (both LP and LIP) still depend on the OEM and mobile operator.
Available languages
These languages are either preloaded or available for download, depending on the OEM, region of purchase, and mobile operator.
For each of the manufacturers listed, Yes is displayed if the language is supported or available for download in at least one region, and No is displayed if it is not supported in any region.
Notes
Surface
The following languages are available for download on all 2021 and newer Surface devices regardless of the region:
Danish
German
English (Australia)
English (Canada)
English (United Kingdom)
English (United States)
Spanish (Spain)
Spanish (Mexico)
French (Canada)
French (France)
Italian
Dutch
Norwegian
Polish
Finnish
Swedish
Japanese
These additional languages are available for download exclusively in their respective markets, in addition to the above languages:
Americas: Portuguese (Brazil)
EMEA: Czech, Estonian, Croatian, Latvian, Lithuanian, Hungarian, Portuguese (Portugal), Romanian, Slovak, Slovenian, Turkish, Greek, Bulgarian, Arabic
Asia Pacific: Thai, Korean, Chinese (Simplified), Chinese (Traditional), Chinese (Hong Kong)
System requirements
The basic system requirements of Windows 11 differ significantly from Windows 10. Windows 11 only supports 64-bit systems such as those using an x86-64 or ARM64 processor; IA-32 processors are no longer supported. Thus, Windows 11 is the first consumer version of Windows not to support 32-bit processors (although Windows Server 2008 R2 was the first version of Windows NT to not support them). The minimum RAM and storage requirements were also increased; Windows 11 now requires at least 4GB of RAM and 64GB of storage. S mode is only supported for the Home edition of Windows 11. As of August 2021, the officially supported list of processors includes Intel Core 8th generation and later, AMD Zen+ and later (which include the "AF" revisions of Ryzen 1000 CPUs, which are underclocked versions of Zen+-based Ryzen 2000 parts that supplant Ryzen 1000 parts that could no longer be manufactured due to a change in process), and Qualcomm Snapdragon 850 and later. The compatibility list includes the Intel Core i7-7820HQ, a seventh-generation processor used by the Surface Studio 2, although only on devices that shipped with DCH-based drivers.
Legacy BIOS is no longer supported; a UEFI system with Secure Boot and a Trusted Platform Module (TPM) 2.0 security coprocessor is now required. The TPM requirement in particular has led to confusion as many motherboards do not have TPM support, or require a compatible TPM to be physically installed onto the motherboard. Many newer CPUs also include a TPM implemented at the CPU level (with AMD referring to this "fTPM", and Intel referring to it as "Platform Trust Technology" [PTT]), which might be disabled by default and require changing settings in the computer's UEFI firmware, or an UEFI firmware update that is configured to automatically enable the firmware TPM upon installation.
Original equipment manufacturers can still ship computers without a TPM 2.0 coprocessor upon Microsoft's approval. Devices with unsupported processors are not blocked from installing or running Windows 11; however, a clean install or upgrade using ISO installation media must be performed as Windows Update will not offer an upgrade from Windows 10. Additionally, users must also accept an on-screen disclaimer stating that they will not be entitled to receive updates, and that damage caused by using Windows 11 on an unsupported configuration are not covered by the manufacturer's warranty. Some third-party software may refuse to run on "unsupported" configurations of Windows 11.
Reception
Pre-release
Reception of Windows 11 upon its reveal was positive, with critics praising the new design and productivity features. However, Microsoft was criticized for creating confusion over the minimum system requirements for Windows 11. The increased system requirements (compared to those of Windows 10) initially published by Microsoft meant that up to 60 percent of existing Windows 10 PCs were unable to upgrade to Windows 11, which has faced concerns that this will make the devices electronic waste.
It has been theorized that these system requirements were a measure intended to encourage the purchase of new PCs, especially amid a downturn in PC sales and increased prices due to the global chip shortage. While Microsoft has not specifically acknowledged this when discussing the cutoff, it was also acknowledged that the sixth and seventh generation of Intel Core processors were prominently afflicted by CPU-level security vulnerabilities such as Meltdown and Spectre, and that newer CPUs manufactured since then had increased mitigations against the flaws. Research Vice President of Gartner Stephen Kleynhans felt that Microsoft was "looking at the entire stack from the hardware up through the applications and the user experience and trying to make the entire stack work better and more securely.
Launch
Andrew Cunningham of Ars Technica praised the improvements to its visual design (describing the new "Mica" appearance as reminiscent of the visual appearance of iOS and macOS, and arguing that Microsoft had "[made] a serious effort" at making the user-facing aspects of Windows 11 more consistent visually), window management, performance (assessed as being equivalent to if not better than Windows 10), other "beneficial tweaks", and its system requirements having brought greater public attention to hardware security features present on modern PCs. Criticism was raised towards Widgets' lack of support for third-party content (thus limiting it to Microsoft services only), regressions in taskbar functionality and customization, the inability to easily select default applications for common tasks such as web browsing (now requiring the user to select the browser application for each file type individually), and Microsoft's unclear justification for its processor compatibility criteria. Cunningham concluded that "as I've dug into [Windows 11] and learned its ins and outs for this review, I've warmed to it more", but argued that the OS was facing similar "public perception" issues to Windows Vista and Windows 8. However, he noted that 11 did not have as many performance issues or bugs as Vista had upon its release, nor was as "disjointed" as 8, and recommended that users who were unsure about the upgrade should stay on Windows 10 in anticipation of future updates to 11.
Tom Warren of The Verge described Windows 11 as being akin to a house in the middle of renovations, but that "actually using Windows 11 for the past few months hasn't felt as controversial as I had expected"—praising its updated user interface as being more modern and reminiscent of iOS and Chrome OS, the new start menu for feeling less cluttered than the Windows 10 iteration, updates to some of its stock applications, and Snap Assist. Warren noted that he rarely used the Widgets panel or Microsoft Teams, citing that he preferred the weather display that later versions of Windows 10 offered, and didn't use Teams to communicate with his friends and family. He also acknowledged the expansion of Microsoft Store to include more "traditional" desktop applications. However, he felt that Windows 11 still felt like a work in progress, noting UI inconsistencies (such as dark mode and new context menu designs not being uniform across all dialogues and applications, and the modern Settings app still falling back upon legacy Control Panel applets for certain settings), regressions to the taskbar (including the inability to move it, drag files onto taskbar buttons to focus the corresponding application, and the clock only shown on the primary display in multi-monitor configurations), and promised features (such as dynamic refresh rate support and a universal microphone mute button) not being present on the initial release. Overall, he concluded that "I wouldn't rush out to upgrade to Windows 11, but I also wouldn't avoid it. After all, Windows 11 still feels familiar and underneath all the UI changes, it's the same Windows we've had for decades."
PC World was more critical, arguing that Windows 11 "sacrifices productivity for personality, but without cohesion", commenting upon changes such as the inability to use local "offline" accounts on Windows 11 Home, regressions to the taskbar, a "functionally worse" start menu, Microsoft Teams integration having privacy implications and being a ploy to coerce users into switching to the service, File Explorer obscuring common functions under unclear icons, using "terribly sleazy" behaviors to discourage changing the default web browser from Microsoft Edge, and that the OS "anecdotally feels less responsive, slower, and heavier than Windows 10." It was concluded that Windows 11 "feels practical and productive, but less so than its predecessor in many aspects", while its best features were either "hidden deeper within", required specific hardware (DirectStorage, Auto HDR) or were not available on launch (Android app support).
See also
List of operating systems
References
External links
Windows 11 release information from Microsoft
11
2021 software
Android (operating system)
ARM operating systems
Computer-related introductions in 2021
Proprietary operating systems
Tablet operating systems
X86-64 operating systems |
239098 | https://en.wikipedia.org/wiki/BitTorrent | BitTorrent | BitTorrent is a communication protocol for peer-to-peer file sharing (P2P), which enables users to distribute data and electronic files over the Internet in a decentralized manner.
To send or receive files, a person uses a BitTorrent client on their Internet-connected computer. A BitTorrent client is a computer program that implements the BitTorrent protocol. BitTorrent clients are available for a variety of computing platforms and operating systems, including an official client released by BitTorrent, Inc. Popular clients include μTorrent, Xunlei Thunder, Transmission, qBittorrent, Vuze, Deluge, BitComet and Tixati. BitTorrent trackers provide a list of files available for transfer and allow the client to find peer users, known as "seeds", who may transfer the files.
Programmer Bram Cohen designed the protocol in April 2001, and released the first available version on 2 July 2001. On 15 May 2017, BitTorrent, Inc. (later renamed Rainberry, Inc.) released BitTorrent v2 protocol specification. libtorrent was updated to support the new version on 6 September 2020.
BitTorrent is one of the most common protocols for transferring large files, such as digital video files containing TV shows and video clips, or digital audio files containing songs. BitTorrent was responsible for 3.35% of all worldwide bandwidth—more than half of the 6% of total bandwidth dedicated to file sharing. In 2019, BitTorrent was a dominant file sharing protocol and generated a substantial amount of Internet traffic, with 2.46% of downstream, and 27.58% of upstream traffic. , BitTorrent had 15–27 million concurrent users at any time. , BitTorrent is utilized by 150 million active users. Based on this figure, the total number of monthly users may be estimated to more than a quarter of a billion (≈ 250 million).
The use of BitTorrent may sometimes be limited by Internet Service Providers (ISPs), on legal or copyright grounds. Users may choose to run seedboxes or Virtual Private Networks (VPNs) to circumvent these restrictions.
History
Programmer Bram Cohen, a University at Buffalo alumnus, designed the protocol in April 2001, and released the first available version on 2 July 2001.
The first release of the BitTorrent client had no search engine and no peer exchange. Up until 2005, the only way to share files was by creating a small text file called a "torrent", that they would upload to a torrent index site. The first uploader acted as a seed, and downloaders would initially connect as peers. Those who wish to download the file would download the torrent, which their client would use to connect to a tracker which had a list of the IP addresses of other seeds and peers in the swarm. Once a peer completed a download of the complete file, it could in turn function as a seed. These files contain metadata about the files to be shared and the trackers which keep track of the other seeds and peers.
In 2005, first Vuze and then the BitTorrent client introduced distributed tracking using distributed hash tables which allowed clients to exchange data on swarms directly without the need for a torrent file.
In 2006, peer exchange functionality was added allowing clients to add peers based on the data found on connected nodes.
BitTorrent v2 is intended to work seamlessly with previous versions of the BitTorrent protocol. The main reason for the update was that the old cryptographic hash function, SHA-1 is no longer considered safe from malicious attacks by the developers, and as such, v2 uses SHA-256. To ensure backwards compatibility, the v2 .torrent file format supports a hybrid mode where the torrents are hashed through both the new method and the old method, with the intent that the files will be shared with peers on both v1 and v2 swarms. Another update to the specification is adding a hash tree to speed up time from adding a torrent to downloading files, and to allow more granular checks for file corruption. In addition, each file is now hashed individually, enabling files in the swarm to be deduplicated, so that if multiple torrents include the same files, but seeders are only seeding the file from some, downloaders of the other torrents can still download the file. Magnet links for v2 also support a hybrid mode to ensure support for legacy clients.
Design
The BitTorrent protocol can be used to reduce the server and network impact of distributing large files. Rather than downloading a file from a single source server, the BitTorrent protocol allows users to join a "swarm" of hosts to upload and download from each other simultaneously. The protocol is an alternative to the older single source, multiple mirror sources technique for distributing data, and can work effectively over networks with lower bandwidth. Using the BitTorrent protocol, several basic computers, such as home computers, can replace large servers while efficiently distributing files to many recipients. This lower bandwidth usage also helps prevent large spikes in internet traffic in a given area, keeping internet speeds higher for all users in general, regardless of whether or not they use the BitTorrent protocol.
The file being distributed is divided into segments called pieces. As each peer receives a new piece of the file, it becomes a source (of that piece) for other peers, relieving the original seed from having to send that piece to every computer or user wishing a copy. With BitTorrent, the task of distributing the file is shared by those who want it; it is entirely possible for the seed to send only a single copy of the file itself and eventually distribute to an unlimited number of peers. Each piece is protected by a cryptographic hash contained in the torrent descriptor. This ensures that any modification of the piece can be reliably detected, and thus prevents both accidental and malicious modifications of any of the pieces received at other nodes. If a node starts with an authentic copy of the torrent descriptor, it can verify the authenticity of the entire file it receives.
Pieces are typically downloaded non-sequentially, and are rearranged into the correct order by the BitTorrent client, which monitors which pieces it needs, and which pieces it has and can upload to other peers. Pieces are of the same size throughout a single download (for example, a 10 MB file may be transmitted as ten 1 MB pieces or as forty 256 KB pieces).
Due to the nature of this approach, the download of any file can be halted at any time and be resumed at a later date, without the loss of previously downloaded information, which in turn makes BitTorrent particularly useful in the transfer of larger files. This also enables the client to seek out readily available pieces and download them immediately, rather than halting the download and waiting for the next (and possibly unavailable) piece in line, which typically reduces the overall time of the download. This eventual transition from peers to seeders determines the overall "health" of the file (as determined by the number of times a file is available in its complete form).
The distributed nature of BitTorrent can lead to a flood-like spreading of a file throughout many peer computer nodes. As more peers join the swarm, the likelihood of a successful download by any particular node increases. Relative to traditional Internet distribution schemes, this permits a significant reduction in the original distributor's hardware and bandwidth resource costs. Distributed downloading protocols in general provide redundancy against system problems, reduce dependence on the original distributor, and provide sources for the file which are generally transient and therefore there is no single point of failure as in one way server-client transfers.
Though both ultimately transfer files over a network, a BitTorrent download differs from a one way server-client download (as is typical with an HTTP or FTP request, for example) in several fundamental ways:
BitTorrent makes many small data requests over different IP connections to different machines, while server-client downloading is typically made via a single TCP connection to a single machine.
BitTorrent downloads in a random or in a "rarest-first" approach that ensures high availability, while classic downloads are sequential.
Taken together, these differences allow BitTorrent to achieve much lower cost to the content provider, much higher redundancy, and much greater resistance to abuse or to "flash crowds" than regular server software. However, this protection, theoretically, comes at a cost: downloads can take time to rise to full speed because it may take time for enough peer connections to be established, and it may take time for a node to receive sufficient data to become an effective uploader. This contrasts with regular downloads (such as from an HTTP server, for example) that, while more vulnerable to overload and abuse, rise to full speed very quickly, and maintain this speed throughout. In the beginning, BitTorrent's non-contiguous download methods made it harder to support "streaming playback". In 2014, the client Popcorn Time allowed for streaming of BitTorrent video files. Since then, more and more clients are offering streaming options.
Searching
The BitTorrent protocol provides no way to index torrent files. As a result, a comparatively small number of websites have hosted a large majority of torrents, many linking to copyrighted works without the authorization of copyright holders, rendering those sites especially vulnerable to lawsuits. A BitTorrent index is a "list of .torrent files, which typically includes descriptions" and information about the torrent's content. Several types of websites support the discovery and distribution of data on the BitTorrent network. Public torrent-hosting sites such as The Pirate Bay allow users to search and download from their collection of torrent files. Users can typically also upload torrent files for content they wish to distribute. Often, these sites also run BitTorrent trackers for their hosted torrent files, but these two functions are not mutually dependent: a torrent file could be hosted on one site and tracked by another unrelated site. Private host/tracker sites operate like public ones except that they may restrict access to registered users and may also keep track of the amount of data each user uploads and downloads, in an attempt to reduce "leeching".
Web search engines allow the discovery of torrent files that are hosted and tracked on other sites; examples include The Pirate Bay and BTDigg. These sites allow the user to ask for content meeting specific criteria (such as containing a given word or phrase) and retrieve a list of links to torrent files matching those criteria. This list can often be sorted with respect to several criteria, relevance (seeders-leechers ratio) being one of the most popular and useful (due to the way the protocol behaves, the download bandwidth achievable is very sensitive to this value). Metasearch engines allow one to search several BitTorrent indices and search engines at once.
The Tribler BitTorrent client was among the first to incorporate built-in search capabilities. With Tribler, users can find .torrent files held by random peers and taste buddies. It adds such an ability to the BitTorrent protocol using a gossip protocol, somewhat similar to the eXeem network which was shut down in 2005. The software includes the ability to recommend content as well. After a dozen downloads, the Tribler software can roughly estimate the download taste of the user, and recommend additional content.
In May 2007, researchers at Cornell University published a paper proposing a new approach to searching a peer-to-peer network for inexact strings, which could replace the functionality of a central indexing site. A year later, the same team implemented the system as a plugin for Vuze called Cubit and published a follow-up paper reporting its success.
A somewhat similar facility but with a slightly different approach is provided by the BitComet client through its "Torrent Exchange" feature. Whenever two peers using BitComet (with Torrent Exchange enabled) connect to each other they exchange lists of all the torrents (name and info-hash) they have in the Torrent Share storage (torrent files which were previously downloaded and for which the user chose to enable sharing by Torrent Exchange). Thus each client builds up a list of all the torrents shared by the peers it connected to in the current session (or it can even maintain the list between sessions if instructed).
At any time the user can search into that Torrent Collection list for a certain torrent and sort the list by categories. When the user chooses to download a torrent from that list, the .torrent file is automatically searched for (by info-hash value) in the DHT Network and when found it is downloaded by the querying client which can after that create and initiate a downloading task.
Downloading and sharing
Users find a torrent of interest on a torrent index site or by using a search engine built into the client, download it, and open it with a BitTorrent client. The client connects to the tracker(s) or seeds specified in the torrent file, from which it receives a list of seeds and peers currently transferring pieces of the file(s). The client connects to those peers to obtain the various pieces. If the swarm contains only the initial seeder, the client connects directly to it, and begins to request pieces. Clients incorporate mechanisms to optimize their download and upload rates.
The effectiveness of this data exchange depends largely on the policies that clients use to determine to whom to send data. Clients may prefer to send data to peers that send data back to them (a "tit for tat" exchange scheme), which encourages fair trading. But strict policies often result in suboptimal situations, such as when newly joined peers are unable to receive any data because they don't have any pieces yet to trade themselves or when two peers with a good connection between them do not exchange data simply because neither of them takes the initiative. To counter these effects, the official BitTorrent client program uses a mechanism called "optimistic unchoking", whereby the client reserves a portion of its available bandwidth for sending pieces to random peers (not necessarily known good partners, so called preferred peers) in hopes of discovering even better partners and to ensure that newcomers get a chance to join the swarm.
Although "swarming" scales well to tolerate "flash crowds" for popular content, it is less useful for unpopular or niche market content. Peers arriving after the initial rush might find the content unavailable and need to wait for the arrival of a "seed" in order to complete their downloads. The seed arrival, in turn, may take long to happen (this is termed the "seeder promotion problem"). Since maintaining seeds for unpopular content entails high bandwidth and administrative costs, this runs counter to the goals of publishers that value BitTorrent as a cheap alternative to a client-server approach. This occurs on a huge scale; measurements have shown that 38% of all new torrents become unavailable within the first month. A strategy adopted by many publishers which significantly increases availability of unpopular content consists of bundling multiple files in a single swarm. More sophisticated solutions have also been proposed; generally, these use cross-torrent mechanisms through which multiple torrents can cooperate to better share content.
Creating and publishing
The peer distributing a data file treats the file as a number of identically sized pieces, usually with byte sizes of a power of 2, and typically between 32 kB and 16 MB each. The peer creates a hash for each piece, using the SHA-1 hash function, and records it in the torrent file. Pieces with sizes greater than 512 kB will reduce the size of a torrent file for a very large payload, but is claimed to reduce the efficiency of the protocol. When another peer later receives a particular piece, the hash of the piece is compared to the recorded hash to test that the piece is error-free. Peers that provide a complete file are called seeders, and the peer providing the initial copy is called the initial seeder. The exact information contained in the torrent file depends on the version of the BitTorrent protocol.
By convention, the name of a torrent file has the suffix .torrent. Torrent files have an "announce" section, which specifies the URL of the tracker, and an "info" section, containing (suggested) names for the files, their lengths, the piece length used, and a SHA-1 hash code for each piece, all of which are used by clients to verify the integrity of the data they receive. Though SHA-1 has shown signs of cryptographic weakness, Bram Cohen did not initially consider the risk big enough for a backward incompatible change to, for example, SHA-3. As of BitTorrent v2 the hash function has been updated to SHA-256.
In the early days, torrent files were typically published to torrent index websites, and registered with at least one tracker. The tracker maintained lists of the clients currently connected to the swarm. Alternatively, in a trackerless system (decentralized tracking) every peer acts as a tracker. Azureus was the first BitTorrent client to implement such a system through the distributed hash table (DHT) method. An alternative and incompatible DHT system, known as Mainline DHT, was released in the Mainline BitTorrent client three weeks later (though it had been in development since 2002) and subsequently adopted by the μTorrent, Transmission, rTorrent, KTorrent, BitComet, and Deluge clients.
After the DHT was adopted, a "private" flag – analogous to the broadcast flag – was unofficially introduced, telling clients to restrict the use of decentralized tracking regardless of the user's desires. The flag is intentionally placed in the info section of the torrent so that it cannot be disabled or removed without changing the identity of the torrent. The purpose of the flag is to prevent torrents from being shared with clients that do not have access to the tracker. The flag was requested for inclusion in the official specification in August 2008, but has not been accepted yet. Clients that have ignored the private flag were banned by many trackers, discouraging the practice.
Anonymity
BitTorrent does not, on its own, offer its users anonymity. One can usually see the IP addresses of all peers in a swarm in one's own client or firewall program. This may expose users with insecure systems to attacks. In some countries, copyright organizations scrape lists of peers, and send takedown notices to the internet service provider of users participating in the swarms of files that are under copyright. In some jurisdictions, copyright holders may launch lawsuits against uploaders or downloaders for infringement, and police may arrest suspects in such cases.
Various means have been used to promote anonymity. For example, the BitTorrent client Tribler makes available a Tor-like onion network, optionally routing transfers through other peers to obscure which client has requested the data. The exit node would be visible to peers in a swarm, but the Tribler organization provides exit nodes. One advantage of Tribler is that clearnet torrents can be downloaded with only a small decrease in download speed from one "hop" of routing.
i2p provides a similar anonymity layer although in that case, one can only download torrents that have been uploaded to the i2p network. The bittorrent client Vuze allows users who are not concerned about anonymity to take clearnet torrents, and make them available on the i2p network.
Most BitTorrent clients are not designed to provide anonymity when used over Tor, and there is some debate as to whether torrenting over Tor acts as a drag on the network.
Private torrent trackers are usually invitation only, and require members to participate in uploading, but have the downside of a single centralized point of failure. Oink's Pink Palace and What.cd are examples of private trackers which have been shut down.
Seedbox services download the torrent files first to the company's servers, allowing the user to direct download the file from there. One's IP address would be visible to the Seedbox provider, but not to third parties.
Virtual private networks encrypt transfers, and substitute a different IP address for the user's, so that anyone monitoring a torrent swarm will only see that address.
Associated technologies
Distributed trackers
On 2 May 2005, Azureus 2.3.0.0 (now known as Vuze) was released, introducing support for "trackerless" torrents through a system called the "distributed database." This system is a Distributed hash table implementation which allows the client to use torrents that do not have a working BitTorrent tracker. Instead just bootstrapping server is used (router.bittorrent.com, dht.transmissionbt.com or router.utorrent.com). The following month, BitTorrent, Inc. released version 4.2.0 of the Mainline BitTorrent client, which supported an alternative DHT implementation (popularly known as "Mainline DHT", outlined in a draft on their website) that is incompatible with that of Azureus. In 2014, measurement showed concurrent users of Mainline DHT to be from 10 million to 25 million, with a daily churn of at least 10 million.
Current versions of the official BitTorrent client, μTorrent, BitComet, Transmission and BitSpirit all share compatibility with Mainline DHT. Both DHT implementations are based on Kademlia. As of version 3.0.5.0, Azureus also supports Mainline DHT in addition to its own distributed database through use of an optional application plugin. This potentially allows the Azureus/Vuze client to reach a bigger swarm.
Another idea that has surfaced in Vuze is that of virtual torrents. This idea is based on the distributed tracker approach and is used to describe some web resource. Currently, it is used for instant messaging. It is implemented using a special messaging protocol and requires an appropriate plugin. Anatomic P2P is another approach, which uses a decentralized network of nodes that route traffic to dynamic trackers. Most BitTorrent clients also use Peer exchange (PEX) to gather peers in addition to trackers and DHT. Peer exchange checks with known peers to see if they know of any other peers. With the 3.0.5.0 release of Vuze, all major BitTorrent clients now have compatible peer exchange.
Web seeding
Web "seeding" was implemented in 2006 as the ability of BitTorrent clients to download torrent pieces from an HTTP source in addition to the "swarm". The advantage of this feature is that a website may distribute a torrent for a particular file or batch of files and make those files available for download from that same web server; this can simplify long-term seeding and load balancing through the use of existing, cheap, web hosting setups. In theory, this would make using BitTorrent almost as easy for a web publisher as creating a direct HTTP download. In addition, it would allow the "web seed" to be disabled if the swarm becomes too popular while still allowing the file to be readily available. This feature has two distinct specifications, both of which are supported by Libtorrent and the 26+ clients that use it.
The first was created by John "TheSHAD0W" Hoffman, who created BitTornado. This first specification requires running a web service that serves content by info-hash and piece number, rather than filename.
The other specification is created by GetRight authors and can rely on a basic HTTP download space (using byte serving).
In September 2010, a new service named Burnbit was launched which generates a torrent from any URL using webseeding. There are server-side solutions that provide initial seeding of the file from the web server via standard BitTorrent protocol and when the number of external seeders reach a limit, they stop serving the file from the original source.
RSS feeds
A technique called broadcatching combines RSS feeds with the BitTorrent protocol to create a content delivery system, further simplifying and automating content distribution. Steve Gillmor explained the concept in a column for Ziff-Davis in December 2003. The discussion spread quickly among bloggers (Ernest Miller, Chris Pirillo, etc.). In an article entitled Broadcatching with BitTorrent, Scott Raymond explained:
The RSS feed will track the content, while BitTorrent ensures content integrity with cryptographic hashing of all data, so feed subscribers will receive uncorrupted content. One of the first and popular software clients (free and open source) for broadcatching is Miro. Other free software clients such as PenguinTV and KatchTV are also now supporting broadcatching. The BitTorrent web-service MoveDigital added the ability to make torrents available to any web application capable of parsing XML through its standard REST-based interface in 2006, though this has since been discontinued. Additionally, Torrenthut is developing a similar torrent API that will provide the same features, and help bring the torrent community to Web 2.0 standards. Alongside this release is a first PHP application built using the API called PEP, which will parse any Really Simple Syndication (RSS 2.0) feed and automatically create and seed a torrent for each enclosure found in that feed.
Throttling and encryption
Since BitTorrent makes up a large proportion of total traffic, some ISPs have chosen to "throttle" (slow down) BitTorrent transfers. For this reason, methods have been developed to disguise BitTorrent traffic in an attempt to thwart these efforts. Protocol header encrypt (PHE) and Message stream encryption/Protocol encryption (MSE/PE) are features of some BitTorrent clients that attempt to make BitTorrent hard to detect and throttle. As of November 2015, Vuze, Bitcomet, KTorrent, Transmission, Deluge, μTorrent, MooPolice, Halite, qBittorrent, rTorrent, and the latest official BitTorrent client (v6) support MSE/PE encryption.
In August 2007, Comcast was preventing BitTorrent seeding by monitoring and interfering with the communication between peers. Protection against these efforts is provided by proxying the client-tracker traffic via an encrypted tunnel to a point outside of the Comcast network. In 2008, Comcast called a "truce" with BitTorrent, Inc. with the intention of shaping traffic in a protocol-agnostic manner. Questions about the ethics and legality of Comcast's behavior have led to renewed debate about net neutrality in the United States. In general, although encryption can make it difficult to determine what is being shared, BitTorrent is vulnerable to traffic analysis. Thus, even with MSE/PE, it may be possible for an ISP to recognize BitTorrent and also to determine that a system is no longer downloading but only uploading data, and terminate its connection by injecting TCP RST (reset flag) packets.
Multitrackers
Another unofficial feature is an extension to the BitTorrent metadata format proposed by John Hoffman and implemented by several indexing websites. It allows the use of multiple trackers per file, so if one tracker fails, others can continue to support file transfer. It is implemented in several clients, such as BitComet, BitTornado, BitTorrent, KTorrent, Transmission, Deluge, μTorrent, rtorrent, Vuze, and Frostwire. Trackers are placed in groups, or tiers, with a tracker randomly chosen from the top tier and tried, moving to the next tier if all the trackers in the top tier fail.
Torrents with multiple trackers can decrease the time it takes to download a file, but also have a few consequences:
Poorly implemented clients may contact multiple trackers, leading to more overhead-traffic.
Torrents from closed trackers suddenly become downloadable by non-members, as they can connect to a seed via an open tracker.
Peer selection
BitTorrent, Inc. was working with Oversi on new Policy Discover Protocols that query the ISP for capabilities and network architecture information. Oversi's ISP hosted NetEnhancer box is designed to "improve peer selection" by helping peers find local nodes, improving download speeds while reducing the loads into and out of the ISP's network.
Implementations
The BitTorrent specification is free to use and many clients are open source, so BitTorrent clients have been created for all common operating systems using a variety of programming languages. The official BitTorrent client, μTorrent, qBittorrent, Transmission, Vuze, and BitComet are some of the most popular clients.
Some BitTorrent implementations such as MLDonkey and Torrentflux are designed to run as servers. For example, this can be used to centralize file sharing on a single dedicated server which users share access to on the network. Server-oriented BitTorrent implementations can also be hosted by hosting providers at co-located facilities with high bandwidth Internet connectivity (e.g., a datacenter) which can provide dramatic speed benefits over using BitTorrent from a regular home broadband connection. Services such as ImageShack can download files on BitTorrent for the user, allowing them to download the entire file by HTTP once it is finished.
The Opera web browser supports BitTorrent natively. Brave web browser ships with an extension which supports WebTorrent, a BitTorrent-like protocol based on WebRTC instead of UDP and TCP. BitLet allowed users to download Torrents directly from their browser using a Java applet (until browsers removed support for Java applets). An increasing number of hardware devices are being made to support BitTorrent. These include routers and NAS devices containing BitTorrent-capable firmware like OpenWrt. Proprietary versions of the protocol which implement DRM, encryption, and authentication are found within managed clients such as Pando.
Adoption
A growing number of individuals and organizations are using BitTorrent to distribute their own or licensed works (e.g. indie bands distributing digital files of their new songs). Independent adopters report that BitTorrent technology reduces demands on private networking hardware and bandwidth, an essential for non-profit groups with large amounts of internet traffic.
Some uses of BitTorrent for file sharing may violate laws in some jurisdictions (see legislation section).
Film, video, and music
BitTorrent Inc. has obtained a number of licenses from Hollywood studios for distributing popular content from their websites.
Sub Pop Records releases tracks and videos via BitTorrent Inc. to distribute its 1000+ albums. Babyshambles and The Libertines (both bands associated with Pete Doherty) have extensively used torrents to distribute hundreds of demos and live videos. US industrial rock band Nine Inch Nails frequently distributes albums via BitTorrent.
Podcasting software is starting to integrate BitTorrent to help podcasters deal with the download demands of their MP3 "radio" programs. Specifically, Juice and Miro (formerly known as Democracy Player) support automatic processing of .torrent files from RSS feeds. Similarly, some BitTorrent clients, such as μTorrent, are able to process web feeds and automatically download content found within them.
DGM Live purchases are provided via BitTorrent.
VODO, a service which distributes "free-to-share" movies and TV shows via BitTorrent.
Broadcasters
In 2008, the CBC became the first public broadcaster in North America to make a full show (Canada's Next Great Prime Minister) available for download using BitTorrent.
The Norwegian Broadcasting Corporation (NRK) has since March 2008 experimented with bittorrent distribution, available online. Only selected works in which NRK owns all royalties are published. Responses have been very positive, and NRK is planning to offer more content.
The Dutch VPRO broadcasting organization released four documentaries in 2009 and 2010 under a Creative Commons license using the content distribution feature of the Mininova tracker.
Cloud Service Providers
The Amazon AWS's Simple Storage Service (S3), until April 29, 2021, had supported sharing of bucket objects with BitTorrent protocols. As of June 13, 2020, the feature is only available in service regions launched after May 30, 2016. The feature for the existing customers will be extended for an additional 12 months following the deprecation. After April 29, 2022, BitTorrent clients will no longer connect to Amazon S3.
Software
Blizzard Entertainment uses BitTorrent (via a proprietary client called the "Blizzard Downloader", associated with the Blizzard "BattleNet" network) to distribute content and patches for Diablo III, StarCraft II and World of Warcraft, including the games themselves.
Wargaming uses BitTorrent in their popular titles World of Tanks, World of Warships and World of Warplanes to distribute game updates.
CCP Games, maker of the space simulation MMORPG Eve Online, has announced that a new launcher will be released that is based on BitTorrent.
Many software games, especially those whose large size makes them difficult to host due to bandwidth limits, extremely frequent downloads, and unpredictable changes in network traffic, will distribute instead a specialized, stripped down BitTorrent client with enough functionality to download the game from the other running clients and the primary server (which is maintained in case not enough peers are available).
Many major open source and free software projects encourage BitTorrent as well as conventional downloads of their products (via HTTP, FTP etc.) to increase availability and to reduce load on their own servers, especially when dealing with larger files.
Resilio Sync is a BitTorrent-based folder-syncing tool which can act as an alternative to server-based synchronisation services such as Dropbox.
Government
The British government used BitTorrent to distribute details about how the tax money of British citizens was spent.
Education
Florida State University uses BitTorrent to distribute large scientific data sets to its researchers.
Many universities that have BOINC distributed computing projects have used the BitTorrent functionality of the client-server system to reduce the bandwidth costs of distributing the client-side applications used to process the scientific data. If a BOINC distributed computing application needs to be updated (or merely sent to a user), it can do so with little impact on the BOINC server.
The developing Human Connectome Project uses BitTorrent to share their open dataset.
Academic Torrents is a BitTorrent tracker for use by researchers in fields that need to share large datasets
Others
Facebook uses BitTorrent to distribute updates to Facebook servers.
Twitter uses BitTorrent to distribute updates to Twitter servers.
The Internet Archive added BitTorrent to its file download options for over 1.3 million existing files, and all newly uploaded files, in August 2012. This method is the fastest means of downloading media from the Archive.
By early 2015, AT&T estimated that BitTorrent accounted for 20% of all broadband traffic.
Routers that use network address translation (NAT) must maintain tables of source and destination IP addresses and ports. Because BitTorrent frequently contacts 20–30 servers per second, the NAT tables of some consumer-grade routers are rapidly filled. This is a known cause of some home routers ceasing to work correctly.
Legislation
Although the protocol itself is legal, problems stem from using the protocol to traffic copyright infringing works, since BitTorrent is often used to download otherwise paid content, such as movies and video games. There has been much controversy over the use of BitTorrent trackers. BitTorrent metafiles themselves do not store file contents. Whether the publishers of BitTorrent metafiles violate copyrights by linking to copyrighted works without the authorization of copyright holders is controversial. Various jurisdictions have pursued legal action against websites that host BitTorrent trackers.
High-profile examples include the closing of Suprnova.org, TorrentSpy, LokiTorrent, BTJunkie, Mininova, Oink's Pink Palace and What.cd. BitTorrent search engine The Pirate Bay torrent website, formed by a Swedish group, is noted for the "legal" section of its website in which letters and replies on the subject of alleged copyright infringements are publicly displayed. On 31 May 2006, The Pirate Bay's servers in Sweden were raided by Swedish police on allegations by the MPAA of copyright infringement; however, the tracker was up and running again three days later. In the study used to value NBC Universal in its merger with Comcast, Envisional examined the 10,000 torrent swarms managed by PublicBT which had the most active downloaders. After excluding pornographic and unidentifiable content, it was found that only one swarm offered legitimate content.
In the United States, more than 200,000 lawsuits have been filed for copyright infringement on BitTorrent since 2010. In the United Kingdom, on 30 April 2012, the High Court of Justice ordered five ISPs to block The Pirate Bay.
Security
One concern is the UDP flood attack. BitTorrent implementations often use μTP for their communication. To achieve high bandwidths, the underlying protocol used is UDP, which allows spoofing of source addresses of internet traffic. It has been possible to carry out Denial-of-service attacks in a P2P lab environment, where users running BitTorrent clients act as amplifiers for an attack at another service. However this is not always an effective attack because ISPs can check if the source address is correct.
Several studies on BitTorrent found files available for download containing malware. In particular, one small sample indicated that 18% of all executable programs available for download contained malware. Another study claims that as much as 14.5% of BitTorrent downloads contain zero-day malware, and that BitTorrent was used as the distribution mechanism for 47% of all zero-day malware they have found.
See also
Anonymous P2P
Anti-Counterfeiting Trade Agreement
Bencode
Cache Discovery Protocol
Comparison of BitTorrent clients
Comparison of BitTorrent sites
Comparison of BitTorrent tracker software
Glossary of BitTorrent terms
Magnet URI scheme
Simple file verification
Super-seeding
Torrent poisoning
References
Further reading
External links
Specification
Unofficial BitTorrent Protocol Specification v1.0 at wiki.theory.org
Unofficial BitTorrent Location-aware Protocol 1.0 Specification at wiki.theory.org
Application layer protocols
Computer-related introductions in 2001
File sharing
Web 2.0 |
43228047 | https://en.wikipedia.org/wiki/Automotive%20Grade%20Linux | Automotive Grade Linux | Automotive Grade Linux (AGL) is an open source project hosted by The Linux Foundation that is building an open operating system and framework for automotive applications. AGL was launched in 2012 with founding members including Jaguar Land Rover, Nissan, Toyota, DENSO Corporation, Fujitsu, HARMAN, NVIDIA, Renesas, Samsung, and Texas Instruments (TI). Today, AGL has 146 members.
Release History
Release notes contain details for each of the following:
On June 30, 2014, AGL announced their first release, which was based on Tizen and was primarily for demo applications.
AGL expanded the first reference platform with the Unified Code Base (UCB) distribution. The first UCB release, nicknamed Agile Albacore, was released in January 2016 and leverages software components from AGL, Tizen and GENIVI Alliance.
UCB 2.0, nicknamed Brilliant Blowfish, was made available in July 2016 and included new features like rear seat display, video playback, audio routing and application framework.
UCB 3.0, or Charming Chinook was released in January 2017 with Smart Device Link for Mobile Integration and a new Window Manager & SDK.
UCB 4.0 (Daring Dab) was announced in early 2017 and released in August; features include Secure Over-the-Air (OTA), SmartDeviceLink integration, and speech recognition APIs.
UCB 5.0 (Electric Eel) was released in January 2018. Improved features included wider and more robust hardware support, support for control from multiple surfaces, audio management and OTA updates.
UCB 6.0 (Funky Flounder) was made available in October 2018. Features include telematics systems, electronic instrument clusters.
UCB 7.0 (Grumpy Gumpy) was released in March 2019 featuring a speech recognition API .
UCB 8.0 (Happy Halibut) was released in August 2019 and decreased the footprint of AGL while increasing the modularity. It added Alexa integration as well as better Audio and CAN support.
UCB 9.0 (Itchy Icefish) was made available in April 2020
UCB 10.0 (Jumping Jellyfish) was made available in November 2020
UCB 11.0 (Kooky Koi) was made available in February 2021
UCB 12.0 (Lucky Lamprey) was made available in July 2021
UCB 12.91 (Magic Marlin) was made available in December 2021
Adoption History
On May 31, 2017, AGL announced that the 2018 Toyota Camry will be the first Toyota vehicle on the market with the AGL-based system in the United States.
On January 30, 2019, it was reported that the Mazda3 was using AGL.
As of April 2020 Mercedes Benz, Subaru and Toyota produce vehicles which make use of the UCB for their vehicles.
References
External links
Tizen
Automotive Grade Linux website
Automotive Linux Wiki
Automotive software
Dashboard head units
Embedded Linux
Embedded operating systems
Linux Foundation projects |
300267 | https://en.wikipedia.org/wiki/Power%20Computing%20Corporation | Power Computing Corporation | Power Computing Corporation (often referred to as Power Computing) was the first company selected by Apple Inc to create Macintosh-compatible computers ("Mac clones"). Stephen “Steve” Kahng, a computer engineer best known for his design of the Leading Edge Model D, founded the company in November 1993. Power Computing started out with financial backing from Olivetti (US$5 million) and $4 million of Mr. Kahng's money.
The first Mac-compatible (clone) PC shipped in May 1995. Like Dell Computer, Power Computing followed a direct, build-to-order sales model. In one year, Power Computing shipped 100,000 units with revenues of $250 million in the first year. Power Computing was the first company to sell $1,000,000 of products on the Internet.
Power Computing released upgraded models until 1997 with revenues reaching $400 million a year. The Mac clone business was stopped after Steve Jobs returned as interim CEO of Apple in July 1997. In September, Apple bought the core assets of Power Computing for $100 million in Apple stock and terminated the Mac cloning business.
History
Power Computing Corporation was founded on 11 November 1993 in Milpitas, California, backed by $5 million from Olivetti and $4 million of Mr. Kahng's money. At the MacWorld Expo in January 1995, just days after receiving notice he had the license to clone Macintosh computers, Kahng enlisted Mac veteran Michael Shapiro to help build the company. Shapiro helped to develop the original logo and brand and worked with Kahng to build the initial management team. Power Computing opened manufacturing and operations offices in Austin, Texas at the recently abandoned facilities of CompuAdd and engineering offices in Cupertino, California, staffed largely by members of Apple's original Power Macintosh team. In 1997, PCC relocated its headquarters to a location directly across I-35 from Dell's main campus, and remained there until Apple acquired PCC's assets in 1997. Mr. Kahng set out to create a simplified Mac design that made it cheaper and faster to produce the machines. He then targeted the mail-order market, where Power Computing could get a quicker return on its money than it could by selling through distributors.
"With direct mail, you get your money back in days by credit card instead of the 30 to 60 days it takes for the resale channel to repay," Mr. Kahng said.
At that time, Apple was leaning towards giving licenses to big time computer makers. Initially, even with Mr. Kahng's reputation as a "master cloner", getting Apple to take him seriously was a challenge. He ended up bringing Olivetti people with him to meetings. Apple engineers gave him the help he needed to make a Mac prototype. The team reduced the size of the Apple main circuit board so that it could fit into a standard PC box. They also used off-the-shelf PC power supplies and monitors.
A few days before the end of the year, it was announced that Apple Computer picked Power Computing to be its first Macintosh clone maker. Jim Gable, Apple's director of Mac licensing was quoted in The Wall Street Journal saying "[Mr. Kahng] is clever and fleet of foot. We want him to succeed."
Power Computing's goal was to have clones available for as little as $1,000 each starting in March or April 1995. John C. Dvorak, a computer columnist at MacUser magazine, remarked, "Apple is not going to know what hit them. Stephen Kahng is tenacious." When the machine was released, Macworld's review said
“The first clones work as well as Apple's Macs. That alone represents an auspicious start to Apple's reversal of its decade-long go-it-alone strategy. Although these first clones introduce no compelling new technologies, breathtaking features, or stunning industrial designs, they prove that Mac clones can be legitimate alternatives to Apple's own Macs.”
Initial machines
The initial clones were available in desktop and tower configurations, and were based on the PowerPC 601 80 MHz, 100 MHz and 110 MHz microprocessors. They were comparable to Apple Computer's Power Macintosh 7100 and 8100 class of computers. Pricing ranged from $1,995–2,899.
“Power Computing's system design (except for the clock-oscillator chip that controls the CPU and bus speed, the two models' motherboards are identical) suggests a thoughtful, sophisticated approach. This sophistication derives, in part, from help from Apple, as well as from the fact that two key Apple engineers recently joined Power Computing.”
Unlike Apple at the time, Power Computing pressed for direct sales. After a customer placed an order for a semi-customized configuration, the system was delivered the next day. Following the delivery of the system, Power Computing called the customer to surmise their needs and offer technical support and customer service. In addition, Power Computing set a goal of a 3-minute response time for all inquiries.
In May 1995, shortly after the original clone announcement, Power Computing teamed up with Austin, Texas based Metrowerks to offer the Power Computing CodeStation. The CodeStation was a package consisting of the recently announced Power Series clone, rebranded and bundled with the latest PowerPC version of CodeWarrior (CW6 Gold which introduced Magic Cap support). CodeStations were sold through Metrowerks at discounted developer prices and it is unknown exactly how many units were sold.
At the end of July 1995, Power Computing announced that it had successfully ramped the volume production capability of its Power 100 system. The efficiencies provided by volume production allowed Power Computing to lower the base configuration price of a "Power 100 Starter System" to $1,699. In addition, the company instigated a comprehensive quick-ship program that allowed popular configurations to ship the same day. Power Computing advertised models up to the "Power 120 XL", a $5,499 machine built around the PowerPC 601+ chip, a 2GB SCSI hard drive, 17 inch Sony monitor, 4X-speed CD-ROM, built-in Ethernet, and 32MB RAM.
At the end of October 1995, Power Computing introduced the world's fastest Macintosh-compatible computer, the PowerWave, based on the PowerPC 604 microprocessor. Per an article in the Austin American-Statesman, Power Computing said its machine would far outperform Windows-compatible machines based on Intel's Pentium processors.
At the early 1996 Macworld trade show in San Francisco, Power Computing found itself the star attraction because Apple was so preoccupied with its mounting financial woes that then-CEO Michael Spindler cancelled an appearance. PCC got another break when a computer firm that had spent $170,000 erecting an immense booth pulled out at the last moment, allowing Mr. Kahng to pick up the prime exhibiting space for $30,000.
At that Macworld, the PowerCurve — a line of mid-range, CPU-upgradeable Mac OS systems based on the PowerPC 601 and the industry-standard PCI expansion bus — was introduced. Unique to the PowerCurve 601/120 was the native support of VGA–style monitors.
Market success
In May 1996, just one year after Power Computing started selling Mac clones, the company reached the 100,000 units sold milestone. The number of employees had grown to 300. And as noted in an article in The Wall Street Journal (WSJ) by Jim Carlton, Power CEO Steve Kahng “still hasn’t taken his (golf) clubs out of the bag” (he had vowed not to play another round of his beloved golf until he had shipped the first 30,000 Mac clones).
That same WSJ article noted that one-half of Power Computings's customers represent people who would have otherwise purchased a computer from Apple. The others are people who might have bought a non-Mac computer.
There is no question Apple is losing sales to us, but we are also expanding the Mac market," says Geoff Burr, Power Computing's vice president of sales and marketing.
Still, unless Apple can rapidly expand its cloning operations -- a goal of new Apple CEO Gilbert Amelio -- to boost flagging Mac market share and generate enough new licensing and software revenue to offset sales lost to cloners, Apple could see its belated cloning campaign backfire.In June 1996, Mr. Kahng persuaded a unit of Lockheed Martin Corp. to buy 3,000 of his computers rather than Apple's. Though a longtime Apple customer, Lockheed Martin said Power beat out Apple's bid by agreeing to such extras as loading in special engineering software before shipping the machines out, a request that Apple declined. This was the largest sale in the history of Macs or Mac-compatible computers at the time.
Kahng was able to leverage his strong relationship with IBM to get access to the fastest PowerPC processors sooner than anyone else. As a result, starting in April 1996 and continuing through 1997, Power Computing regularly put out the fastest computer system in either platform (Mac OS or WinTel).
In April 1996, Power Computing unveiled the PowerTower, based on the 180 MHz and 166 MHz PowerPC 604 processor (announced by IBM on the same day). These were the fastest Mac OS personal computers available at the time.
Three months later, in July 1996, Power Computing was back with an even faster system – the PowerTower Pro which marked the worldwide debut of the new PowerPC 604e microprocessor featuring clock speeds of up to 225 MHz, making the PowerTower Pro the fastest personal computer available.
May 27, 1997 – PowerTower Pro 250 outperformed all comparable Pentium and Pentium II class Windows-based systems that were shipping at the time.
Aug. 4, 1997 – PowerTower Pro G3 275 and PowerTower Pro G3 250 would have been the world's first desktop systems using the new PowerPC generation of processors except that they were never built.
At Macworld Expo 1997, the company presented a military-themed campaign that urged the Mac faithful to “Fight Back.” Power Computing employees were outfitted in camouflage. The video wall looped “why we fight” propaganda. And “Steve Says” posters, flyers and T-shirts were ubiquitous inside the Moscone Center as well as in the streets surrounding the convention center (where Power Computing logoed Hummers, with bullhorns blazing, circled the center). However, the end was near.
Acquisition by Apple
In July, Apple's CEO Gil Amelio was ousted by Apple's Board of Directors, and Steve Jobs soon returned as interim CEO. Jobs believed that Apple had started to license clones too late to repeat the business model pioneered by Microsoft in the early 1980s.
"Apple has to let go of this ghost and invent the future," Mr. Jobs said. Instead of expanding the share of the market that used computers based on the Macintosh system, the decision to license clones simply ate into Apple's own sales of hardware, he said.
At MacWorld Boston in August, Power Computing President Joel Kocher unsuccessfully tried to convince attendees to rally against Apple's stiff new licensing policies. He and other executives resigned soon afterwards as Power Computing's board chose to be acquired instead.
On September 2, 1997, Apple Computer bought key assets of Power Computing for more than $100 million in Apple stock and roughly $10 million in cash. As part of the deal, Power Computing became a Apple subsidiary and Apple got back the license that allowed Power Computing to sell Macintosh-based machines. Apple also got some engineers and other employees that were absorbed into Apple's workforce, the rest were laid off. Some of them helped created Apple's next generation of technologies like the iMac.
Originally, Power Computing announced that they would be spun off by Apple and start making Wintel clones. However, Power Computing was forced to halt operations in December 1997, when the company was hit with lawsuits from its suppliers. As the parent company, Apple had to settle the lawsuits out of court and pay undisclosed amounts of money on behalf of Power Computing. As a result, Apple decided to instead absorb Power Computing into Apple and sell off any assets. By late January 1998, the last of Power Computing's physical assets were auctioned off, and Power Computing shareholders were mailed Apple Computer shares representing their pro rata share in the now-defunct corporation.
Anyone who had a Power Computing Macintosh clone was given a free upgrade up to Mac OS 8.1 by Apple under the Power Computing name. Ironically, this made Power Computing one of two Macintosh clones to get a Mac OS 8 upgrade disk (the other was UMAX, which got it under a agreement with Apple). Apple continued to provide technical support for any Power Computing machine until December 31, 2004.
Machine Upgrades
Power Computing's machines were one of the most popular Macintosh clone to ever be made. Any 603 or 604 equipped Power Computing machine can officially go up to Mac OS 8.1 due to Apple providing users of Power Computing machines Mac OS 8 upgrade disks as part of the acquisition (most other Macintosh clones can only officially go up to Mac OS 7.6). However, despite officially only going up to Mac OS 8.1, any 603 or 604 equipped Power Computing machine is capable of being upgraded up to Mac OS 9.1, although this is not officially supported by Apple.
Powered by a PowerPC 603e or a 604e processor, Power Computing's machines cannot run Mac OS X natively, but with the addition of a G3 or G4 processor upgrade and the use of XPostFacto 4.0, they could run several versions of Mac OS X up to 10.4 Tiger, with some limitations.
A number of Power Computing community websites have appeared over the years.
See also
Macintosh clone
Notes
References
Power Computing press releases (issued via BusinessWire)
Markoff, John. "For Apple, Clones and Competition." The New York Times 29 December 1994
Egan, Diane. "Mac Attack Begins: Apple Licenses OS." Electronic Buyers' News 2 January 1995
Rebello, Kathy. "IT JUST MAY BE THE YEAR OF THE APPLE It's leaner, it's signing up clone makers-and the Intel and Windows woes won't hurt a bit." Business Week 16 January 1995
Piller, Charles. "First clones. (Power Computing Macintosh clones; other upcoming clone machines discussed)." Macworld 1 April 1995
Carlton, Jim. "King Kahng: Master of Cheap Clones May Hold Key to Fate Of Apple Computers --- He Is Making First Copies Of the Fabled Macintosh, Which Risks Sales Loss --- `We Want Him to Succeed'." The Wall Street Journal 14 April 1995
Rizzo, John. "Clones' corporate clout. (compatibility of upcoming Macintosh clones with PC networks used in business)." MacUser 1 May 1995
Crabb, Don. "Note to Power Computing: make portable clones, too. (open letter to Power Computing CEO Stephen Kahng beseeching better portable designs than Apple is producing)." MacWEEK 15 May 1995
Moran, Susan. "Apple seen getting boost from Mac clones in South Korea." Reuters News 24 September 1995
Ladendorf, Kirk. "MAKING WAVES; With today's introduction of its PowerWave machines, Power Computing steps up from mere Macintosh clonemaker to technological innovator." Austin American-Statesman 30 October 1995
Ristelhueber, Robert. "Power Computing banks on aggressive designs and mail order channel. (Company Business and Marketing)." Electronic Business 1 November 1995
Ladendorf, Kirk. "Power Computing locates space it needs in Round Rock." Austin American-Statesman 29 December 1995
Newsbytes. "Macworld - Power Computing Offers New Mac Clone" 11 January 1996
Ryer, Kelly and Pearlstein, Joanna. "Power halts meltdown after operations crisis." MacWeek 25 March 1996
Carlton, Jim. "Power Computing Gains Towering Presence as Cloner --- CEO `King Kahng' Snatches Some of Apple's Revenue as It Copies the Mac." The Wall Street Journal 20 May 1996
Burrows, Peter. "Up Front: SILICON SAGAS APPLE COULD LEARN AT ITS CLONE'S FEET." Business Week 5 August 1996
Walsh, Jeff. "Apple freezes Mac OS May halt licensing OS to third parties." InfoWorld 25 August 1997
Ortiz, Catalina. "Apple buying Macintosh clone maker Power Computing for $100 million." AP Newswires 2 September 1997
Markoff, John. "Apple Decides Cloning Isn't Its Route Back To Profitability." The New York Times Section D; Business/Financial Desk 3 September 1997
External links
All Power Computing Mac Clones (at EveryMac.com)
Power Computing: Fighting Back for the Mac or Stealing Apple’s Customers?
Power Computing ads
1998 mergers and acquisitions
American companies established in 1993
American companies disestablished in 1998
Apple Inc. acquisitions
Computer companies established in 1993
Computer companies disestablished in 1998
Defunct computer companies of the United States
Defunct computer hardware companies
Macintosh clones |
146428 | https://en.wikipedia.org/wiki/Rsync | Rsync | rsync is a utility for efficiently transferring and synchronizing files between a computer and a storage drive and across networked computers by comparing the modification times and sizes of files. It is commonly found on Unix-like operating systems and is under the GPL-3.0-or-later license.
Rsync is written in C as a single threaded application. The rsync algorithm is a type of delta encoding, and is used for minimizing network usage. Zlib may be used for additional data compression, and SSH or stunnel can be used for security. Rsync is the facility typically used for synchronizing software repositories on mirror sites used by package management systems.
Rsync is typically used for synchronizing files and directories between two different systems. For example, if the command rsync local-file user@remote-host:remote-file is run, rsync will use SSH to connect as user to remote-host. Once connected, it will invoke the remote host's rsync and then the two programs will determine what parts of the local file need to be transferred so that the remote file matches the local one.
Rsync can also operate in a daemon mode (rsyncd), serving and receiving files in the native rsync protocol (using the "rsync://" syntax).
History
Andrew Tridgell and Paul Mackerras wrote the original rsync, which was first announced on 19 June 1996. It is similar in function and invocation to rdist (rdist -c), created by Ralph Campbell in 1983 and released under the Berkeley Software Distribution. Tridgell discusses the design, implementation, and performance of rsync in chapters 3 through 5 of his Ph.D. thesis in 1999. It is currently maintained by Wayne Davison.
Because of the flexibility, speed, and scriptability of rsync, it has become a standard Linux utility, included in all popular Linux distributions. It has been ported to Windows (via Cygwin, Grsync, or SFU), FreeBSD, NetBSD, OpenBSD, and macOS.
Use
Similar to cp, rcp and scp, rsync requires the specification of a source and of a destination, of which at least one must be local.
Generic syntax:
rsync [OPTION] … SRC … [USER@]HOST:DEST
rsync [OPTION] … [USER@]HOST:SRC [DEST]
where SRC is the file or directory (or a list of multiple files and directories) to copy from, DEST is the file or directory to copy to, and square brackets indicate optional parameters.
rsync can synchronize Unix clients to a central Unix server using rsync/ssh and standard Unix accounts. It can be used in desktop environments, for example to efficiently synchronize files with a backup copy on an external hard drive. A scheduling utility such as cron can carry out tasks such as automated encrypted rsync-based mirroring between multiple hosts and a central server.
Examples
A command line to mirror FreeBSD might look like:
$ rsync -avz --delete ftp4.de.FreeBSD.org::FreeBSD/ /pub/FreeBSD/
The Apache HTTP Server supports rsync only for updating mirrors.
$ rsync -avz --delete --safe-links rsync.apache.org::apache-dist /path/to/mirror
The preferred (and simplest) way to mirror a PuTTY website to the current directory is to use rsync.
$ rsync -auH rsync://rsync.chiark.greenend.org.uk/ftp/users/sgtatham/putty-website-mirror/ .
A way to mimic the capabilities of Time Machine (macOS);
see also Time rsYnc Machine (tym).
$ date=$(date "+%FT%H-%M-%S") # rsync interprets ":" as separator between host and port (i. e. host:port), so we cannot use %T or %H:%M:%S here, so we use %H-%M-%S
$ rsync -aP --link-dest=$HOME/Backups/current /path/to/important_files $HOME/Backups/back-$date
$ ln -nfs $HOME/Backups/back-$date $HOME/Backups/current
Make a full backup of system root directory:
$ rsync -avAXHS --progress --exclude={"/dev/*","/proc/*","/sys/*","/tmp/*","/run/*","/mnt/*","/media/*","/lost+found"} / /path/to/backup/folder
Delete all files and directories, within a directory, extremely fast:
# Make an empty directory somewhere, which is the first path, and the second path is the directory you want to empty.
$ rsync -a --delete /path/to/empty/dir /path/to/dir/to/empty
Connection
An rsync process operates by communicating with another rsync process, a sender and a receiver. At startup, an rsync client connects to a peer process. If the transfer is local (that is, between file systems mounted on the same host) the peer can be created with fork, after setting up suitable pipes for the connection. If a remote host is involved, rsync starts a process to handle the connection, typically Secure Shell. Upon connection, a command is issued to start an rsync process on the remote host, which uses the connection thus established. As an alternative, if the remote host runs an rsync daemon, rsync clients can connect by opening a socket on TCP port 873, possibly using a proxy.
Rsync has numerous command line options and configuration files to specify alternative shells, options, commands, possibly with full path, and port numbers. Besides using remote shells, tunnelling can be used to have remote ports appear as local on the server where an rsync daemon runs. Those possibilities allow adjusting security levels to the state of the art, while a naive rsync daemon can be enough for a local network.
Algorithm
Determining which files to send
By default, rsync determines which files differ between the sending and receiving systems by checking the modification time and size of each file. If time or size is different between the systems, it transfers the file from the sending to the receiving system. As this only requires reading file directory information, it is quick, but it will miss unusual modifications which change neither.
Rsync performs a slower but comprehensive check if invoked with --checksum. This forces a full checksum comparison on every file present on both systems. Barring rare checksum collisions, this avoids the risk of missing changed files at the cost of reading every file present on both systems.
Determining which parts of a file have changed
The rsync utility uses an algorithm invented by Australian computer programmer Andrew Tridgell for efficiently transmitting a structure (such as a file) across a communications link when the receiving computer already has a similar, but not identical, version of the same structure.
The recipient splits its copy of the file into chunks and computes two checksums for each chunk: the MD5 hash, and a weaker but easier to compute 'rolling checksum'. It sends these checksums to the sender.
The sender computes the checksum for each rolling section in its version of the file having the same size as the chunks used by the recipient's. While the recipient calculates the checksum only for chunks starting at full multiples of the chunk size, the sender calculates the checksum for all sections starting at any address. If any such rolling checksum calculated by the sender matches a checksum calculated by the recipient, then this section is a candidate for not transmitting the content of the section, but only the location in the recipient's file instead. In this case, the sender uses the more computationally expensive MD5 hash to verify that the sender's section and recipient's chunk are equal. Note that the section in the sender may not be at the same start address as the chunk at the recipient. This allows efficient transmission of files which differ by insertions and deletions. The sender then sends the recipient those parts of its file that did not match, along with information on where to merge existing blocks into the recipient's version. This makes the copies identical.
The rolling checksum used in rsync is based on Mark Adler's adler-32 checksum, which is used in zlib, and is itself based on Fletcher's checksum.
If the sender's and recipient's versions of the file have many sections in common, the utility needs to transfer relatively little data to synchronize the files. If typical data compression algorithms are used, files that are similar when uncompressed may be very different when compressed, and thus the entire file will need to be transferred. Some compression programs, such as gzip, provide a special "rsyncable" mode which allows these files to be efficiently rsynced, by ensuring that local changes in the uncompressed file yield only local changes in the compressed file.
Rsync supports other key features that aid significantly in data transfers or backup. They include compression and decompression of data block by block using zlib, and support for protocols such as ssh and stunnel.
Variations
The utility uses the rsync algorithm to generate delta files with the difference from file A to file B (like the utility diff, but in a different delta format). The delta file can then be applied to file A, turning it into file B (similar to the patch utility). rdiff works well with binary files.
The rdiff-backup script maintains a backup mirror of a file or directory either locally or remotely over the network on another server. rdiff-backup stores incremental rdiff deltas with the backup, with which it is possible to recreate any backup point.
The librsync library used by rdiff is an independent implementation of the rsync algorithm. It does not use the rsync network protocol and does not share any code with the rsync application. It is used by Dropbox, rdiff-backup, duplicity, and other utilities.
The acrosync library is an independent, cross-platform implementation of the rsync network protocol. Unlike librsync, it is wire-compatible with rsync (protocol version 29 or 30). It is released under the Reciprocal Public License and used by the commercial rsync software Acrosync.
The duplicity backup software written in python allows for incremental backups with simple storage backend services like local file system, sftp, Amazon S3 and many others. It utilizes librsync to generate delta data against signatures of the previous file versions, encrypting them using gpg, and storing them on the backend. For performance reasons a local archive-dir is used to cache backup chain signatures, but can be re-downloaded from the backend if needed.
As of macOS 10.5 and later, there is a special -E or --extended-attributes switch which allows retaining much of the HFS file metadata when syncing between two machines supporting this feature. This is achieved by transmitting the Resource Fork along with the Data Fork.
zsync is an rsync-like tool optimized for many downloads per file version. zsync is used by Linux distributions such as Ubuntu for distributing fast changing beta ISO image files. zsync uses the HTTP protocol and .zsync files with pre-calculated rolling hash to minimize server load yet permit diff transfer for network optimization.
Rclone is an open-source tool inspired by rsync that focuses on cloud and other high latency storage. It supports more than 50 different providers and provides an rsync-like interface for cloud storage.
rsync applications
See also
casync
Remote Differential Compression
List of TCP and UDP port numbers
Grsync - App based on RSync but with Graphical User Interface
Notes
References
External links
rsync algorithm - 1998-11-09
rsync Examples (How to use rsync)
1996 software
Backup software for Linux
Data synchronization
Free backup software
Free file transfer software
Free network-related software
Free software programmed in C
Network file transfer protocols
Unix network-related software |
2303228 | https://en.wikipedia.org/wiki/Alluvium%20%28peercasting%29 | Alluvium (peercasting) | Alluvium is open source peercasting software developed by the Foundation for Decentralization Research, first released in 2003. It comprises three components, Core, Media Player, and Server. Alluvium allows video and audio programming to be broadcast over the Internet using swarming technology. It is powered by Onion Networks' Swarmcast, and is notable for its incorporation of server-side time-based playlists, and client software which examines those playlists and begins streaming content from the server(and available peers) per that schedule, simplifying the creation of continuous-broadcast video and audio.
Technical overview
Alluvium is a technology for low-cost streaming media broadcasts. It differs in method from server-to-client streaming servers such as icecast, Real Server, and QuickTime Streaming Server. It requires only a standard web server and client software. No additional modules or CGI scripts are required for its operation.
Requirements
Web Server
The web server handles static files: content, and the playlist(s). The Alluvium playlist file is a text file, residing on the web server, written in the Alluvium playlist format, which is based on the RSS 1.0 news format. The playlist file specifies the play order of URLs that can be hosted anywhere on the web. All RSS tags used are standard tags from existing schemas. An Alluvium playlist file can be generated using the Broadcaster playlist generation tool.
The web server is configured to deliver Alluvium playlists with MIME type Content-Type: application/x-alluvium for files with the extension .rss
Client software
The client software, running on each listener's computer, scans through a playlist file until it finds an entry which is scheduled for the current time, then fetches that media. Files are downloaded using the Open Content Network (OCN) utilizing Swarmcast swarming download technology. After the first file download has started, the client immediately sends it to a locally generated icecast-compatible stream. The client's media player can then be directed to the local stream and listen to it exactly as though it was a normal icecast stream.
Swarming download operation
The client software first checks with the OCN gateway, which stores special headers for all of the files being distributed through the OCN. If the gateway doesn't know about a particular URL, it will fetch the necessary information from the URL and then cache it. The information stored by the gateway contains information needed to swarm download the file such as a hashtree.
Among the information obtained by the client from the gateway is a list of addresses for other clients who are also downloading or have recently downloaded the file. Clients download multiple parts of the file simultaneously from each other. When a certain part of the file is unavailable from other clients, a client will fetch it from the original source URL and then share that part with the other clients, minimizing the load on the server which stores the content files. The majority of data transfer happens between peers. Priority for downloading is given to chunks earlier in the file, so that file playback can happen immediately.
This swarming architecture offers savings in bandwidth and processor usage. Because most transfers happen between listeners, the source server has much less load. Also, unlike icecast, servers which serve files for Alluvium stations do not decode the files, so broadcasts can be done from low-cost, obsolete hardware with sufficiently fast I/O and network speeds.
History
Alluvium was developed as part of the Tristero project, hosted at SourceForge, by Brandon Wiley. The source code, still in beta, is available as part of the Tristero project at tristero.cvs.sourceforge.net
Alluvium was unveiled at CodeCon 2 in February 2003, generating interest in the peer-to-peer, open source, and streaming radio communities. In 2004, CodeCon 3 was broadcast live using Alluvium 2.0.
Alluvium was further developed and incorporated into software developed at and named after ActLab.TV, a peercasted TV and radio service operated by the ActLab at the University of Texas at Austin.
References
External links
Alluvium on SourceForge Last updated February 21, 2003.
Alluvium information Official site.
Peercasting
Peer-to-peer software |
302327 | https://en.wikipedia.org/wiki/Castle%20Wolfenstein | Castle Wolfenstein | Castle Wolfenstein is a 1981 action-adventure game that was developed by Muse Software for the Apple II home computer. It is one of the earliest games to be based on stealth mechanics. An Atari 8-bit family port was released in 1982 and was followed by versions for Commodore 64 (1983) and MS-DOS (1984).
The game takes place during World War II. The player takes the role of an Allied prisoner of war who is held captive in the fictional Castle Wolfenstein. After escaping from the cell, the player's objective is to find the Nazis' secret war plans and escape from the castle. Nazi soldier enemies can be dealt with by impersonating, sneaking, or killing them.
The game was received positively amongst critics and became one of the best-selling games of the early 1980s. It is considered to have had a direct influence on modern stealth and first-person shooter games. The game was praised for its graphics, and gameplay, but criticized for its long waiting times when opening chests.
Gameplay
Castle Wolfenstein is a two-dimensional action-adventure game that is played from a top-down perspective using a keyboard, joystick, or paddles. It has also been described as a maze game. There are eight difficulty levels in the game that are determined by the player's rank. The player takes the role of an Allied spy that has been captured by Nazis and imprisoned in a dungeon within Castle Wolfenstein for interrogation by the SS Stormtroopers. While the spy is waiting for interrogation, a dying prisoner emerges from a hiding place and hands the player a fully loaded pistol with 10 rounds, and three grenades before passing away. The objective is to escape from the castle and if the player finds the battle plans before escaping, they will be promoted and the complexity of the subsequent run will be increased, while the castle's layout changes and the game starts again.
The game takes place in a procedurally-generated castle of approximately 60 rooms that house standard Nazi guards and SS Stormtroopers identified by their bulletproof vests marked with the SS insignia. Standard guards can be eliminated with a pistol and have a chance to surrender if the player points a pistol at them even if they have no ammunition, and SS Stormtroopers with grenades because they usually wear body armor. Enemies can be looted once surrendered or after they've been eliminated and can possess ammunition, grenades, and keys which can be used on doors and chests. Doors and chests can be opened more quickly by shooting at them but will attract the guards in the room, and if the chest contains ammunition and grenades, they will explode resulting in immediate death. Chests may contain bulletproof vests, uniforms, and secret documents, or sauerkraut, sausages, and schnapps that do not affect the gameplay. Uniforms allow the player character to pass guards unnoticed, but they are ineffective against SS Stormtroopers. If the player dies from enemy gunfire, the game restarts with the castle's layout preserved and the same chests and guards. If they are killed by their own grenade, the game restarts in a newly generated castle.
Development and release
Castle Wolfenstein was developed by Silas Warner at Muse Software and the game's cover art was drawn by John Benson.
The game was initially conceptualized as a game set in the mid-1980s in what Warner describes as "a guy running around rooms" and did not know how to develop the game further. He was uninterested in using space as a setting due to his belief that there were so many of them on the market. The concept changed after Warner watched the 1961 British-American war film The Guns of Navarone and was amazed by the Allied commandos who broke into a German fortress to destroy the German artillery battery. Within the same day, he played Berzerk, a multi-directional shooter arcade game in which the player navigates through a maze with laser-shooting robots. He decided to use the same concept but with Nazi soldiers instead of robots. His idea was to take the basic common concept of an arcade shoot 'em up, where players dodge enemies with the intent of killing them and change the objective to escape the enemy guards and their castle with shooting guards simply a means to an end and not an end in itself.
Warner implemented procedural level generation to the game, which took 35 to 60 seconds to complete before the gameplay of the original Apple version started; as a result, the game produced a new set of 60 rooms, the arrangement of which was nearly always different. He designed the game's architecture using three programs, each of which was on separate floppy disks and later integrated into a single floppy disk. The first one initialized the graphics and shuffled 64 interchangeable floor plans. The second disk governed the behavior of the castle's guards, while the third disk handled the player character's behavior. According to Warner, a lot of work went into synchronizing the programs, and was satisfied with the result. For the soundtrack, he implemented his own voice for the German guards. Warner recorded his voice using Apple II software called The Voice also published by Muse Software. He used German phrases such as Achtung, Schweinhund, Halt, and five other German phrases.
Muse Software released Castle Wolfenstein in September 1981 for the Apple II and the game was ported to other platforms. It was first ported to the Atari 8-bit family six months after the Apple release, then to the Commodore 64 in 1983 and to MS-DOS in 1984. Following the game's release, a software developed by Moxie, The Great Escape Utility, was marketed in 1983, promising bug fixes to speed up the opening of chests and the startup time of the game. It also allowed players to choose their starting location and gain an unlimited amount of items. The software is regarded as the first commercial trainer in video gaming.
Reception
According to Harvey Bernstein of Antic, after its release, Castle Wolfenstein "quickly shot to the top of the charts" and became "one of the most popular games for any microcomputer". In the October 1982 issue of Computer Gaming World, associate publisher and game merchandiser Dana Lombardy released an incomplete list of top-selling games as of 30 June 1982, where the game landed in 13th place with 20,000 copies sold. The game ultimately sold about 50,000 copies by 1983.
Creative Computing Video and Arcade Gamess Andrew Brill complained about the Apple version's slow gameplay, which according to Brill is mainly due to the time taken to open chests that contain "completely useless" items, which Brill regarded as the game's "most frustrating feature", but added "thrill of the escape" is "worth the wait". Richard Herring of Ahoy!, reviewing the game's Commodore 64 port, also complained about Castle Wolfensteins slow gameplay, especially the long time it took to open the chests. He also stated that each room must be loaded from the floppy disk, causing a lag when each room is entered. Herring also mentioned a bug, in which if the player character bumps into a wall, the screen "goes into hysterics for a few seconds". Herring added that playing the game with a keyboard is "inconvenient" as the player does not have time to perform game actions quickly enough but concluded by stating Castle Wolfenstein has "simple but effective graphics" and called the game "addicting". In a 1991 Computer Gaming World survey of strategy and war games, M. Evan Brooks called the game an "arcade classic" stated despite the outdated graphics, it had remained in his "fond memories". In 1996, the same magazine listed Castle Wolfenstein as the 116th best game of all time.
Sequels and follow-ups
In 1984, Muse Software released a sequel to Castle Wolfenstein titled Beyond Castle Wolfenstein, which has similar graphics and gameplay to its predecessor and contains a number of updates such as the use of a knife, the ability to bribe guards, and a pass system in which guards periodically summon the player character and ask him or her to show the correct pass. Castle Wolfenstein directly influenced the game Wolfenstein 3D, which was developed by id Software. John Romero stated the original idea was to create a 3D Castle Wolfenstein but did not have the rights to the game during development. Many options for the game's title were proposed and rejected and eventually, id Software bought the rights to use Wolfenstein from Silas Warner. The original concept of Wolfenstein 3D changed significantly because the developers decided the core of the gameplay would be fast and simple so features such as the ability to drag and loot fallen enemy soldiers were withdrawn.
Further development by other studios led to the emergence of one of the longest-living video series; as of 2021, there are 13 Wolfenstein games, the most recent of which, Wolfenstein: Youngblood, is a spin-off that was released in 2019.
Legacy
Multiple media outlets considered Castle Wolfenstein to be significant in the shaping of stealth games and first-person shooters genre. Though no more Wolfenstein games were released by Muse Entertainment after Beyond Castle Wolfenstein, Metal Gear series and several other video games took elements and inspiration from the two original games. GameSpots Daniel Hindes stated that the first-person shooter genre was "forged" by Castle Wolfenstein, and the game introduced a number of new stealth mechanics. Casey Alkaisy, marketing manager at DICE, in his review of stealth games on Gamasutra, said the first foundations of the stealth genre were laid down in Pac-Man but its game mechanics only took shape with the advent of Castle Wolfenstein, after which other games using the same ideas began to appear. In its review of the series, Xbox Wire called Castle Wolfenstein a "proto-stealth game" that contains "innovations that would go on to become standards in the stealth genre". When speaking with Retro Gamer, Wolfenstein 3D co-creator John Romero, credited Castle Wolfenstein as the "original stealth shooter".
References
Citations
Bibliography
External links
1981 video games
Action-adventure games
Apple II games
Atari 8-bit family games
Commodore 64 games
DOS games
Stealth video games
Wolfenstein
Works set in castles
World War II video games
Top-down video games
Video games about Nazi Germany
Video games developed in the United States
Video games using procedural generation |
13689443 | https://en.wikipedia.org/wiki/Blue%20Yonder | Blue Yonder | Blue Yonder (formerly JDA Software Group) is an American software and consultancy company owned by Japanese conglomerate Panasonic. Blue Yonder provides supply chain management, manufacturing planning, retail planning, store operations and category management offerings headquartered in Scottsdale, Arizona. The company has more than 3,000 corporate customers in the manufacturing, distribution, transportation, retail and services industries. Companies acquired over time include Yantriks, Blue Yonder, RedPrairie, i2 Technologies, Manugistics, E3, Intactix, and Arthur.
History
In 1985, James Donald Armstrong and Frederick M. Pakis formed the US-based JDA Software, Inc. in Cleveland, Ohio. After signing a contract with a Phoenix-based automotive retailer in 1987, all eight JDA employees relocated to headquarters in Arizona. After 10 years of operation as a privately held firm, JDA went public on March 15, 1996.
In 1998, JDA completed its first acquisition, the Arthur Retail division.
In April 2000, the company announced the acquisition of Intactix division. In September 2001, JDA acquired E3 Corporation. JDA acquired Manugistics in July 2006.
On November 5, 2009, JDA announced its intent to acquire i2 Technologies, a Dallas-based provider of supply chain management software. The acquisition was completed in January 2010. In June 2010, Dillard's Department Stores won a $246 million judgment against i2, claiming damages from use of two supply chain management systems. JDA announced efforts to reduce or reverse this judgment, noting Dillard's still used the software and had done so since 2000.
On December 21, 2012, RedPrairie with Mike Mayoras as CEO and Martin Hiscox as President and Vice Chairman, with the backing of New Mountain Capital, bought JDA Software for $1.9 billion and took it private, merging RedPrairie and JDA into one organization. The business traded under the JDA brand.
On August 8, 2016, JDA was reported to be exploring its sale to Honeywell International Inc. The sale was challenged on August 16, 2016 by The Blackstone Group, giving an alternative to JDA by offering a financing plan.
On January 30, 2017, Girish Rishi succeeded Bal Dail as CEO.
On July 2, 2018, JDA announced an agreement to acquire Blue Yonder GmbH. The acquisition was completed on August 7, 2018.
On February 11, 2020, JDA announced that it was renaming itself to Blue Yonder, Inc. On July 23, Blue Yonder announced the acquisition of Yantriks.
On April 21, 2021, Panasonic announced that it had agreed to acquire Blue Yonder. The acquisition was closed on September 17, 2021.
In February 2022, CEO Rishi stepped down.
References
1985 establishments in Ohio
American companies established in 1985
Companies based in Scottsdale, Arizona
Software companies established in 1985
Software companies based in Arizona
Software companies of the United States
Supply chain software companies
1996 initial public offerings
Companies formerly listed on the Nasdaq
Panasonic
American subsidiaries of foreign companies
2016 mergers and acquisitions
2021 mergers and acquisitions |
912618 | https://en.wikipedia.org/wiki/Abertay%20University | Abertay University | Abertay University ( ), formerly the University of Abertay Dundee, is one of two public universities in the city of Dundee, Scotland. In 1872, Sir David Baxter, 1st Baronet of Kilmaron, left a bequest for the establishment of a mechanics' institute in Dundee and the Dundee Institute of Technology was formed in 1888. As early as 1902 it was recognised by the Scottish Education Department as an educational hub, and was one of the first to be designated a central institution, akin to an "industrial university". Abertay gained University status in 1994.
Abertay launched the world's first computer games degree in 1997 and in 2017 held a programme of events celebrating 20 Years of Games. Abertay was also the first to offer a degree in Ethical Hacking, starting in 2006.
History
The following history to 1988 provides a summary account that relies primarily on the book published by Dundee Institute of Technology in 1989, 'The First Hundred Years: 1888-1988'. Where additional sources have been used, post 1988, these have been cited accordingly.
The Baxter bequest (1872)
In 1872 Sir David Baxter, 1st Baronet of Kilmaron, died and bequeathed £20,000 (£1,581,200 adjusting for inflation) for the establishment of a mechanics' institute in Dundee. The Baxter bequest was intended to create an educational establishment permitting young (male) working mechanics and other craftsmen to better themselves. After some years of delay the trustees finalised a scheme and met the conditions of the bequest and the Dundee Technical Institute opened on 15 October 1888 in grounds, purchased from University College, Dundee, adjacent to Small's Wynd, Dundee. Initially 238 students enrolled and classes were conducted based on the syllabus of the Government Science and Art Department of South Kensington and the City & Guilds of London Institute. Subjects were primarily scientific and technical although applied art was also taught, and jute spinning and textile design were soon added to the portfolio.
In 1901 the Dundee Technical Institute enrolled 723 part-time students and was one of the first education hubs to be recognised as a 'central institution' by the Scotch Education Department. In 1906 a new site in Bell Street, Dundee was purchased to build a larger complex to accommodate a growing student population. In 1911 the completed complex was formally opened as the Dundee Technical College & School of Art. The portfolio had by now expanded again to include marine engineering and navigation.
The First World War retarded enrolments and growth but the vocational nature of the institute meant that its classes were highly relevant to the war effort. Records show that the first women students enrolled in 1914. After the war, the institute continued to expand adding a new school of pharmacy, and more specialist classes in engineering and building. Commercial classes in finance, economics and accounting were added to support trade at home and abroad.
The Duncan of Jordanstone bequest (1909)
In 1909 James Duncan of Jordanstone left £60,000 (£4,993,263 adjusting for inflation) to establish an art college in Dundee. It was only after a lengthy legal battle surrounding this bequest and the right of the existing college to spend the money, that a new scheme was entered into in 1933 permitting the establishment of the Dundee Institute of Art and Technology. The scheme allowed for separate technical and art colleges under a single governance framework. Plans for a new art college were drawn up in 1937. However, owing to the outbreak of the Second World War, plans were delayed and construction did not begin until 1953. The college of art became a formally separate institution, known as the Duncan of Jordanstone College of Art and Design, in 1975, remaining independent until 1994 when it became part of the University of Dundee.
First degrees (1951)
After the Second World War enrolments and the scope of delivery continued to expand, as did the reputation of the institute. By 1951 the institute was teaching courses that led to examinations for the external degrees of the University of London in pharmacy, mechanical, civil, and electrical engineering. In 1955 the National Council for Technological Awards was established and validated diplomas in technology which were equivalent in standard if not in name to honours degrees. In 1963 the Robbins Committee on Higher Education set out the principle that higher education should be available to all who wanted it and were suitably well qualified. The Committee recommended that the government should expand higher education in the UK, particularly in science and technology.
University status (1994)
Abertay University was created in 1994, under government legislation granting the title "University of Abertay Dundee" to the Dundee Institute of Technology. Since 2014 the University has promoted itself as Abertay University. The university's name was formally changed to Abertay University by an Order of Council on 1 September 2019.
Abertay was the first university in the world to offer a "computer games" degree in 1997. Abertay was the UK's first University to be recognised as a Centre for Excellence in Computer Games Education, and is associated with a business support programme for computer game startups.
Campus
Abertay University is situated in the centre of Dundee. The campus buildings include the historic Old College buildings of Dundee Business School, the Bernard King Library, scenes of crime teaching facilities, specialist Ethical Hacking labs designed for research into computer hacking and misuse, and modern computer games labs in the UK Centre for Excellence in Computer Games Education.
The Bernard King Library in Bell Street opened to learners in February 1998 and was formally opened by Queen Elizabeth II on 30 June 1998. The library was voted best new building in Scotland in the 1998 Scottish Design Awards competition. The building has a stone rectangular 'spine' and a curved glass front mimicking an open book. The Library houses an English Language learning centre, a specialist Law library, and an EU funded IT suite. The library was designed with the digital age very much in mind, and although the traditional books still feature, the emphasis was and is very much on providing access to digital information through online subscriptions.
The Student Centre building in Bell Street opened in 2005 providing a home to the Students' Association as well as a trading centre with an art gallery (Hannah Maclure Centre), cinema, student bars, food, and retail trading outlets.
Academic reputation
Abertay is a small university that receives the majority of its funding for teaching rather than research. Nevertheless, according to the results of the Research Excellence Framework 2014 (REF2014) published on 18 December 2014, Abertay was the highest ranked modern university in Scotland for 'research intensity'. The University submitted an increased proportion of staff in REF2014 compared to RAE2008 and achieved an average score of 2.15 - which in REF terms means 'quality that is recognised internationally in terms of originality, significance, and rigour'. This was an improvement from the average score of 1.83, 'national recognition', achieved in RAE2008.
Submissions were made in:
Unit 4 Psychology, Psychiatry and Neurosciences
Unit 5 Biological Sciences
Unit 7 Earth Systems and Environmental Sciences
Unit 15 General Engineering
Unit 20 Law
Unit 23 Sociology
Unit 26 Sport and Exercise Sciences, Leisure and Tourism
Abertay submitted 30% more staff in REF 2014 than in RAE 2008, in seven Units of Assessment (UoAs) compared to six in 2008. Abertay submitted to three Units of Assessment for the first time: Sports Science, Sociology, and Biological Sciences. Abertay scored a proportion of research as 4* in six of the seven UoAs in 2014, compared to only two out of the six units submitted in 2008. Abertay submitted 36% of its academic staff to the REF.
Abertay was the first university in the world to offer a "computer games" degree in 1997. In 2009 it established the UK's first Centre for Excellence in Computer Games Education, and a business support programme. Abertay runs five of the 25 interactive and games degree courses accredited in the UK by Creative Skillset, the industry skills body for the creative sector, more than any other institution.
External accreditation
Abertay is externally peer reviewed under the Enhancement-led Institutional Review (ELIR) method by the Quality Assurance Agency for Higher Education, Scotland (QAAS), on behalf of the Scottish Funding Council (SFC). All provision is benchmarked to the Scottish Credit and Qualifications Framework (SCQF).
In addition, Modules and Programmes offered at Abertay currently have been accredited by the following professional bodies:
Association of Chartered Certified Accountants;
Association of International Accountants;
British Association for Counselling and Psychotherapy;
British Computer Society;
British Council;
British Psychological Society;
Chartered Institute of Management Accountants;
Chartered Institute of Personnel and Development;
Chartered Institution of Water and Environmental Management;
Counselling and Psychotherapy Scotland;
Faculty of Advocates;
Forensic Science Society;
Health and Care Professions Council;
Higher Education Academy;
Institute of Biomedical Science;
Joint Audio Media Education Services;
Joint Board of Moderators;
Law Society of Scotland;
Nursing and Midwifery Council;
Royal Society of Chemistry;
Skillset;
The Independent Game Developers' Association
Research organisation
Research themes
Research at the University is organised into four main themes.
Creative Industries
Environment
Security
Society
Each theme is associated with areas of expertise.
The creative industries research theme focuses on: games research; digital cultures; and digital living. The environment research theme focuses on: environmental science and engineering; food science and innovation; environmental and systems biology; and sustainable technology. The security theme on: cyber security; forensic psychobiology; forensic sciences; and law. The society theme on: business, economics and management; the law of employment; media and culture; mental health and wellbeing; psychology; sociology; and sport performance and exercise.
Governance
The University was established by a statutory instrument The University of Abertay Dundee (Scotland) Order of Council 1994. The Order sets out the objects of the University and the general functions of the University Court to 'conduct the affairs of the University and carry out and promote its objects'. The Order requires that the University Court makes arrangements for a Principal to be appointed to 'discharge the functions of the University Court (other than those delegated to Senate by virtue of article 36(3) of the Order) relating to the organisation and management of the University and to the discipline therein'. The Order requires that the University Court appoints and maintains a Senate, delegating to it 'the functions of the University Court relating to the overall planning, co-ordination, development and supervision of the academic work of the University; and such other functions of the University Court as may be assigned to the Senate by the University Court'.
Notable features
Computer games education
Abertay was the UK's first University to be recognised as a Centre for Excellence in Computer Games Education.
The Centre for Excellence is accredited by Skillset and has strong links with industrial partners from across the broadcast, interactive and wider digital media sectors. These partners include BBC Scotland, BBC Vision, BSkyB, Channel 4, Electronic Arts, Codemasters, Blitz Games Studios, Rare, Sony Computer Entertainment Europe, Microsoft and Disney Interactive.
Ethical Hacking and Cyber-Security
Abertay was the first university in the world to have a degree in Ethical Hacking, differentiating from its other cyber-security degree counterparts by taking a more offensive approach to security. The university began to offer this degree in 2006. In 2020 the university was one of eight in the UK, and the only one in Scotland, to be named an Academic Centre of Excellence in Cybersecurity by the National Cybersecurity Centre.
The university is also home to the Ethical Hacking society, which hosts the Securi-Tay conference, the largest student-run security conference in Europe.
Sport
Abertay University has a wide group of sports teams competing under the university banner. It ranges from Tennis to Hockey who compete in the BUCS (British Universities & Colleges Sport) leagues, which competes against other universities and colleges in Scotland. The Football, Hockey and Basketball teams have their 1st teams competing in the highest league (BUCS 1A). The University has a fierce rivalry with Dundee University, competing in a yearly Varsity competition. There are numerous notable athletic alumni that have undertaken their degrees in the University with the University facilitating elite sports men and women, who have either represented their country or compete at a high level.
Electives for the 21st century
An electives scheme provides opportunities, from 2015/16, for all early years students to broaden their intellectual horizon beyond the standard single or joint Honours degree combinations on offer.
Symbols
Coat of arms
Prior to 1953 no coat of arms was registered in the name of the college. The original Ensigns Armorial were recorded in the Public Register of All Arms and Bearings in Scotland on 25 July 1953, in the name of Dundee Technical College. They were subsequently transferred to Dundee College of Technology in 1977, then to Dundee Institute of Technology in 1988 and, finally, to the University of Abertay Dundee on 25 April 1994.
The arms are described as:
"Party per fess, in chief tierced in pale: 1st, Ermine, a chevron engrailed between three mullets Gules; 2nd, Azure, three chevronels Or; 3rd, Argent, a spray of oak Proper fructed Or between three pheons Azure; in base Azure, a pot of three flowering lilies Argent between two flanches Or each charged with a book Gules."
The top left sector is taken from the arms of Sir David Baxter of Kilmaron, who bequeathed a significant sum of money in order to establish the original Dundee Technical Institute in 1888. The top right sector is taken from the arms of Sir William Dalgleish, who was the senior trustee of what by then was known as Dundee Technical College and School of Art, and who opened the first building – Old College – on Abertay's present campus on Bell Street in 1911. The top middle sector of three chevrons is the heraldic symbol for "technical". The pot of lilies in base is taken from the arms of the City of Dundee, with the books on either side representing education.
Motto
Beatus homo qui invenit sapientiam (Blessed is the man who finds wisdom).
Tartan
Aaron McCauley, a graduate of Abertay, designed and registered the Abertay tartan in 2003. The tartan is based on Abertay's promotional colours of dark blue, red, gold and green from its coat of arms.
Student life
Accommodation
Lyon Street
Meadowside Hall
Keiller Court
Parker House - iQ Student Accommodation
Students' centre
The student centre building was constructed in 2005 at a cost of £6 million. It provides a focal point for student entertainment and recreation and contains numerous outlets including Aroma coffee bar, the Common Room E-Bar and Campus Shop.
The Abertay Students' Association (Abertay SA) is based in the second floor of the Kydd building. Abertay SA co-ordinates all societies and acts as the voice and representation of all of Abertay's students.
Notable alumni and staff
Shehzad Afzal, Game designer
Roger Ball, musician, founding member Average White Band
Vikki Bunce, Scottish field hockey player
Victoria Drummond MBE, first female Merchant Navy engineer in Britain
Malcolm Duncan, musician, founding member Average White Band
Joe FitzPatrick, Scottish National Party MSP, Scottish Government Minister for Parliamentary Business
Stewart Hosie, Scottish National Party MP
David Jones, DMA Design founder – now Rockstar North - creator of GTA and Lemmings franchises
Bella Keyzer (1922 – 1992) Women's equality icon was retrained at Dundee Technical College to be a shipyard welder in 1976.
Andrew Mackenzie, Verdant Gin founder
Maurice Malpas, Scottish football player and manager, spent his entire professional playing career at Dundee United F.C. and was capped for the Scotland national football team 55 times
Iain McNicol, General Secretary of the Labour Party (UK)
Stuart McMillan, Scottish National Party MSP
Andy Nicol, Scottish Rugby Union player, team captain; also represented the Barbarians and British Lions
Jude Ower MBE, Playmob founder
Gavin Clydesdale Reid, economist and past President of the Scottish Economic Society (1999–2002)
William Samson, former staff member and Scottish astronomer, mathematician, and computer scientist
Tom Smith, Scottish Rugby Union player and coach, also represented Northampton Saints and British Lions
Dr John N. Sutherland, former Professor of Virtual Reality, Gifu University, Japan.
Sir Brian Souter, Stagecoach founder
Liam Wong, graphic designer, game developer and photographer
References
External links
Abertay University
Abertay Students' Association
Dundee Academy of Sport
Research Repository
Educational institutions established in 1888
1888 establishments in Scotland
Educational institutions established in 1994
1994 establishments in Scotland |
28722785 | https://en.wikipedia.org/wiki/Bashir%20Rameev | Bashir Rameev | Bashir Iskandarovich Rameev (; formerly "Rameyev" in English; 1 May 1918 – 16 May 1994) was a Soviet inventor and scientist, one of the founders of Soviet computing, author of 23 patents, including the first patent in the field of electronic computers officially registered in the USSR—a patent for the Automatic Electronic Digital Machine (1948). Rameev's inventions paved the way for the development of a new field in Soviet science—electronic computing—and for the formation of a new branch of industry that supported it.
The central ideas incorporated in Rameev's invention of the electronic computer included: storing programs in computer memory, using binary code, utilizing external devices, and deploying electronic circuits and semiconductor diodes. The first publication about similar technology outside of the USSR appeared in 1949-1950. Rameev also suggested that intermediate computation data be automatically printed on punched tape and sent into the computer's arithmetic device for subsequent processing, meaning that the processing of commands would be performed in the computer's arithmetic device; this is usually referred to as the Von Neumann architecture.
Of particular note is Rameev's invention of diode-matrix control circuits, which were used to build his first brainchild, the first serially manufactured Soviet mainframe "Strela" (1954). In the 1950s, the diode-matrix control circuits were not widespread due to their significant dimensions and high power consumption. However, with subsequent development of microelectronics and the emergence of large-scale integrated circuits, which made possible to deploy tens or hundreds of thousands of diodes and transistors in a single piece of silicon, the concept of control circuits became viable and commonly used.
"Strela" computers carried out calculations in nuclear physics, rocketry and space research. Notably, one of “Strelas" was used to calculate “Sputnik” orbit trajectory. For the development of "Strela" Rameev and his team were awarded the Stalin Prize of 1st degree, which was the highest Soviet award at that time.
Between 1956 and 1969, Rameev designed and oversaw the manufacturing of 14 different computers including: the multi-purpose "Ural" computer series and the specialized machines “Weather” (“Погода”), "Crystal" (“Кристалл”), "Granite" (“Гранит”), and “Coordinate” (“Координата”). Rameev's "famous computer family 'Ural' existed more than 15 years and had good chances to be one of the corner stones of future Russian computer engineering".
Childhood and youth
Rameev's mother died when he was two. His father was targeted by the Soviets and perished in labor camps during Stalinist purges. This branded Rameev, who was by then a sophomore at the Moscow Power Engineering Institute, as a son of “the enemy of the people”. As a result, he faced coarse, overt and systematic discrimination, which began with university expulsion and job rejections and lasted until the breakout of World War II. Despite his impeccable record of service in the Soviet Army during World War II, Rameev encountered the same unfounded discrimination when he returned from the front. As the last resort, he wrote a letter to Joseph Stalin asking for help. Instead of helpful intervention, he was summoned to a phone call with a bureaucrat who told him “to live quietly and to not write again”. It is then, at the age of 29, that Rameev realized that he had to do something extraordinary good for his country to prove that he and his family were not “the enemies of the people”.
References
External links
Russian Virtual Computer Museum
1918 births
1994 deaths
People from Baymak
Bashkir computer scientists
Soviet computer scientists
Bashkir inventors
Soviet inventors
Computer designers
Bashkir scientists
Science and technology in Bashkortostan |
4884406 | https://en.wikipedia.org/wiki/Perpetual%20access | Perpetual access | Perpetual access is the stated continuous access of licensed electronic material after is it no longer accessible through an active paid subscription either through the library or publisher action. In many cases, the two parties involved in the license agree that it is necessary for the license to retain access to these materials after the license has lapsed. Other terms for perpetual access or similar trains of thought are ‘post-cancellation access’ and ‘continuing access.'
In the licensing of software products, a perpetual license means that a software application is sold on a one-time basis and the licensee can then use a copy of the software forever. The license holder has indefinite access to a specific version of a software program by paying for it only once.
Perpetual access is a term that is used within the library community to describe the ability to retain access to electronic journals after the contractual agreement for these materials has passed. Typically when a library licenses access to an electronic journal, the journal's content remains in the possession of the licensor. The library often purchases the rights to all back issues as well as new issues. When the license elapses, access to all the journal's contents is lost. In a typical print model, the library purchases the journals and retains them for the duration of the contract but also after the contract expires. In order to retain access to journals that were released during the term of a license for digital electronic journals, the library must obtain perpetual access rights.
The ability to maintain perpetual access can be seen in the shift from print to electronic material, as apparent in both user demand and advantages of non-print material. Electronic materials rely on a relationship between library and publisher, with a distinct dynamic over the publisher’s control of the licensed material. This in turn causes issues when the paid for subscription with a publisher ends and the use of the material is now uncertain or there is the inability to share that material. With the shift from physical print material to that of electronic material, the legality of what it means to own a purchase is an issue. The concept of first-sale doctrine that formerly allowed more lenient access and use of physical print material is no longer applicable with electronic material due to past legal precedent. This essentially points to the issue that “for libraries, this means that legal ownership of individual titles, the storage unit (often a piece of hardware or software), and the ability to maintain files for future use are tied to the content provider-often a publisher or software developer.”
Perpetual access is closely related to digital archiving, which is the preservation of electronic documents. However, archiving rights are "the right to permanently retain an electronic copy of the licensed materials.” Perpetual access rights focus on continual access, and archiving rights focus on continual access and how one receives continual access. Often, if an institution is to retain perpetual access, it must design a way in which to preserve the electronic documents that are granted by the license. Several initiatives have developed methods in which to retain electronic documents and retain perpetual access. The most notable of these are the LOCKSS program and the Ithaka Portico program.
Concerns
With license agreements for perpetual access, communication between publishers and libraries is a large part of this process, as agreement terms and policy understanding are not always clear. Licensing agreements do not always even include perpetual access. In addition to this, because of the complexity involving perpetual access, libraries may find the choice to use electronic material with no understanding of how it may be used when access is gone, as it may be the only option available.
Link rot, negligence or denial of domain renewal, or closing of information source are some examples of technical issues that directly effect the ability to maintain perpetual access. Issues like these for both perpetual access and with digital preservation have garnered some more recent attention through single discipline efforts or government level. One example is the Keepers Registry, which equips libraries with resources to help them navigate perpetual access and digital preservation topics as a whole. Despite the cost effectiveness of utilizing electronic material in place of print, the cost of maintaining that electronic material is a hinderance on the other end of the spectrum for a library's ability to opt for and maintain perpetual access, both in terms of time and staffing limitations. This in turn creates a barrier in the need for continuous efforts by libraries to maintain and monitor the materials if perpetual access beyond the sole act of perpetual access being granted.
Trigger events are also another concern for libraries and the option for perpetual access capabilities. These events can be when electronic material is no longer accessible for six months or longer. One example of a trigger event is the case when access to the information made available is lost due to a natural event.
Related Initiatives
Several initiatives have developed methods in which to retain electronic documents and retain perpetual access. The most notable of these are the LOCKSS program, the CLOCKSS (Controlled LOCKSS), and the Ithaka Portico program.
See also
Digital preservation
Software license
References
External links
CLOCKSS official site
Definition of Perpetual Access from Oxford University Press
Definition of Perpetual Access from Portico
DSpace official site
Electronic Resource Management: Report of the DLF ERM Initiative
Keepers Registry official site
List of participating publishers and titles in LOCKSS
LOCKSS official site
Perpetual Access and Post-Cancellation Access from LOCKSS
Perpetual Access to Electronic Journals: A Survey of One Academic Research Library's Licenses
Portico official site
Digital rights management |
1871104 | https://en.wikipedia.org/wiki/Sticky%20bit | Sticky bit | In computing, the sticky bit is a user ownership access right flag that can be assigned to files and directories on Unix-like systems.
There are two definitions: one for files, one for directories.
For files, particularly executables, superuser could tag these as to be retained in main memory, even when their need ends, to minimize swapping that would occur when another need arises, and the file now has to be reloaded from relatively slow secondary memory. This function has become obsolete due to swapping optimization.
For directories, when a directory's sticky bit is set, the filesystem treats the files in such directories in a special way so only the file's owner, the directory's owner, or root user can rename or delete the file. Without the sticky bit set, any user with write and execute permissions for the directory can rename or delete contained files, regardless of the file's owner. Typically this is set on the /tmp directory to prevent ordinary users from deleting or moving other users' files.
The modern function of the sticky bit refers to directories, and protects directories and their content from being hijacked by non-owners; this is found in most modern Unix-like systems. Files in a shared directory such as /tmp belong to individual owners, and non-owners may not delete, overwrite or rename them.
History
The sticky bit was introduced in the Fifth Edition of Unix (in 1974) for use with pure executable files. When set, it instructed the operating system to retain the text segment of the program in swap space after the process exited. This speeds up subsequent executions by allowing the kernel to make a single operation of moving the program from swap to real memory. Thus, frequently-used programs like editors would load noticeably faster. One notable problem with "stickied" programs was replacing the executable (for instance, during patching); to do so required removing the sticky bit from the executable, executing the program and exiting to flush the cache, replacing the binary executable, and then restoring the sticky bit.
Subsequently this behavior became operative only in HP-UX and UnixWare. Solaris appears to have abandoned this in 2005. The 4.4-Lite release of BSD retained the old sticky bit behavior, but it has been subsequently dropped from OpenBSD (as of release 3.7) and FreeBSD (as of release 2.2.1). No version of Linux has ever supported this traditional behavior; Linux performs caching of executable files in the same way as all files, so re-executing the program to flush the cache is not necessary.
Usage
The most common use of the sticky bit is on directories residing within filesystems for Unix-like operating systems. When a directory's sticky bit is set, the filesystem treats the files in such directories in a special way so only the file's owner, the directory's owner, or root can rename or delete the file. Without the sticky bit set, any user with write and execute permissions for the directory can rename or delete contained files, regardless of the file's owner. Typically, this is set on the /tmp directory to prevent ordinary users from deleting or moving other users' files. This feature was introduced in 4.3BSD in 1986, and today it is found in most modern Unix-like systems.
In addition, Solaris (as of Solaris 2.5) defines special behavior when the sticky bit is set on non-executable files: those files, when accessed, will not be cached by the kernel. This is usually set on swap files to prevent access on the file from flushing more important data from the system cache. It is also used occasionally for benchmarking tests.
The sticky bit is also set by the automounter to indicate that a file has not been mounted yet. This allows programs like ls to ignore unmounted remote files.
Examples
The sticky bit can be set using the chmod command and can be set using its octal mode 1000 or by its symbol t (s is already used by the setuid bit). For example, to add the bit on the directory /usr/local/tmp, one would type chmod +t /usr/local/tmp. Or, to make sure that directory has standard tmp permissions, one could also type chmod 1777 /usr/local/tmp.
To clear it, use chmod -t /usr/local/tmp or chmod 0777 /usr/local/tmp (the latter will also reset the tmp directory to standard permissions).
In Unix symbolic file system permission notation, the sticky bit is represented either by the letter t or T in the final character-place depending on whether the execution bit for the others category is set or unset, respectively. For instance, on Solaris 8, the /tmp directory, which by default has both the others execute bit and the sticky-bit set, shows up as:
$ ls -ld /tmp
drwxrwxrwt 4 root sys 485 Nov 10 06:01 /tmp
If the sticky-bit is set on a file or directory without the execution bit set for the others category (non-user-owner and non-group-owner), it is indicated with a capital T (replacing what would otherwise be -):
# ls -l test
-rw-r--r-- 1 root anygroup 0 Nov 10 12:57 test
# chmod +t test; ls -l test
-rw-r--r-T 1 root anygroup 0 Nov 10 12:57 test
See also
chmod
setuid
References
External links
Unix File and Directory Permissions, 2010, by Wayne Pollock, archived from the original on February 3, 2012
Unix file system technology
File system permissions |
5944343 | https://en.wikipedia.org/wiki/Dean%20Cromwell | Dean Cromwell | Dean Bartlett Cromwell (September 20, 1879 – August 3, 1962), nicknamed "Maker of Champions", was an American athletic coach in multiple sports, principally at the University of Southern California (USC). He was the head coach of the USC track team from 1909 to 1948, excepting 1914 and 1915, and guided the team to 12 NCAA team national championships (1926, 1930–31, 1935–43) and 34 individual NCAA titles. He was the head coach for the U.S. track team at the 1948 Olympic Games in London, and assistant head coach for the U.S. track team at the 1936 Berlin Olympics.
In Berlin he was responsible for the expulsion of the only two Jewish American sprinters from the 4x100m relay team, while trying to appease Adolf Hitler.
Early life
Born in Turner, Oregon, Cromwell moved to southern California with his family as a boy after his father's death, and attended Occidental Prep School and Occidental College, graduating in 1902. While at Occidental, he was a multi-sport standout athlete, playing football and baseball and competing in track and cycling; in 1901 the Helms Athletic Foundation named him the outstanding athlete in southern California. After college, he worked for the telephone company, also continuing to compete in local amateur sports.
Career
After being hired as USC's track coach, he became known for his skill in developing star athletes. His many outstanding pupils included Fred Kelly (1912 gold medalist in the 110m hurdles), Charley Paddock (1920 gold medalist in the 100m and 4 × 100 relay), Bud Houser (1924 gold medalist in the shot put and discus; 1928 gold medalist in the discus), Jess Mortensen (1929 NCAA javelin champion, 1931 world record in the decathlon), Frank Wykoff (1928, 1932 and 1936 gold medalist in the 4 × 100 relay), Ken Carpenter (1936 gold medalist in the discus), Earle Meadows (1936 gold medalist in the pole vault), Louis Zamperini (collegiate record-holder in the mile from 1938–53), Wilbur Thompson (1948 gold medalist in the shot put), Cliff Bourland (1948 gold medalist in the 4 × 400 relay), Bill Sefton (two-time world record holder in the pole vault), and Mel Patton (1948 gold medalist in the 200m and 4 × 100 relay). Athletes coached by Cromwell eventually set individual world records in 14 events and relay world records in three others, and won 12 Olympic gold medals during his time at USC.
Cromwell also served as the head coach of the USC Trojans football program from 1909 to 1910, and from 1916 to 1918. His involvement with USC football goes back even farther; he is known to have officiated USC games as early as 1903, and he played (along with the coaches of both teams) for USC opponent Harvard School in a 1905 game due to the weakness of the Harvard roster. In his first term as coach, 1909 to 1910, he posted a record of 10–1–3, but this was exclusively against southern California competition, with no major colleges on the schedule. Like many schools, USC switched from football to rugby, from 1911 to 1913. Cromwell returned as football coach in 1916, by which time USC's teams had begun to be known as the Trojans. By this point, the university was facing competition which more regularly included major colleges such as California, Utah and Stanford, and his relative lack of expertise in the sport was more readily apparent. World War I also depleted the team's ranks in 1917 and 1918. In his final three years his record was still respectable at 11–7–3, though only 4–4–1 against major colleges. In his final season in 1918, USC was 2–2–2. They did not play a home game in Los Angeles until December 14 due to a citywide ban on public gatherings during the Spanish flu epidemic. Cromwell was replaced as head football coach following the season by Gus Henderson. During his tenure, Cromwell compiled a 21–8–6 record. Apart from Sam Barry, who took over the 1941 team in the wake of Howard Jones' death, Cromwell was the last USC football coach for whom it was not his primary sport. He also coached the USC basketball team in 1918, though they only played two games against the Los Angeles Athletic Club, losing both.
After retiring, Cromwell continued to serve as an advisor in track and field, and briefly was the field announcer for the National Football League's Los Angeles Rams. He died at age 82 at his Los Angeles home after suffering a heart attack; he had suffered a previous attack in March of the same year. He was survived by his wife Gertrude and their three sons; his cremated remains were interred at Twin Oaks Cemetery in Turner, Oregon. He was inducted into the National Track & Field Hall of Fame in its inaugural class in 1974, and into the USC Athletic Hall of Fame in its second class in 1995. The university's track field is named Cromwell Field in his honor. He is also a member of the Occidental College Track and Field Hall of Fame.
Cromwell can be seen as a contestant on the December 16, 1954 edition of You Bet Your Life (season 5, episode 14).
Controversy
In order to curry favor with Avery Brundage, U.S. Olympic Committee Chairman, Cromwell joined the isolationist America First Committee, of which Brundage was a founding member and organizer.
As a coach in Berlin in 1936, Cromwell held the only two Jewish American sprinters - Marty Glickman and Sam Stoller - away from the 4x100m relay team. It is said that he wanted to please Brundage, the head of the United States, who was soon the head of the IOC, was a well-known Nazi supporter, who wanted not to offend Adolf Hitler by being a Jewish emissary. He never regret his actions.
Head coaching record
Football
Writings
The High Jump''', published in 1939, International Sports, Inc., Indianapolis, Indiana
"Championship Technique In Track And Field, A Book For Athletes, Coaches, And Spectators"
Written in collaboration with Al Wesson. Published in 1941 by Whittlesey House/MaGraw-Hill
Book Company, Inc.
See also
USC Trojans
List of college football head coaches with non-consecutive tenure
References
Additional sources
Porter, David L., ed. (1988). Biographical Dictionary of American Sports: Outdoor Sports. Westport, Connecticut: Greenwood Press. .
"Dean Cromwell Mourned by Sports World." Los Angeles Times''. August 5, 1962.
2006 USC Football Media Guide
"Marty Glickman at Jewish Virtual Library". Jewishvirtuallibrary.org. Retrieved June 7, 2010; Jewish Athletes – Marty Glickman & Sam Stoller". U.S. Holocaust Museum. Stanley Meisler (July 23, 1996).
"Nazi Games Exhibit Details Discrimination, Deception: Two American Jews were booted off track team, apparently to spare Hitler embarrassment". Los Angeles Times
Glickman tells of '36 Games". Syracuse Herald Journal. January 20, 1980
Mistake of 1936 Olympic Games Not Forgotten". Los Angeles Times. March 29, 1998.
Charles Chi Halevi (April 10, 2000). "Games of Shame".
The Jerusalem Post, Howard Z. Unger (March 31, 1998).
External links
National Track & Field Hall of Fame
Occidental College Track and Field Hall of Fame
Cromwell Field
1879 births
1962 deaths
American track and field coaches
Occidental Tigers baseball players
Occidental Tigers football players
USC Trojans football coaches
USC Trojans men's basketball coaches
USC Trojans track and field coaches
People from Marion County, Oregon
Turner, Oregon
Basketball coaches from California |
11937119 | https://en.wikipedia.org/wiki/Glftpd | Glftpd | glFTPd is a freely available FTP server which runs on Unix, Linux, and BSD operating systems. It has number of features, like logins restricted by a particular set of IP addresses, transfer quotas per-user and per-group basis, and user/groups not stored in the system files, which make it attractive to private warez servers, including topsites. It does have legitimate uses though—a number of web development books recommend it amongst other general purpose FTP servers, and some Linux certification exams of SAIR required knowledge of it. It can integrate with Eggdrop through IRC channels.
History
glFTPd stands for GreyLine File Transfer Protocol Daemon. It was named after the initial developer GreyLine. The first public release of glFTPd dates back to the beginning 1998. glFTPd is well known for its detailed user permissions, extensive scripting features and for securely and efficiently transferring files between other sites using FXP.
Support
Support for glFTPd is available on IRC on EFnet in both #glftpd and #glhelp
See also
Comparison of FTP server software
References
External links
Official website
Installation of the GreyLine FTP daemon on Arch Linux
glFTPD scripts by Turranius
FTP server software
FTP server software for Linux
Unix Internet software |
63676221 | https://en.wikipedia.org/wiki/Exposure%20Notification | Exposure Notification | The (Google/Apple) Exposure Notification (GAEN) system, originally known as the Privacy-Preserving Contact Tracing Project, is a framework and protocol specification developed by Apple Inc. and Google to facilitate digital contact tracing during the COVID-19 pandemic. When used by health authorities, it augments more traditional contact tracing techniques by automatically logging encounters with other notification system users using their Android or iOS smartphone. Exposure Notification is a decentralized reporting based protocol built on a combination of Bluetooth Low Energy technology and privacy-preserving cryptography. It is used as an opt-in feature within COVID-19 apps developed and published by authorized health authorities. Originally unveiled on April 10, 2020, it was first made available on iOS on May 20, 2020 as part of the iOS 13.5 update and on December 14, 2020 as part of the iOS 12.5 update for older iPhones. On Android, it was added to devices via a Google Play Services update, supporting all versions since Android Marshmallow.
The Apple/Google protocol is similar to the Decentralized Privacy-Preserving Proximity Tracing (DP-3T) protocol created by the European DP-3T consortium and the Temporary Contact Number (TCN) protocol by Covid Watch, but is implemented at the operating system level, which allows for more efficient operation as a background process. Since May 2020, a variant of the DP-3T protocol is supported by the Exposure Notification Interface. Other protocols are constrained in how they operate as they have no special privilege over normal apps. This leads to issues, particularly on iOS devices where digital contact tracing apps running in the background experience significantly degraded performance. The joint approach is also designed to maintain interoperability between Android and iOS devices, which constitute the sheer majority of the market.
The ACLU stated the approach "appears to mitigate the worst privacy and centralization risks, but there is still room for improvement". In late April, Google and Apple shifted the emphasis of the naming of the system, describing it as an "exposure notification service", rather than "contact tracing" system.
Technical specification
Typically digital contact tracing protocols have two major responsibilities: encounter logging and infection reporting. Exposure Notification only defines encounter logging which is a decentralized architecture, with the majority of the infection reporting, currently it is centralized, being delegated to individual app implementations.
To handle encounter logging, the system uses Bluetooth Low Energy to send tracking messages to nearby devices running the protocol to discover encounters with other people. The tracking messages contain unique identifiers that are encrypted with a secret daily key held by the sending device. These identifiers change every 15–20 minutes as well as Bluetooth MAC address in order to prevent tracking of clients by malicious third parties through observing static identifiers over time.
The sender's daily encryption keys are generated using a random number generator. Devices record received messages, retaining them locally for 14 days. If a user tests positive for infection, the last 14 days of their daily encryption keys can be uploaded to a central server, where it is then broadcast to all devices on the network. The method through which daily encryption keys are transmitted to the central server and broadcast is defined by individual app developers. The Google-developed reference implementation calls for a health official to request a one-time verification code (VC) from a verification server, which the user enters into the encounter logging app. This causes the app to obtain a cryptographically signed certificate, which is used to authorize the submission of keys to the central reporting server.
The received keys are then provided to the protocol, where each client individually searches for matches in their local encounter history. If a match meeting certain risk parameters is found, the app notifies the user of potential exposure to the infection. Google and Apple intend to use the received signal strength (RSSI) of the beacon messages as a source to infer proximity. RSSI and other signal metadata will also be encrypted to resist deanonymization attacks.
Version 1.0
To generate encounter identifiers, first a persistent 32-byte private Tracing Key () is generated by a client. From this a 16 byte Daily Tracing Key is derived using the algorithm , where is a HKDF function using SHA-256, and is the day number for the 24-hour window the broadcast is in starting from Unix Epoch Time. These generated keys are later sent to the central reporting server should a user become infected.
From the daily tracing key a 16-byte temporary Rolling Proximity Identifier is generated every 10 minutes with the algorithm , where is a HMAC function using SHA-256, and is the time interval number, representing a unique index for every 10 minute period in a 24-hour day. The Truncate function returns the first 16 bytes of the HMAC value. When two clients come within proximity of each other they exchange and locally store the current as the encounter identifier.
Once a registered health authority has confirmed the infection of a user, the user's Daily Tracing Key for the past 14 days is uploaded to the central reporting server. Clients then download this report and individually recalculate every Rolling Proximity Identifier used in the report period, matching it against the user's local encounter log. If a matching entry is found, then contact has been established and the app presents a notification to the user warning them of potential infection.
Version 1.1
Unlike version 1.0 of the protocol, version 1.1 does not use a persistent tracing key, rather every day a new random 16-byte Temporary Exposure Key () is generated. This is analogous to the daily tracing key from version 1.0. Here denotes the time is discretized in 10 minute intervals starting from Unix Epoch Time. From this two 128-bit keys are calculated, the Rolling Proximity Identifier Key () and the Associated Encrypted Metadata Key (). is calculated with the algorithm , and using the algorithm.
From these values a temporary Rolling Proximity Identifier () is generated every time the BLE MAC address changes, roughly every 15–20 minutes. The following algorithm is used: , where is an AES cryptography function with a 128-bit key, the data is one 16-byte block, denotes the Unix Epoch Time at the moment the roll occurs, and is the corresponding 10-minute interval number. Next, additional Associated Encrypted Metadata is encrypted. What the metadata represents is not specified, likely to allow the later expansion of the protocol. The following algorithm is used: , where denotes AES encryption with a 128-bit key in CTR mode. The Rolling Proximity Identifier and the Associated Encrypted Metadata are then combined and broadcast using BLE. Clients exchange and log these payloads.
Once a registered health authority has confirmed the infection of a user, the user's Temporary Exposure Keys and their respective interval numbers for the past 14 days are uploaded to the central reporting server. Clients then download this report and individually recalculate every Rolling Proximity Identifier starting from interval number , matching it against the user's local encounter log. If a matching entry is found, then contact has been established and the app presents a notification to the user warning them of potential infection.
Version 1.2
Version 1.2 of the protocol is identical to version 1.1, only introducing minor terminology changes.
Privacy
Preservation of privacy was referred to as a major component of the protocol; it is designed so that no personally identifiable information can be obtained about the user or their device. Apps implementing Exposure Notification are only allowed to collect personal information from users on a voluntary basis. Consent must be obtained by the user to enable the system or publicize a positive result through the system, and apps using the system are prohibited from collecting location data. As an additional measure, the companies stated that it would sunset the protocol by-region once they determine that it is "no longer needed".
The Electronic Frontier Foundation showed concerns the protocol was vulnerable to "linkage attacks", where sufficiently capable third parties who had recorded beacon traffic may retroactively be able to turn this information into tracking information, for only areas in which they had already recorded beacons, for a limited time segment and for only users who have disclosed their COVID-19 status, once a device's set of daily encryption keys have been revealed.
On April 16, the European Union started the process of assessing the proposed system for compatibility with privacy and data protection laws, including the General Data Protection Regulation (GDPR). On April 17, 2020, the UK's Information Commissioner's Office, a supervisory authority for data protection, published an opinion analyzing both Exposure Notification and the Decentralized Privacy-Preserving Proximity Tracing protocol, stating that the systems are "aligned with the principles of data protection by design and by default" (as mandated by the GDPR).
Deployment
Exposure Notification is compatible with devices supporting Bluetooth Low Energy and running Android 6.0 "Marshmallow" or newer with Google mobile services, or iOS 13.5 or newer. On iOS, it is serviced via operating system updates. On Android, it is serviced via updates to Google Play Services (by means of Google Play), ensuring compatibility with the majority of Android devices released outside of Mainland China, and not requiring it to be integrated into an Android firmware (which would hinder deployment), although it is not compatible with Huawei devices released since May 2019 due to the US trade ban on Huawei. Apple and Google released reference implementations for apps utilizing the system, which can be used as a base.
Exposure Notification apps may only be released by public health authorities. To discourage fragmentation, each country will typically be restricted to one app, although Apple and Google stated that they would accommodate regionalized approaches if a country elects to do so.
On September 1, 2020, the consortium announced "Exposure Notifications Express" (EN Express), a system designed to ease adoption of the protocol by health authorities by removing the need to develop an app themselves. Under this system, a health authority provides parameters specific to their implementation (such as thresholds, branding, messaging, and key servers), which is then processed to generate the required functionality. On Android, this data is used to generate an app, while on iOS, the functionality is integrated directly at the system level on iOS 13.7 and newer without a dedicated app. On December 14, 2020, Apple released iOS 12.5, bringing support for Exposure Notifications to older iPhones.
The last information update on the “Exposure Notification Systems” partnership was a year end review issued by Google in December 2020: "we plan to keep you updated here with new information again next year". Nothing has however been issued on the one year anniversary of the launch of the “Exposure Notification Interface” API in spite of important changes on the pandemic front such as vaccination, variants, digital health passports, app adoption challenges as well as growing interest for tracking QR codes (and notifying from that basis) on a mostly airborne transmitted virus. The Frequently Asked Questions (FAQ) published document has not been revised since May 2020. Basic support remains provided through the apps store released by authorized public health agencies, including enforcement of the personal privacy protection framework as demonstrated on the UK NHS challenge in support of their contact tracers.
In June 2021, Google faced allegations that it had automatically downloaded Massachusetts' "MassNotify" app to Android devices without user consent. Google clarified that it had not actually downloaded the app to user devices, and that Google Play Services was being used to deploy an EN Express configuration profile, allowing it to be activated from the Exposure Notification section of the Google Settings app on an opt-in basis.
Adoption
As of May 21, at least 22 countries had received access to the protocol. Switzerland and Austria were among the first to back the protocol. On April 26, after initially backing PEPP-PT, Germany announced it would back Exposure Notification, followed shortly after by Ireland and Italy. Despite already adopting the centralised BlueTrace protocol, Australia's Department of Health and Digital Transformation Agency are investigating whether the protocol could be implemented to overcome limitations of its COVIDSafe app. On May 25, Switzerland became the first country to launch an app leveraging the protocol, SwissCovid, beginning with a small pilot group.
In England, the National Health Service (NHS) trialed both an in-house app on a centralized platform developed by its NHSX division, and a second app using Exposure Notification. On June 18, the NHS announced that it would focus on using Exposure Notification to complement manual contact tracing, citing tests on the Isle of Wight showing that it had better cross-device compatibility (and would also be compatible with other European approaches), but that its distance calculations were not as reliable as the centralized version of the app, an issue which was later rectified. Later, it was stated that the app would be supplemented by QR codes at venues. A study of the impact of Exposure Notification in England and Wales estimated that it averted 8,700 (95% confidence interval 4,700–13,500) deaths out of the 32,500 recorded from its introduction on 24 September 2020 to 31 December 2020.
Canada launched its COVID Alert app, co-developed in partnership with BlackBerry Limited and Shopify, on July 31 in Ontario.
In May 2020, Covid Watch launched the first calibration and beta testing pilot of the GAEN APIs in the United States at the University of Arizona. In Aug 2020, the app launched publicly for a phased roll-out in the state of Arizona.
The U.S. Association of Public Health Laboratories (APHL) stated in July 2020 that it was working with Apple, Google, and Microsoft on a national reporting server for use with the protocol, which it stated would ease adoption and interoperability between states.
In August 2020, Google stated that at least 20 U.S. states had expressed interest in using the protocol. In Alabama, the Alabama Department of Public Health, University of Alabama at Birmingham, and the University of Alabama System deployed the "GuideSafe" app for university students returning to campus, which includes Exposure Notification features. On August 5, the Virginia Department of Health released its "COVIDWise" app — making it the first U.S. state to release an Exposure Notification-based app for the general public. North Dakota and Wyoming released an EN app known as "Care19 Alert", developed by ProudCrowd and using the APHL server (the app is a spin-off from an existing location logging application it had developed, based on one it had developed primarily for use by students travelling to attend college football away games).
Maryland, Nevada, Virginia, and Washington, D.C. have announced plans to use EN Express. In September, Delaware, New Jersey, New York, and Pennsylvania all adopted "COVID Alert" apps developed by NearForm, which are based on its COVID Tracker Ireland app. Later that month, the Norwegian Institute of Public Health announced that it would lead development of an Exposure Notification-based app for the country, which replaces a centralized app that had ceased operations in June 2020 after the Norwegian Data Protection Authority ruled that it violated privacy laws.
Alternatives
Some countries, such as France, have pursued centralized approaches to digital contact tracing, in order to maintain records of personal information that can be used to assist in investigating cases. The French government asked Apple in April 2020 to allow apps to perform Bluetooth operations in the background, which would allow the government to create its own system independent of Exposure Notification.
On August 9, the Canadian province of Alberta announced plans to migrate to the EN-based COVID Alert from its BlueTrace-based ABTraceTogether app. This did not occur, and on November 6 Premier of Alberta Jason Kenney announced that the province would not do so, arguing that ABTraceTogether was "from our view, simply a better and more effective public health tool", and that they would be required to phase out ABTraceTogether if they did switch. British Columbia has also declined to adopt COVID Alert, with provincial health officer Bonnie Henry stating that COVID Alert was too "non-specific".
Australia's officials have stated its COVIDSafe, which is based on Singapore's BlueTrace, will not be shifting from manual intervention.
In the United States, states such as California and Massachusetts declined to use the technology, opting for manual contact tracing. California later reversed course and adopted the system in December 2020.
Chinese vendor Huawei (which cannot include Google software on its current Android products due to U.S. sanctions) added a OS-level DP-3T API known as "Contact Shield" to its Huawei Mobile Services stack in June 2020, which the company states is intended to be interoperable with Exposure Notification.
References
External links
Official Website (Google)
Official Website (Apple)
Announcement (Google)
Announcement (Apple)
Overview presentation (Google)
Technical specifications (Apple)
Exposure Notification: Frequently Asked Questions (Apple/Google)
Overview of the version 1.0 of Contact Tracing protocol by Apple & Google
Overview of the version 1.2 and comparation with version 1.0
Mobile applications
Software associated with the COVID-19 pandemic
Scientific and technical responses to the COVID-19 pandemic
Google
Apple Inc.
Digital contact tracing protocols
Digital contact tracing protocols with decentralized reporting |
53847202 | https://en.wikipedia.org/wiki/Hesione%20%28mythology%29 | Hesione (mythology) | In Greek mythology, the name Hesione (/hɪˈsaɪ.əniː/; Ancient Greek: Ἡσιονη) refers to various mythological figures:
Hesione, a daughter of Oceanus.
Hesione, also called Isonoe, one of the Danaids who became the lover of Zeus and bore a son by him, Orchomenos.
Hesione, a Trojan princess and daughter of Laomedon.
Hesione, one of the names given to the wife of Nauplius, who was the father of Palamedes, Oiax and Nausimedon. The mythographer Apollodorus reports that, according to Cercops Nauplius' wife was Hesione, and that in the Nostoi (Returns), an early epic from the Trojan cycle of poems about the Trojan War, his wife was Philyra, but that according to the "tragic poets" his wife was Clymene.
Hesione, daughter of Celeus, was one of the sacrificial victims of Minotaur. She may be the sister of another victim, Porphyrion granting that their father named Celeus is the same.
Other
Wonder Woman (comic book) 1976 Volume 1: Issues 226 & 227 - A golden robot formerly owned by Hephaestus, The God of Fire
Notes
References
Aeschylus, translated in two volumes. 1. Prometheus Bound by Herbert Weir Smyth, Ph. D. Cambridge, MA. Harvard University Press. 1926. Online version at the Perseus Digital Library. Greek text available from the same website.
Apollodorus, Apollodorus, The Library, with an English Translation by Sir James George Frazer, F.B.A., F.R.S. in 2 Volumes. Cambridge, MA, Harvard University Press; London, William Heinemann Ltd. 1921. Online version at the Perseus Digital Library.
Dictys Cretensis, from The Trojan War. The Chronicles of Dictys of Crete and Dares the Phrygian translated by Richard McIlwaine Frazer, Jr. (1931-). Indiana University Press. 1966. Online version at the Topos Text Project.
Gantz, Timothy, Early Greek Myth: A Guide to Literary and Artistic Sources, Johns Hopkins University Press, 1996, Two volumes: (Vol. 1), (Vol. 2).
Hard, Robin, The Routledge Handbook of Greek Mythology: Based on H.J. Rose's "Handbook of Greek Mythology", Psychology Press, 2004, . Google Books.
Maurus Servius Honoratus, In Vergilii carmina comentarii. Servii Grammatici qui feruntur in Vergilii carmina commentarii; recensuerunt Georgius Thilo et Hermannus Hagen. Georgius Thilo. Leipzig. B. G. Teubner. 1881. Online version at the Perseus Digital Library
Women in Greek mythology
Characters in Greek mythology |
11248067 | https://en.wikipedia.org/wiki/Death%20%28play%29 | Death (play) | Death is a play by Woody Allen. It was first published in 1975, along with God, and other short stories in Woody Allen's book Without Feathers. It is a comedic version of Eugène Ionesco's 1959 play The Killer. His 1991 film Shadows and Fog was based on this play.
Plot
Kleinman, a meek salesman, is awoken late one night by a mob led by a man named Hacker, who forces him to join their vigilante group dedicated to catching a serial killer who frequently changes his modus operandi. Hacker claims to have a plan to catch the maniac, but when Kleinman asks about what he has to do, each man in the group says that they are only aware of their own part of the plan so the killer won't catch on. They march him to the street to stand guard and leave him on his own to await his part in the plan.
Kleinman is eventually joined by a doctor, who tells him that his interest in the case is to catch the killer so he can understand a psychopathic mind. The doctor leaves, and Kleinman hears screams in the night. He then meets a prostitute, Gina, and the two contemplate death and the possibility of life in the universe. Gina eventually leaves too, before the doctor returns mortally wounded by the maniac. A policeman and another man find the body. The man tells Kleinman that Hacker was murdered, but by a rogue faction of his vigilante mob who splintered off when they disagreed with his ideas on how to catch the killer. The two mobs arrive and demand that Kleinman join them, before getting into a large fight.
A third mob shows up, having hired a clairvoyant named Hans Spiro to identify the killer. Spiro says that Kleinman is the maniac, and the mobs join to perform a kangaroo court and sentence him to death. Just before Kleinman's hanging, a man arrives to tell the group that the killer has been spotted. The mob apologizes, and all run off. Kleinman is again left alone, before the real maniac, who resembles Kleinman, enters. The two converse briefly, and the killer admits to being a psychopath but insists that he can kill easily by pretending to be sane before he stabs Kleinman and exits. Kleinman is discovered dying by the mob, who bicker over his body until he expires. After Kleinman's death, another man arrives and tells the mob that the killer has been spotted in a different location, and they all run off again.
Characters
Kleinman
Hank
Al
Sam
Hacker
John
Victor
Anna
Doctor
Gina
Man
Policeman
Bill
Frank
Don
Hans Spiro
Assistant
Henry
Abe
Maniac
References
1975 plays
Plays by Woody Allen
Comedy plays
American plays adapted into films
Plays based on other plays |
28264175 | https://en.wikipedia.org/wiki/Software%20Freedom%20Conservancy | Software Freedom Conservancy | Software Freedom Conservancy is an organization that provides a non-profit home and infrastructure support, including legal services, for free/open source software projects. The organization was established in 2006, with the help of the Software Freedom Law Center (SFLC). As of June 2018, the organization had over 40 member projects.
History
Software Freedom Conservancy (SFC) was established in 2006, with the backing of the Software Freedom Law Center.
In 2007 Conservancy started coordinating GNU General Public License compliance and enforcement actions, primarily for the BusyBox project (see BusyBox GPL lawsuits).
In October 2010, Conservancy hired its first Executive Director, Bradley M. Kuhn and a year later, its first General Counsel, Tony Sebro.
In May 2012, Conservancy took on GPL compliance and enforcement for several other member projects, as well as for a number of individual Linux kernel developers. In March 2014, Conservancy appointed Karen Sandler as its Executive Director, with Bradley M. Kuhn taking on the role as Distinguished Technologist.
In February 2015, the Outreachy program (formerly the Free and Open Source Software Program for Women) announced that it was moving from The GNOME Project to become part of Conservancy.
As of July 2015, Conservancy had 30 member projects, including QEMU, Boost, BusyBox, Git, Inkscape, Samba, Sugar Labs and Wine.
In May 2016, Yorba Foundation assigned the copyrights of the projects it has developed to Software Freedom Conservancy. This includes copyrights for Shotwell, Geary, Valencia, gexiv2. The calendar application California is absent of the bundle because of an oversight on Yorba's part.
In November 2017, the SFC reported that the Software Freedom Law Center had demanded the invalidation of the SFC's trademark.
Member projects
Current projects
the following 46 projects are members of Software Freedom Conservancy:
ArgoUML
Backdrop CMS
Bongo
Boost
Bro Network Security Monitor
Buildbot
BusyBox
Clojars
Common Workflow Language open standards project
Coreboot
Darcs
Debian Copyright Aggregation Project
Etherpad
Evergreen
Geary
Gevent
Git
Godot
GPL Compliance Project for Linux Developers
Harvey OS
Homebrew
Inkscape
K-3D
Kallithea
Kohana
Libbraille
LibreHealth
Linux XIA
Mercurial
Metalink
North Bay Python
OpenChange
OpenTripPlanner
OpenWrt
Outreachy
phpMyAdmin
PyPy
QEMU
Reproducible Builds
Samba
Selenium
Spec-Ops
Squeak
Sugar Labs
SurveyOS
SWIG
Teaching Open Source
Twisted
uClibc
Wine
Former projects
These projects have since been removed from the Software Freedom Conservancy's current project list since 2016:
Foresight Linux
gexiv2
Shotwell
Valencia
Directors
, Conservancy's directors are:
Jeremy Allison
Kate Chapman
Dr. Laura Fortunato
Mark Galassi (Vice President and Chair)
Bradley M. Kuhn (President)
Mike Linksvayer
Martin Michlmayr (Treasurer)
Tony Sebro
The Board Secretary is Karen Sandler.
Past directors include:
Tom Tromey
Ian Lance Taylor
Peter T. Brown (Treasurer)
Stormy Peters
Litigation
In July 2010, Conservancy announced it had prevailed in court against Westinghouse Digital, receiving an injunction as part of a default judgement.
In March 2015, Conservancy announced it was funding litigation by Christoph Hellwig against VMware for violation of his copyrights in its ESXi product. The case will be heard in the district court of Hamburg, Germany. VMware stated that it believed the case was without merit and expressed disappointment that Conservancy had resorted to litigation.
Trademarks
Software Freedom Conservancy holds the registered word trademark for Git under US500000085961336 since 2015-02-03.
See also
Apache Software Foundation (ASF)
Free Software Foundation (FSF)
Open Source Initiative (OSI)
Software Freedom Law Center (SFLC)
Software in the Public Interest (SPI)
References
External links
501(c)(3) organizations
Free and open-source software organizations
Non-profit organizations based in New York City
Organizations established in 2006
Fiscal sponsorship organizations |
50660269 | https://en.wikipedia.org/wiki/2015%E2%80%932016%20SWIFT%20banking%20hack | 2015–2016 SWIFT banking hack | In 2015 and 2016, a series of cyberattacks using the SWIFT banking network were reported, resulting in the successful theft of millions of dollars. The attacks were perpetrated by a hacker group known as APT 38 whose tactics, techniques and procedure overlap with the infamous Lazarus Group who are believed to be behind the Sony attacks. Experts agree that APT 38 was formed following the March 2013 sanctions and the first known operations connected to this group occurred in February 2014. If the attribution to North Korea is accurate, it would be the first known incident of a state actor using cyberattacks to steal funds.
The attacks exploited vulnerabilities in the systems of member banks, allowing the attackers to gain control of the banks' legitimate SWIFT credentials. The thieves then used those credentials to send SWIFT funds transfer requests to other banks, which, trusting the messages to be legitimate, then sent the funds to accounts controlled by the attackers.
First reports
The first public reports of these attacks came from thefts from Bangladesh central bank and a bank in Vietnam.
A $101 million theft from the Bangladesh central bank via its account at the New York Federal Reserve Bank was traced to hacker penetration of SWIFT's Alliance Access software, according to a New York Times report. It was not the first such attempt, the society acknowledged, and the security of the transfer system was undergoing new examination accordingly.
Soon after the reports of the theft from the Bangladesh central bank, a second, apparently related, attack was reported to have occurred on a commercial bank in Vietnam.
Both attacks involved malware written to both issue unauthorized SWIFT messages and to conceal that the messages had been sent. After the malware sent the SWIFT messages that stole the funds, it deleted the database record of the transfers then took further steps to prevent confirmation messages from revealing the theft. In the Bangladeshi case, the confirmation messages would have appeared on a paper report; the malware altered the paper reports when they were sent to the printer. In the second case, the bank used a PDF report; the malware altered the PDF viewer to hide the transfers.
Furthermore, news agency Reuters reported on 20 May 2016 that there had already been a similar case in Ecuador in early 2015 when Banco del Austro funds were transferred to bank accounts in Hong Kong. Neither Banco del Austro nor Wells Fargo, who were asked to conduct the transactions, initially reported the movements to SWIFT as suspicious; implications that the actions actually were a theft only emerged during a BDA lawsuit filed against Wells Fargo.
Expanded scope and suspicions of North Korea
After the initial two reports, two security firms reported that the attacks involved malware similar to that used in the 2014 Sony Pictures Entertainment hack and impacted as many at 12 banks in Southeast Asia. Both attacks are attributed to a hacker group nicknamed Lazarus Group by researchers. Symantec has linked the group with North Korea. If North Korea's involvement is true, it would be the first known incident of a state actor using cyberattacks to steal funds.
Ramifications
International relations
If the attack did originate in North Korea, the thefts would have profound implications for international relations. It would be the first known instance of a state actor using cyber attacks to steal funds.
The thefts may also have implications for the regime of international sanctions that aim to isolate North Korea's economy. The theft may represent a significant percentage of North Korea's current GDP.
SWIFT system
Trust in the SWIFT system has been an important element in international banking for decades. Banks consider SWIFT messages trustworthy, and can thus follow the transmitted instructions immediately. In addition, the thefts themselves can threaten the solvency of the member banks. "This is a big deal, and it gets to the heart of banking," said SWIFT's CEO, Gottfried Leibbrandt, who added, "Banks that are compromised like this can be put out of business."
Following the attacks, SWIFT announced a new regime of mandatory controls required of all banks using the system. SWIFT will inspect member banks for compliance, and inform regulators and other banks of noncompliance.
SWIFT officials have made repeated remarks that attacks on the system are expected to continue. In September 2016, SWIFT announced that three additional banks had been attacked. In two of the cases, the hackers succeeded in sending fraudulent SWIFT orders, but the receiving banks found them to be suspicious and discovered the fraud. According to SWIFT officials, in the third case, a patch to the SWIFT software allowed the attacked bank to detect the hackers before messages were sent.
See also
Illicit activities of North Korea
References
2015 crimes in the United States
2016 crimes in the United States
Cyberattacks on banking industry
Cyberwarfare in the United States
Society for Worldwide Interbank Financial Telecommunication
Hacking in the 2010s
Data breaches in the United States |
8385835 | https://en.wikipedia.org/wiki/Robert%20David%20Stevens | Robert David Stevens | Robert David Stevens (born 1965) is a professor of bio-health informatics. and Head of Department of Computer Science at The University of Manchester
Education
Stevens gained his Bachelor of Science degree in biochemistry from the University of Bristol in 1986, a Master of Science degree in bioinformatics in 1991 and a DPhil in Computer Science in 1996, both from the University of York.
Career and research
Stevens current research interests are the construction of biological ontologies, such as the Gene Ontology, and the reconciliation of semantic heterogeneity in bioinformatics. This research has been funded by the Engineering and Physical Sciences Research Council (EPSRC), Biotechnology and Biological Sciences Research Council (BBSRC) and the European Union.
Stevens has been Principal investigator for a range of research projects including Ondex, ComparaGrid, SWAT (Semantic Web Authoring Tool) and the Ontogenesis Network.
Stevens served as Program Chair and co-organiser for the International Conference on Biomedical Ontology (ICBO) 2012 and co-founded the UK Ontology Network. He has also participated in the Health care and Life Sciences Interest Group (HCLSIG) of the World Wide Web Consortium (W3C). Stevens is currently on the editorial board of the Journal of Biomedical Semantics. Stevens started as a lecturer, then became a senior lecturer, Reader and became a Professor in August 2013.
Stevens has taught on several undergraduate and postgraduate courses on software engineering, databases, bioinformatics and runs introductory and advanced courses on the Web Ontology Language. He has been the main doctoral advisor to five successful PhD students and co-supervised several others.
Since July 2016 he has served as Head of Department of Computer Science at The University of Manchester.
References
|-
Alumni of the University of York
Alumni of the University of Bristol
Academics of the University of Manchester
People associated with the Department of Computer Science, University of Manchester
Living people
1965 births |
2301817 | https://en.wikipedia.org/wiki/Kurt%20Wallander | Kurt Wallander | Kurt Wallander () is a fictional Swedish police inspector created by Swedish crime writer Henning Mankell (1948 – 2015). He is the protagonist of many mystery novels set in and around the town of Ystad, south-east of the city of Malmö, in the southern province of Scania. Wallander has been portrayed on screen by the actors Rolf Lassgård, Krister Henriksson, Sir Kenneth Branagh and Adam Pålsson.
Biography
As a young police officer, he was nearly killed when a drunk whom he was questioning stabbed him with a butcher's knife (this is mentioned in the account of his first case). Wallander was once married, but his wife Mona left him and he has since had a difficult relationship with his rebellious only child, Linda, who barely survived a suicide attempt when she was fifteen. He also had issues with his late father, an artist who painted the same landscape 7,000 times for a living; the elder Wallander strongly disapproved of his son's decision to join the police force and frequently derided him for it.
Wallander is a great fan of the opera; while in his car he regularly listens to recordings of famous opera singers such as Maria Callas, and when he can find the time goes to opera performances, sometimes crossing over to Copenhagen, Denmark for this purpose. At one time, Wallander had dreamed of making opera his life, leaving the police force and becoming the impresario of his friend, Sten Widén, a tenor who aspired to sing opera. But Widén's voice was not good enough and the dream came to naught—a crushing disappointment in Wallander's life (as in Widén's).
Inspector Wallander has few close friends and is known for his less-than-desirable lifestyle; he consumes too much alcohol and junk food, exercises very little, and sometimes struggles with anger. He frequently regards the crimes he investigates on a very personal level, throwing himself into catching criminals and going against the orders of his superiors to try to solve a case, often with negative effects on his emotional stability.
Over the years he has grown increasingly disillusioned with his work and often wonders whether he should have become a police officer at all. He was once falsely sued and harassed for police brutality and still lives with the guilt of having shot and killed a man in the fog, an act which drove him into depression and nearly led to his resignation. His relationships with his colleagues are tentative; they are alternately amazed by his intellect and frustrated by his brusque manner and aggressive tactics.
He is frequently at loose ends socially and with his family. After the breakup of his marriage, he had an affair with Annette Brolin, the prosecutor with whom he was working on some cases — but she was married and had children, and would not consider divorcing for his sake ("Faceless Killers"). In later years, he maintains a somewhat inconsistent romantic relationship with Baiba Liepa, a woman in Riga, Latvia, whom he met while investigating a murder there, until it eventually dissolves. Over the course of the series he is diagnosed with diabetes, and towards the end of his career he suffers from memory lapses, discovering he has developed Alzheimer's disease, with which his father was also afflicted.
Novels
The following Kurt Wallander novels have been translated into English. They are listed in the order that they were originally published in Sweden:
Mördare utan ansikte (1991; English translation by Steven T. Murray: Faceless Killers, 1997)
Hundarna i Riga (1992; English translation by Laurie Thompson: The Dogs of Riga, 2001)
Den vita lejoninnan (1993; English translation by Laurie Thompson: The White Lioness, 1998)
Mannen som log (1994; English translation by Laurie Thompson: The Man Who Smiled, 2005)
Villospår (1995; English translation by Steven T. Murray: Sidetracked, 1999)
Den femte kvinnan (1996; English translation by Steven T. Murray: The Fifth Woman, 2000)
Steget efter (1997; English translation by Ebba Segerberg: One Step Behind, 2002)
Brandvägg (1998; English translation by Ebba Segerberg: Firewall, 2002)
Pyramiden (1999; short stories; English translation by Ebba Segerberg with Laurie Thompson: The Pyramid, 2008)
Handen (2004; novella; originally published in Dutch (2004) as Het Graf (The Grave). Published in Swedish, 2013. English translation by Laurie Thompson: An Event in Autumn, 2014)
Den orolige mannen (2009; English translation by Laurie Thompson: The Troubled Man, 2011)
The following novel features Wallander's daughter Linda in the lead, while he is a secondary character:
Innan frosten (2002; English translation by Ebba Segerberg: Before the Frost, 2005)
It was intended as the first of a spinoff trilogy. However Mankell was so distraught after the suicide of Johanna Sällström, the actress playing the character at the time in the Swedish TV series, that he decided to abandon the series after only the first novel.
The order that the novels occur in the timeline of the series is shown below (with the title of the English translation shown in parentheses). Note that there is some overlap in the timeline among the novels as there are three separate series.
Pyramiden (The Pyramid)
Mördare utan ansikte (Faceless Killers)
Hundarna i Riga (The Dogs of Riga)
Den vita lejoninnan (The White Lioness)
Mannen som log (The Man Who Smiled)
Villospår (Sidetracked)
Den femte kvinnan (The Fifth Woman)
Steget efter (One Step Behind)
Brandvägg (Firewall)
Innan frosten (Before the Frost)
Handen (An Event in Autumn)
Den orolige mannen (The Troubled Man)
TV and film
Film series (Swedish)
Between 1994 and 2007, all nine Wallander novels published at the time were made into films in Sweden starring Rolf Lassgård as Wallander:
TV series (Swedish)
From 2005 to 2006, 13 new stories, starring Krister Henriksson as Kurt Wallander and Johanna Sällström as Linda Wallander, were produced. The first film, based on Before the Frost, was released in cinemas. The rest are original stories not based on any of Mankell's books, and were released on DVD, with the exception of Mastermind which was also released in cinemas.
.
.
.
.
.
.
.
.
.
.
.
.
.
Two of these films were directed by BAFTA award-winning Swedish director Jonas Grimås, who outside Sweden is best known for his work on British television such as the 1990s crime drama Second Sight (Kingdom of the Blind) starring Clive Owen, police drama series Heartbeat, and Hamish Macbeth.
Yellow Bird announced in March 2008 that 13 new Swedish language Wallander films were to be made with Krister Henriksson. Production started in 2008. These new films were to have a more political slant than the previous films starring Henriksson. The first production in the second series, , was given a cinematic launch in Sweden on 9 January 2009 before being released on DVD. The theme over the closing credits is "Quiet Night", sung by Anna Ternheim. The remaining were scheduled to be released on DVD during early 2010:
.
.
.
.
.
.
.
.
.
.
.
.
.
A third series consisting of six episodes was released in 2013. This is the last season with Krister Henriksson. In these final episodes, Kurt Wallander suffers from memory problems because of Alzheimer’s disease, and he cannot continue to work as a policeman.
Den orolige mannen
Försvunnen
Sveket
Saknaden
Mordbrännaren
Sorgfågeln.
TV series (British)
The novels have also been adapted as twelve television films for the BBC, produced by Yellow Bird and Left Bank Pictures. The series stars Kenneth Branagh as Wallander. The episodes have not been filmed in the order in which the original novels were published, resulting in changes to the backstories of the lead characters in the films. The first series consisted of the novels Sidetracked, Firewall and One Step Behind. These three were shot on location in Ystad in the summer of 2008, with a combined budget of £6 million ($12 million). They aired in late 2008 on the BBC.
A second series of Wallander adaptations was commissioned by the BBC from the same production team in 2008. Broadcast in January 2010, the second series was composed of adaptations of Faceless Killers, The Man Who Smiled, and The Fifth Woman.
The third series began shooting in Ystad and Riga, Latvia in the Summer of 2011 and continued into the winter. Broadcast in July 2012, it consists of adaptations of An Event in Autumn, The Dogs of Riga and Before the Frost. While the novel Before the Frost has Wallander's daughter Linda as its protagonist detective, the story was adapted for television so that Wallander himself became the lead.
The BBC Wallander series concluded in May 2016 with a three-episode fourth series consisting of an adaptation of The White Lioness and a two-episode adaptation of Mankell's final Wallander novel, The Troubled Man.
Special appearances
Mankell's friend and writer Jan Guillou used Kurt Wallander in the 10th book of his Carl Hamilton-series . Guillou and Mankell also co-wrote the Swedish crime-drama mini series Talismanen and here we also encounter Kurt Wallander as a supporting character, this time portrayed by actor Lennart Jähkel.
Young Wallander
Starting in September 2020, streaming service Netflix, launched a new English-language series based on the character, entitled Young Wallander. The series depicts Wallander (Adam Pålsson) as a rookie detective in present-day Malmö, as opposed to the usual setting of Ystad. The show is not based on any of the novels, nor does it feature any of the familiar supporting characters from Mankell's works.
Young Wallander is a Swedish-UK co-production, with Pålsson the sole Swedish actor, amidst a mostly British cast. In November 2020, the series was renewed for a second season.
See also
Author Maj Sjöwall and Per Wahlöö's Swedish detective character Martin Beck
References
Further reading
External links
Branagh's Wallander - Website relating to the BBC's English-language Wallander starring Kenneth Branagh
Official Henning Mankell site
Comprehensive Henning Mankell fan site
Kurt Wallander's Ystad
In the Footsteps of Wallander PDF document from Ystad Tourist Office with map of places referred to in the novels.
Article in Variety concerning plans to bring Wallander to British television
Yellow Bird site
Interesting TV on Wallander - Analysis of all three major Wallander Film/TV adaptations and associations with each of the Mankell novels.
Fictional Scandinavian people
Characters in crime novel series
Fictional Swedish police detectives
Thriller film characters
Book series introduced in 1991
Literary characters introduced in 1991
Crime novel series
Detective novels
Mystery novels
Swedish crime novels
Swedish detective novels
Swedish mystery novels
Novels set in Sweden |
48428787 | https://en.wikipedia.org/wiki/Gackpoid | Gackpoid | , is a software product developed by Internet Co., Ltd. for the Vocaloid software. His voice is sampled from Japanese singer and actor Gackt. The mascot of the software is called , after Gackt's alias name. Gackpo is sometimes referred to as Gackpo Camui or Gakupo Kamui, and usually referred to as Kamui Gakupo.
Development
Gackpoid was developed by Internet Co., Ltd. using Yamaha Corporation's Vocaloid 2 synthesizer software as their first venture into the voice synthesizer industry. The initial version was released on July 31, 2008. The name "Gackpoid", meaning "Gackt-like Vocaloid", was chosen by Gackt himself during the voice recording process. Camui Gackpo, the software's mascot, was designed by manga artist Kentaro Miura (notable for manga Berserk) and chosen by Internet Co. from a pool of several competing designs.
His vocals were one of the Vocaloid 2 male vocals which was used in reference to the creation of VY2.
Additional software
V3 Gackpoid, an update to the original Gackpoid using the Vocaloid 3 synthesizer software, was released on July 13, 2012. It was released as a package with three different vocal tones: Native (the basic voice), Power, and Whisper.
On July 25, 2013, Internet Co., Ltd. announced that Mac OS X versions of all of their software, including Gackpoid, were in development using the Vocaloid Editor for Cubase NEO software for Macintosh. The OS X versions were rolled out as a free download for all registered users in October 2013, though Internet Co. advised users to choose only one version—Mac or Windows—due to only one license being available per user.
On December 2, 2014, Noboru Murakami, president of Internet Co., Ltd., stated that an update of Gackpoid from Vocaloid 3 to Vocaloid 4 was forthcoming, but declined to specify release dates or schedules. Gackpoid V4, an update to the voice library using the Vocaloid 4 synthesizer software, was released on April 30, 2015. It added growl samples to each of the three voice banks and contained general improvements to the V3 program.
Both Vocaloid 3 (once imported into the newer engine) and Vocaloid 4 versions of the software have access to the new Vocaloid 4 "Cross-Synthesis" system, though the vocals can only cross with their respect engine versions.
Noboru had expressed hope to one day make a Gackpoid English. He noted, however, English vocals could only occur if they were profitable.
V3 Gackpoid and VY2v3 were also the focus of vol.4 of the Vocaloid-P data series.
Characteristics
Gackpo is designed to resemble a samurai clad in a jinbaori, a type of kimono which was used as a battle surcoat along with parts of traditional armor, while carrying a katana.
Contest
On June 12, 2009, Nico Nico Douga announced a "Gackpoid Contest", in partnership with Dwango and CELL, to encourage song creators to use Gackpoid for the creation of quality, original songs. Gackt had previously hinted at the contest on June 10 during a Nico Nico live broadcast challenge, where he announced that he would not appear in another broadcast until viewers created "spirited" music using the software in an upcoming competition. Competitors were asked to attach a "Gackpoid Contest" tag to their submitted videos in order to compete in the contest, which accepted submissions until the end of August. Finalist entries would be awarded 100,000 yen (US$1,220) each. Gackt promised viewers that he would choose one or more of the winning songs to cover for eventual recording and release.
On December 15, Nico Nico Douga aired a live broadcast announcing ten finalists for the contest, which would win "Excellence Prizes". The official winning entry was declared to be Episode.0, by mathru/Kanimiso-P, during the “NicoNico Daikaigi 2011 in Taiwan” event, where Gackt announced that Episode.0 and another finalist, natsu-P's Paranoid Doll, had been covered by him and would be released together as his thirty-ninth single. Kanimiso-P was awarded 300,000 yen (US$3,660) and the nine other Excellence Prize winners 100,000 yen. The single was released on July 13, 2011.
See also
Megpoid
List of Vocaloid products
References
External links
Official website
Vocaloids introduced in 2008
Fictional singers
Japanese idols
Japanese popular culture |
2695900 | https://en.wikipedia.org/wiki/PC-MOS/386 | PC-MOS/386 | PC-MOS/386 is a multi-user, multitasking computer operating system produced by The Software Link (TSL), announced at COMDEX in November 1986 for February 1987 release. PC-MOS/386, a successor to PC-MOS, can run many MS-DOS programs on the host machine or a terminal connected to it. Unlike MS-DOS, PC-MOS/386 is optimized for the Intel 80386 processor; however early versions will run on any x86 computer. PC-MOS/386 used to be proprietary, but it was released as open-source software in 2017.
History
The last commercial version produced was v5.01, compatible with MS-DOS 5. It required a memory management unit (MMU) to support memory protection, so was not compatible with 8086 and 8088 processors.
MMU support for 286 class machines was provided using a proprietary hardware shim inserted between the processor and its socket. 386 machines did not require any special hardware.
Multi-user operation suffered from the limitations of the day including the inability of the processor to schedule and partition running processes. Typically swapping from a foreground to a background process on the same terminal used the keyboard to generate an interrupt and then swap the processes. The cost of RAM (over US$500/Mb in 1987) and the slow and expensive hard disks of the day limited performance.
PC-MOS terminals could be x86 computers running terminal emulation software communicating at 9600 or 19200 baud, connected via serial cables. Speeds above this required specialized hardware boards which increased cost, but the speed was not a serious limitation for interacting with text-based programs.
PC-MOS also figured prominently in the lawsuit Arizona Retail Systems, Inc. v. The Software Link, Inc., where Arizona Retail Systems claimed The Software Link violated implied warranties on PC-MOS. The case is notable because The Software Link argued that it had disclaimed the implied warranties via a license agreement on the software's shrinkwrap licensing. The result of the case, which Arizona Retail Systems won, helped to establish US legal precedent regarding the enforceability of shrinkwrap licenses.
There was a year 2000 problem-like issue in this operating system, first manifesting on 1 August 2012 rather than 1 January 2000: files created on the system from this date on would no longer work.
On 21 July 2017 PCMOS/386 was relicensed under GPL v3 and its source code uploaded to GitHub, with the "year 2012" issue corrected.
Commands
Commands supported by PC-MOS Version 4 are:
ABORT
ADDDEV
ADDTASK
ALIAS
AUTOCD
BATECHO
BREAK
CALL
RETURN
CD
CLASS
CLS
COMMAND
COMPFILE
COPY
DATE
DEBUG
DIR
DIRMAP
DISKCOPY
DISKID
DOT
ECHO
ED
ENVSIZE
ERASE
EXCEPT
EXPORT
FILEMODE
FLUSH
FOR
FORMAT
GOTO
HDSETUP
HELP
IF
IMPORT
INSERT
KEY
KEYMAP
MD
MORE
MOS
MOSADM
MSORT
MSYS
NEXT
ONLY
PATH
PAUSE
PRINT
PROMPT
RD
REL
REM
REMDEV
REMTASK
RENAME
SEARCH
SET
SIGNOFF
SIGNON
SPOOL
STOP
SWITCH
TEXT
ENDTEXT
TIME
TYPE
VERIFY
WVER
See also
DoubleDOS
Multiuser DOS - Digital Research's unrelated multi-user operating system
VM/386 - unrelated multi-tasking DOS environment
Virtual DOS machine
Multiuser DOS Federation
FreeDOS
Timeline of operating systems
References
1987 software
Discontinued operating systems
Disk operating systems
DOS variants
Formerly proprietary software
Free software operating systems
Assembly language software
X86 operating systems |
93485 | https://en.wikipedia.org/wiki/UserLand%20Software | UserLand Software | UserLand Software is a US-based software company, founded in 1988, that sells web content management, as well as blogging software packages and services.
Company history
Dave Winer founded the company in 1988 after leaving Symantec in the spring of 1988. Jean-Louis Gassée, who resigned in 1990 as chief of Apple's product development, came to serve on UserLand's board of directors.
Frontier
UserLand's first product release of April 1989 was UserLand IPC, a developer tool for interprocess communication that was intended to evolve into a cross-platform RPC tool. In January 1992 UserLand released version 1.0 of Frontier, a scripting environment for the Macintosh which included an object database and a scripting language named UserTalk. At the time of its original release, Frontier was the only system-level scripting environment for the Macintosh, but Apple was working on its own scripting language, AppleScript, and started bundling it with the MacOS 7 system software. As a consequence, most Macintosh scripting work came to be done in the less powerful, but free, scripting language provided by Apple.
UserLand responded to Applescript by re-positioning Frontier as a Web development environment, distributing the software free of charge with the "Aretha" release of May 1995. In late 1996, Frontier 4.1 had become "an integrated development environment that lends itself to the creation and maintenance of Web sites and management of Web pages sans much busywork," and by the time Frontier 4.2 was released in January 1997, the software was firmly established in the realms of website management and CGI scripting, allowing users to "taste the power of large-scale database publishing with free software."
Frontier's NewsPage suite came to play a pivotal role in the emergence of blogging through its adoption by Jorn Barger, Chris Gulker, and others in the 1997–98 period.
UserLand launched a Windows version of Frontier 5.0 in January 1998 and began charging for licenses again with the 5.1 release of June 1998.
Frontier subsequently became the kernel for two of UserLand's products, Manila and Radio UserLand, as well as Dave Winer's OPML Editor, all of which support the UserTalk scripting language.
UserLand eventually placed Frontier under the open source GNU General Public License with the 10.0a1 release of September 28, 2004. Frontier is now maintained by the Frontier Kernel Project .
Early Web building applications
Userland developed two pioneering Web building applications, AutoWeb in early 1995 and Clay Basket later that year. Both applications went through a free public beta period, yet neither was ever released in a 1.0 version. In 1996 Clay Basket was abandoned in favor of improved Web publishing functionality built into Frontier.
Manila
Launched as part of Frontier 6.1 in November 1999, Manila is a content management system that allows the hosting of web sites and their editing through a browser. Within days of releasing Manila, UserLand set up a free Manila hosting service, EditThisPage.com, which quickly became a popular weblogging service.
Radio UserLand
Radio UserLand is a client-side weblog system that hosts blogs on UserLand's servers for an annual software license fee. The software includes an RSS aggregator and was one of the first applications to both send and receive audio files as RSS enclosures (see podcasting). UserLand was an early adopter of the RSS syndication method, merging Winer's Scripting News XML format with Netscape's RSS.
First released as a public beta under the name Pike in March 2000, the software came to be released in synch with Manila version numbering: the initial release of 2001 was named Radio UserLand 7.0 and its only major upgrade in 2002 Radio UserLand 8.0. The software is no longer considered to be under active development.
XML-based protocols and formats
UserLand counts among the earliest adopters of XML, with first experiments made in late 1997. The company was involved in the development, specification and implementation of several XML formats and was noted for its commitment to openness.
XML-RPC
Created in 1998 by UserLand Software and Microsoft, XML-RPC is a remote procedure call protocol that uses XML to encode its calls and HTTP as a transport mechanism.
UserLand first included a stable XML-RPC framework with its 5.1.3 release of Frontier in August 1998 and subsequently made extensive use of XML-RPC in its Frontier-based products, Manila and Radio UserLand. XML-RPC is also used in the MetaWeblog API.
SOAP
SOAP evolved from XML-RPC and was designed as an object-access protocol by Dave Winer, Don Box, Bob Atkinson, and Mohsen Al-Ghosein in 1998, with backing from Microsoft, where Atkinson and Al-Ghosein worked at the time.
SOAP 1.1 was submitted to the W3C by Microsoft, IBM, and UserLand, amongst others, on May 9, 2000. Version 1.2 of the proposed standard became a W3C recommendation on June 24, 2003.
RSS
RSS (Really Simple Syndication) is a family of Web feed formats used to publish frequently updated works—such as blog entries, news headlines, audio, and video—in a standardized format. An RSS document (which is called a "feed", "web feed", or "channel") includes full or summarized text, plus metadata such as publishing dates and authorship.
Between 1999 and 2003, UserLand contributed various versions of the RSS specification. For an overview of the process see the History of web syndication technology.
Using RSS, UserLand also ran one of the first Web aggregators, My.UserLand.Com, which allowed users to follow numerous weblogs from a single web page.
Userland's RSS advocacy led them to develop RSS feeds for the New York Times company. The original feeds used a variation on standard RSS, and the feeds were only publicized to UserLand Radio bloggers.
OPML
Outline Processor Markup Language (OPML) is an XML format for outlines. Originally developed in 2000 as a native file format for Radio UserLand's outliner application, it has since been adopted for other uses, the most common being to exchange lists of web feeds between web feed aggregators.
References
External links
Official website
Frontier kernel open source project
Software companies established in 1988
Software companies based in California
Software companies of the United States |
68951236 | https://en.wikipedia.org/wiki/Home%20of%20Peace%20Cemetery%20%28Colma%2C%20California%29 | Home of Peace Cemetery (Colma, California) | Home of Peace Cemetery, also known as Navai Shalome, is a Jewish cemetery established in 1889, and is located at 1299 El Camino Real in Colma, California. The cemetery contains the Emanu-El Mausoleum, owned by and serving the Congregation Emanu-El of San Francisco. It is one of four Jewish cemeteries near the city of San Francisco and it shares an adjacent space next to the Hills of Eternity Memorial Park (also a Jewish cemetery, and also founded in 1889).
History
Emanu-El Hart (or the "Old Jewish Cemetery") was built in 1847 at Gough Street and Vallejo Street in San Francisco; by 1860 the remains were relocated to an area that is now Mission Dolores Park and this served as a cemetery for the Congregation Emanu-El and the Congregation Sherith Israel. When the city of San Francisco started to see dramatic growth in population; it was decided to move the cemetery outside of the city to Colma and they established Home of Peace Cemetery and Hills of Eternity Memorial Park with each cemetery served a different congregation.
Notable burials
Aaron Fleishhacker (1820–1898), Kingdom of Bavaria-born American businessman; founded paper box manufacturer, A. Fleishhacker & Co.
Herbert Fleishhacker (1872–1957), businessman, civic leader and philanthropist.
Abraham Haas (1847–1921), Kingdom of Bavaria-born American businessman, co-founder of the Hellman, Haas & Co.
Alfred Hertz (1872–1942), Prussian-born conductor.
Florence Prag Kahn (1866–1948), teacher, politician, and the first Jewish woman to serve in the United States Congress.
Julius Kahn (1861–1924), Grand Duchy of Baden-born American politician, United States Congressman.
Simon Koshland (1825–1896), Kingdom of Bavaria-born American businessman, and wool merchant.
Charles Lane (1905–2007), actor, appearing in many Frank Capra films.
Philip N. Lilienthal (1849–1908), banker and philanthropist; initially interred at the family vault at Home of Peace Cemetery and later moved to Salem Fields Cemetery, in Brooklyn, New York.
Joseph Owades (1919–2005), biochemist and brewer of light and industrially produced beer.
Ignatz Steinhart (1840–1917), banker, entrepreneur, philanthropist; namesake of the former Steinhart Aquarium in San Francisco.
Levi Strauss (1829–1902), German Confederation-born American businessman; founder of Levi Strauss & Co. and the first blue jeans.
Adolph Sutro (1830–1898), Prussian-born American engineer, politician and philanthropist; served as the 24th mayor of San Francisco from 1895 until 1897.
Walter Wanger (1894–1968), film producer.
James David Zellerbach (1892–1963), businessman, United States diplomat and ambassador.
See also
List of cemeteries in California
Bereavement in Judaism
References
Cemeteries in San Mateo County, California
History of San Mateo County, California
1889 establishments in California
Jewish cemeteries in California
Protected areas of San Mateo County, California |
42570568 | https://en.wikipedia.org/wiki/Aplos%20Software | Aplos Software | Aplos Software is a privately held company that specializes in software as a service for nonprofit organizations. Their primary focus is simple software to manage the essential nonprofit tasks of fund accounting, nonprofit tax preparation and donor management for small, mid-sized, and large non-profit organizations.
History
Aplos Software was founded in 2009 in Fresno, California by Dan Kelly and Tim Goetz, a certified public accountant. Tim Goetz previously served as an executive pastor of a church and helped found two nonprofits. He couldn't find the low-cost fund accounting solution he wanted for his nonprofits, so he joined with a Fresno-based investor that shared his vision to serve the nonprofit sector with simple, affordable software and founded Aplos Software.
After initially developing a desktop fund accounting software program, in 2011 Aplos Software launched its fund accounting software, Aplos Accounting, as an online product, also known as software as a service, tailored specifically to small and mid-sized non-profit organizations and religious corporations.
The company raised over $3.4 million in funding through an angel investor to expand the development of its web-based nonprofit software suite, $2 million of which was raised in 2014.
Aplos launched an integration with Church Community Builder, a church management platform in January 2016.
In February 2016, Aplos raised $4 million in additional funding through a private venture capital fund.
In June 2017, Aplos announced a merger with Portalbuzz, a membership management platform that specializes in software and website portals for service clubs.
Aplos and Gusto (software), a payroll and HR platform, launched an integrated solution for payroll and reimbursements to be tracked within the accounting of nonprofits and churches.
Key Areas of Development
Aplos Software focused its software on simplifying the primary back office tasks required to manage a nonprofit, private foundation, foundation (nonprofit), charitable organization or church.
Fund Accounting
Nonprofit fund accounting differs from business accounting because it is often necessary to track restricted and unrestricted funds separately. This often occurs when a donor or grant specifies that the organization must use the funds for a specific purpose. The accounting software must track how these funds were used and how much is available. Aplos first launched its online fund accounting software, Aplos Accounting, in 2011. The most popular competitive accounting product for small and mid-sized nonprofits is Quickbooks
In October 2012, Aplos Software launched Aplos Oversight, an online software which provides an administrator or accountant real-time access to the accounting of multiple organizations.
Form 990 Preparation
In summer of 2012, Aplos Software was approved as an IRS efile provider to submit IRS tax forms on behalf of tax-exempt organizations and in October 2012 launched Aplos e-File, a tax preparation and filing software for IRS Form 990-N. IRS Form 990-N is an annual electronic IRS filing for tax-exempt organizations with less than $50,000 in annual gross receipts. In 2013, Aplos added tax preparation and e-file software for IRS Form 990-EZ and its required schedules to Aplos e-File. IRS Form 990-EZ is the short form of the full Form 990 IRS tax forms and is available to organizations with up to $200,000 in gross receipts and $500,000 in assets.
Donor Management
Aplos Software also focused on expanding the fundraising and donation tracking aspects of its software since 2012 to make it more successful for the vertical market of nonprofits. It launched a contributions management module in 2012 that tracked donations within Aplos Accounting and created contribution statements, which are annual giving receipts required by the IRS. In July 2013 it began offering a donor management module, and in May 2017 it expanded on its module to launch a stand-alone product as Aplos Donor Management.
According to the Chronicle of Philanthropy, nonprofits are increasingly creating initiatives to accept donations online and growth in online giving outpaced traditional methods in 2012. To keep pace with the growing popularity and requests by its customers for an online giving platform, Aplos Software expanded its fundraising functionality in January 2014 by partnering with WePay, an online payment portal, to add the ability for nonprofits to accept online donations.
See also
Comparison of accounting software
Fund accounting
Non-profit organization
Alternative giving
Software as a service
IRS Form 990
References
External links
www.aplos.com
Non-profit technology
Cloud applications |
11356402 | https://en.wikipedia.org/wiki/CNGrid | CNGrid | CNGrid (Chinese: 中国国家网格) is the Chinese national high performance computing network supported by 863 Program.
Research and development
China National Grid (CNGrid) is a major project supported by the Hi-Tech Research and Development (863) Program of China. CNGrid is the new generation test-bed of information infrastructure aggregating high-performance computing and transaction processing capabilities.
Through resource sharing, work in coordination, and service mechanism, CNGrid effectively supports many applications such as scientific research, resource environment, advanced manufacturing, and information services. CNGrid promotes the construction of national information industry and the development of related industries by technological innovations.
China National Grid Software, named CNGrid GOS, is a suite of grid software with independent intellectual property, which is developed by CNGrid software R&D project team. It mainly includes a system software, a CA certificate management system and a testing environment, three business version of sub-systems (high-performance computing gateway, data grid, and grid workflow), and a monitoring system.
This project is undertaken by seven organizations including Institute of Computing Technology of Chinese Academy of Sciences, Jiangnan Institute of Computing Technology, Tsinghua University, National University of Defense Technology, Beihang University, Computer Network Information Center of Chinese Academy of Sciences, and Shanghai Supercomputing Center.
2. CNGrid GOS system software
CNGrid GOS system software (VegaGOS) provides functionalities including global naming management, VO management, user management, resource management, application runtime management and so on. The VegaGOS has many important innovations in global naming management, distributed resource management, virtual organization (agora), grid process (grip) technology, grid security mechanism, supporting a variety of domain applications, etc.
(1) Naming. Naming is a decentralized and name-stable global object (Gnode) management system. Naming supports locating objects by the global unique identifier with the feature of low latency and high success ratio; Naming also supports object searching based-on attribute-match with the feature of low latency and high recall ratio. Naming is a fundamental component in VegaGOS to construct the whole system. As a reusable component, Naming forms a global layer of virtual names to solve the problem of non-stable of physical address and tight coupling between applications and resources.
(2) Resource management. Resources in VegaGOS are in various forms, and are accessed in different ways. It is really difficult to describe and manage those heterogeneous resources. The introduction of resource controller mechanism (RController) is in order to import and manage various heterogeneous resources in a unified way. RController provides many functions for resources like create, destroy, access control, access, read and write properties, etc.
(3) VO management. Virtual organization in VegaGOS, called Agora, supplies distributed resources, users and access control policy management, and has the characteristic of single sign-on and single system image. Agora, as a common trusted third-party super-organization, achieves the unified cross-domain access control mechanisms while keeping autonomy.
(4) Grid application runtime management. Grid applications need to maintain the identities of users to support access control implementation during runtime. In VegaGOS, Grid Process technology, which is abbreviated to grip, is not only maintains the user identities and other application runtime context, but also manages resources occupied by the application and supports a number of applications collaborations.
(5) Application level tools. VegaGOS provides a wealth of application level tools in order to support the traditional command-line mode in high-performance computing and to make it have grid characteristics, including Portal/GShell/VegaSSH/GOSClient. Portal provides users with friendly operation interface based on Web, and facilitates users to use VegaGOS. GShell is a grid shell like a GNU bash environment, to support the application running with a grip; VegaSSH supplies single sign-on to any grid node to use the back-end high performance computing resources; GOSClient is a set of client tools including GShell and can be installed independently to use VegaGOS system.
See also
EUChinaGRID
External links
homepage
WebHPC Production of CNGrid GOS
Science and technology in China
Internet in China |
16523806 | https://en.wikipedia.org/wiki/Od%20%28Unix%29 | Od (Unix) | od is a command on various operating systems for displaying ("dumping") data in various human-readable output formats. The name is an acronym for "octal dump" since it defaults to printing in the octal data format.
Overview
The od program can display output in a variety of formats, including octal, hexadecimal, decimal, and ASCII. It is useful for visualizing data that is not in a human-readable format, like the executable code of a program, or where the primary form is ambiguous (e.g. some Latin, Greek and Cyrillic characters looking similar).
od is one of the earliest Unix programs, having appeared in version 1 AT&T Unix. It is also specified in the POSIX standards. The implementation for od used on Linux systems is usually provided by GNU Core Utilities.
Since it predates the Bourne shell, its existence causes an inconsistency in the do loop syntax. Other loops and logical blocks are opened by the name, and closed by the reversed name, e.g. if ... fi and case ... esac, but od's existence necessitates do ... done.
The command is available as a separate package for Microsoft Windows as part of the UnxUtils collection of native Win32 ports of common GNU Unix-like utilities. The command has also been ported to the IBM i operating system.
Example session
Normally a dump of an executable file is very long. The head program prints out the first few lines of the output. Here is an example of a dump of the "Hello world" program, piped through head.
% od hello | head
0000000 042577 043114 000401 000001 000000 000000 000000 000000
0000020 000002 000003 000001 000000 101400 004004 000064 000000
0000040 003610 000000 000000 000000 000064 000040 000006 000050
0000060 000033 000030 000006 000000 000064 000000 100064 004004
0000100 100064 004004 000300 000000 000300 000000 000005 000000
0000120 000004 000000 000003 000000 000364 000000 100364 004004
0000140 100364 004004 000023 000000 000023 000000 000004 000000
0000160 000001 000000 000001 000000 000000 000000 100000 004004
0000200 100000 004004 002121 000000 002121 000000 000005 000000
0000220 010000 000000 000001 000000 002124 000000 112124 004004
Here is an example of od used to diagnose the output of echo where the user types and after writing "Hello" to literal insert a tab and ^C character:
% echo "Hello ^C" | od -cb
0000000 H e l l o \t 003 \n
110 145 154 154 157 011 003 012
0000010
See also
Hex editor
Hex dump
References
External links
od - GNU Core Utilities manpage
Unix SUS2008 utilities |
43444434 | https://en.wikipedia.org/wiki/CYREN | CYREN | Cyren Inc. is a cloud-based, Internet security technology company providing security as a service (SECaaS) and threat intelligence services to businesses. Services include email security, web security, DNS security, cloud sandboxing, inbound/outbound anti-spam services, real-time phishing detection and blocking, ransomware protection, URL filtering, IP reputation for email, malware attack detection, anti-malware and IP intelligence, botnet attack prevention, and cloud threat lookup. Cyren also provides endpoint protection, including anti-malware for mobile, URL filtering for mobile, and inbound/outbound Internet of Things (IoT) gateway protection. Major corporate clients using Cyren's services include Microsoft, Google, Check Point, Dell, T-Mobile, and Intel.
Cyren currently employs approximately 220 with headquarters in McLean, Virginia (USA) and offices in Herzliya (Israel), Berlin (Germany), Bracknell (UK) and Reykjavík (Iceland). Its common stock is listed on the NASDAQ Stock Exchange under the ticker symbol CYRN. In January 2019, Cyren announced that it was voluntarily delisting from the Tel Aviv Stock Exchange.
Cyren is among the most well-funded cybersecurity firms in the Washington, DC metro area according to a 2018 study. In 2018 Cyren made news for its research into phishing trends, particularly the prevalence of Microsoft Office, Office365, and Outlook as the brands most targeted by phishing kits. In the same year, the firm provided primary support to the Icelandic police during their investigation of the largest cyberattack to hit the country.
The company estimates that its security cloud currently processes more than 25 billion security transactions generated by over 1.3 billion users in 180 countries to detect cyber threats as they emerge.
Company history
Commtouch was incorporated as a private company under the laws of the State of Israel on February 10, 1991 by Gideon Mantel, a former officer in a “special bomb-squad unit” for the Israel Defense Forces (IDF). Wired magazine observed that the early company culture at Commtouch encouraged “being a fighter” as their Israeli employees had completed several years of military service.
The Israeli venture capital company Gemini Israel Ventures which was supported by the "Yozma" government program at the time (doubling any investment with government money, see Yitzhak Rabin) made an investment in Commtouch.
In 1997 Isabel Maxwell became President of Commtouch. According to Maxwell, she convinced Microsoft co-founder Bill Gates to make an investment in the business. In 1999 Microsoft co-founder Paul Allen also made an investment in Commtouch of $20-million.
Commtouch went public in 1999.
In September 2010, the company acquired the Command Antivirus division of Authentium, and in October 2012, completed the acquisition of the antivirus business FRISK Software International.
In November 2012, the company completed acquisition of Eleven GmbH, which enabled the company to accelerate delivery of private label cloud-based security services, specifically designed for OEM and service provider markets.
In January 2014, the company received shareholder approval to change its name from the original of “Commtouch” to Cyren Ltd.
Technology
Cyren's cloud-based security services are delivered via two platforms: Cyren Cloud Security (CCS) and Threat Intelligence Services (TIS). The CCS SaaS security platform is designed for enterprises and sold either directly or through channel partners. Services include Web Security, Email Security, DNS Security and Cloud Sandboxing. Cyren TIS offers cloud-based cyber threat detection application program interfaces (APIs), and software development kits (SDKs) to technology and security vendors, including Google and Microsoft. Cyren TIS services include Email Security, Web Security, Endpoint Security and Advanced Threat Protection. These platforms are powered by Cyren GlobalView™, Cyren's global security cloud that analyzes 25 billion security transactions each day to identify emerging threats in real time.
Recent announcements
In February 2019, Cyren announced the intent of Lior Samuelson, the Chief Executive Officer and Chairman of the Board, to step down as CEO. On April 24, 2019, it was announced Brett Jackson would be taking on the role of CEO.
In January 2019, Cyren announced that it would be voluntarily delisting from the Tel Aviv Stock Exchange (TASE) in an effort to simplify regulatory filings and concentrate fragmented trading volume onto the Nasdaq exchange. The company’s ordinary shares were delisted from trading on the TASE on April 10, 2019. Cyren will continue to maintain its headquarters in Herzliya, Israel and operate as an Israeli-registered company.
In July 2017, the private equity firm Warburg-Pincus announced it had acquired a 21.3% stake in Cyren for $US19.6 million.
Awards
In 2018 and 2019, Cyren Email Security received a first-place gold award in the email security category in the Cybersecurity Excellence Awards, and placed at the top in the anti-malware category in 2017.
See also
Web Application Security
Antivirus software
Antispam features
Endpoint Security
Mobile Security
Network Security
Internet Security
Content filtering
Cloud Security
Phishing
Botnets
Mobile Malware
References
Companies listed on the Nasdaq
Security companies of Israel
Technology companies established in 1990
Software companies of Israel
Companies based in Netanya
Companies listed on the Tel Aviv Stock Exchange |
18775928 | https://en.wikipedia.org/wiki/Ushahidi | Ushahidi | Ushahidi is an open source software application which utilises user-generated reports to collate and map data. It uses the concept of crowdsourcing serving as an initial model for what has been coined as "activist mapping" - the combination of social activism, citizen journalism and geographic information. Ushahidi allows local observers to submit reports using their mobile phones or the Internet, creating an archive of events with geographic and time-date information. The Ushahidi platform is often used for crisis response, human rights reporting, and election monitoring. Ushahidi (Swahili for "testimony", closely related to shahidi which means "witness") was created in the aftermath of Kenya's disputed 2007 presidential election that collected eyewitness reports of violence reported by email and text message and placed them on a Google Maps map.
The Ushahidi platform has been used by the United Nations Department of Field Services and Peacekeeping, in response to the Haiti Earthquake in 2010, to monitor the Nigerian elections in 2011, by the Obama Campaign for America 2012, by the Nepalese army to respond to the earthquake of 2015, in and by local activists groups such as Humanitarian Tracker to monitor violence in the Syrian civil war and HarassMap to help women report on sexual harassment.
Successful deployment of crisis mapping applications like Ushahidi benefits from careful attention to how the technology fits into the relevant cultural settings, and focusing on realistic goals.
Products
Ushahidi
Ushahidi v2 was built on the Kohana web framework, a fork of the CodeIgniter framework. It includes support for Nexmo wholesale SMS API and Clickatell SMS Gateway (Budgetsms.net SMS Gateway is planned). Furthermore, the official Ushahidi-hosted websites use the commercial service. Ushahidi provides the option of using OpenStreetMap maps in its user interface, but requires the Google Maps API for geocoding. Ushahidi is often set up using a local SMS gateway created by a local FrontlineSMS set-up.
Ushahidi v3 was released in September 2015. As an improvement to the v2 platform it is built as an API with a web client. It allows for custom survey creation, and the running of multiple surveys on a single deployment, amongst other feature improvements from v2 such as embeddable maps and surveys, analytics, private deployments, and management of roles and permissions. It is built on the Laravel PHP web framework. It is open source under the AGPL license.
Ushahidi v4 was released in 2018, and replaces Kohana with Lumen.
Releases and codenames
1.0 Mogadishu – 10 December 2009
1.2 Haiti – ~22 January 2010
2.0 Luanda – 22 November 2010
2.1 Tunis – 9 August 2011
2.2 Juba – 13 March 2012
2.3 Juba – 24 April 2012
3.0 - September 2015
4.0 - 1 Oct 2018
Crowdmap
Crowdmap is designed and built by the team behind Ushahidi, a platform that was originally built to crowdsource crisis information. As the platform evolved, so did its users. Crowdmap now allows users to set up their own deployments of Ushahidi without having to install it on a web server. Since its release in 2010, prominent deployments of Crowdmap have documented the global Occupy movement and the 2011 London anti-cuts protest. The original Crowdmap was a hosted version of the Ushahidi v2 open source software platform.
On 31 December 2010, the Ushahidi team announced a new version of Crowdmap, that differed from the Ushahidi v2 codebase: Checkins, a geosocial add-on to Crowdmap that allows users to create a white-label alternative to sites like Foursquare and Gowalla. Rather than filling out submission forms online, checkins allow Crowdmap users to expedite data entry to their deployment, focussing first on location and adding more detailed information later. Ushahidi describes the effort as "checkins with a purpose".
SwiftRiver (discontinued)
SwiftRiver was designed as a suite of intelligence and real-time data gathering products that complement Ushahidi's mapping and visualization products. Often referred to as the SwiftRiver Initiative the goal of the project was "to democratize access to the tools for making sense of information". The project attracted a lot of interest from newsrooms.
In December 2014, Ushahidi announced that it would stop development and support and reallocate the resources.
SwiftRiver was a free and Open-source platform that helped people make sense of a lot of information in a short amount of time. It was born out of the need to understand and act upon a wave of massive amounts of crisis data that tends to overwhelm in the first 24 hours of a disaster. There had been a great deal of interest in Swift for other industries such as newsrooms, political analysts and marketers as an open-source alternative to more expensive, proprietary intelligence software platforms. The SwiftRiver platform offered applications which combine natural language/artificial intelligence process, data-mining for SMS and Twitter, and verification algorithms for different sources of information.
Rollcall
Ushahidi built RollCall after a team member was involved in the terrorist attack at Westgate Malle in Nairobi in 2013. Rollcall is a quick way to check in with the people someone is responsible for during critical situations. Rollcall sends a one-click message with a binary question such as "Are you okay?" to a pre-prepared contact list, such as list of colleagues, parents at a school, or an embassy's list of citizens in a country at that time, via text, email, mobile app, and Slack. The recipients respond with a "yes" or "no" allowing the organization responsible for them to quickly triage who is in danger.
History
Beginnings in Kenya
Ushahidi (Swahili for "testimony" or "witness") is a website created in the aftermath of Kenya's disputed 2007 presidential election (see 2007–2008 Kenyan crisis) that collected eyewitness reports of violence sent in by email and text-message and placed them on a Google map. It is also the name of the open source software developed for that site, which has since been improved, released freely, and used for a number of similar projects around the globe.
The Kenyan site was developed and run by several bloggers and software developers, all current or former residents of Kenya. They include Erik Hersman, Juliana Rotich, Ory Okolloh, and David Kobia. The site was initially proposed by Okolloh, developed cheaply, and put online within a few days. International media, government sources, NGOs, and Kenyan journalists and bloggers were used to verify eyewitness testimony. The site was later also used to facilitate donations from abroad.
An analysis by Harvard's Kennedy School of Government found that Ushahidi was better overall at reporting acts of violence as they began. The data collected by Ushahidi was superior to that reported by the mainstream media in Kenya at the time. The service was also better at reporting non-fatal violence as well as information coming in from rural areas.
On 23 December 2010, Ushahidi Co-founder and Executive Director Ory Okolloh announced that she was stepping down from her role to become Manager of Policy for Africa at Google.
Post-Kenya crisis uses
Soon after its initial use in Kenya, the Ushahidi software was used to create a similar site to track anti-immigrant violence in South Africa, in May 2008. The software has since been used to map violence in eastern Congo, beginning in November 2008. Ushahidi is used in Kenya, Malawi, Uganda, and Zambia in June 2009 to track pharmacy stockouts in several Southeast African countries. Finally, it was used to monitor elections in Mexico and India, among other projects. It was also used by Al Jazeera to collect eyewitness reports during the 2008–09 Gaza War.
The post election violence in Kenya was the subject of a Harvard Humanitarian Institute study and mentioned in a Berkman Center report.
2010
Haiti
In 2010, due to the earthquake in Haiti, Patrick Meier launched a joint effort between Ushahidi, The Fletcher School of Law & Diplomacy at Tufts University, UN OCHA/Colombia and the International Network of Crisis Mappers (CM*Net) to start the Haiti implementation. A few hours later many humanitarian/tech workers joined this initiative. Nearly 40,000 independent reports were sent to the Ushahidi Haiti Project of which nearly 4,000 distinct events were plotted. The project instance was an impressive proof of concept for the application of crisis
mapping and crowdsourcing to large scale catastrophes and a novel approach to the rapidly evolving field of
crisis informatics.
Chile
Only a month after the Haiti earthquake, the 2010 earthquake in Chile prompted Patrick Meier to launch Ushahidi-Chile within hours of the initial quake. The Chile site is co-managed with the School of International and Public Affairs, Columbia University in the United States, supported by Chilean Americans.
Louisiana, U.S.
On 20 April 2010 BP's offshore Deepwater Horizon oil rig exploded killing eleven workers and precipitating the largest accidental offshore oil spill in the history of the petroleum industry. On 3 May the Louisiana Bucket Brigade (LABB) publicly released the Oil Spill Crisis Map, the first application of the Ushahidi platform in a humanitarian response in the United States.
In the years since the BP oil spill, LABB continues to use the map (now the iWitness Pollution Map) as a repository of eyewitness reports and photos documenting the impacts of petrochemical pollution on human health and the environment. Reports to the map come from cities all over Louisiana, including Baton Rouge, St. Rose, and Chalmette. Since 2010 LABB has collected over 14,000 reports, making it the largest and longest-running deployment of an Ushahidi instance.
Washington, D.C.
In the wake of winter storms, the Washington Post and the web development company PICnet used the software to create a site mapping blocked roads and other information.
Italy
Elena Rapisardi, together with Giovanni Lotto, launched the first Italian crowdmap Open Foreste Italiane in order to list and map information to prevent and manage forest fires; the meaning of this project has been reported on the Ushahidi blog.
Though OpenForeste did not completely achieve his goals, it showed importance for two reasons: (1) unlike previous instances, the platform was utilized in absence of an acting crisis or emergency to collect, map, share and spread information in order to manage future and potential emergencies, thus joining the awareness of the possibilities of Web 2.0 and a different approach to natural risk prevention; (2) it brought to Italy the knowledge and potential of Ushahidi, crowdmapping and social use of crowdsourcing, which was then used in following years in several instances, both private and public, especially from local Civil Protection structures and based on the new approach to the Ushahidi platform (see here a non-complete crowdmap of Italian Crowdsourcing Projects).
Russia
Ushahidi was used in Russia to set up a "map of help" for voluntary workers needed after the 2010 Russian wildfires.
2011
Christchurch
Using Ushahidi, the Christchurch Recovery Map website was launched less than 24 hours after the February 2011 Christchurch earthquake in Christchurch, New Zealand. The site maps locations of services such as food, water, toilets, fuel, ATMs, and medical care. Information was gathered via Twitter using the #eqnz hashtag, SMS messages, and email. The site was founded by a group of web professionals, and maintained by volunteers.
Middle East
This software allowed pro-democracy demonstrators across the Middle East to organise and communicate what was happening around them in early 2011. On 2 March, the UN Office for the Coordination of Humanitarian Affairs (OCHA) requested that the Standby Volunteer Task Force be activated for Libya. The Task Force's Tech Team set up a password protected Ushahidi platform almost immediately and several days later launched a public version at OCHA's request. This allowed users to contribute relevant information about ground conditions as they occurred.
Italy
In July 2011, Giuseppe Calamita had created the first crowdmap to monitor a WIMAX/LTE Internet Service Provider to answer the issues not due to the ISP (jammer, etc.)
India
India Citizen Reports has been using Ushahidi since 2011 to collect and disseminate reports in various categories like civic problems, crimes and corruption. TelecomMap.com uses Ushahidi to map 3G network quality and Wi-Fi hotspots.
Australia
Australian Broadcasting Corporation used Ushahidi to map the Queensland floods in January.
United States
The MightyMoRiver Project used Ushahidi's hosted service Crowdmap to track the Missouri River floods of 2011.
Macedonia
Transparency Watch Project is using the Ushahidi platform to track corruption reported cases in the Republic of Macedonia. PrijaviKorupcija is a joint project by Transparency International and the Center for International Relations allowing citizens to report cases of corruption via ONE by sending SMS from their mobile phones, sending an email, using the web form, the hashtag #korupcijaMK on Twitter or by reporting via phone call.
Nigerian Elections
Ushahidi was used to monitor the Nigerian 2011 elections under the project Reclaim Naija. A published article in the Journal of Information Technology & Politics by Catie Snow Bailard & Steven Livingston showed that, "Controlling for a number of factors, we find that the number and nature of crowdmap reports generated by citizens is significantly correlated with increased voter turnout (by 8%) in the 2011 Nigerian presidential election as a result of providing officials with improved information about the functionality of local polling stations."
2012
Balkans (Bosnia and Herzegovina, Serbia, Montenegro, Macedonia)
Al Jazeera Balkans deployed Ushahidi crisis mapping platform on 5 February 2012 to track the snow/cold emergency in the Balkans.
2013
Kenya Elections
Ushahidi helped run the Uchaguzi partnership to monitor the 2013 Kenyan elections. The deployment gathered over 8000 reports, a report "Uchaguzi: A Qualitative and Quantitive Analysis of ICTS, Statebuilding, and Peacebuilding in Kenya." showed that 75% of reporters said their report was responded to.
2014
Kenya
Ushahidi announced Ping (now called Rollcall) in response to the attacks on Westgate Mall in Nairobi. The software was used to map out all the blood drive center locations in Nairobi and let users quickly identify places to donate, see which blood types were in demand, and identify whether equipment or volunteers were needed at any locations. Among the goals of this map was to help ensure that when the Kenyan population came out to donate blood, they would know which donation centers needed their blood type the most.
2015
Crowd sourced data were extensively used during the relief and rescue operations at Quakemap.org. Kathmandu Living Labs (KLL), a volunteer organization, set up a platform to collect and manage data from crowd using Ushahidi. KLL further managed to conduct first level of verification as well, providing the Nepalese Army with the following details: location, number of affected people, number of death/injured, support requirements, level of urgency, and contact information.
The Nepalese army report said that "Crowd Sourcing is one of the common approaches to collect information from the public. Although it is not new in the context of disaster management, but during April Earthquake in Nepal, this approach was appropriately used in a matured way. The systematic process defined by Kathmandu Living Labs volunteers marked a path in utilization of crowd sourced data by implementing agency like Nepalese Army."
2016
Ushahidi created USAelectionmonitor.com to monitor the USA 2016 Presidential Election. After the election Ushahidi set up Documenthate.org to monitor the spike in hate crimes against minorities in the USA post election. Ushahidi partnered with journalist and activist Shaun King and non-profit journalism group Propublica. The New Yorker covered the story, saying: "Now Shaun King, a writer for the Daily News, is working with the open-source software company Ushahidi to create a map of post-election intimidation. “Thousands of people have emailed me incident reports over the past seven days,” King wrote me in an e-mail. “The team at Ushahidi is helping me go through them, verify them the best we can, catalogue and then map them, then share them.” The aim is to raise awareness of politically motivated violence and help people stay safe, report it to authorities as needed, and create a database of such incidents."
2017
Ushahidi ran the Uchaguzi partnership to monitor the 2017 Kenyan elections, the fourth country-wide election monitoring effort for the organization. Ushahidi integrated a Facebook Messenger bot to allow the 7 million Facebook users in Kenya to report via Messenger. The platform received over 7000 reports on election day.
As of 2017, the platform reported that it had been deployed over 125,000 times in over 160 countries, although most of these were for evaluation, training or curiosity.
Awards
Ushahidi has received several awards in recognition to its effectiveness and creativity, latest being The MacArthur Award. The awards received by Ushahidi so far include the following:
The MacArthur Award – 2013
Global Adaptation Index Prize – 2012
Funding of US$1.4 million from the Omidyar Network
Criticism
Sexual harassment allegations
On 9 July 2017, allegations surfaced online of Ushahidi covering up a sexual harassment incident perpetrated by one of the members of their senior leadership. Ushahidi released a statement the next day confirming they were aware of the allegations which were the subject of an ongoing internal inquiry.
Former Ushahidi board member and co-founder Ory Okolloh, strongly condemned the laxity with which the board dealt with the issue on 11 July saying that “more clarity on steps that have been taken so far and the relevant timelines should be shared and those found culpable either by their action or inaction should resign”.
Ushahidi released a second statement on 17 July 2017 detailing the chronology of events showed that the incident occurred on 19 January 2017 and was reported to the board on 4 May 2017, and that on 5 May the accused was placed on temporary leave, given due process, and an investigation was undertaken. After back and forth with both parties' lawyers, the board held an inquiry on 3 July and the accused was alerted of his firing 14 days later on 17 July the date of the statement.
Angela Kabari came out publicly on 20 July as the victim in a Medium post detailing a 6-month ordeal and called for the resignation of the entire Ushahidi board that consisted of David Kobia, Erik Hersman, Juliana Rotich and Jenny Stefanotti at the time. She identified Daudi Were, Ushahidi's Executive Director, as the accused. Her statement to the board when she reported the matter included an un-notarized transcript of the recording of the incident and claimed an earlier occurrence the previous year. In her post, she said she encountered 11 other victims some of whom were current employees of the organisation and that the board members were aware of Daudi's misconduct in separate incidents spanning 10 years, however, no others came forward. She castigated the board for lack of support, victim shaming, slander, delays and character attacks. Angela resigned on 28 June citing frustration due to “continual stalling and inaction from the board 55 days after my complaint.”
Daudi Were was alerted of his dismissal on 17 July, formally dismissed on 22 July, and Ushahidi released an apology to Angela for the harassment she experienced while working at Ushahidi.
Ushahidi staff and staff from Ushahidi related organizations, under the banner "Women in Tech Kenya", released a statement on 24 July that supported Angela's speaking out, condemned Daudi Were's conduct, applauded his dismissal, and supporting the board following due process.
The Ushahidi board did not resign and through a series of posts defended their conduct, saying that they followed due process in the pursuit of justice. In a statement released on 28 July, Ushahidi said that due to the legal process, agreed upon by all parties, all communications had to go through the legal representation and called Ms. Kabari's claims that they did not care about her disingenuous for not understanding the neutral position the board undertook in the sexual harassment investigation.
See also
Crisis mapping
Uchaguzi
Commons-based peer production
Cognitive Surplus
References
External links
Ushahidi
NetSquared: "Remixing the Web for Social Change"
TED (conference): TED
Institute for Interactive Journalism
United for Africa
Ushahidi Nexmo plugin How To
Internet-related activism
Internet-based activism
Electoral fraud
International political websites
Science and technology in Kenya
Kenyan political websites
Internet properties established in 2008
Swahili words and phrases
Google Maps
Crowdsourcing
Web mapping
OpenStreetMap
Mass media in Nairobi
Emergency management software |
6367813 | https://en.wikipedia.org/wiki/Bibus | Bibus | Bibus is reference management software designed for OpenOffice.org packages and Microsoft Word in particular, with goal of creating an open source bibliographic software package that will allow easy formatting of the bibliographic index in OpenOffice.org Writer and Microsoft Word. It is based on Python and wxWidgets, making it platform-independent in principle. It functions on all 32-bit versions of Microsoft Windows (95/98/NT/2000/XP), POSIX (Linux/BSD/UNIX-like OSes) and, to a limited extent, Mac OS X. Bibus is free software released under the GNU GPL v2+.
Features include searching and reference uploading from MEDLINE using eTBLAST or PubMed and user reference library independence, making document exchange between collaborative writers easier.
Command line version is found in github (https://github.com/linsujie/bibcmd) python3 version is found in sourceforge (https://sourceforge.net/projects/biblioassist/)
See also
Comparison of reference management software
External links
Innovation.swmed.edu
Wiki.servies.openoffice.org
Free software programmed in Python
Software that uses wxWidgets |
37558587 | https://en.wikipedia.org/wiki/High%20Speed%20LAN%20Instrument%20Protocol | High Speed LAN Instrument Protocol | HiSLIP (High-Speed LAN Instrument Protocol) is a TCP/IP-based protocol for remote instrument control of LAN-based test and measurement instruments. It was specified by the IVI Foundation
and is intended to replace the older VXI-11 protocol. Like VXI-11, HiSLIP is normally used via a library that implements the VISA API.
Version 1.4 of the LAN eXtensions for Instrumentation (LXI) standard recommends HiSLIP as “LXI HiSLIP Extended Function for LXI based instrumentation”.
Benefits
HiSLIP fixes several problems with the VXI-11 protocol (which synchronously sends GPIB commands via SunRPC):
New asynchronous “overlap mode” to help applications fully utilize Ethernet performance
Support for both shared and exclusive instrument locking
Support for IPv6
Features
HiSLIP can operate in two different modes:
In “overlap mode”, input and output data are buffered between the client and server and a series of independent queries can be sent by a client without having to wait for each to complete before sending the next. The responses are sent back in the order in which the queries were sent. This asynchronous operation helps applications to fully utilize Ethernet performance.
There is also a slower “synchronized mode”, in which a client is required to read the result of each query before it can send another. It is intended for backwards compatibility with the capabilities of GPIB, VXI-11, and USB-TMC instruments.
HiSLIP clients (VISA libraries) have to support both modes. HiSLIP servers (instruments) need to support at least one of them, but can also support both.
A HiSLIP client contacts a server by opening two TCP connections, both to port 4880, and sends packetized messages on both:
The “synchronous channel” carries normal bi-directional ASCII command traffic (e.g., SCPI), and synchronous GPIB meta-messages (END, triggers, etc.).
The “asynchronous channel” carries GPIB-like meta-messages that need to be treated at higher priority and independent of the data path (e.g., device clear, service request).
Usage
To migrate from VXI-11 to HiSLIP, a user of a VISA library and instrument that support both merely has to change the VISA resource string used to address the instrument. The shortest possible version of a VXI-11 VISA resource string is "TCPIP::<IP address|hostname>::<hislipServer>[,port#]::INSTR". To use the HiSLIP communication channel, such a VISA resource string needs to be changed to: "TCPIP::<IP address|hostname>::hislip0::INSTR". If the HiSLIP server is using a port other than the default of 4880, then it must be specified in the resource string as: "TCPIP::IP address|hostname>::hislip0[,port#]::INSTR".
References
External links
http://www.rohde-schwarz.de/appnote/1MA208 Fast Remote Instrument Control with HiSLIP - Application Note
Input/output
Electronic test equipment
Computer buses |
58434424 | https://en.wikipedia.org/wiki/HeySpace | HeySpace | HeySpace is a web-based task management application founded in 2018 by Time Solutions. The program is a mix of Slack and Trello, combining an online chat facet of the former with project management of the latter.
Time Solutions ─ a system manufacturer of HeySpace ─ is an IT company based in Wrocław, founded in 2009 by Kamil Rudnicki, a 21-years-old student back then.[1] The main investors of Time Solutions are Asseco Poland and Venture Incubator, which financed Time Solutions in 2011.
History
HeySpace made its debut on May 3, 2018, when it was officially presented on Time Solutions' blog. To July 2018, it was used only by the company itself, and after that date, the app was made available also to the rest of the world. The tool was inspired by the internal need of the company for combining the communication function provided by Slack with the task management function offered by apps such as Trello. Mixing both features in one tool allowed for greater integration of projects within the company.
Features
HeySpace offers numerous features which aim is to facilitate team collaboration and communication. First of all, the tool provides its user with a chat board that can be divided into private and public spaces. The same division concerns sticky notes boards. Moreover, it is possible to create task sheets directly from the conversation level.
The user of HeySpace can also import their tasks from the currently used software (such as Asana, Trello, Jira, Todoist, or Wrike) via the available support.
Besides task management and collaboration, the tool enables percent-complete tracking, milestone tracking, status tracking, project planning, file sharing, workflow management, idea management, resource management and many more.
Reception
On August 21, 2018, after appearing on ProductHunt, HeySpace received the #4 Product of the Day award. What is more, the application was granted ''Great User Experience 2018'' and ''Rising Star 2018'' awards by Finances Online. Besides, the tool has received many positive reviews around the world.
See also
Computer and network surveillance
Project management software
Comparison of project management software
References
Project management
Project management software
Task management software
Communication software |
15988516 | https://en.wikipedia.org/wiki/Modeling%20Maturity%20Levels | Modeling Maturity Levels | Modeling Maturity Levels is a classification system defined by Anneke Kleppe and Jos Warmer in their book MDA Explained Addison-Wesley. The levels characterize the role of modeling in a software project.
The concept shows resemblance to the way software processes are rated with the Capability Maturity Model.
There are 6 levels
Level 0 No Specification: the specification of software is not written down. It is kept in the minds of the developers
Level 1 Textual Specification: the software is specified by a natural language text (be it English or Chinese or something else), written down in one or more documents
Level 2 Text with Models: a textual specification is enhanced with several models to show some of the main structures of the system
Level 3 Models with Text: the specification of software is written down in one or more models. In addition to these models, natural language text is used to explain details, the background, and the motivation of the models, but the core of the specifications lies in the models.
Level 4 Precise Models: the specification of the software is written down in one or more models. Natural language can still be used to explain the background and motivation of the models, but it takes on the same role as comments in source code.
Level 5 Models only: the models are precise and detailed enough to allow complete code generation. The code generators at this level have become as trustworthy as compilers, therefore no developer needs to even look at the generated code.
References
T. Mettler, Thinking in terms of design decisions when developing maturity models, International Journal of Strategic Decision Sciences, 1(4), 2010, pp. 76-87.
T. Mettler, P. Rohner, and R. Winter, Towards a Classification of Maturity Models in Information Systems, Management of the Interconnected World, in: A. D'Atri, M. De Marco, A.M. Braccini, and F. Cabiddu (Eds.), Berlin, Heidelberg: Physica, 2010, pp. 333-340.
Anneke Kleppe and Jos Warmer in their book MDA Explained Addison-Wesley
Book: MDA Explained: The Model Driven Architecture : Practice and Promise" by Anneke G. Kleppe, Jos B. Warmer, Wim Bast, Publisher: Addison-Wesley Professional, Release Date: April 2003,
External links
Getting Started with Modeling Maturity Levels
Unified Modeling Language
Maturity models |
1160713 | https://en.wikipedia.org/wiki/Telecomsoft | Telecomsoft | Telecomsoft was a British video game publisher and a division of British Telecom. The company was founded by Dr. Ederyn Williams in 1984 and operated three separate labels: Firebird, Rainbird, and Silverbird. The first employee was James Leavey, seconded from elsewhere in BT, who, along with Tony Rainbird, became the driving force behind the company in the early days.
History
Telecomsoft was founded in 1984 when computer games were the fastest growing sector within the computer software market at the time.
Despite a turnover of over £6 million in 1987/88, British Telecom sold the three labels to MicroProse in 1989 in a deal reported to be worth around £2,000,000 after a failed management buyout. MicroProse sold the Silverbird label soon after acquisition, but continued to use the Rainbird and Firebird labels for a short period.
Labels
Firebird
British Telecom brought in Tony Rainbird, owner of budget software publisher Micro-Gold, to help set up the first Telecomsoft label, Firebird.
Originally named Firefly Software, the label had to be renamed when it was discovered that the name had already been registered by another company.
The first titles to be published on the Firebird Silver label in November 1984 were The Wild Bunch for the ZX Spectrum, Booty for the Commodore 64 and Bird Strike for the BBC Micro.
Although there were doubts as to whether or not the market could afford to sustain a range of budget titles, the Firebird Silver releases were successful. In February 1985, Booty was the third best selling video game in the UK, behind only Ghostbusters and Daley Thompson's Decathlon.
While Firebird Silver would release budget titles priced at £2.50, Firebird Gold would release more prestigious titles at a higher price. Firebird Gold established itself just as well as its budget counterpart. The label became synonymous with many classic 8-bit titles such as Elite, Revs, and The Sentinel.
In October 1985, the budget range was relaunched as the lower priced Firebird Silver 199 Range and a full price label, Firebird Hot, was created to publish titles such as Costa Capers, the sequel to Technician Ted. A further label called Firebird Super Silver was a short-lived mid-price range which published titles such as Chimera and the Amstrad CPC version of Booty at £3.99.
Firebird's success allowed them to acquire a number of third party developers, see Telecomsoft acquisitions below, and they also established a deal with Ultimate Play the Game, whereby they would convert and publish a number of their successful ZX Spectrum games to the Commodore 64.
As the Rainbird label became the home of Telecomsoft's premium products, the Gold and Hot labels slowly merged into a single full price range which went on to publish Mike Singleton's Dark Sceptre and the home conversions of Bubble Bobble.
A final overhaul of the Firebird brand was conducted in early 1988 as the budget titles became rebranded as Silverbird.
Silverbird
Rather than attempt to juggle a potentially confusing budget label with the same branding as their full price software, Telecomsoft decided to rebrand their Silver 199 budget label as a single Silverbird range. Two price points were established for 8-bit software (£1.99 and £2.99) while a few budget 16-bit titles were priced at £9.99. These various price points were differentiated between by their own particular style of packaging.
Rather than simply republish their existing range of budget software, Silverbird published a range of titles that hadn't previously been released at a budget price point. This included many original new titles as well older full-price titles acquired from other publishers.
Following MicroProse's acquisition of Telecomsoft, the US publisher sold off the Silverbird label to a Tudor Enterprises, a British publisher. They published a compilation pack of old Silverbird titles and a small number of original titles before closing down their software publishing operations.
Rainbird
The Rainbird label was established in November 1985 by Tony Rainbird. For legal reasons, the label's original name, Bluebird, had to be changed, although it still retained Tony Rainbird's original idea of releasing all its games in striking blue packaging.
The 16-bit home computer market, largely represented by the Atari ST and Amiga, was just beginning to take off in 1986 and the Rainbird label was an ideal opportunity to capitalise on it. Rather than concentrate on the more simplistic arcade action games that had dominated the 8-bit era, Rainbird aimed to introduce cutting edge simulators, adventure games and utilities to the full-price market.
Rainbird formed partnerships with a number of developers who would produce their next range of games. Magnetic Scrolls and Argonaut Software were amongst the first developers to benefit from a publishing deal with the label. Realtime Games, a successful ZX Spectrum developer who specialised in fast 3D action games, converted Starglider to the ZX Spectrum and developed Carrier Command.
The company republished enhanced versions of adventure games by Level 9 Computing, beginning with their Middle-earth trilogy: Colossal Adventure (itself an enhanced conversion of Adventure by Will Crowther and Don Woods), Adventure Quest and Dungeon Adventure, these last two featuring the Demon Lord Agaliarept. Rainbird published this sequence as Jewels of Darkness and references to Middle-earth were expunged. Rainbird also published Level 9's Silicon Dreams trilogy: Snowball was followed by Return to Eden and The Worm in Paradise.
MicroProse continued to use the Rainbird label for a number of years, after its acquisition of Telecomsoft.
Acquisitions
Beyond Software
One of Telecomsoft's earliest acquisitions was Beyond Software. Originally set up by the EMAP publishing group in 1983, Beyond published numerous titles on the ZX Spectrum, Commodore 64 and Amstrad CPC, but met with very little success until the release of Mike Singleton's Lords of Midnight in 1984. The Tolkien-esque strategy game, and allowed Beyond to establish a distribution deal with American developers First Star, as well as a publishing deal with developer Denton Designs.
After being acquired by Telecomsoft in late 1985<ref name="birds-beyond">Richard Hewison: Beyond. from: The Bird Sanctuary. Accessed on 2009-12-10</ref> for a six figure sum, Beyond continued to operate as a unique label, mostly releasing games that had already been in development for some time, as well as a number of conversions of existing titles. Telecomsoft did very little with the Beyond label beyond these releases. A number of other titles, such as Star Trek: The Rebel Universe'', were released on the Firebird label.
Odin Computer Graphics
References
BT Group
MicroProse
Defunct video game companies of the United Kingdom
Video game development companies
Video game publishers |
20583874 | https://en.wikipedia.org/wiki/Juraj%20Hromkovi%C4%8D | Juraj Hromkovič | Juraj Hromkovič (born 1958) is a Slovak Computer Scientist and Professor at ETH Zürich. He is the author of numerous monographs and scientific publications in the field of algorithmics, computational complexity theory, and randomization.
Biography
Hromkovič was born 1958 in Bratislava. He studied at Comenius University where he received his Ph.D. in 1986 (Dr. rer. nat.), habilitated in 1989 (Theoretical Cybernetics and Mathematical Informatics), and worked as a lecturer from 1989 to 1990. From 1989 to 1994, he was a visiting professor at the group of Burkhard Monien at the University of Paderborn. In 1994, he received a professorship at the Institute of Informatics at the University of Kiel. From 1997 to 2003, he led the Chair of Computer Science 1 at RWTH Aachen. Since 2004, he has been a professor at the Federal Institute of Technology, Zurich for Information Technology and Education.
Next to active research in various fields of theoretical computer science (about 170 publications), the main focus of his work lies on education for teachers of Computer Science and the illustration of basics of Computer Science to non-professionals.
References
External links
Homepage of the chair of Information Technology and Education at ETH Zürich
Homepage of the Center for Informatics Education (ABZ) of ETH Zürich
Juraj Hromkovič at Mathematics Genealogy Project
Slovak computer scientists
Theoretical computer scientists
1958 births
Living people
Scientists from Bratislava
Comenius University alumni
ETH Zurich faculty |
4813336 | https://en.wikipedia.org/wiki/DotProject | DotProject | dotProject is a web-based, multi-user, multi-language project management application. It is free and open source software, and is maintained by an open community of volunteer programmers.
History
dotProject was originally developed by Will Ezell at dotmarketing, Inc. to be an open source replacement for Microsoft Project, using a very similar user interface but including project management functionality. Begun in 2000, the project was moved to SourceForge in October 2001, and, from version 2.1.8 onwards, is hosted on GitHub.
The project stalled in late 2002 when the original team moved to dotCMS. Subsequently, Andrew Eddie and Adam Donnison, two of the more active developers, were granted administration rights to the project. Andrew continued to work on the project until he moved on to Mambo and later Joomla. Adam remains an administrator.
In late 2007, the new dotProject team began a major redevelopment using the Zend Framework, with version 3 (dP3) the expected target release to be utilising it. A fork called web2project was initiated at the same time.
Since 2018, the dotProject core team has focused their efforts on keeping dotProject compatible with the latest versions of PHP and MySQL/MariaDB and updating its dependent packages; the overall look and feel remains notoriously similar to what it used to be in the late 2000s.
Overview of the main features
dotProject is mostly a task-oriented project management system, predating contemporary tools addressing methodologies such as Agile software development. Instead, it uses the "waterfall" model to manage tasks, sequentially and/or in parallel, assigned to different members of a team or teams, and establishing dependencies between tasks and milestones. It can display such relationships visually using Gantt charts.
It is not specifically designed for software project management but can be used by most kinds of project-oriented service companies (such as design studios, architects, media producers, lawyer offices, and the like), all of which organise their work conceptually in similar ways. Unlike most contemporary software project management tools, dotProject cannot be easily integrated with the usual constellation of 'business tools'; instead, it is a complete, standalone application, not requiring anything else besides a platform that supports PHP (it is web server agnostic) and MySQL/MariaDB. Except for drawing Gantt graphics, it has a reasonably small footprint in terms of memory and disk space requirements.
In spite of its conceptual simplicity, dotProject nevertheless can be extended or integrated with other tools. It comes with a series of plugins, most of which pre-activated; there is even a repository of independently maintained 'mods' (or plugins) available on SourceForge, which include a Risks management module (released in late 2020) among others.
While dotProject is self-contained in terms of user authentication and management, it can also integrate with an external LDAP server, as well as synchronise its users with a phpNuke installation. Further authentication methods are possible to be developed separately but are currently not part of the core software.
The core of dotProject focuses on Companies, which may have subunits known as Departments, which, in turn, have Users. Companies can be internal or external; thus, a project can be shared/viewed by customers, by giving them access via a special Role. Roles have a reasonably complex permissions system, allowing a certain degree of fine-tuning of what kind of information can be viewed and/or edited by the users. There is even the possibility of having a 'public' role with no access to any information but nevertheless able to submit tickets via the integrated ticketing system.
Projects, in turn, are linked to one company and (optionally) one or more departments in that company; users assigned to a specific project, however, may come from any company or department — thus allowing cross-company development, or the involvement of external users (independent consultants, freelancers, or even the clients and their intermediaries).
Projects are divided into Tasks, which can have all sorts of dependencies between them; tasks can also have subtasks, and they can be assigned to specific milestones. This allows the establishment of complex relationships between the team members, the many projects they might be involved in, and the amount of work to be distributed among all. As is common with other project management tools, tasks can be created as mere stubs and completed later; assigned and reassigned to team members; or even moved across projects (or becoming subtasks of other tasks).
Team members are expected to register the amount of time they spend on each task, which is accomplished via Logs. These are often one-line comments with an estimate of the time consumed (but can optionally have much more information); dotProject will take those logs into account when calculating the workload, the overall cost of the project so far (and compare it to the budget), as well as figuring out what tasks are being completed in due time or are overdue. Depending on the company style and its level of activity tracking — according to their business culture — time-tracking can be as simple as just closing a task, or it might involve several logs until a supervisor deems that the task can be safely closed.
All these activities are tracked and made part of the overall project history. Optionally, dotProject can send emails to the involved parties, triggered by special conditions — such as a task being overdue, or having been completed so that a customer can be invoiced. While dotProject is not a fully-fledged invoicing system, it can produce enough data output to send reasonably detailed invoices to customers. At the same time, via its reporting facility, the management or the board can get properly formatted reports about ongoing projects, besides having access to the Gantt charts.
Communication between team members can be as simple as leaving comments on tasks and/or logs, but dotProject also includes a minimalistic Forum facility. These are usually assigned to a single project (but each project can have several separate forums, with separate moderators, serving different purposes).
And while dotProject is not a sophisticated document management system, it nevertheless allows files to be uploaded to a special directory, also assigned to specific projects/tasks, and under control of the permission system (file names get hashed, and only someone with the proper permission will be able to retrieve those files). There is a very simple built-in file management system to allow for file uploading and categorising with metadata. The file folder can theoretically be mounted on an external file system on a cloud storage provider — so long as this is achieved at the operating system level; dotProject, by itself, does not connect directly to any storage provider. dotProject also includes a very simple versioning system.
Tasks and milestones are also integrated into the built-in Calendar module, which is usually the preset entry point of the user — allowing them to keep up with the tasks they're involved in, or those that they supervise. There is some flexibility in how the information is presented. It is unknown if there is a way to automatically subscribe to a specific calendar; by contrast, Contacts, a module that allows editing the data related to each user, also permits exports using the vCard format.
Support and community
As of 2021, the dotProject community mostly volunteers time to reply on dotProject's GitHub issues, but there is no other form of getting any support.
As of May 2013, there were over 50,210 registered users in the dotProject forums and an average of 500–700 downloads each day.
As of April 2021, the original website mentioned before — which included a rich community of users — does not exist any longer, although https://dotproject.net/ is still actively maintained and points to some key resources (mostly on GitHub).
See also
Comparison of time tracking software
List of project management software
References
External links
Official web site
SourceForge.net Project of the Month for April 2009
https://books.google.com/books?id=XS8K8OydSEcC&pg=PA122&dq=dotProject+software&ei=Dt_YSrGBGYeENMPPzIIP#v=onepage&q=dotProject%20software&f=false
https://books.google.com/books?id=D6sGjfl5htkC&pg=PA128&dq=dotProject+software&ei=Dt_YSrGBGYeENMPPzIIP#v=onepage&q=dotProject%20software&f=false
https://books.google.com/books?id=uICMLDbOC54C&pg=PT166&dq=dotProject+software&ei=Dt_YSrGBGYeENMPPzIIP#v=onepage&q=dotProject%20software&f=false
https://books.google.com/books?id=ntR2Yprl4JwC&pg=PA176&dq=dotProject+software&lr=&ei=i9_YSvvtEo_YNeHqoP8O#v=onepage&q=dotProject%20software&f=false
Lee Jordan, Project Management with dotProject.
Project management software
Free project management software
PHP software
2000 software
Business software for Linux |
50501391 | https://en.wikipedia.org/wiki/AIX%20Toolbox%20for%20Linux%20Applications | AIX Toolbox for Linux Applications | The AIX Toolbox for Linux Applications is a collection of GNU tools for IBM AIX. These tools are available for installation using Redhat's RPM format.
Licensing
Each of these packages includes its own licensing information and while IBM has made the code available to AIX users, the code is provided as is and has not been thoroughly tested. The Toolbox is meant to provide a core set of some of the most common development tools and libraries along with the more popular GNU packages.
References
External links
AIX Toolbox for Linux Applications
Programming tools
Free compilers and interpreters
Free software programmed in C
Free software programmed in C++
System administration
Red Hat
UNIX System V
IBM operating systems
Power ISA operating systems
PowerPC operating systems |
45270199 | https://en.wikipedia.org/wiki/Craig%20Gotsman | Craig Gotsman | Craig Gotsman is the Dean of the Ying Wu College of Computing at the New Jersey Institute of Technology (NJIT), where he is a Distinguished Professor. Before NJIT, he was co-founder of the Cornell Tech campus in New York City and Founding Director of the Jacobs Technion-Cornell Institute there. Gotsman has also co-founded several technology startup companies and consulted to many large technology corporations.
Early life
Born in the UK, Gotsman spent his early childhood in South Africa. His family immigrated to Israel in 1973.
He was awarded all his academic degrees, including a PhD in Computer Science in 1991, from the Hebrew University of Jerusalem.
During 1984–89, Gotsman served as an officer in the Technological R&D Unit of the Israel Defense Forces, retiring from active reserve duty in 2005 with the rank of major.
Academic career
Specializing in computer graphics and geometry processing, Gotsman joined the Computer Science Department at the Technion – Israel Institute of Technology in Haifa as an assistant professor in 1991. In 2005 he co-founded the Center for Graphics and Geometric Computing, and in 2006 he became the first incumbent of the Technion's Hewlett-Packard Chair in Computer Engineering.
Gotsman was Visiting Professor at Harvard University and Research Scientist at MIT during 2003–2004, Visiting Professor at INRIA, Sophia Antipolis in 2006 and Visiting Professor at ETH Zurich in 2010. After helping co-found the Cornell Tech campus in New York City, he served in leadership roles and on the faculty there during 2012–2016.
Gotsman has published over 150 research papers
Gotsman is a Fellow of the National Academy of Inventors
and a Fellow of the Academy of Europe (Academia Europaea).
Cornell Tech
Gotsman played a leading role in the formation of the Cornell Tech campus in New York City.
Cornell Tech is an applied sciences campus dedicated to fostering innovation and producing entrepreneurial engineers, a project conceived and driven by former Mayor Michael Bloomberg and the New York City Economic Development Corporation, with the purpose of growing the tech sector of NYC.
In 2011, Gotsman helped develop the proposal to establish an Applied Sciences campus, submitted by Cornell University and Technion to the City of New York. The proposal subsequently won the bid, competing against a number of groups of international universities, including Massachusetts Institute of Technology and Stanford University.
Technion was cited as "the MIT of Israel" and a key player because of its innovation culture and contribution to the emergence of Israel as a global technological superpower, as documented in the book "Startup Nation".
In 2011, Gotsman was appointed Deputy Senior Vice-President (equivalent to Vice Provost) at Technion, responsible for the joint Technion-Cornell venture. In Feb 2012 he was appointed Founding Director of the joint Jacobs Technion-Cornell Innovation Institute at Cornell Tech. In this role, Gotsman developed a number of novel academic and entrepreneurial programs, including the successful Runway program,
supporting PhDs forming commercial ventures based on their deep technical expertise. He also engaged in faculty and student recruiting, corporate relations, media relations and fund raising. In April 2013, he helped raise a $133M naming gift from Joan and Irwin M. Jacobs of San Diego.
New Jersey Institute of Technology
In 2017, Gotsman was named Distinguished Professor and Dean of the Ying Wu College of Computing at New Jersey Institute of Technology (NJIT), a public R1 university with a significantly diverse student body. As such, he is chief executive of one of the few dedicated Colleges of Computing in the US. The Ying Wu College of Computing (YWCC) enrolls 3,500 of NJIT's 12,000 students, graduating 900 computing professionals at all levels every year, making it the largest generator of computing talent in the New York metro area. At NJIT, Gotsman was instrumental in expanding the college with a new department of Data Science, a research Institute for Data Science, a new graduate-level instructional facility in Jersey City and a joint Institute for Future Technologies with Ben-Gurion University of Israel.
Entrepreneurship
Gotsman co-founded three startup companies: Virtue3D was founded in 1997 and developed advanced technologies for Web-based 3D computer graphics based on Technion intellectual property. The technology was eventually acquired by German Mental Images, itself later acquired by NVIDIA. Estimotion – a precursor to Waze – was founded in 2000 and developed technologies for real-time traffic-based applications for cellular phones. The principal investors were Partner/Orange Communications and Shlomo Group. Estimotion was acquired by British ITIS Holdings, itself later acquired by INRIX. CatchEye
– commercializing 3D camera-related video-processing technology that Gotsman developed with colleagues at ETH Zurich – was founded in 2014 and acquired in 2017.
Gotsman also co-founded in 2006 an active consulting company – Geometrika – which develops graphics and geometric software technologies.
Other Activities
Outreach
While at Technion, Gotsman served as Associate Dean for Computing during 2001–2003. He also founded its Industrial Affiliates Program and Alumni Program and served as Associate Dean for External Relations during 2005–2008.
Industrial Activity
Gotsman served as consultant for HP Labs in Haifa and spent summers during 1993–1996 at HP Labs in Palo Alto. He has also consulted for companies in Israel, Europe and the US, including Nokia, Shell Oil, Disney, Intel, Rafael, Autodesk and Samsung.
Public Service
In 2014, Gotsman served as a technology expert on the New York City Metropolitan Transportation Authority Reinvention Commission, appointed by the governor of NY State. In its report, the commission recommended a number of reforms to the public transportation systems in New York City. Gotsman is also active in the entrepreneurial community in New York and New Jersey, in particular through the New York City Economic Development Corporation. and the New Jersey Economic Development Authority. Gotsman is a member of the Board of the New Jersey Innovation Institute (NJII).
Select Bibliography
Chen, R. and Gotsman, C. "Generalized As-Similar-As-Possible Warping with Applications in Digital Photography", Computer Graphics Forum, 35(2):81-92, 2016.
Kuster C., Popa T., Bazin J.-C., Gotsman C. and Gross M. "Gaze Correction for Home Video Conferencing", ACM Transactions on Graphics (Proc. SIGGRAPH Asia), 31(6):174, 2012.
Xu Y., Liu L., Gotsman C. and Gortler S.J. "Capacity-Constrained Delaunay Triangulation for Point Distributions", Computers and Graphics (Proc. SMI), 35(3):510–516, 2011.
Ben-Chen M., Weber O. and Gotsman C. "Variational Harmonic Maps for Space Deformation", ACM Transactions on Graphics 28(3), (Proc. SIGGRAPH), 2009.
Weber O., Ben-Chen M. and Gotsman C. "Complex Barycentric Coordinates with Applications to Image Deformation", Computer Graphics Forum, 28(2):587–597, 2009.
Liu L., Zhang L., Xu Y., Gotsman C. and Gortler S.J. "A Local/Global Approach to Mesh Parameterization", Computer Graphics Forum, 27(5):1495–1504, (Proc. Symp. Geometry Proc.), 2008.
Gortler S.J., Gotsman C. and Thurston D. "Discrete One-forms on Meshes and Applications to 3D Mesh Parameterization", Computer Aided Geometric Design, 33(2):83–112, 2006.
Sumner R.W., Zwicker M., Gotsman C. and Popovic J. "Mesh-based Inverse Kinematics", ACM Transactions on Graphics, 24(3):488–495, (Proc. SIGGRAPH), 2005.
Bogomjakov A. and Gotsman C. "Universal Rendering Sequences for Transparent Vertex Caching of Progressive Meshes", Computer Graphics Forum, 21(2):137–148, 2002.
Karni Z. and Gotsman C. "Spectral Compression of Mesh Geometry", Computer Graphics (Proc. SIGGRAPH), 279–286, 2000.
Touma C. and Gotsman C. "Triangle Mesh Compression", Proc. Graphics Interface 98, 1998.
Rabinovich B. and Gotsman C. "Visualization of Large Terrains in Resource-Limited Computing Environments", Proc. IEEE Visualization, 1997.
References
Computer scientists
1964 births
Living people |
56785838 | https://en.wikipedia.org/wiki/Marie-Paule%20Cani | Marie-Paule Cani | Marie-Paule Cani (born 1965) is a French computer scientist conducting advanced research in the fields of shape modeling and computer animation. She has contributed to over 200 research publications having around 5000 citations.
In 2007, Cani received the national Irène Joliot-Curie Prize to acknowledge her actions in mentoring women in computer science. She wants to strengthen the presence of women in scientific careers and mentors doctoral students. She won the Eurographics Award in 2011 for her work in outstanding technical contributions to the creation of 3D content.
In 1999, Institut Universitaire de France awarded her with junior membership.
In 2019 she is elected at the French Academy of sciences.
Education
1987 M.Sc. in computer Science, Ecole Normale Supérieure & University Paris XI, France.
1990 Ph.D. in computer graphics, University Paris XI, France.
1995 Habilitation Computer Science, Institut National Polytechnique de Grenoble.
Positions held
In 2014, Cani became the chair of computer science at the Collège de France.
Since May 2017, Cani has been professor of computer science at Ecole Polytechnique, Paris-Saclay, France. Prior to this, she held the same position at Grenoble INP from 1997 where she was the head of the INRIA research group EVASION, part of Laboratoire Jean Kuntzmann, a joint lab of CNRS and Grenoble Université Alpes. She became a full Professor in 1997. For a period of five years, from 1993 to 1997, she served as an assistant professor at Institut National Polytechnique de Grenoble. She started her academic career in 1990, as a Lecturer at Ecole Normale Supérieure, Paris.
References
1965 births
Living people
French computer scientists
French women computer scientists |
237036 | https://en.wikipedia.org/wiki/Advanced%20Video%20Coding | Advanced Video Coding | Advanced Video Coding (AVC), also referred to as H.264 or MPEG-4 Part 10, Advanced Video Coding (MPEG-4 AVC), is a video compression standard based on block-oriented, motion-compensated coding. It is by far the most commonly used format for the recording, compression, and distribution of video content, used by 91% of video industry developers . It supports resolutions up to and including 8K UHD.
The intent of the H.264/AVC project was to create a standard capable of providing good video quality at substantially lower bit rates than previous standards (i.e., half or less the bit rate of MPEG-2, H.263, or MPEG-4 Part 2), without increasing the complexity of design so much that it would be impractical or excessively expensive to implement. This was achieved with features such as a reduced-complexity integer discrete cosine transform (integer DCT), variable block-size segmentation, and multi-picture inter-picture prediction. An additional goal was to provide enough flexibility to allow the standard to be applied to a wide variety of applications on a wide variety of networks and systems, including low and high bit rates, low and high resolution video, broadcast, DVD storage, RTP/IP packet networks, and ITU-T multimedia telephony systems. The H.264 standard can be viewed as a "family of standards" composed of a number of different profiles, although its "High profile" is by far the mostly commonly used format. A specific decoder decodes at least one, but not necessarily all profiles. The standard describes the format of the encoded data and how the data is decoded, but it does not specify algorithms for encoding video that is left open as a matter for encoder designers to select for themselves, and a wide variety of encoding schemes has been developed. H.264 is typically used for lossy compression, although it is also possible to create truly lossless-coded regions within lossy-coded pictures or to support rare use cases for which the entire encoding is lossless.
H.264 was standardized by the ITU-T Video Coding Experts Group (VCEG) of Study Group 16 together with the ISO/IEC JTC1 Moving Picture Experts Group (MPEG). The project partnership effort is known as the Joint Video Team (JVT). The ITU-T H.264 standard and the ISO/IEC MPEG-4 AVC standard (formally, ISO/IEC 14496-10 – MPEG-4 Part 10, Advanced Video Coding) are jointly maintained so that they have identical technical content. The final drafting work on the first version of the standard was completed in May 2003, and various extensions of its capabilities have been added in subsequent editions. High Efficiency Video Coding (HEVC), a.k.a. H.265 and MPEG-H Part 2 is a successor to H.264/MPEG-4 AVC developed by the same organizations, while earlier standards are still in common use.
H.264 is perhaps best known as being the most commonly used video encoding format on Blu-ray Discs. It is also widely used by streaming Internet sources, such as videos from Netflix, Hulu, Amazon Prime Video, Vimeo, YouTube, and the iTunes Store, Web software such as the Adobe Flash Player and Microsoft Silverlight, and also various HDTV broadcasts over terrestrial (ATSC, ISDB-T, DVB-T or DVB-T2), cable (DVB-C), and satellite (DVB-S and DVB-S2) systems.
H.264 is restricted by patents owned by various parties. A license covering most (but not all) patents essential to H.264 is administered by a patent pool administered by MPEG LA.
The commercial use of patented H.264 technologies requires the payment of royalties to MPEG LA and other patent owners. MPEG LA has allowed the free use of H.264 technologies for streaming Internet video that is free to end users, and Cisco Systems pays royalties to MPEG LA on behalf of the users of binaries for its open source H.264 encoder.
Naming
The H.264 name follows the ITU-T naming convention, where the standard is a member of the H.26x line of VCEG video coding standards; the MPEG-4 AVC name relates to the naming convention in ISO/IEC MPEG, where the standard is part 10 of ISO/IEC 14496, which is the suite of standards known as MPEG-4. The standard was developed jointly in a partnership of VCEG and MPEG, after earlier development work in the ITU-T as a VCEG project called H.26L. It is thus common to refer to the standard with names such as H.264/AVC, AVC/H.264, H.264/MPEG-4 AVC, or MPEG-4/H.264 AVC, to emphasize the common heritage. Occasionally, it is also referred to as "the JVT codec", in reference to the Joint Video Team (JVT) organization that developed it. (Such partnership and multiple naming is not uncommon. For example, the video compression standard known as MPEG-2 also arose from the partnership between MPEG and the ITU-T, where MPEG-2 video is known to the ITU-T community as H.262.) Some software programs (such as VLC media player) internally identify this standard as AVC1.
History
Overall history
In early 1998, the Video Coding Experts Group (VCEG – ITU-T SG16 Q.6) issued a call for proposals on a project called H.26L, with the target to double the coding efficiency (which means halving the bit rate necessary for a given level of fidelity) in comparison to any other existing video coding standards for a broad variety of applications. VCEG was chaired by Gary Sullivan (Microsoft, formerly PictureTel, U.S.). The first draft design for that new standard was adopted in August 1999. In 2000, Thomas Wiegand (Heinrich Hertz Institute, Germany) became VCEG co-chair.
In December 2001, VCEG and the Moving Picture Experts Group (MPEG – ISO/IEC JTC 1/SC 29/WG 11) formed a Joint Video Team (JVT), with the charter to finalize the video coding standard. Formal approval of the specification came in March 2003. The JVT was (is) chaired by Gary Sullivan, Thomas Wiegand, and Ajay Luthra (Motorola, U.S.: later Arris, U.S.). In July 2004, the Fidelity Range Extensions (FRExt) project was finalized. From January 2005 to November 2007, the JVT was working on an extension of H.264/AVC towards scalability by an Annex (G) called Scalable Video Coding (SVC). The JVT management team was extended by Jens-Rainer Ohm (RWTH Aachen University, Germany). From July 2006 to November 2009, the JVT worked on Multiview Video Coding (MVC), an extension of H.264/AVC towards 3D television and limited-range free-viewpoint television. That work included the development of two new profiles of the standard: the Multiview High Profile and the Stereo High Profile.
Throughout the development of the standard, additional messages for containing supplemental enhancement information (SEI) have been developed. SEI messages can contain various types of data that indicate the timing of the video pictures or describe various properties of the coded video or how it can be used or enhanced. SEI messages are also defined that can contain arbitrary user-defined data. SEI messages do not affect the core decoding process, but can indicate how the video is recommended to be post-processed or displayed. Some other high-level properties of the video content are conveyed in video usability information (VUI), such as the indication of the color space for interpretation of the video content. As new color spaces have been developed, such as for high dynamic range and wide color gamut video, additional VUI identifiers have been added to indicate them.
Fidelity range extensions and professional profiles
The standardization of the first version of H.264/AVC was completed in May 2003. In the first project to extend the original standard, the JVT then developed what was called the Fidelity Range Extensions (FRExt). These extensions enabled higher quality video coding by supporting increased sample bit depth precision and higher-resolution color information, including the sampling structures known as Y′CBCR 4:2:2 (a.k.a. YUV 4:2:2) and 4:4:4. Several other features were also included in the FRExt project, such as adding an 8×8 integer discrete cosine transform (integer DCT) with adaptive switching between the 4×4 and 8×8 transforms, encoder-specified perceptual-based quantization weighting matrices, efficient inter-picture lossless coding, and support of additional color spaces. The design work on the FRExt project was completed in July 2004, and the drafting work on them was completed in September 2004.
Five other new profiles (see version 7 below) intended primarily for professional applications were then developed, adding extended-gamut color space support, defining additional aspect ratio indicators, defining two additional types of "supplemental enhancement information" (post-filter hint and tone mapping), and deprecating one of the prior FRExt profiles (the High 4:4:4 profile) that industry feedback indicated should have been designed differently.
Scalable video coding
The next major feature added to the standard was Scalable Video Coding (SVC). Specified in Annex G of H.264/AVC, SVC allows the construction of bitstreams that contain layers of sub-bitstreams that also conform to the standard, including one such bitstream known as the "base layer" that can be decoded by a H.264/AVC codec that does not support SVC. For temporal bitstream scalability (i.e., the presence of a sub-bitstream with a smaller temporal sampling rate than the main bitstream), complete access units are removed from the bitstream when deriving the sub-bitstream. In this case, high-level syntax and inter-prediction reference pictures in the bitstream are constructed accordingly. On the other hand, for spatial and quality bitstream scalability (i.e. the presence of a sub-bitstream with lower spatial resolution/quality than the main bitstream), the NAL (Network Abstraction Layer) is removed from the bitstream when deriving the sub-bitstream. In this case, inter-layer prediction (i.e., the prediction of the higher spatial resolution/quality signal from the data of the lower spatial resolution/quality signal) is typically used for efficient coding. The Scalable Video Coding extensions were completed in November 2007.
Multiview video coding
The next major feature added to the standard was Multiview Video Coding (MVC). Specified in Annex H of H.264/AVC, MVC enables the construction of bitstreams that represent more than one view of a video scene. An important example of this functionality is stereoscopic 3D video coding. Two profiles were developed in the MVC work: Multiview High profile supports an arbitrary number of views, and Stereo High profile is designed specifically for two-view stereoscopic video. The Multiview Video Coding extensions were completed in November 2009.
3D-AVC and MFC stereoscopic coding
Additional extensions were later developed that included 3D video coding with joint coding of depth maps and texture (termed 3D-AVC), multi-resolution frame-compatible (MFC) stereoscopic and 3D-MFC coding, various additional combinations of features, and higher frame sizes and frame rates.
Versions
Versions of the H.264/AVC standard include the following completed revisions, corrigenda, and amendments (dates are final approval dates in ITU-T, while final "International Standard" approval dates in ISO/IEC are somewhat different and slightly later in most cases). Each version represents changes relative to the next lower version that is integrated into the text.
Version 1 (Edition 1): (May 30, 2003) First approved version of H.264/AVC containing Baseline, Main, and Extended profiles.
Version 2 (Edition 1.1): (May 7, 2004) Corrigendum containing various minor corrections.
Version 3 (Edition 2): (March 1, 2005) Major addition containing the first amendment, establishing the Fidelity Range Extensions (FRExt). This version added the High, High 10, High 4:2:2, and High 4:4:4 profiles. After a few years, the High profile became the most commonly used profile of the standard.
Version 4 (Edition 2.1): (September 13, 2005) Corrigendum containing various minor corrections and adding three aspect ratio indicators.
Version 5 (Edition 2.2): (June 13, 2006) Amendment consisting of removal of prior High 4:4:4 profile (processed as a corrigendum in ISO/IEC).
Version 6 (Edition 2.2): (June 13, 2006) Amendment consisting of minor extensions like extended-gamut color space support (bundled with above-mentioned aspect ratio indicators in ISO/IEC).
Version 7 (Edition 2.3): (April 6, 2007) Amendment containing the addition of the High 4:4:4 Predictive profile and four Intra-only profiles (High 10 Intra, High 4:2:2 Intra, High 4:4:4 Intra, and CAVLC 4:4:4 Intra).
Version 8 (Edition 3): (November 22, 2007) Major addition to H.264/AVC containing the amendment for Scalable Video Coding (SVC) containing Scalable Baseline, Scalable High, and Scalable High Intra profiles.
Version 9 (Edition 3.1): (January 13, 2009) Corrigendum containing minor corrections.
Version 10 (Edition 4): (March 16, 2009) Amendment containing definition of a new profile (the Constrained Baseline profile) with only the common subset of capabilities supported in various previously specified profiles.
Version 11 (Edition 4): (March 16, 2009) Major addition to H.264/AVC containing the amendment for Multiview Video Coding (MVC) extension, including the Multiview High profile.
Version 12 (Edition 5): (March 9, 2010) Amendment containing definition of a new MVC profile (the Stereo High profile) for two-view video coding with support of interlaced coding tools and specifying an additional supplemental enhancement information (SEI) message termed the frame packing arrangement SEI message.
Version 13 (Edition 5): (March 9, 2010) Corrigendum containing minor corrections.
Version 14 (Edition 6): (June 29, 2011) Amendment specifying a new level (Level 5.2) supporting higher processing rates in terms of maximum macroblocks per second, and a new profile (the Progressive High profile) supporting only the frame coding tools of the previously specified High profile.
Version 15 (Edition 6): (June 29, 2011) Corrigendum containing minor corrections.
Version 16 (Edition 7): (January 13, 2012) Amendment containing definition of three new profiles intended primarily for real-time communication applications: the Constrained High, Scalable Constrained Baseline, and Scalable Constrained High profiles.
Version 17 (Edition 8): (April 13, 2013) Amendment with additional SEI message indicators.
Version 18 (Edition 8): (April 13, 2013) Amendment to specify the coding of depth map data for 3D stereoscopic video, including a Multiview Depth High profile.
Version 19 (Edition 8): (April 13, 2013) Corrigendum to correct an error in the sub-bitstream extraction process for multiview video.
Version 20 (Edition 8): (April 13, 2013) Amendment to specify additional color space identifiers (including support of ITU-R Recommendation BT.2020 for UHDTV) and an additional model type in the tone mapping information SEI message.
Version 21 (Edition 9): (February 13, 2014) Amendment to specify the Enhanced Multiview Depth High profile.
Version 22 (Edition 9): (February 13, 2014) Amendment to specify the multi-resolution frame compatible (MFC) enhancement for 3D stereoscopic video, the MFC High profile, and minor corrections.
Version 23 (Edition 10): (February 13, 2016) Amendment to specify MFC stereoscopic video with depth maps, the MFC Depth High profile, the mastering display color volume SEI message, and additional color-related VUI codepoint identifiers.
Version 24 (Edition 11): (October 14, 2016) Amendment to specify additional levels of decoder capability supporting larger picture sizes (Levels 6, 6.1, and 6.2), the green metadata SEI message, the alternative depth information SEI message, and additional color-related VUI codepoint identifiers.
Version 25 (Edition 12): (April 13, 2017) Amendment to specify the Progressive High 10 profile, hybrid log–gamma (HLG), and additional color-related VUI code points and SEI messages.
Version 26 (Edition 13): (June 13, 2019) Amendment to specify additional SEI messages for ambient viewing environment, content light level information, content color volume, equirectangular projection, cubemap projection, sphere rotation, region-wise packing, omnidirectional viewport, SEI manifest, and SEI prefix.
Version 27 (Edition 14): (August 22, 2021) Amendment to specify additional SEI messages for annotated regions and shutter interval information, and miscellaneous minor corrections and clarifications.
Patent holders
|}
Applications
The H.264 video format has a very broad application range that covers all forms of digital compressed video from low bit-rate Internet streaming applications to HDTV broadcast and Digital Cinema applications with nearly lossless coding. With the use of H.264, bit rate savings of 50% or more compared to MPEG-2 Part 2 are reported. For example, H.264 has been reported to give the same Digital Satellite TV quality as current MPEG-2 implementations with less than half the bitrate, with current MPEG-2 implementations working at around 3.5 Mbit/s and H.264 at only 1.5 Mbit/s. Sony claims that 9 Mbit/s AVC recording mode is equivalent to the image quality of the HDV format, which uses approximately 18–25 Mbit/s.
To ensure compatibility and problem-free adoption of H.264/AVC, many standards bodies have amended or added to their video-related standards so that users of these standards can employ H.264/AVC. Both the Blu-ray Disc format and the now-discontinued HD DVD format include the H.264/AVC High Profile as one of three mandatory video compression formats. The Digital Video Broadcast project (DVB) approved the use of H.264/AVC for broadcast television in late 2004.
The Advanced Television Systems Committee (ATSC) standards body in the United States approved the use of H.264/AVC for broadcast television in July 2008, although the standard is not yet used for fixed ATSC broadcasts within the United States. It has also been approved for use with the more recent ATSC-M/H (Mobile/Handheld) standard, using the AVC and SVC portions of H.264.
The CCTV (Closed Circuit TV) and Video Surveillance markets have included the technology in many products.
Many common DSLRs use H.264 video wrapped in QuickTime MOV containers as the native recording format.
Derived formats
AVCHD is a high-definition recording format designed by Sony and Panasonic that uses H.264 (conforming to H.264 while adding additional application-specific features and constraints).
AVC-Intra is an intraframe-only compression format, developed by Panasonic.
XAVC is a recording format designed by Sony that uses level 5.2 of H.264/MPEG-4 AVC, which is the highest level supported by that video standard. XAVC can support 4K resolution (4096 × 2160 and 3840 × 2160) at up to 60 frames per second (fps). Sony has announced that cameras that support XAVC include two CineAlta cameras—the Sony PMW-F55 and Sony PMW-F5. The Sony PMW-F55 can record XAVC with 4K resolution at 30 fps at 300 Mbit/s and 2K resolution at 30 fps at 100 Mbit/s. XAVC can record 4K resolution at 60 fps with 4:2:2 chroma sampling at 600 Mbit/s.
Design
Features
H.264/AVC/MPEG-4 Part 10 contains a number of new features that allow it to compress video much more efficiently than older standards and to provide more flexibility for application to a wide variety of network environments. In particular, some such key features include:
Multi-picture inter-picture prediction including the following features:
Using previously encoded pictures as references in a much more flexible way than in past standards, allowing up to 16 reference frames (or 32 reference fields, in the case of interlaced encoding) to be used in some cases. In profiles that support non-IDR frames, most levels specify that sufficient buffering should be available to allow for at least 4 or 5 reference frames at maximum resolution. This is in contrast to prior standards, where the limit was typically one; or, in the case of conventional "B pictures" (B-frames), two.
Variable block-size motion compensation (VBSMC) with block sizes as large as 16×16 and as small as 4×4, enabling precise segmentation of moving regions. The supported luma prediction block sizes include 16×16, 16×8, 8×16, 8×8, 8×4, 4×8, and 4×4, many of which can be used together in a single macroblock. Chroma prediction block sizes are correspondingly smaller when chroma subsampling is used.
The ability to use multiple motion vectors per macroblock (one or two per partition) with a maximum of 32 in the case of a B macroblock constructed of 16 4×4 partitions. The motion vectors for each 8×8 or larger partition region can point to different reference pictures.
The ability to use any macroblock type in B-frames, including I-macroblocks, resulting in much more efficient encoding when using B-frames. This feature was notably left out from MPEG-4 ASP.
Six-tap filtering for derivation of half-pel luma sample predictions, for sharper subpixel motion-compensation. Quarter-pixel motion is derived by linear interpolation of the halfpixel values, to save processing power.
Quarter-pixel precision for motion compensation, enabling precise description of the displacements of moving areas. For chroma the resolution is typically halved both vertically and horizontally (see 4:2:0) therefore the motion compensation of chroma uses one-eighth chroma pixel grid units.
Weighted prediction, allowing an encoder to specify the use of a scaling and offset when performing motion compensation, and providing a significant benefit in performance in special cases—such as fade-to-black, fade-in, and cross-fade transitions. This includes implicit weighted prediction for B-frames, and explicit weighted prediction for P-frames.
Spatial prediction from the edges of neighboring blocks for "intra" coding, rather than the "DC"-only prediction found in MPEG-2 Part 2 and the transform coefficient prediction found in H.263v2 and MPEG-4 Part 2. This includes luma prediction block sizes of 16×16, 8×8, and 4×4 (of which only one type can be used within each macroblock).
Integer discrete cosine transform (integer DCT), a type of discrete cosine transform (DCT) where the transform is an integer approximation of the standard DCT. It has selectable block sizes and exact-match integer computation to reduce complexity, including:
An exact-match integer 4×4 spatial block transform, allowing precise placement of residual signals with little of the "ringing" often found with prior codec designs. It is similar to the standard DCT used in previous standards, but uses a smaller block size and simple integer processing. Unlike the cosine-based formulas and tolerances expressed in earlier standards (such as H.261 and MPEG-2), integer processing provides an exactly specified decoded result.
An exact-match integer 8×8 spatial block transform, allowing highly correlated regions to be compressed more efficiently than with the 4×4 transform. This design is based on the standard DCT, but simplified and made to provide exactly specified decoding.
Adaptive encoder selection between the 4×4 and 8×8 transform block sizes for the integer transform operation.
A secondary Hadamard transform performed on "DC" coefficients of the primary spatial transform applied to chroma DC coefficients (and also luma in one special case) to obtain even more compression in smooth regions.
Lossless macroblock coding features including:
A lossless "PCM macroblock" representation mode in which video data samples are represented directly, allowing perfect representation of specific regions and allowing a strict limit to be placed on the quantity of coded data for each macroblock.
An enhanced lossless macroblock representation mode allowing perfect representation of specific regions while ordinarily using substantially fewer bits than the PCM mode.
Flexible interlaced-scan video coding features, including:
Macroblock-adaptive frame-field (MBAFF) coding, using a macroblock pair structure for pictures coded as frames, allowing 16×16 macroblocks in field mode (compared with MPEG-2, where field mode processing in a picture that is coded as a frame results in the processing of 16×8 half-macroblocks).
Picture-adaptive frame-field coding (PAFF or PicAFF) allowing a freely selected mixture of pictures coded either as complete frames where both fields are combined for encoding or as individual single fields.
A quantization design including:
Logarithmic step size control for easier bit rate management by encoders and simplified inverse-quantization scaling
Frequency-customized quantization scaling matrices selected by the encoder for perceptual-based quantization optimization
An in-loop deblocking filter that helps prevent the blocking artifacts common to other DCT-based image compression techniques, resulting in better visual appearance and compression efficiency
An entropy coding design including:
Context-adaptive binary arithmetic coding (CABAC), an algorithm to losslessly compress syntax elements in the video stream knowing the probabilities of syntax elements in a given context. CABAC compresses data more efficiently than CAVLC but requires considerably more processing to decode.
Context-adaptive variable-length coding (CAVLC), which is a lower-complexity alternative to CABAC for the coding of quantized transform coefficient values. Although lower complexity than CABAC, CAVLC is more elaborate and more efficient than the methods typically used to code coefficients in other prior designs.
A common simple and highly structured variable length coding (VLC) technique for many of the syntax elements not coded by CABAC or CAVLC, referred to as Exponential-Golomb coding (or Exp-Golomb).
Loss resilience features including:
A Network Abstraction Layer (NAL) definition allowing the same video syntax to be used in many network environments. One very fundamental design concept of H.264 is to generate self-contained packets, to remove the header duplication as in MPEG-4's Header Extension Code (HEC). This was achieved by decoupling information relevant to more than one slice from the media stream. The combination of the higher-level parameters is called a parameter set. The H.264 specification includes two types of parameter sets: Sequence Parameter Set (SPS) and Picture Parameter Set (PPS). An active sequence parameter set remains unchanged throughout a coded video sequence, and an active picture parameter set remains unchanged within a coded picture. The sequence and picture parameter set structures contain information such as picture size, optional coding modes employed, and macroblock to slice group map.
Flexible macroblock ordering (FMO), also known as slice groups, and arbitrary slice ordering (ASO), which are techniques for restructuring the ordering of the representation of the fundamental regions (macroblocks) in pictures. Typically considered an error/loss robustness feature, FMO and ASO can also be used for other purposes.
Data partitioning (DP), a feature providing the ability to separate more important and less important syntax elements into different packets of data, enabling the application of unequal error protection (UEP) and other types of improvement of error/loss robustness.
Redundant slices (RS), an error/loss robustness feature that lets an encoder send an extra representation of a picture region (typically at lower fidelity) that can be used if the primary representation is corrupted or lost.
Frame numbering, a feature that allows the creation of "sub-sequences", enabling temporal scalability by optional inclusion of extra pictures between other pictures, and the detection and concealment of losses of entire pictures, which can occur due to network packet losses or channel errors.
Switching slices, called SP and SI slices, allowing an encoder to direct a decoder to jump into an ongoing video stream for such purposes as video streaming bit rate switching and "trick mode" operation. When a decoder jumps into the middle of a video stream using the SP/SI feature, it can get an exact match to the decoded pictures at that location in the video stream despite using different pictures, or no pictures at all, as references prior to the switch.
A simple automatic process for preventing the accidental emulation of start codes, which are special sequences of bits in the coded data that allow random access into the bitstream and recovery of byte alignment in systems that can lose byte synchronization.
Supplemental enhancement information (SEI) and video usability information (VUI), which are extra information that can be inserted into the bitstream for various purposes such as indicating the color space used the video content or various constraints that apply to the encoding. SEI messages can contain arbitrary user-defined metadata payloads or other messages with syntax and semantics defined in the standard.
Auxiliary pictures, which can be used for such purposes as alpha compositing.
Support of monochrome (4:0:0), 4:2:0, 4:2:2, and 4:4:4 chroma sampling (depending on the selected profile).
Support of sample bit depth precision ranging from 8 to 14 bits per sample (depending on the selected profile).
The ability to encode individual color planes as distinct pictures with their own slice structures, macroblock modes, motion vectors, etc., allowing encoders to be designed with a simple parallelization structure (supported only in the three 4:4:4-capable profiles).
Picture order count, a feature that serves to keep the ordering of the pictures and the values of samples in the decoded pictures isolated from timing information, allowing timing information to be carried and controlled/changed separately by a system without affecting decoded picture content.
These techniques, along with several others, help H.264 to perform significantly better than any prior standard under a wide variety of circumstances in a wide variety of application environments. H.264 can often perform radically better than MPEG-2 video—typically obtaining the same quality at half of the bit rate or less, especially on high bit rate and high resolution video content.
Like other ISO/IEC MPEG video standards, H.264/AVC has a reference software implementation that can be freely downloaded. Its main purpose is to give examples of H.264/AVC features, rather than being a useful application per se. Some reference hardware design work has also been conducted in the Moving Picture Experts Group.
The above-mentioned aspects include features in all profiles of H.264. A profile for a codec is a set of features of that codec identified to meet a certain set of specifications of intended applications. This means that many of the features listed are not supported in some profiles. Various profiles of H.264/AVC are discussed in next section.
Profiles
The standard defines several sets of capabilities, which are referred to as profiles, targeting specific classes of applications. These are declared using a profile code (profile_idc) and sometimes a set of additional constraints applied in the encoder. The profile code and indicated constraints allow a decoder to recognize the requirements for decoding that specific bitstream. (And in many system environments, only one or two profiles are allowed to be used, so decoders in those environments do not need to be concerned with recognizing the less commonly used profiles.) By far the most commonly used profile is the High Profile.
Profiles for non-scalable 2D video applications include the following:
Constrained Baseline Profile (CBP, 66 with constraint set 1) Primarily for low-cost applications, this profile is most typically used in videoconferencing and mobile applications. It corresponds to the subset of features that are in common between the Baseline, Main, and High Profiles.
Baseline Profile (BP, 66) Primarily for low-cost applications that require additional data loss robustness, this profile is used in some videoconferencing and mobile applications. This profile includes all features that are supported in the Constrained Baseline Profile, plus three additional features that can be used for loss robustness (or for other purposes such as low-delay multi-point video stream compositing). The importance of this profile has faded somewhat since the definition of the Constrained Baseline Profile in 2009. All Constrained Baseline Profile bitstreams are also considered to be Baseline Profile bitstreams, as these two profiles share the same profile identifier code value.
Extended Profile (XP, 88) Intended as the streaming video profile, this profile has relatively high compression capability and some extra tricks for robustness to data losses and server stream switching.
Main Profile (MP, 77) This profile is used for standard-definition digital TV broadcasts that use the MPEG-4 format as defined in the DVB standard. It is not, however, used for high-definition television broadcasts, as the importance of this profile faded when the High Profile was developed in 2004 for that application.
High Profile (HiP, 100) The primary profile for broadcast and disc storage applications, particularly for high-definition television applications (for example, this is the profile adopted by the Blu-ray Disc storage format and the DVB HDTV broadcast service).
Progressive High Profile (PHiP, 100 with constraint set 4) Similar to the High profile, but without support of field coding features.
Constrained High Profile (100 with constraint set 4 and 5) Similar to the Progressive High profile, but without support of B (bi-predictive) slices.
High 10 Profile (Hi10P, 110) Going beyond typical mainstream consumer product capabilities, this profile builds on top of the High Profile, adding support for up to 10 bits per sample of decoded picture precision.
High 422 Profile (Hi422P, 122) Primarily targeting professional applications that use interlaced video, this profile builds on top of the High 10 Profile, adding support for the 4:2:2 chroma sampling format while using up to 10 bits per sample of decoded picture precision.
High 444 Predictive Profile (Hi444PP, 244) This profile builds on top of the High 4:2:2 Profile, supporting up to 4:4:4 chroma sampling, up to 14 bits per sample, and additionally supporting efficient lossless region coding and the coding of each picture as three separate color planes.
For camcorders, editing, and professional applications, the standard contains four additional Intra-frame-only profiles, which are defined as simple subsets of other corresponding profiles. These are mostly for professional (e.g., camera and editing system) applications:
High 10 Intra Profile (110 with constraint set 3) The High 10 Profile constrained to all-Intra use.
High 422 Intra Profile (122 with constraint set 3) The High 4:2:2 Profile constrained to all-Intra use.
High 444 Intra Profile (244 with constraint set 3) The High 4:4:4 Profile constrained to all-Intra use.
CAVLC 444 Intra Profile (44) The High 4:4:4 Profile constrained to all-Intra use and to CAVLC entropy coding (i.e., not supporting CABAC).
As a result of the Scalable Video Coding (SVC) extension, the standard contains five additional scalable profiles, which are defined as a combination of a H.264/AVC profile for the base layer (identified by the second word in the scalable profile name) and tools that achieve the scalable extension:
Scalable Baseline Profile (83) Primarily targeting video conferencing, mobile, and surveillance applications, this profile builds on top of the Constrained Baseline profile to which the base layer (a subset of the bitstream) must conform. For the scalability tools, a subset of the available tools is enabled.
Scalable Constrained Baseline Profile (83 with constraint set 5) A subset of the Scalable Baseline Profile intended primarily for real-time communication applications.
Scalable High Profile (86) Primarily targeting broadcast and streaming applications, this profile builds on top of the H.264/AVC High Profile to which the base layer must conform.
Scalable Constrained High Profile (86 with constraint set 5) A subset of the Scalable High Profile intended primarily for real-time communication applications.
Scalable High Intra Profile (86 with constraint set 3) Primarily targeting production applications, this profile is the Scalable High Profile constrained to all-Intra use.
As a result of the Multiview Video Coding (MVC) extension, the standard contains two multiview profiles:
Stereo High Profile (128) This profile targets two-view stereoscopic 3D video and combines the tools of the High profile with the inter-view prediction capabilities of the MVC extension.
Multiview High Profile (118) This profile supports two or more views using both inter-picture (temporal) and MVC inter-view prediction, but does not support field pictures and macroblock-adaptive frame-field coding.
The Multi-resolution Frame-Compatible (MFC) extension added two more profiles:
MFC High Profile (134) A profile for stereoscopic coding with two-layer resolution enhancement.
MFC Depth High Profile (135)
The 3D-AVC extension added two more profiles:
Multiview Depth High Profile (138) This profile supports joint coding of depth map and video texture information for improved compression of 3D video content.
Enhanced Multiview Depth High Profile (139) An enhanced profile for combined multiview coding with depth information.
Feature support in particular profiles
Levels
As the term is used in the standard, a "level" is a specified set of constraints that indicate a degree of required decoder performance for a profile. For example, a level of support within a profile specifies the maximum picture resolution, frame rate, and bit rate that a decoder may use. A decoder that conforms to a given level must be able to decode all bitstreams encoded for that level and all lower levels.
The maximum bit rate for the High Profile is 1.25 times that of the Constrained Baseline, Baseline, Extended and Main Profiles; 3 times for Hi10P, and 4 times for Hi422P/Hi444PP.
The number of luma samples is 16×16=256 times the number of macroblocks (and the number of luma samples per second is 256 times the number of macroblocks per second).
Decoded picture buffering
Previously encoded pictures are used by H.264/AVC encoders to provide predictions of the values of samples in other pictures. This allows the encoder to make efficient decisions on the best way to encode a given picture. At the decoder, such pictures are stored in a virtual decoded picture buffer (DPB). The maximum capacity of the DPB, in units of frames (or pairs of fields), as shown in parentheses in the right column of the table above, can be computed as follows:
Where MaxDpbMbs is a constant value provided in the table below as a function of level number, and PicWidthInMbs and FrameHeightInMbs are the picture width and frame height for the coded video data, expressed in units of macroblocks (rounded up to integer values and accounting for cropping and macroblock pairing when applicable). This formula is specified in sections A.3.1.h and A.3.2.f of the 2017 edition of the standard.
For example, for an HDTV picture that is 1,920 samples wide (PicWidthInMbs = 120) and 1,080 samples high (FrameHeightInMbs = 68), a Level 4 decoder has a maximum DPB storage capacity of floor(32768/(120*68)) = 4 frames (or 8 fields). Thus, the value 4 is shown in parentheses in the table above in the right column of the row for Level 4 with the frame size 1920×1080.
It is important to note that the current picture being decoded is not included in the computation of DPB fullness (unless the encoder has indicated for it to be stored for use as a reference for decoding other pictures or for delayed output timing). Thus, a decoder needs to actually have sufficient memory to handle (at least) one frame more than the maximum capacity of the DPB as calculated above.
Implementations
In 2009, the HTML5 working group was split between supporters of Ogg Theora, a free video format which is thought to be unencumbered by patents, and H.264, which contains patented technology. As late as July 2009, Google and Apple were said to support H.264, while Mozilla and Opera support Ogg Theora (now Google, Mozilla and Opera all support Theora and WebM with VP8). Microsoft, with the release of Internet Explorer 9, has added support for HTML 5 video encoded using H.264. At the Gartner Symposium/ITXpo in November 2010, Microsoft CEO Steve Ballmer answered the question "HTML 5 or Silverlight?" by saying "If you want to do something that is universal, there is no question the world is going HTML5." In January 2011, Google announced that they were pulling support for H.264 from their Chrome browser and supporting both Theora and WebM/VP8 to use only open formats.
On March 18, 2012, Mozilla announced support for H.264 in Firefox on mobile devices, due to prevalence of H.264-encoded video and the increased power-efficiency of using dedicated H.264 decoder hardware common on such devices. On February 20, 2013, Mozilla implemented support in Firefox for decoding H.264 on Windows 7 and above. This feature relies on Windows' built in decoding libraries. Firefox 35.0, released on January 13, 2015, supports H.264 on OS X 10.6 and higher.
On October 30, 2013, Rowan Trollope from Cisco Systems announced that Cisco would release both binaries and source code of an H.264 video codec called OpenH264 under the Simplified BSD license, and pay all royalties for its use to MPEG LA for any software projects that use Cisco's precompiled binaries, thus making Cisco's OpenH264 binaries free to use. However, any software projects that use Cisco's source code instead of its binaries would be legally responsible for paying all royalties to MPEG LA. Target CPU architectures include x86 and ARM, and target operating systems include Linux, Windows XP and later, Mac OS X, and Android; iOS was notably absent from this list, because it doesn't allow applications to fetch and install binary modules from the Internet. Also on October 30, 2013, Brendan Eich from Mozilla wrote that it would use Cisco's binaries in future versions of Firefox to add support for H.264 to Firefox where platform codecs are not available. Cisco published the source code to OpenH264 on December 9, 2013.
Although iOS was not supported by the 2013 Cisco software release, Apple updated its Video Toolbox Framework with iOS 8 (released in September 2014) to provide direct access to hardware-based H.264/AVC video encoding and decoding.
Software encoders
Hardware
Because H.264 encoding and decoding requires significant computing power in specific types of arithmetic operations, software implementations that run on general-purpose CPUs are typically less power efficient. However, the latest quad-core general-purpose x86 CPUs have sufficient computation power to perform real-time SD and HD encoding. Compression efficiency depends on video algorithmic implementations, not on whether hardware or software implementation is used. Therefore, the difference between hardware and software based implementation is more on power-efficiency, flexibility and cost. To improve the power efficiency and reduce hardware form-factor, special-purpose hardware may be employed, either for the complete encoding or decoding process, or for acceleration assistance within a CPU-controlled environment.
CPU based solutions are known to be much more flexible, particularly when encoding must be done concurrently in multiple formats, multiple bit rates and resolutions (multi-screen video), and possibly with additional features on container format support, advanced integrated advertising features, etc. CPU based software solution generally makes it much easier to load balance multiple concurrent encoding sessions within the same CPU.
The 2nd generation Intel "Sandy Bridge" Core i3/i5/i7 processors introduced at the January 2011 CES (Consumer Electronics Show) offer an on-chip hardware full HD H.264 encoder, known as Intel Quick Sync Video.
A hardware H.264 encoder can be an ASIC or an FPGA.
ASIC encoders with H.264 encoder functionality are available from many different semiconductor companies, but the core design used in the ASIC is typically licensed from one of a few companies such as Chips&Media, Allegro DVT, On2 (formerly Hantro, acquired by Google), Imagination Technologies, NGCodec. Some companies have both FPGA and ASIC product offerings.
Texas Instruments manufactures a line of ARM + DSP cores that perform DSP H.264 BP encoding 1080p at 30fps. This permits flexibility with respect to codecs (which are implemented as highly optimized DSP code) while being more efficient than software on a generic CPU.
Licensing
In countries where patents on software algorithms are upheld, vendors and commercial users of products that use H.264/AVC are expected to pay patent licensing royalties for the patented technology that their products use. This applies to the Baseline Profile as well.
A private organization known as MPEG LA, which is not affiliated in any way with the MPEG standardization organization, administers the licenses for patents applying to this standard, as well as other patent pools, such as for MPEG-4 Part 2 Video, HEVC and MPEG-DASH. The patent holders include Fujitsu, Panasonic, Sony, Mitsubishi, Apple, Columbia University, KAIST, Dolby, Google, JVC Kenwood, LG Electronics, Microsoft, NTT Docomo, Philips, Samsung, Sharp, Toshiba and ZTE, although the majority of patents in the pool are held by Panasonic ( patents), Godo Kaisha IP Bridge ( patents) and LG Electronics ( patents).
On August 26, 2010, MPEG LA announced that royalties won't be charged for H.264 encoded Internet video that is free to end users. All other royalties remain in place, such as royalties for products that decode and encode H.264 video as well as to operators of free television and subscription channels. The license terms are updated in 5-year blocks.
Since the first version of the standard was completed in May 2003 ( years ago) and the most commonly used profile (the High profile) was completed in June 2004 ( years ago), a substantial number of the patents that originally applied to the standard have been expiring, although one of the US patents in the MPEG LA H.264 pool lasts at least until 2027.
In 2005, Qualcomm sued Broadcom in US District Court, alleging that Broadcom infringed on two of its patents by making products that were compliant with the H.264 video compression standard. In 2007, the District Court found that the patents were unenforceable because Qualcomm had failed to disclose them to the JVT prior to the release of the H.264 standard in May 2003. In December 2008, the US Court of Appeals for the Federal Circuit affirmed the District Court's order that the patents be unenforceable but remanded to the District Court with instructions to limit the scope of unenforceability to H.264 compliant products.
See also
VC-1, a standard designed by Microsoft and approved as a SMPTE standard in 2006
Comparison of H.264 and VC-1
Dirac (video compression format), a video coding design by BBC Research & Development, released in 2008
VP8, a video coding design by On2 Technologies (later purchased by Google), released in 2008
VP9, a video coding design by Google, released in 2013
High Efficiency Video Coding (ITU-T H.265 or ISO/IEC 23008-2), an ITU/ISO/IEC standard, released in 2013
AV1, a video coding design by the Alliance for Open Media, released in 2018
Versatile Video Coding (ITU-T H.266 or ISO/IEC 23091-3), an ITU/ISO/IEC standard, released in 2020
IPTV
Group of pictures
Intra-frame coding
Inter frame
References
Further reading
External links
MPEG-4 AVC/H.264 Information Doom9's Forum
H.264/MPEG-4 Part 10 Tutorials (Richardson)
(dated December 2007)
(dated April 2009)
(dated May 2010)
High-definition television
Open standards covered by patents
Video codecs
Video compression
Videotelephony
ITU-T recommendations
ITU-T H Series Recommendations
H.26x
ISO standards
MPEG-4 Part 10
IEC standards |
26173989 | https://en.wikipedia.org/wiki/Firewall%20%28computing%29 | Firewall (computing) | In computing, a firewall is a network security system that monitors and controls incoming and outgoing network traffic based on predetermined security rules. A firewall typically establishes a barrier between a trusted network and an untrusted network, such as the Internet.
History
The term firewall originally referred to a wall intended to confine a fire within a line of adjacent buildings. Later uses refer to similar structures, such as the metal sheet separating the engine compartment of a vehicle or aircraft from the passenger compartment. The term was applied in the late 1980s to network technology that emerged when the Internet was fairly new in terms of its global use and connectivity. The predecessors to firewalls for network security were routers used in the late 1980s. Because they already segregated networks, routers could apply filtering to packets crossing them.
Before it was used in real-life computing, the term appeared in the 1983 computer-hacking movie WarGames, and possibly inspired its later use.
Types
Firewalls are categorized as a network-based or a host-based system. Network-based firewalls can be positioned anywhere within a LAN or WAN. They are either a software appliance running on general-purpose hardware, a hardware appliance running on special-purpose hardware, or a virtual appliance running on a virtual host controlled by a hypervisor. Firewall appliances may also offer non firewall functionality, such as DHCP or VPN services. Host-based firewalls are deployed directly on the host itself to control network traffic or other computing resources. This can be a daemon or service as a part of the operating system or an agent application for protection.
Packet filter
The first reported type of network firewall is called a packet filter, which inspect packets transferred between computers. The firewall maintains an access control list which dictates what packets will be looked at and what action should be applied, if any, with the default action set to silent discard. Three basic actions regarding the packet consist of a silent discard, discard with Internet Control Message Protocol or TCP reset response to the sender, and forward to the next hop. Packets may be filtered by source and destination IP addresses, protocol, source and destination ports. The bulk of Internet communication in 20th and early 21st century used either Transmission Control Protocol (TCP) or User Datagram Protocol (UDP) in conjunction with well-known ports, enabling firewalls of that era to distinguish between specific types of traffic such as web browsing, remote printing, email transmission, and file transfers.
The first paper published on firewall technology was in 1987 when engineers from Digital Equipment Corporation (DEC) developed filter systems known as packet filter firewalls. At AT&T Bell Labs, Bill Cheswick and Steve Bellovin continued their research in packet filtering and developed a working model for their own company based on their original first-generation architecture.
Connection tracking
From 1989–1990, three colleagues from AT&T Bell Laboratories, Dave Presotto, Janardan Sharma, and Kshitij Nigam, developed the second generation of firewalls, calling them circuit-level gateways.
Second-generation firewalls perform the work of their first-generation predecessors but also maintain knowledge of specific conversations between endpoints by remembering which port number the two IP addresses are using at layer 4 (transport layer) of the OSI model for their conversation, allowing examination of the overall exchange between the nodes.
Application layer
Marcus Ranum, Wei Xu, and Peter Churchyard released an application firewall known as Firewall Toolkit (FWTK) in October 1993. This became the basis for Gauntlet firewall at Trusted Information Systems.
The key benefit of application layer filtering is that it can understand certain applications and protocols such as File Transfer Protocol (FTP), Domain Name System (DNS), or Hypertext Transfer Protocol (HTTP). This allows it to identify unwanted applications or services using a non standard port, or detect if an allowed protocol is being abused. It can also provide unified security management including enforced encrypted DNS and virtual private networking.
As of 2012, the next-generation firewall provides a wider range of inspection at the application layer, extending deep packet inspection functionality to include, but is not limited to:
Web filtering
Intrusion prevention systems
User identity management
Web application firewall
Endpoint specific
Endpoint based application firewalls function by determining whether a process should accept any given connection. Application firewalls filter connections by examining the process ID of data packets against a rule set for the local process involved in the data transmission. Application firewalls accomplish their function by hooking into socket calls to filter the connections between the application layer and the lower layers. Application firewalls that hook into socket calls are also referred to as socket filters.
Configuration
Setting up a firewall is a complex and error-prone task. A network may face security issues due to configuration errors.
See also
Air gap (networking)
Distributed firewall
DMZ (computing)
Firewall pinhole
Firewalls and Internet Security
Golden Shield Project
Intrusion detection system
Windows Firewall
References
External links
Evolution of the Firewall Industry – discusses different architectures, how packets are processed, and provides a timeline of the evolution.
A History and Survey of Network Firewalls – provides an overview of firewalls at various ISO levels, with references to original papers where early firewall work was reported.
Network management
Firewall software
Packets (information technology)
Data security
Cyberwarfare
American inventions |
323407 | https://en.wikipedia.org/wiki/R.O.B. | R.O.B. | R.O.B. (Robotic Operating Buddy) is a toy robot accessory for the Nintendo Entertainment System (NES). It was launched in July 1985 as the in Japan, and October 1985 as R.O.B. in North America. Its short lifespan yielded only two games in the Robot Series: Gyromite and Stack-Up.
Following the video game crash of 1983, Nintendo alleviated the fearful retail market by rebranding its Japanese Famicom video game console as the Nintendo Entertainment System—a new platform focused on R.O.B. to further reclassify the system as a uniquely sophisticated toy experience instead of simply as a video game console. Computer Entertainer magazine in June 1985 called R.O.B. "the world's only interactive robot".
The NES's extensive marketing plan, with its immediately successful centering on R.O.B., began with the October 1985 test market launch of the NES in Manhattan, New York. The launch was Nintendo's debut in the North American video game console market, which eventually revitalized the entire video game industry. R.O.B. was quietly discontinued a few years later, and is now remembered as a successful Trojan Horse of marketing. He is a cameo or playable character in many Nintendo games, especially the Super Smash Bros. series.
History
Development
The new Nintendo of America subsidiary, having already bet the company's own launch upon its conversion of its failed Radar Scope (1980) arcade game cabinets into the successful new Donkey Kong (1981) arcade game, wanted to debut in the home video game console market using the Japanese parent company's successful Famicom system. But the entire video game industry, which had been virtually abandoned following the devastating video game crash of 1983, first needed a relaunch.
Following the crash, many retailers had lost confidence in the Atari-led video game market even while the toy market was strong. With a high volume of low quality products and dead-inventory shovelware, some retailers and industry critics considered video gaming to be a passing fad altogether. Therefore, Nintendo spent much of 1984 re-conceiving its Family Computer (Famicom) brand from Japan to be portrayed to America not as a traditional video game console, but as a new kind of sophisticated entertainment experience altogether.
Nintendo saw the industry's overwhelming trend away from game consoles and toward home computers, but its prototype of a lavish Famicom-based home computer and multimedia package called Advanced Video System (AVS) was poorly received at the January 1985 Winter Consumer Electronics Show, so that system was canceled and redesigned into a cost-reduced toy motif. The Famicom's whimsical appearance was again rebranded with a serious naming and industrial design language similar to the AVS, called the Nintendo Entertainment System (NES). The NES is based on a "control deck", shaped like high-tech videophile equipment with a front-loaded and door-enclosed cartridge port in the style of the modern VCR instead of a top-loaded "video game console".
The Family Computer Robot, a recent niche entry in the Famicom's aftermarket accessory lineup in Japan on July 26, 1985, is a mechanized toy robot with working arms and crude eyesight, resembling "a cross between E.T. and R2-D2". It was designed and patented by veteran Nintendo designer Gunpei Yokoi. Used as a functional companion for playing select video games within a custom playset, it was recolored for the NES and was thrust forth as essential to the NES's new identity as a futuristic, robot-powered experience. The Milwaukee Journal said, "The key to the NES is the interactive robot ... You no longer have to fight only the aliens on the screen; you have a robot to contend with as well." Computer Entertainer called it "the world's only interactive robot", as no other video game system or home computer package ever had one, greatly distinguishing the NES to retailers and consumers alike.
Receiving the first R.O.B. shipment from Japan, Nintendo of America staff remembered their first impression of unboxing and using the robot, initially thrilled with anticipation. Howard Phillips said, "The technology was so cool! ... like voodoo magic ... But then his actual motion was just hysterically slow." Gail Tilden said "That thing was definitely like watching grass grow. It was so slow, and to try and stand there and sales-pitch it in person and try to make it exciting; you had to have the eyes lined up just right or it wouldn't receive the flashes. It was kind of a challenge." Product designer Don James laughed, "[Gyromite] was hard as hell! ... So you really had to think two or three moves ahead to allow him to do what he was going to do. But it's cool to look at, right? ... It was a really neat, unusual little device. And it was fun to play! But again, like Rock 'em Sock 'em Robots, I wouldn't want to do it for 40 hours." Tasked with all of the NES's naming and branding, the sole marketing staff member Gail Tilden said the name was "originally going to be OTTO, which was a play on the word 'auto'", but she settled on Robotic Operating Buddy, or R.O.B.
As the centerpiece of the new NES platform, R.O.B.'s first presentation came at the Summer Consumer Electronics Show in Chicago in June 1985. Nintendo's brochure for attracting distributors shows a prototypical hybrid between AVS and NES with R.O.B., saying, "The future of home entertainment is staring you in the face. Our new video robot is the first of a long line of winners to come from Nintendo." and that R.O.B. is the "star of a new Entertainment System that's programmed to make you rich". The robotic persona reportedly "worked like a charm" to drive intrigued visitors to Nintendo's booth, but nobody signed up to be a distributor of the upcoming NES.
IGN reflected that "[R.O.B.] might have been the key to getting the system into players' hands, and once they had players, Nintendo was convinced the rest would be easy."
Release
Nintendo anticipated that R.O.B.'s flair for futurism, personality, and physicality was to be so crucial to the success of the NES, that the toy was featured prominently in much of the advertising media of the system and its game library, even more than any particular game. The robot was portrayed as a bridge between the player and the game. The retail floor displays were each topped with a huge R.O.B. head model, and the launch party centered on a colossal robot replica with many small silver-plated robot models. The toy robot is the highlighted accessory within the first and most premium NES consumer product offering, the Deluxe Set. This is a boxed bundle containing a Control Deck, R.O.B., Zapper light gun, Gyromite, and Duck Hunt.
The NES was launched in the Deluxe Set at its October 1985 test market in New York City, then in further test markets including Los Angeles, Chicago, and San Francisco, and finally nationwide. The NES's design language with R.O.B. and the Zapper, recategorized the retailers' perception of the NES from a video game to a toy. This bypassed the crashed video game stigma and launched it more safely from the toy sections of retail stores next to established hit robot toys like Transformers, Voltron, Go-Bots, Teddy Ruxpin, and Lazer Tag.
Soon in 1985 came the second and final entry in the Robot Series, Stack-Up, packaged separately along with its own physical game pieces. The NES was soon sold much more popularly in the form of only the Control Deck and Super Mario Bros. — without R.O.B. Optionally, Gyromite was packaged separately, and R.O.B. was packaged separately for . In the following few years, R.O.B. and the two-game Robot Series were quietly discontinued altogether.
Hardware
The patent underlying the R.O.B. product was filed by Gunpei Yokoi as "photosensing video game control system", with the same optical electronics as a NES Zapper, and likewise only functions correctly when coupled with a cathode ray tube (CRT) television and not an LCD. Games can send six distinct commands to R.O.B. by flashing the screen. Both Gyromite and Stack-Up include a test feature, sending an optical flash that should make R.O.B.'s LED light up.
The R.O.B. unit's height is 24 cm (9.6 in). It has a head movement range of a 45° horizontally centered tilt. The arm movement range is 240° left and right with five stopping points, 7 cm (2.75 in) up and down with six stopping points, and 7 cm (2.75 in) between hands when open. The unit has five accessory slots around the hexagonal base, numbered clockwise, starting at the rear-left from the robot's point of view; and notches on the hands allow for specialized parts to be attached for each game. The tinted filter can be optionally attached over the eyes like sunglasses, to compensate for bright televisions or sunlight. The unit is powered by four AA batteries.
Games
Only two officially licensed games were published for R.O.B., which comprise Nintendo's Robot Series: Gyromite and Stack-Up. Computer Entertainer reported Nintendo's supposed plans as stated prior to CES June 1985, for four more nondescript Robot Series games, but they were never released.
Gyromite
The Gyromite retail package consists of the following items: two claws for R.O.B.'s hands; two heavy spinning tops called gyros; two red and blue trays upon which the gyros will rest, causing buttons to be pressed on the second NES controller; one spinner motor for accelerating the gyros; and two black trays upon which the gyros are placed when not in use. Direct game mode is a feature used to learn how to use R.O.B. or to play with R.O.B. without playing the game. Gyromite is a puzzle-platformer in which the character has to collect dynamite before the time runs out, with several red and blue pillars blocking his way. In Game A, the commands are made by pressing START and then pushing the direction in which to move R.O.B., and using the A and B buttons to open and close his arms. If R.O.B. places a gyro on the red or blue button, it pushes the A or B button on the second NES controller, moving the pillar of the corresponding color. If both buttons need to be pressed at the same time, the gyros are placed in a spinner so that they will stay balanced on the button without R.O.B. holding it. Game B has the same controls, except that START does not need to be pressed to make R.O.B. accept a command.
Stack-Up
Stack-Up comes with five trays, five different colored circular , and two claws worn by R.O.B. for grabbing the blocks. In the Direct game mode, the player makes their block stack match with the one on screen by moving Professor Hector to the button that corresponds to the desired movement. In Memory, the player has to make a list of commands to recreate the displayed block set up, and then R.O.B. follows the list afterward. In Bingo, the player makes the shown block stack, where the color of the block does not matter. There are two enemies: one causes the player to lose a life, and the other makes R.O.B. perform undesired actions.
Aftermarket
In 2014, independent game developer Retrozone produced a limited release NES cartridge titled 8-Bit X-Mas 2014. The title screen features R.O.B. character graphics, and interacts with the toy by making it dance to Christmas music.
Reception
In January 1986, an independent research firm commissioned by Nintendo delivered a survey of 200 NES owners, showing that the most popular given reason for buying an NES was because children wanted the robot—followed by good graphics, variety of games, and the uniqueness and newness of the NES package. The creation and marketing of R.O.B. as a "Trojan Horse" after the video game crash of 1983 was placed fifth in GameSpy's twenty-five smartest moves in gaming history. Yahoo! ranked R.O.B. as one of the craziest video game controllers and noted the unfortunate fact that the peripheral only worked with two games.
By 1987, the two-year-old R.O.B. and Robot Series had received none of Nintendo's promised updates while the rest of the NES's library had exploded with classic flagship franchise-building hits like Super Mario Bros., The Legend of Zelda, and Metroid. In 1987, Mark Seeley of Crash! magazine visited a toy fair in England to observe a playthrough of Gyromite with R.O.B., saying of the struggling demonstrator that he had "never seen anything so complicated and difficult in all my life". In July 1987, Family Computing magazine advocated buying the much cheaper and more entertaining setup of the Control Deck and Super Mario Bros. instead of R.O.B., saying, "Anyone who has seen a Nintendo ad on television would think that R.O.B. is the heart of the system. Not so. R.O.B. is an ingenious idea [but] while R.O.B. is a cute little guy, there isn't much you can do with him. ... [N]either [of his two games] generates much excitement."
In 2018, Owen S. Good of Polygon remembered his childhood experience with the vintage R.O.B. and assessed the setup as "a novel, if almost Rube Goldberg-esque way of 'playing' with its users ... that quickly got dull".
Historian Chris Kohler was unimpressed with the product's long-term entertainment value. "As video game controller peripherals go, R.O.B. was a particularly gimmicky one. Once the novelty of controlling a robot's arms and spinning a glorified top had worn off, usually within days or even hours, R.O.B. got in the way of enjoyment. He required battery replacements too often, and it was immediately apparent that the maze barriers in Gyromite could be turned on and off just as easily by tapping the A and B buttons on a standard controller, which was all that R.O.B.'s complicated motions ended up doing."
In retrospect, Kohler considered R.O.B.'s discontinuation to now be immaterial because the product's whole existence has ultimately amounted to "merely a Trojan horse to get NES systems into American homes". He said "The gambit worked like a charm, and nobody missed R.O.B. or the Zapper once players realized that games played with the standard video game controller, like Super Mario Bros. were much more fun."
Legacy
After many failures, the late addition of R.O.B. gave a key product distinction to the launch of the Nintendo Entertainment System, reclassified the platform as a toy, and served as a Trojan Horse to enable the platform's successful launch. This, in turn, secured the survival of Nintendo of America and revitalized the entire video game industry. A followup promotional poster from Nintendo simply pictured R.O.B. and said, "They said reviving the video game market wasn't humanly possible. It wasn't."
R.O.B. has made cameo appearances in many video games, such as StarTropics (1990) for NES, F-Zero GX (2003), the WarioWare series, The Legend of Zelda: Majora's Mask (2015, 3DS), and the Star Fox series.
R.O.B. is an unlockable character in Mario Kart DS, Super Smash Bros. Brawl, Super Smash Bros. for Nintendo 3DS and Wii U, and Super Smash Bros. Ultimate, each of which refers to R.O.B. as male. In Brawls adventure mode, The Subspace Emissary, R.O.B. plays a major role in the plot. As part of Super Smash Bros. for Nintendo 3DS and Wii U, R.O.B. has two Amiibo figurines, uniquely produced in both the gray and white NES color scheme and the red and white Famicom color scheme.
R.O.B. is the model for TASBot, a tool-assisted speedrun software bot for video games.
See also
HERO, a programmable robot series for home computers from 1982 to 1995
Topo, a programmable robot series for home computers from 1983 to 1984
Short Circuit, the 1986 sci-fi film starring the robot Johnny Five
WALL-E, the 2008 sci-fi film starring the titular robot
Transformers
Notes
References
1985 robots
Japanese inventions
Nintendo antagonists
Nintendo characters
Nintendo Entertainment System accessories
Nintendo protagonists
Nintendo toys
Robot characters in video games
Robots of Japan
Super Smash Bros. fighters
Toy robots |
436706 | https://en.wikipedia.org/wiki/Dot-decimal%20notation | Dot-decimal notation | Dot-decimal notation is a presentation format for numerical data. It consists of a string of decimal numbers, using the full stop (dot) as a separation character.
A common use of dot-decimal notation is in information technology where it is a method of writing numbers in octet-grouped base-10 (decimal) numbers. In computer networking, Internet Protocol Version 4 (IPv4) addresses are commonly written using the quad-dotted notation of four decimal integers, ranging from 0 to 255 each.
IPv4 address
In computer networking, the notation is associated with the specific use of quad-dotted notation to represent IPv4 addresses and used as a synonym for dotted-quad notation. Dot-decimal notation is a presentation format for numerical data expressed as a string of decimal numbers each separated by a full stop. For example, the hexadecimal number 0xFF000000 may be expressed in dot-decimal notation as 255.0.0.0.
An IPv4 address has 32 bits. For purposes of representation, the bits may be divided into four octets written in decimal numbers, ranging from 0 to 255, concatenated as a character string with full stop delimiters between each number. This octet-grouped dotted-decimal format may more specifically be called "dotted octet" format, or a "dotted quad address".
For example, the address of the loopback interface, usually assigned the host name localhost, is 127.0.0.1. It consists of the four octets, written in binary notation: 01111111, 00000000, 00000000, and 00000001. The 32-bit number is represented in hexadecimal notation as 0x7F000001.
No formal specification of this textual IP address representation exists. The first mention of this format in RFC documents was in RFC 780 for the Mail Transfer Protocol published May 1981, in which the IP address was supposed to be enclosed in brackets or represented as a 32-bit decimal integer prefixed by a pound sign. A table in RFC 790 (Assigned Numbers) used the dotted decimal format, zero-padding each number to three digits. RFC 1123 (Requirements for Internet Hosts – Application and Support) of October 1989 mentions a requirement for host software to accept “IP address in dotted-decimal ("#.#.#.#") form”, although it notes “[t]his last requirement is not intended to specify the complete syntactic form for entering a dotted-decimal host number”. An IETF draft intended to define textual representation of IP addresses expired without further activity.
A popular implementation of IP networking, originating in 4.2BSD, contains a function inet_aton() for converting IP addresses in character string representation to internal binary storage. In addition to the basic four-decimals format and 32-bit numbers, it also supported intermediate syntax forms of octet.24bits (e.g. 10.1234567; for Class A addresses) and octet.octet.16bits (e.g. 172.16.12345; for Class B addresses). It also allowed the numbers to be written in hexadecimal and octal representations, by prefixing them with 0x and 0, respectively. These features continue to be supported in some software, even though they are considered as non-standard. This means addresses with a component written with a leading zero digit may be interpreted differently in programs that do or do not recognize such formats.
A POSIX-conforming variant of inet_aton, the inet_pton() function, supports only the four-decimal variant of IP addresses.
IP addresses in dot-decimal notation are also presented in CIDR notation, in which the IP address is suffixed with a slash and a number, used to specify the length of the associated routing prefix. For example, 127.0.0.1/8 specifies that the IP address has an eight-bit routing prefix, and therefore the subnet mask 255.0.0.0.
OIDs
Object identifiers use a style of dot-decimal notation to represent an arbitrarily deep hierarchy of objects identified by decimal numbers. They may also use textual words separated by dots, like some computer languages (see inheritance).
Version numbers
Software releases are often given version numbers in dot-decimal notation, with the first digit designating major revisions and the smaller ones progressively more minor releases. Version numbers with a leading zero, say "0.1.8", conventionally indicate that the software is still in beta and does not yet have complete features.
Libraries
Libraries use notation systems consisting of decimal numbers separated by dots, such as the older Dewey Decimal Classification and the Universal Decimal Classification, to classify books and other works by subject. The UDC additionally codes works with multiple dot-decimal topics, separated by colons.
Texts
Dot-decimal notation is often used for sections within a large text. This was standardized in ISO 2145.
Medicine
Dot-decimal notation is also used to describe illnesses in a language-neutral way. For instance, the AO Foundation/Orthopaedic Trauma Association (AO/OTA) classification generates numeric codes for describing broken toes. They run 88[meaning a fracture of the phalanges].[number-code of toe, with the big toe=1 and the little toe=5].[number-code of phalanx, counting 1-3 outwards from the foot].[number-code of location on the bone, with 1 being the inner end, 3 the outer, and 2 in-between]. So, for instance, 88.5.3.2 means a fracture to the little toe's outermost bone, in the center. There are other classifications for other fractures and dislocations.
See also
ISO 2145
Decimal section numbering
References
Network addressing |
19602324 | https://en.wikipedia.org/wiki/Network%20encryption%20cracking | Network encryption cracking | Network encryption cracking is the breaching of network encryptions (e.g., WEP, WPA, ...), usually through the use of a special encryption cracking software. It may be done through a range of attacks (active and passive) including injecting traffic, decrypting traffic, and dictionary-based attacks.
Methods
As mentioned above, several types of attacks are possible. More precisely they are:
Decrypting traffic based on tricking access points (active attack)
Injecting traffic based on known plaintext (active attack)
Gathering traffic and performing brute force/dictionary based attacks
Decrypting traffic using statistical analysis (passive attack)
Injecting traffic
Injecting traffic means inserting forged encrypted messages into the network. It may be done if either the key is known (to generate new messages), or if the key is not known and only an encrypted message and plaintext message is gathered, through comparison of the two. Programs able to do the latter are Aireplay and WepWedgie.
Decrypting
Decryption often requires 2 tools; 1 for gathering packets and another for analysing the packet and determining the key. Gathering packets may be done through tools such as WireShark or Prismdump and cracking may be done through tools such as WEPCrack, AirSnort, AirCrack, and WEPLab.
When gathering packets, often a great amount of them are required to perform cracking. Depending on the attack used, 5-16 million frames may be required. The attack command itself, however, is surprisingly simple.
WEPCrack
Commands to be inputted into WEPCrack are:
perl \progra~1\wepcrack\pcap-getIV.pl
This command generates a log-file (ivfile.log) from a captured packet obtained by WireShark or prismdump A packet with at least 5 million frames is required.
perl \progra~1\wepcrack\wepcrack\.pl ivfile.log
This command asks WEPCrack to determine the key from the log file.
AirCrack
Aircrack is another program that's even simpler to use, as no command need to be entered; instead the user is asked to type in some parameters and click some buttons.
First airodump is started to gather the packets; herefore channel and MAC-filter are asked, yet the user does not need to know them per se (instead 0 and p may be inputted respectively). Then, AirCrack is started, the file just created by airodump is accessed, a 0 needs to be entered and the program determines the key.
AirSnort
AirSnort is an even simpler program, as it is completely interface-based. As the attack is only a simple brute-force attack however, cracking the encryption can take a while (from several days to a few weeks). Especially if traffic is low (only 4 users or so on network, the cracking will take at least 2 weeks).
Comparison of tools
A comparison of the tools noted above may be found at Security Focus.
References
Computer network security
Security |
1928037 | https://en.wikipedia.org/wiki/Reference%20monitor | Reference monitor | In operating systems architecture a reference monitor concept defines a set of design requirements on a reference validation mechanism, which enforces an access control policy over subjects' (e.g., processes and users) ability to perform operations (e.g., read and write) on objects (e.g., files and sockets) on a system. The properties of a reference monitor are captured by the acronym NEAT, which means:
The reference validation mechanism must be Non-bypassable, so that an attacker cannot bypass the mechanism and violate the security policy.
The reference validation mechanism must be Evaluable, i.e., amenable to analysis and tests, the completeness of which can be assured (verifiable). Without this property, the mechanism might be flawed in such a way that the security policy is not enforced.
The reference validation mechanism must be Always invoked. Without this property, it is possible for the mechanism to not perform when intended, allowing an attacker to violate the security policy.
The reference validation mechanism must be Tamper-proof. Without this property, an attacker can undermine the mechanism itself and hence violate the security policy.
For example, Windows 3.x and 9x operating systems were not built with a reference monitor, whereas the Windows NT line, which also includes Windows 2000 and Windows XP, was designed to contain a reference monitor, although it is not clear that its properties (tamperproof, etc.) have ever been independently verified, or what level of computer security it was intended to provide.
The claim is that a reference validation mechanism that satisfies the reference monitor concept will correctly enforce a system's access control policy, as it must be invoked to mediate all security-sensitive operations, must not be tampered with, and has undergone complete analysis and testing to verify correctness. The abstract model of a reference monitor has been widely applied to any type of system that needs to enforce access control and is considered to express the necessary and sufficient properties for any system making this security claim.
According to Ross Anderson, the reference monitor concept was introduced by James Anderson in an influential 1972 paper. Peter Denning in a 2013 oral history stated that James Anderson credited the concept to a paper he and Scott Graham presented at a 1972 conference.
Systems evaluated at B3 and above by the Trusted Computer System Evaluation Criteria (TCSEC) must enforce the reference monitor concept.
References
See also
Security kernel
Operating system security |
48227895 | https://en.wikipedia.org/wiki/Back%20from%20the%20Grave%2C%20Volume%206 | Back from the Grave, Volume 6 | Back from the Grave, Volume 6 (LP) is the sixth installment in the Back from the Grave series of garage rock compilations assembled by Tim Warren of Crypt Records. It was released in 1986. In keeping with all of the entries in the series, and as indicated in the subheading which reads "17 Loud Unpsychedelic Wild Mid-60s Garage Punkers," this collection generally excludes psychedelic, folk rock, and pop-influenced material in favor of basic primitive rock and roll, usually consisting of songs displaying the rawer and more aggressive side of the genre often characterized by the use of fuzztone-distorted guitars and rough vocals. The packaging features well-researched liner notes written by Tim Warren which convey basic information about each song and group, such as origin, recording date, and biographical sketches, usually written in a conversational style that includes occasional slang, anecdotes, humorous asides. The liner notes are noticeably opinionated, sometimes engaging in tongue-in-cheek insults directed at other genres of music. The packaging also includes photographs of the bands, and the front cover features a highly satirical cartoon by Mort Todd which depicts the customarily vengeful deeds of revivified zombies, but this time, in a version of the future based on a retro-vision from the past, replete with flying saucers, these defiantly "earthly" creatures have taken Crypt records' makeshift fighter-plane for a joyride into orbit for the purpose of not-so-safely depositing their "musically heterodox" victims into the outer reaches of space.
The set begins with "My World Is Upside Down" by the Shames from Ipswich, Massachusetts, who are also later represented on the set with "Special Ones." Two Michigan bands, Keggs from Detroit, who perform "Girl" and the Ascots from Pontiac ("So Good"), are also included on side one. Side two begins with "Varsity Club Song" by the Golden Catalinas from La Crosse, Wisconsin followed by "Say You Love Me," by Billy & the Kids from Wenatchee, Washington. "Come on Mary" is by the Abandoned "Love's a Fire" by the Werps. The set concludes with "Through the Night" by The Trojans Of Evol.
Track listing
Side one
The Shames: "My World Is Upside Down"
Long John and the Silvermen: "Heart Filled with Love"
The Keggs: "Girl"
Beaux Jens: "She was Mine"
Shames: "Special Ones"
The Savoys: "Can It Be"
The Ascots: "So Good"
The Barracudas: "Baby Get Lost"
Side two
The Golden Catalinas: "Varsity Club Song"
Billy and the Kids: "Say You Love Me"
The Shandels: "Caroline"
The Shandels: "Mary Mary"
The Abandoned: "Come on Mary"
The Treytones: "Nonymous"
The Bryds: "Your Lies"
The Werps: "Love's a Fire"
The Trojans of Evol: "Through the Night"
Catalogue and release information
Long playing record (Crypt LP-006, rel. 1986)
Back from the Grave, Volumes 5 and 6 (CD)
Back from the Grave, Volumes 5 and 6 (CD) is a re-mastered CD that combines into one disc volumes 5 and 6 of the original 1983 LPs in the Back from the Grave series of garage rock compilations out by Tim Warren of Crypt Records. This CD was released in 2015. Until the advent of this CD in 2015, there had been no prior releases of volumes 5 and 6 on CD, as all of the songs which were included on the volumes 5 and LPs, appeared instead on volumes 1-4 in the old CD series—the entries in that old CD series differed dramatically from the LPs. However, this CD is a part of a new Back from the Grave sub-series which attempts to more faithfully replicate the song selection original LPs, bringing the series for the first time into multi-media coherence. Like the LPs the packaging features well-researched liner notes written by Tim Warren which convey basic information about each song and group, such as origin, recording date, and biographical sketches. The packaging also includes photographs of the bands, and the front cover (taken from the Volume 5 LP) features a highly satirical cartoon by Mort Todd. The track list to the Volumes 5 and 6 CD is similar to the corresponding LPs, but there are some differences.
Track listing
The Jesters Of Newport: "Stormy"
The Warlords: "Real Fine Lady"
The Henchmen: "Livin'"
The Jaguars: "It's Gonna Be Alright"
The Vestells: "Won't You Tell Me"
The Few: "Escape"
The Nobles: "Something Else"
The Keggs: "To Find Out"
The Humans: "Warning"
The Illusions: "City of People"
The Tigermen: "Close That Door"
The Aztex: "The Little Streets in My Town"
The Hatfields The Kid from Cinncy
The Centrees: "She's Good for Me"
The Tikis: "Show You Love"
The Rising Tides: "Take the World as it Comes"
The Shames: "My World Is Upside Down"
Long John and the Silvermen: "Heart Filled with Love"
The Keggs: "Girl"
Beaux Jens: "She was Mine"
Shames: "Special Ones"
The Savoys: "Can It Be"
The Abandoned: "Come on Mary"
The Barracudas: "Baby Get Lost"
The Ascots: "So Good"
The Shames: "Special Ones"
The Golden Catalinas: "Varsity Club Song"
Billy and the Kids: "Say You Love Me"
The Shandels: "Caroline"
The Shandels: "Mary Mary"
The Treytones: "Nonymous"
The Bryds: "Your Lies"
The Trojans of Evol: "Through the Night"
Catalogue and release information
Compact disc (Crypt CD, rel. 2015)
References
Back from the Grave (series)
1986 compilation albums
2015 compilation albums
Crypt Records albums |
438022 | https://en.wikipedia.org/wiki/Nintendo%20DS | Nintendo DS | The is a handheld game console produced by Nintendo, released globally across 2004 and 2005. The DS, an initialism for "Developers' System" or "Dual Screen", introduced distinctive new features to handheld games: two LCD screens working in tandem (the bottom one being a touchscreen), a built-in microphone and support for wireless connectivity. Both screens are encompassed within a clamshell design similar to the Game Boy Advance SP. The Nintendo DS also features the ability for multiple DS consoles to directly interact with each other over Wi-Fi within a short range without the need to connect to an existing wireless network. Alternatively, they could interact online using the now-defunct Nintendo Wi-Fi Connection service. Its main competitor was Sony's PlayStation Portable during the seventh generation of video game consoles.
Prior to its release, the Nintendo DS was marketed as an experimental "third pillar" in Nintendo's console lineup, meant to complement the Game Boy Advance family and GameCube. However, backward compatibility with Game Boy Advance titles and strong sales ultimately established it as the successor to the Game Boy series. On March 2, 2006, Nintendo launched the Nintendo DS Lite, a slimmer and lighter redesign of the original Nintendo DS with brighter screens and a longer lasting battery. On November 1, 2008, Nintendo released the Nintendo DSi, another redesign with several hardware improvements and new features, although it lost backwards compatibility for Game Boy Advance titles and a few DS games that used the GBA slot. On November 21, 2009, Nintendo released the Nintendo DSi XL, a larger version of the DSi.
All Nintendo DS models combined have sold 154.02 million units, making it the best-selling Nintendo system, the best-selling handheld game console to date, and the second best-selling video game console of all time, overall, behind Sony's PlayStation 2. The Nintendo DS line was succeeded by the Nintendo 3DS family in February 2011, which maintains backward compatibility with nearly all Nintendo DS and DSi software except for some software that requires the GBA slot for use.
History
Development
Development on the Nintendo DS began around mid-2002, following an original idea from former Nintendo president Hiroshi Yamauchi about a dual-screened console. On November 13, 2003, Nintendo announced that it would be releasing a new game product in 2004. The company did not provide many details, but stated it would not succeed the Game Boy Advance or GameCube. On January 20, 2004, the console was announced under the codename "Nintendo DS". Nintendo released only a few details at that time, saying that the console would have two separate, 3-inch TFT LCD display panels, separate processors, and up to 1 gigabit (128 Megabytes) of semiconductor memory. Current Nintendo president at the time, Satoru Iwata, said, "We have developed Nintendo DS based upon a completely different concept from existing game devices in order to provide players with a unique entertainment experience for the 21st century." He also expressed optimism that the DS would help put Nintendo back at the forefront of innovation and move away from the conservative image that has been described about the company in years past. In March 2004, a document containing most of the console's technical specifications was leaked, also revealing its internal development name, "Nitro". In May 2004, the console was shown in prototype form at E3 2004, still under the name "Nintendo DS". On July 28, 2004, Nintendo revealed a new design that was described as "sleeker and more elegant" than the one shown at E3 and announced Nintendo DS as the device's official name. Following lukewarm GameCube sales, Hiroshi Yamauchi stressed the importance of its success to the company's future, making a statement which can be translated from Japanese as, "If the DS succeeds, we will rise to heaven, but if it fails we will sink to hell."
Launch
President Iwata referred to Nintendo DS as "Nintendo's first hardware launch in support of the basic strategy 'Gaming Population Expansion because the touch-based device "allows users to play intuitively". On September 20, 2004, Nintendo announced that the Nintendo DS would be released in North America on November 21, 2004, for US$149.99. It was set to release on December 2, 2004, in Japan for JP¥15,000; on February 24, 2005, in Australia for A$199.95; and on March 11, 2005, in Europe for €149.99 (£99.99 in the United Kingdom). The console was released in North America with a midnight launch event at Universal CityWalk EB Games in Los Angeles, California. The console was launched quietly in Japan compared to the North America launch; one source cited the cold weather as the reason. Regarding the European launch, Nintendo President Satoru Iwata said this:
North America and Japan
The Nintendo DS was launched in North America for US$149.99 on November 21, 2004; in Japan for JP¥15,000 on December 2 in the color "Titanium". Well over three million preorders were taken in North America and Japan; preorders at online stores were launched on November 3 and ended the same day as merchants had already sold their allotment. Initially, Nintendo planned to deliver one million units combined at the North American and Japanese launches; when it saw the preorder numbers, it brought another factory online to ramp up production. Nintendo originally slated 300,000 units for the U.S. debut; 550,000 were shipped, and just over 500,000 of those sold through in the first week. Later in 2005, the manufacturer suggested retail price for the Nintendo DS was dropped to US$129.99.
Both launches proved to be successful, but Nintendo chose to release the DS in North America prior to Japan, a first for a hardware launch from the Kyoto-based company. This choice was made to get the DS out for the largest shopping day of the year in the U.S. (the day after Thanksgiving, also known as "Black Friday"). Perhaps partly due to the release date, the DS met unexpectedly high demand in the United States, selling 1 million units by December 21, 2004. By the end of December, the total number shipped worldwide was 2.8 million, about 800,000 more than Nintendo's original forecast. At least 1.2 million of them were sold in the U.S. Some industry reporters referred to it as "the Tickle Me Elmo of 2004". In June 2005, Nintendo informed the press that a total of 6.65 million units had been sold worldwide.
As is normal for electronics, some were reported as having problems with stuck pixels in either of the two screens. Return policies for LCD displays vary between manufacturers and regions, however, in North America, Nintendo has chosen to replace a system with fixed pixels only if the owner claims that it interferes with their gaming experience. There were two exchange programs in place for North America. In the first, the owner of the defective DS in question would provide a valid credit card number and, afterward, Nintendo would ship a new DS system to the owner with shipping supplies to return the defective system. In the second, the owner of the defective DS in question would have shipped their system to Nintendo for inspection. After inspection, Nintendo technicians would have either shipped a replacement system or fixed the defective system. The first option allowed the owner to have a new DS in 3–5 business days.
Multiple games were released alongside the DS during its North American launch on November 21, 2004. At launch there was one pack-in demo, in addition to the built-in PictoChat program: Metroid Prime Hunters: First Hunt (published by Nintendo and is a demo for Metroid Prime Hunters, a game released in March 2006). At the time of the "Electric Blue" DS launch in June 2005, Nintendo bundled the system with Super Mario 64 DS.
In Japan, the games were released at the same time as the system's first release (December 2, 2004). In the launch period, The Prince of Tennis 2005 -Crystal Drive- (Konami) and Puyo Puyo Fever (Puyo Pop Fever) (Sega) were released.
Europe
The DS was released in Europe on March 11, 2005, for €149. A small supply of units was available prior to this in a package with a promotional "VIP" T-shirt, Metroid Prime Hunters - First Hunt, a WarioWare: Touched! demo and a pre-release version of Super Mario 64 DS, through the Nintendo Stars Catalogue; the bundle was priced at £129.99 for the UK and €189.99 for the rest of Europe, plus 1,000 of Nintendo's "star" loyalty points (to cover postage). , 1 million DS units had been sold in Europe, setting a sales record for a handheld console.
The European release of the DS, like the U.S., was originally packaged with a Metroid Prime Hunters: First Hunt demo. The European packaging for the console is noticeably more "aggressive" than that of the U.S./Japanese release. The European game cases are additionally about 1/4 inch thicker than their North American counterparts and transparent rather than solid black. Inside the case, there is room for one Game Boy Advance game pak and a DS card with the instructions on the left side of the case.
Australia and New Zealand
The DS launched in Australia and New Zealand on February 24, 2005. It retailed in Australia for AU$199 and in New Zealand for NZ$249. Like the North American launch, it includes the Metroid Prime Hunters - First Hunt demo. The first week of sales for the system broke Australian launch sales records for a console, with 19,191 units sold by the 27th.
China
"iQue DS", the official name of the Chinese Nintendo DS, was released in China on June 15, 2005. The price of the iQue DS was 980 RMB (roughly US$130) as of April 2006. This version of the DS includes updated firmware to block out the use of the PassMe device, along with the new Red DS. Chinese launch games were Zhi Gan Yi Bi (Polarium) (Nintendo/iQue) and Momo Waliou Zhizao (WarioWare: Touched!) (Nintendo/iQue). The iQue was also the name of the device that China received instead of the Nintendo 64.
Games available on launch
Promotion
The system's promotional slogans revolve around the word "Touch" in almost all countries, with the North American slogan being "Touching is good."
The Nintendo DS was seen by many analysts to be in the same market as Sony's PlayStation Portable, although representatives from both companies stated that each system targeted a different audience. Time magazine awarded the DS a Gadget of the Week award.
At the time of its release in the United States, the Nintendo DS retailed for . The price dropped to on August 21, 2005, one day before the releases of Nintendogs and Advance Wars: Dual Strike.
Nine official colors of the Nintendo DS were available through standard retailers. Titanium-colored units were available worldwide, Electric Blue was exclusive to North and Latin America. There was also a red version which was bundled with the game Mario Kart DS. Graphite Black, Pure White, Turquoise Blue, and Candy Pink were available in Japan. Mystic Pink and Cosmic Blue were available in Australia and New Zealand. Japan's Candy Pink and Australia's Cosmic Blue were also available in Europe and North America through a Nintendogs bundle, although the colors are just referred to as pink and blue; however, these colors were available only for the original style Nintendo DS; a different and more-limited set of colors were used for the Nintendo DS Lite.
Sales
As of March 31, 2016, all Nintendo DS models combined have sold 154.02 million units, making it the best-selling handheld game console to date, and the second best-selling video game console of all time.
Legacy
The success of the Nintendo DS introduced touchscreen controls and wireless online gaming to a wide audience. According to Damien McFerran of Nintendo Life, the "DS was the first encounter many people had with touch-based tech, and it left an indelible impression."
The DS established a large casual gaming market, attracting large non-gamer audiences and establishing touchscreens as the standard controls for future portable gaming devices. According to Jeremy Parish, writing for Polygon, the Nintendo DS laid the foundations for touchscreen mobile gaming on smartphones. He stated that the DS "had basically primed the entire world for" the iPhone, released in 2007, and that the DS paved the way for iPhone gaming mobile apps. However, the success of the iPhone "effectively caused the DS market to implode" by the early 2010s, according to Parish.
The success of the DS paved the way for its successor, the Nintendo 3DS, a handheld gaming console with a similar dual-screen setup that can display images on the top screen in stereoscopic 3D.
On January 29, 2014, Nintendo announced that Nintendo DS games would be added to the Wii U's Virtual Console, with the first game, Brain Age: Train Your Brain in Minutes a Day!, being released in Japan on June 3, 2014.
Hardware
The Nintendo DS design resembles that of the multi-screen games from the Game & Watch line, such as Donkey Kong and Zelda, which was also made by Nintendo.
The lower display of the Nintendo DS is overlaid with a resistive touchscreen designed to accept input from the included stylus, the user's fingers, or a curved plastic tab attached to the optional wrist strap. The touchscreen lets users interact with in-game elements more directly than by pressing buttons; for example, in the included chatting software, PictoChat, the stylus is used to write messages or draw.
The handheld features four lettered buttons (X, Y, A, B), a directional pad, and Start, Select, and Power buttons. On the top of the device are two shoulder buttons, a game card slot, a stylus holder and a power cable input. The bottom features the Game Boy Advance game card slot. The overall button layout resembles that of the Super Nintendo Entertainment System controller. When using backward compatibility mode on the DS, buttons X and Y and the touchscreen are not used as the Game Boy Advance line of systems do not feature these controls.
It also has stereo speakers providing virtual surround sound (depending on the software) located on either side of the upper display screen. This was a first for a Nintendo handheld, as the Game Boy line of systems had only supported stereo sound through the use of headphones or external speakers. A built-in microphone is located below the left side of the bottom screen. It has been used for a variety of purposes, including speech recognition, chatting online between and during gameplay sessions, and minigames that require the player to blow or shout into it.
Technical specifications
The system's 3D hardware consists of rendering engine and geometry engine which perform transform and lighting, Transparency Auto Sorting, Transparency Effects, Texture Matrix Effects, 2D Billboards, Texture Streaming, texture-coordinate transformation, perspective-correct texture mapping, per-pixel Alpha Test, per-primitive alpha blending, texture blending, Gouraud Shading, cel shading, z-buffering, W-Buffering, 1bit Stencil Buffer, per-vertex directional lighting and simulated point lighting, Depth Test, Stencil Test, Render to Texture, Lightmapping, Environment Mapping, Shadow Volumes, Shadow Mapping, Distance Fog, Edge Marking, Fade-In/Fade-Out, Edge-AA. Sprite special effects: scrolling, scaling, rotation, stretching, shear. However, it uses point (nearest neighbor) texture filtering, leading to some titles having a blocky appearance. Unlike most 3D hardware, it has a set limit on the number of triangles it can render as part of a single scene; the maximum amount is about 6144 vertices, or 2048 triangles per frame. The 3D hardware is designed to render to a single screen at a time, so rendering 3D to both screens is difficult and decreases performance significantly. The DS is generally more limited by its polygon budget than its pixel fill rate. There are also 512 kilobytes of texture memory, and the maximum texture size is 1024 × 1024 pixels.
The system has 656 kilobytes of video memory and two 2D engines (one per screen). These are similar to (but more powerful than) the Game Boy Advance's single 2D engine.
The Nintendo DS has compatibility with Wi-Fi (IEEE 802.11 (legacy mode)). Wi-Fi is used for accessing the Nintendo Wi-Fi Connection, compete with other users playing the same Wi-Fi compatible game, PictoChat or with a special cartridge and RAM extension, browse the internet.
Nintendo claims the battery lasts a maximum of 10 hours under ideal conditions on a full four-hour charge. Battery life is affected by multiple factors including speaker volume, use of one or both screens, use of wireless connectivity, and use of backlight, which can be turned on or off in selected games such as Super Mario 64 DS. The battery is user-replaceable using only a Phillips-head screwdriver. After about 500 charges the battery life starts dropping.
Users can close the Nintendo DS system to trigger its 'sleep' mode, which pauses the game being played and saves battery life by turning off the screens, speakers, and wireless communications; however, closing the system while playing a Game Boy Advance game will not put the Nintendo DS into sleep mode, and the game will continue to run normally. Certain DS games (such as Animal Crossing: Wild World) also will not pause but the backlight, screens, and speakers will turn off. Additionally, when saving the game in certain games, the DS will not go into sleep mode. Some games, such as The Legend of Zelda: Phantom Hourglass even use the closing motion needed to enter sleep mode as an unorthodox way of solving puzzles. Looney Tunes: Duck Amuck has a game mode in which you need to close the DS to play, helping Daffy Duck hunt a monster with the shoulder buttons.
Accessories
Although the secondary port on the Nintendo DS does accept and support Game Boy Advance cartridges (but not Game Boy or Game Boy Color cartridges), Nintendo emphasized that the main intention for its inclusion was to allow a wide variety of accessories to be released for the system.
Due to the lack of a second port on the Nintendo DSi, it is not compatible with any accessory that uses it.
Rumble Pak
The Rumble Pak was the first official expansion slot accessory. In the form of a Game Boy Advance cartridge, the Rumble Pak vibrates to reflect the action in compatible games, such as when the player bumps into an obstacle or loses a life. It was released in North America and Japan in 2005 bundled with Metroid Prime Pinball. In Europe, it was first available with the game Actionloop, and later Metroid Prime Pinball. The Rumble Pak was also released separately in those regions.
Headset
The Nintendo DS Headset is the official headset for the Nintendo DS. It plugs into the headset port (which is a combination of a standard 3.5 mm (1/8 in) headphone connector and a proprietary microphone connector) on the bottom of the system. It features one earphone and a microphone, and is compatible with all games that use the internal microphone. It was released alongside Pokémon Diamond and Pearl in Japan, North America, and Australia.
Browser
On February 15, 2006, Nintendo announced a version of the cross-platform web browser Opera for the DS system. The browser can use one screen as an overview, a zoomed portion of which appears on the other screen, or both screens together to present a single tall view of the page. The browser went on sale in Japan and Europe in 2006, and in North America on June 4, 2007. Browser operation requires that an included memory expansion pak is inserted into the GBA slot. The DSi has an internet browser available for download from the Nintendo DSi shop for free.
Wi-Fi USB Connector
This USB-flash-disk-sized accessory plugs into a PC's USB port and creates a miniature hotspot/wireless access point, allowing a Wii and up to five Nintendo DS units to access the Nintendo Wi-Fi Connection service through the host computer's Internet connection. When tried under Linux and Mac, it acts as a regular wireless adapter, connecting to wireless networks, an LED blinks when there is data being transferred. There is also a hacked driver for Windows XP/Vista/7/8/10 to make it function the same way. The Wi-Fi USB Connector was discontinued from retail stores.
MP3 Player
The Nintendo MP3 Player (a modified version of the device known as the Play-Yan in Japan) was released on December 8, 2006, by Nintendo of Europe at a retail price of £29.99/€30. The add-on uses removable SD cards to store MP3 audio files, and can be used in any device that features support for Game Boy Advance cartridges; however, due to this, it is limited in terms of its user-interface and functionality, as it does not support using both screens of the DS simultaneously, nor does it make use of its touch-screen capability. It is not compatible with the DSi, due to the lack of the GBA slot, but the DSi includes a music player via SD card. Although it stated on the box that it is only compatible with the Game Boy Micro, Nintendo DS and Nintendo DS Lite, it is also compatible with the Game Boy Advance SP and Game Boy Advance.
Guitar grip controller
The Guitar grip controller comes packaged with the game Guitar Hero: On Tour and is plugged into the GBA game slot. It features four colored buttons like the ones found on regular Guitar Hero guitar controllers for the stationary consoles, though it lacks the fifth orange button found on the guitar controllers. The DS Guitar Hero controller comes with a small "pick-stylus" (which is shaped like a guitar pick, as the name suggests) that can be put away into a small slot on the controller. It also features a hand strap. The game works with both the DS Lite and the original Nintendo DS as it comes with an adapter for the original DS. The Guitar Grip also works with its sequels, Guitar Hero On Tour: Decades, Guitar Hero On Tour: Modern Hits, and Band Hero.
Later models
Nintendo DS Lite
The is the first redesign of the Nintendo DS. While retaining the original model's basic characteristics, it features a sleeker appearance, larger stylus, longer lasting battery, and brighter screens. Nintendo considered a larger model of the Nintendo DS Lite for release, but decided against it as sales of the original redesign were still strong. It was the final DS to have backwards compatibility with Game Boy Advance games. As of March 31, 2014, shipments of the DS Lite had reached 93.86 million units worldwide, according to Nintendo.
Nintendo DSi
The is the second redesign of the Nintendo DS. It is based on the unreleased larger Nintendo DS Lite model. While similar to the previous DS redesign, new features include two inner and outer 0.3 megapixel digital cameras, a larger 3.25 inch display, internal and external content storage, compatibility with WPA wireless encryption, and connectivity to the Nintendo DSi Shop.
The Nintendo DSi XL (DSi LL in Japan) is a larger design of the Nintendo DSi, and the first model of the Nintendo DS family of consoles to be a size variation of a previous one. It features larger screens with wider view angles, improved battery life, and a greater overall size than the original DSi. While the original DSi was specifically designed for individual use, Nintendo president Satoru Iwata suggested that DSi XL buyers give the console a "steady place on a table in the living room", so that it might be shared by multiple household members.
Software and features
Nintendo Wi-Fi Connection
Nintendo Wi-Fi Connection was a free online game service run by Nintendo. Players with a compatible Nintendo DS game could connect to the service via a Wi-Fi network using a Nintendo Wi-Fi USB Connector or a wireless router. The service was launched in North America, Australia, Japan & Europe throughout November 2005. An online compatible Nintendo DS game was released on the same day for each region.
Additional Nintendo DS Wi-Fi Connection games and a dedicated Nintendo DS web browser were released afterwards. Nintendo later believed that the online platform's success directly propelled the commercial success of the entire Nintendo DS platform. The Nintendo Wi-Fi Connection served as part of the basis of what would become the Wii. Most functions (for games on both the DS and Wii consoles) were discontinued worldwide on May 20, 2014.
Download Play
With Download Play, it is possible for users to play multiplayer games with other Nintendo DS systems, and later Nintendo 3DS systems, using only one game card. Players must have their systems within wireless range (up to approximately 65 feet) of each other for the guest system to download the necessary data from the host system. Only certain games supported this feature and usually played with much more limited features than the full game allowed.
Download Play is also utilized to migrate Pokémon from fourth generation games into the fifth generation Pokémon Black and White, an example of a task requiring two different game cards, two handheld units, but only one player.
Some Nintendo DS retailers featured DS Download Stations that allowed users to download demos of current and upcoming DS games; however, due to memory limitations, the downloads were erased once the system was powered off. The Download Station was made up of 1 to 8 standard retail DS units, with a standard DS card containing the demo data. On May 7, 2008, Nintendo released the Nintendo Channel for download on the Wii. The Nintendo Channel used WiiConnect24 to download Nintendo DS demos through it. From there, a user can select the demo they wish to play and, similar to the Nintendo DS Download Stations at retail outlets, download it to their DS and play it until it is powered off.
Multi-Card Play
Multi-Card Play, like Download Play, allows users to play multiplayer games with other Nintendo DS systems. In this case, each system requires a game card. This mode is accessed from an in-game menu, rather than the normal DS menu.
PictoChat
PictoChat allows users to communicate with other Nintendo DS users within local wireless range. Users can enter text (via an on screen keyboard), handwrite messages or draw pictures (via the stylus and touchscreen). There are four chatrooms (A, B, C, D) in which people can go to chat. Up to sixteen people can connect in any one room.
On Nintendo DS and Nintendo DS Lite systems, users can only write messages in black. However, in the DSi and DSi XL, there is a function that allows the user to write in any colour from the rainbow that cycles through the spectrum, meaning the user cannot choose a color.
PictoChat was not available for the subsequent Nintendo 3DS series of systems.
Firmware
Nintendo's own firmware boots the system. A health and safety warning is displayed first, then the main menu is loaded. The main menu presents the player with four main options to select: play a DS game, use PictoChat, initiate DS Download Play, or play a Game Boy Advance game. The main menu also has secondary options such as turning on or off the back light, the system settings, and an alarm.
The firmware also features a clock, several options for customization (such as boot priority for when games are inserted and GBA screen preferences), and the ability to input user information and preferences (such as name, birthday, favorite color, etc.) that can be used in games.
It supports the following languages: English, Japanese, Spanish, French, German, and Italian.
Games
Compatibility
The Nintendo DS is backward compatible with Game Boy Advance (GBA) cartridges. The smaller Nintendo DS game cards fit into a slot on the top of the system, while Game Boy Advance games fit into a slot on the bottom. The Nintendo DS, like the Game Boy Micro, is not backward compatible with games made for the original Game Boy and Game Boy Color because the Sharp Z80 compatible processor is not included and the console has physical incompatibility with Game Boy and Game Boy Color games. The original Game Boy sound processor, however, is still included to maintain compatibility for GBA games that use the older sound hardware.
The handheld does not have a port for the Game Boy Advance Link Cable, so multiplayer and GameCube–Game Boy Advance link-up modes are not available in Game Boy Advance titles. Only single-player mode is supported on the Nintendo DS, as is the case with Game Boy Advance games played via the Virtual Console on the Nintendo 3DS (Ambassadors only) and Wii U.
The Nintendo DS only uses one screen when playing Game Boy Advance games. The user can configure the system to use either the top or bottom screen by default. The games are displayed within a black border on the screen, due to the slightly different screen resolution between the two systems (256 × 192 px for the Nintendo DS, and 240 × 160 px for the Game Boy Advance).
Nintendo DS games inserted into the top slot are able to detect the presence of specific Game Boy Advance games in the bottom slot. In many such games, either stated in-game during gameplay or explained in its instruction manual, extra content can be unlocked or added by starting the Nintendo DS game with the appropriate Game Boy Advance game inserted. Among those games were the popular Pokémon Diamond and Pearl or Pokémon Platinum, which allowed the player to find more/exclusive Pokémon in the wild if a suitable Game Boy Advance cartridge was inserted. Some of the content can stay permanently, even after the GBA game has been removed.
Additionally, the GBA slot can be used to house expansion paks, such as the Rumble Pak, Nintendo DS Memory Expansion Pak, and Guitar Grips for the Guitar Hero: On Tour series. The Nintendo DSi and the DSi XL have an SD card slot instead of a second cartridge slot and cannot play Game Boy Advance games or Guitar Hero: On Tour.
Regional division
The Nintendo DS is region free in the sense that any console will run a Nintendo DS game purchased anywhere in the world; however, the Chinese iQue DS games cannot be played on other versions of the original DS, whose firmware chip does not contain the required Chinese character glyph images; this restriction is removed on Nintendo DSi and 3DS systems. Although the Nintendo DS of other regions cannot play the Chinese games, the iQue DS can play games of other regions. Also, as with Game Boy games, some games that require both players to have a Nintendo DS game card for multiplayer play will not necessarily work together if the games are from different regions (e.g. a Japanese Nintendo DS game may not work with a North American copy, even though some titles, such as Mario Kart DS and Pokémon Diamond and Pearl versions are mutually compatible). With the addition of the Nintendo Wi-Fi Connection, certain games can be played over the Internet with users of a different region game.
Some Wi-Fi enabled games (e.g. Mario Kart DS) allow the selection of opponents by region. The options are "Regional" ("Continent" in Europe) and "Worldwide", as well as two non-location specific settings. This allows the player to limit competitors to only those opponents based in the same geographical area. This is based on the region code of the game in use.
The Nintendo DSi, however, has a region lock for DSiWare downloadable games, as well as DSi-specific cartridges. It still runs normal DS games of any region, however.
Media specifications
Nintendo DS games use a proprietary solid state mask ROM in their game cards. The mask ROM chips are manufactured by Macronix and have an access time of 150 ns. Cards range from 8–512 MiB (64 Mib to 4 Gib) in size (although data on the maximum capacity has not been released). Larger cards have a 25% slower data transfer rate than more common smaller cards. The cards usually have a small amount of flash memory or an EEPROM to save user data such as game progress or high scores. However, there are few games that have no save memory, such as Electroplankton. The game cards are (about half the width and depth of Game Boy Advance cartridges) and weigh around 3.5 g ( oz).
Hacking and homebrew
Since the release of the Nintendo DS, a great deal of hacking has occurred involving the DS's fully rewritable firmware, Wi-Fi connection, game cards that allow SD storage, and software use. There are now many emulators for the DS, as well as the NES, SNES, Sega Master System, Sega Mega Drive, Neo-Geo Pocket, Neo-Geo MVS (arcade), and older handheld consoles like the Game Boy Color.
There are a number of cards which either have built-in flash memory, or a slot which can accept an SD, or MicroSD (like the DSTT, R4 and ez-flash V/Vi) cards. These cards typically enable DS console gamers to use their console to play MP3s and videos, and other non-gaming functions traditionally reserved for separate devices.
In South Korea, many video game consumers exploit illegal copies of video games, including for the Nintendo DS. In 2007, 500,000 copies of DS games were sold, while the sales of the DS hardware units was 800,000.
Another modification device called Action Replay, manufactured by the company Datel, is a device which allows the user to input cheat codes that allows it to hack games, granting the player infinite health, power-ups, access to any part of the game, infinite in game currency, the ability to walk through walls, and various other abilities depending on the game and code used.
See also
List of Nintendo DS and 3DS flash cartridges
Notes
References
External links
Products introduced in 2004
Backward-compatible video game consoles
Handheld game consoles
Regionless game consoles
2000s toys
2010s toys
IQue consoles
Seventh-generation video game consoles
Discontinued handheld game consoles |
32338503 | https://en.wikipedia.org/wiki/Android-x86 | Android-x86 | Android-x86 is an open source project that makes an unofficial porting of the Android mobile operating system developed by the Open Handset Alliance to run on devices powered by x86 processors, rather than RISC-based ARM chips.
Developers Chih-Wei Huang and Yi Sun originated the project in 2009. The project began as a series of patches to the Android source code to enable Android to run on various netbooks, tablets and ultra-mobile PCs. Huang is the current project maintainer. Currently active developers include Mauro Rossi and Michael Goffioul.
Overview
The OS is based on the Android Open Source Project (AOSP) with some modifications and improvements. Some components are developed by the project which allow it to run on PC architecture. For instance, some low-level components are replaced to better suit the platform, such as the kernel and HALs. The OS enables OpenGL ES hardware acceleration via Mesa if supported GPUs are detected, including Intel GMA, AMD's Radeon, Nvidia's chipsets (Nouveau), VMware () and QEMU (). Without a supported GPU, the OS can run in non-accelerated mode via software rendering. Since release 7.1, the software renderer has been implemented via the SwiftShader project.
Like a normal Linux distribution, the project releases pre-built ISO images which can run under live mode or installed to a hard disk on the target system. Since release 4.4-r2, the project also releases efi_img which can be used to create a live USB to be booted from on UEFI systems. Since release 4.4-r4, the UEFI support was united into the ISO images and efi_img was marked as deprecated.
Except AOSP, the following incomplete list of components are developed from scratch or derived from other open source projects to form the entire Android-x86 codebase:
Kernel
Installer
drm_gralloc and gbm_gralloc
Mesa
SwiftShader
Audio
Camera
GPS
Lights
Radio Interface Layer
Sensors
More and more components may be added to the updated version.
Distros
Remix OS
Phoenix OS
Bliss OS
Prime OS
OPENTHOS
Related projects
Project Celadon
A related project, Celadon (formerly Android-IA) has been produced by Intel that will run on newer UEFI devices. The project states that its intention is to drive Android support and innovation on Intel Architecture in addition to providing a venue for collaboration. It re-used the drm_gralloc graphics HAL module from Android-x86 in order to support Intel HD Graphics hardware. Back as Android-IA, it provided a FAQ with more detailed information.
Remix OS
Jide Technologies partnered with Chih-Wei Huang, the main developer of Android-x86, on Remix OS, a closed-source derivative of Android-x86 designed for use on conventional PCs. The first beta of Remix OS was made available on March 1, 2016. The project was discontinued on July 17, 2017.
Android TV x86
In late 2020, a senior member of XDA Developers created Android TV x86 to provide Android TV for PCs, which "should work out of the box because the ROM has its roots in the Android-x86 project".
See also
Linux
Chromium OS
Ubuntu
Anbox - A free and open-source compatibility layer that aims to allow mobile applications and mobile games developed for Android to run on Linux distributions.
BlueStacks
DuOS-M
List of operating systems
References
External links
Android (operating system)
X86 operating systems
Operating system distributions bootable from read-only media
Linux distributions without systemd
Linux distributions |
39437062 | https://en.wikipedia.org/wiki/Debian%20version%20history | Debian version history | Debian releases do not follow a fixed schedule. Recent releases have been made roughly biennially by the Debian Project.
Debian always has at least three release branches active at any time: "stable", "testing" and "unstable". The stable release is the most recent and up-to-date version of Debian. The testing release contains packages that have been tested from unstable. Testing has significantly more up-to-date packages than stable and is a close version of the future release candidate for stable. The unstable release (also known as sid) is the release where active development takes place. It is the most volatile version of Debian.
When the Debian stable branch is replaced with a newer release, the current stable becomes an "oldstable" release. When the Debian stable branch is replaced again, the oldstable release becomes the "oldoldstable" release. Oldoldstable is eventually moved to the archived releases repository.
The most recent version of Debian is Debian version 11, codename "Bullseye". The next up and coming release of Debian is Debian 12 codename "Bookworm".
Debian distribution codenames are based on the names of characters from the Toy Story films. Debian's unstable trunk is named after Sid, a character who regularly destroyed his toys.
Release history
Debian 1.0 was never released, as a vendor accidentally shipped a development release with that version number. The package management system dpkg and its front-end dselect were developed and implemented on Debian in a previous release. A transition from the a.out binary format to the ELF binary format had already begun before the planned 1.0 release. The only supported architecture was Intel 80386 (i386).
Debian 1.1 (Buzz)
Debian 1.1 (Buzz), released 17 June 1996, contained 474 packages. Debian had fully transitioned to the ELF binary format and used Linux kernel 2.0.
Debian 1.2 (Rex)
Debian 1.2 (Rex), released 12 December 1996, contained 848 packages maintained by 120 developers.
Debian 1.3 (Bo)
Debian 1.3 (Bo), released 5 June 1997, contained 974 packages maintained by 200 developers.
Point releases:
1.3.1 ()
1.3.1r1 (Release date unknown)
1.3.1r2 (Release date unknown)
1.3.1r3 (Release date unknown)
1.3.1r4 (Release date unknown)
1.3.1r5 (Release date unknown)
1.3.1r6 ()
Debian 2.0 (Hamm)
Debian 2.0 (Hamm), released 24 July 1998, contained over 1,500 packages maintained by over 400 developers. A transition was made to libc6 and Debian was ported to the Motorola 68000 series (m68k) architectures.
Point releases:
2.0r1 ()
2.0r2 ()
2.0r3 ()
2.0r4 ()
2.0r5 ()
Debian 2.1 (Slink)
Debian 2.1 (Slink), released 9 March 1999, contained about 2,250 packages. The front-end APT was introduced for the package management system and Debian was ported to Alpha and SPARC.
Point releases:
2.1r1 (Possibly never released)
2.1r2 ()
2.1r3 ()
2.1r4 ()
2.1r5 ()
Debian 2.2 (Potato)
Debian 2.2 (Potato), released 14–15 August 2000, contained 2,600 packages maintained by more than 450 developers. New packages included the display manager GDM, the directory service OpenLDAP, the security software OpenSSH and the mail transfer agent Postfix. Debian was ported to the PowerPC and ARM architectures.
Point releases:
2.2r1 ()
2.2r2 ()
2.2r3 ()
2.2r4 ()
2.2r5 ()
2.2r6 ()
2.2r7 ()
Debian 3.0 (Woody)
Debian 3.0 (Woody), released 19 July 2002, contained around 8,500 packages maintained by more than 900 developers. KDE was introduced and Debian was ported to the following architectures: IA-64, PA-RISC (hppa), mips and mipsel and IBM ESA/390 (s390).
Point releases:
3.0r1 ()
3.0r2 ()
3.0r3 ()
3.0r4 ()
3.0r5 ()
3.0r6 ()
Debian 3.1 (Sarge)
Debian 3.1 (Sarge), released 6 June 2005, contained around 15,400 packages. debian-installer and OpenOffice.org were introduced.
Point releases:
3.1r1 ()
3.1r2 ()
3.1r3 ()
3.1r4 ()
3.1r5 ()
3.1r6 ()
3.1r7 ()
3.1r8 () this is the final update for codename Sarge.
Debian 4.0 (Etch)
Debian 4.0 (Etch), released 8 April 2007, contained around 18,000 packages maintained by more than 1,030 developers. Debian was ported to x86-64 (amd64) and support for the Motorola 68000 series (m68k) architecture was dropped. This version introduced utf-8 and udev device management by default.
Point releases:
4.0r1 ()
4.0r2 ()
4.0r3 ()
4.0r4 ()
4.0r5 ()
4.0r6 ()
4.0r7 ()
4.0r8 ()
4.0r9 () this is the final update for codename Etch
Debian 5.0 (Lenny)
Debian 5.0 (Lenny), released 14 February 2009, contained more than 23,000 packages. Debian was ported to the ARM EABI (armel) architecture.
Point releases:
5.0.1 ()
5.0.2 ()
5.0.3 ()
5.0.4 ()
5.0.5 ()
5.0.6 ()
5.0.7 ()
5.0.8 ()
5.0.9 ()
5.0.10 () this is the final update for codename Lenny.
Debian 6.0 (Squeeze)
Debian 6.0 (Squeeze), released 6 February 2011, contained more than 29,000 packages. The default Linux kernel included was deblobbed beginning with this release. The web browser Chromium was introduced and Debian was ported to the kfreebsd-i386 and kfreebsd-amd64 architectures (while that port was later discontinued), and support for the Intel 486, Alpha, and PA-RISC (hppa) architectures was dropped.
Squeeze was the first release of Debian in which non-free firmware components (aka "binary blobs") were excluded from the "main" repository as a matter of policy.
Point releases:
6.0.1 ()
6.0.2 ()
6.0.3 ()
6.0.4 ()
6.0.5 ()
6.0.6 ()
6.0.7 ()
6.0.8 ()
6.0.9 ()
6.0.10 () this is the final update for codename Squeeze.
Squeeze long term support reaches end-of-life ()
Debian 7 (Wheezy)
Debian 7 (Wheezy), released 4 May 2013, contained more than 36,000 packages. Support for UEFI was added and Debian was ported to the armhf and IBM ESA/390 (s390x) architectures.
Point releases:
7.1 ()
7.2 ()
7.3 ()
7.4 ()
7.5 ()
7.6 ()
7.7 ()
7.8 ()
Debian 8.0 codename Jessie releases, Wheezy becomes oldstable ()
7.9 ()
7.10 ()
7.11 () this is the final update for codename Wheezy.
Debian 9.0 codename Stretch releases, Wheezy becomes oldoldstable ()
Wheezy long term support reached end-of-life ()
Wheezy extended long term support reached end-of-life ().
Debian 8 (Jessie)
Debian 8 (Jessie), released 25 April 2015, contained more than 43,000 packages, with systemd installed by default instead of init. (sysvinit and upstart packages are provided as alternatives.) Debian was ported to the ARM64 and ppc64le architectures, while support for the IA-64, kfreebsd-amd64 and kfreebsd-i386, IBM ESA/390 (s390) (only the 31-bit variant; the newer 64-bit s390x was retained) and SPARC architectures were dropped.
Long term support ended June 2020.
Point releases:
8.1 ()
8.2 ()
8.3 ()
8.4 ()
8.5 ()
8.6 ()
8.7 ()
8.8 ()
Debian 9.0 codename Stretch releases, Jessie becomes oldstable ()
8.9 ()
8.10 ()
Regular security support updates have been discontinued ()
8.11 () this is the final update for codename Jessie.
Debian 10.0 codename Buster releases, Jessie becomes oldoldstable ()
Jessie long term support reaches end-of-life ()
Jessie extended long term support reaches end-of-life ()
Debian 9 (Stretch)
Debian 9 (Stretch) was released on 17 June 2017, two years and two months after Debian 8.0, and contained more than 51,000 packages. The final minor update, called a "point release", is version 9.13, released on . Major upgrades include the Linux kernel going from version 3.16 to 4.9, GNOME desktop version going from 3.14 to 3.22, KDE Plasma 4 was upgraded to Plasma 5, LibreOffice 4.3 upgraded to 5.2 and Qt upgraded from 4.8 to 5.7. LXQt has been added as well.
The Intel i586 (Pentium), i586/i686 hybrid and PowerPC architectures are no longer supported as of Stretch.
Point releases:
9.1 ()
9.2 ()
9.3 ()
9.4 ()
9.5 ()
9.6 ()
9.7 ()
9.8 ()
9.9 ()
Stretch becomes oldstable, Buster is the current stable release ()
9.10 ()
9.11 ()
9.12 ()
9.13 () this is the final update for codename Stretch.
Stretch long term support reaches end-of-life ()
Debian 10 (Buster)
Debian 10 (Buster) was released on . It was two years and a month after Debian 9 (Stretch). Debian 10 contains 57,703 packages, supports UEFI Secure Boot, has AppArmor enabled by default, uses LUKS2 as the default LUKS format, and uses Wayland for GNOME by default.
Debian 10 ships with Linux kernel version 4.19. Available desktops include Cinnamon 3.8, GNOME 3.30, KDE Plasma 5.14, LXDE 0.99.2, LXQt 0.14, MATE 1.20, Xfce 4.12. Key application software includes LibreOffice 6.1 for office productivity, VLC 3.0 for media viewing, and Firefox ESR for web browsing.
Point releases:
10.1 ()
10.2 ()
10.3 ()
10.4 ()
10.5 ()
10.6 ()
10.7 ()
10.8 ()
10.9 ()
10.10 ()
Buster becomes oldstable, Bullseye is the current stable release ()
10.11 ()
Debian 11 (Bullseye)
Debian 11 (Bullseye) was released on 14 August 2021. It is based on the Linux 5.10 LTS kernel and will be supported for five years.
On 12 November 2020, it was announced that "Homeworld", by Juliette Taka, will be the default theme for Debian 11, after winning a public poll held with eighteen choices.
Bullseye dropped the remaining Qt4/KDE 4 libraries and Python 2. The first of the code freezes, readying Debian 11 for release, began on 12 January 2021.
Bullseye does not support the older big-endian 32-bit MIPS architectures.
Bullseye shipped with Qt 5.15 KDE Plasma 5.20. Available desktops include Gnome 3.38, KDE Plasma 5.20, LXDE 11, LXQt 0.16, MATE 1.24, and Xfce 4.16.
Development freeze timetable:
January 12, 2021: transition freeze
February 12, 2021: soft freeze
March 12, 2021: hard freeze
July 17, 2021: full freeze
August 14, 2021: release
Point releases:
11.1 ()
11.2 ()
Debian 12 (Bookworm)
Debian 12 (Bookworm) is the current testing release of Debian and is the next release candidate for Debian.
Debian 12 is expected to have link-time optimization (LTO) enabled by default.
Debian 12 is not expected to have Qt 6 as there isn't an active maintainer for it.
Debian 13 (Trixie)
Trixie is expected to be the codename for Debian 13.
Release table
When a release transitions to long-term support phase (LTS-phase), security is no longer handled by the main Debian security team. Only a subset of Debian architectures are eligible for Long Term Support, and there is no support for packages in backports.
Release timeline
Port timeline
Many of past architectures, plus some that have not yet achieved release status, are available from the debian-ports repository.
See also
Summary of Debian version history
Fedora version history
Linux Mint version history
Ubuntu version history
References
External links
Debian Releases at Debian Wiki
Debian Releases at debian.org
Debian
Lists of operating systems
Software version histories |
40326059 | https://en.wikipedia.org/wiki/Cyber%20Security%20and%20Information%20Systems%20Information%20Analysis%20Center | Cyber Security and Information Systems Information Analysis Center | Cyber Security and Information Systems Information Analysis Center (CSIAC) is a United States Department of Defense (DoD) Information Analysis Center (IAC) sponsored by the Defense Technical Information Center (DTIC). The CSIAC is a consolidation of three predecessor IACs: the Data & Analysis Center for Software (DACS), the Information Assurance Technology IAC (IATAC) and the Modeling & Simulation IAC (MSIAC), with the addition of the Knowledge Management and Information Sharing technical area.
CSIAC, one of three IACs sponsored by DTIC, performs the Basic Center of Operations (BCO) functions necessary to fulfill the mission and objectives applicable to the DoD Research, Development, Test and Evaluation (RDT&E) and Acquisition communities’ needs. These activities focus on the collection, analysis, synthesizing/processing and dissemination of Scientific and Technical Information (STI).
The BCO functions, specifically the collection and dissemination of STI, produce several valuable resources (e.g., reports, tool databases, data collections, etc.) in the CSIAC's core technology areas (Cybersecurity, Information Assurance, Software Engineering, Modeling & Simulation, and Knowledge Management/Information Sharing).
CSIAC's mission is to provide the DoD with a central point of access for Information Assurance and Cybersecurity to include emerging technologies in system vulnerabilities, R&D, models, and analysis to support the development and implementation of effective defense against information warfare attacks.
The CSIAC is chartered to leverage best practices and expertise from government, industry, and academia on cyber security and information technology.
History
The United States is vulnerable to Information Warfare attacks because our economic, social, military, and commercial infrastructures demand timely and accurate as well as reliable information services. This vulnerability is complicated by the dependence of our DoD information systems on commercial or proprietary networks which are readily accessed by both users and adversaries. The identification of the critical paths and key vulnerabilities within the information infrastructure is an enormous task. Recent advances in information technology have made information systems easier to use, less expensive, and more available to a wide spectrum of potential adversaries.
The security of our nation depends on the survivability, authenticity, and continuity of DoD information systems. These systems are vulnerable to external attacks, due in part to the necessary dependence on commercial systems and the increased use of the Internet. The survivability, authenticity, and continuity of DoD information systems is of supreme importance to the Warfighter. With the increasing amount of concern and Information Warfare activities requiring rapid responses, it is difficult to ensure that all appropriate agencies and organizations are given the knowledge and tools to protect from, react to, and defend against Information Warfare attacks. CSIAC has been established under the direction of the Defense Technical Information Center and the integrated sponsorship of the Assistant Secretary of Defense for Research and Engineering (ASDR&E); Assistant to Secretary of Defense/Networks and Information Integration; and the Joint Chiefs of Staff.
CSIAC serves as a central authoritative source for Cyber Security vulnerability data, information, methodologies, models, and analyses of emerging technologies relating to the survivability, authenticity, and continuity of operation of Information Systems critical to the nation's defense in support of the agencies' frontline missions.
CSIAC operates as a specialized subject focal point, supplementing DTIC services within DoD Directive 3200.12, DoD Scientific and Technical Information Program (STIP), dated February 11, 1998.
DTIC Realignment and Restructuring
Given the evolving Defense environment, as well as recent congressional guidance, the Defense Technical Information Center (DTIC) recognized an opportunity to reshape the IACs to better respond to DoD mission needs. As a result, DTIC is realigning and consolidating the IAC program structure to achieve several objectives:
Expand the IAC program scope and increase synergy across related technology areas
Increase opportunities for small business
Expand the industrial base accessible through the IACs See United States Cyber Command
To achieve these objectives, DTIC is forming new, consolidated IAC Basic Centers of Operation (BCOs). The BCOs are managed by both industry and academia. The DoD establishes IAC BCOs in areas of strategic importance, such as cyber security and information systems. An IAC BCO serves as the center for its technical community, and as such must maintain connection with all of the key stakeholders within that community, in order to understand on-going activities, current information, future strategies and information needs.
This mission remains unchanged in the new IAC structure. However, what the new approach brings is expanded scope, increased focus on technical information needs, and enhanced agility, as the Defense environment continues to evolve.
BCOs will still analyze and synthesize scientific and technical information (STI). However, they also are to take on an expanded role in program analysis and integration by assessing and shaping nearly $6 billion in Technical Area Tasks (TATs). TATs are a companion offering of the IAC Program, by which DTIC leverages industry and academia's best and brightest to conduct research and analysis, developing innovative solutions to the most challenging requirements. IAC BCOs will ensure consistency with and reduce duplication of prior or other ongoing work and by helping to ensure TATs are more responsive both to customer needs and broader DoD imperatives. BCOs also are to ensure that TAT results are properly documented and made available for broad dissemination. This approach both achieves cost savings and reduces risks, ensuring that in this time of shrinking budgets and evolving requirements, the Defense community leverages all available knowledge to identify and implement innovative solutions.
Creation of CSIAC
The CSIAC BCO represents the first awarded BCO under the new DTIC structure. As its name suggests, CSIAC's main technical focus is in Cyber Security and Information Systems. CSIAC merges the software engineering technology area of the DACS, the modeling & simulation technology area of the MSIAC, and the information assurance technology area of the IATAC all together. It will also address two new technology focus areas: knowledge management and information sharing. Additionally, CSIAC will expand into other areas of importance and closely monitor new technologies as they emerge.
Steering Committee
CSIAC operates under the direction of our Government Steering Committee. The committee is made up of 19 individuals from Government, DoD and the research and development (R&D) community, including representation from the Defense Information Assurance Program (DIAP), Joint Task Force for Global Network Operations (JTF-GNO), National Security Agency (NSA), Naval Postgraduate School (NPS), Office of the Secretary of Defense (OSD), and the Navy Information Operations Command - Norfolk, to name a few. The steering committee meets once a year and provides input and feedback to CSIAC's operations, particularly our information collection and information dissemination efforts. Additionally, the topics of the technical reports that CSIAC authors are dictated by the Steering Committee.
Sponsors
The Cyber Security and Information Systems Information Analysis Center (CSIAC) is a U.S. Department of Defense Information Analysis Center (IAC) sponsored by the Defense Technical Information Center (DTIC), and Assistant Secretary of Defense for Research and Engineering (ASDR&E). CSIAC is hosted by Quanterion Solutions Incorporated.
Team Members
The CSIAC team members consists of a BCO, Tier 1 team members, and Tier 2 team members.
BCO
Quanterion Solutions Incorporated, a small business in Utica, New York was awarded the CSIAC contract in fall of 2012.
Tier 1 Team Members
AEgis Technologies
Assured Information Security (AIS)
SRC
Syracuse University
George Mason University
The University of Southern California
Tier 2 team members
In addition to the Tier One team members, the CSIAC team includes Tier Two organizations, that provide reach-back support with subject matter experts (SMEs) that assist with technical inquiries, State-of-the-Art reports (SOARs), and core Analysis Tasks (CATs). The Tier 2 organizations of the CSIAC team include:
Survice Engineering Company
WetStone
Aptima
Minerva Engineering
The Griffiss Institute
State University of New York Institute of Technology (SUNY IT)
Utica College
Services
Community of Practice
CSIAC's strategy to address the broadened scopes of the three IACs (DACS, IATAC, MSIAC), as well as the new areas of knowledge management and information sharing is to build and facilitate a Community of Practice (CoP) for the cyber security and information systems community.
The CSIAC website (www.thecsiac.com) provides the infrastructure for the CoP and serves as the catalyst. The CSIAC website is member driven and encourages participation from the CSIAC community supported by CSIAC resources and activities. The website emphasizes unifying CSIAC resources and its members by supporting conversions and collaborations.
The CoP supports the entire operation of the CSIAC, including information collection, analysis, and dissemination.
Subject Matter Expert (SME) Network
CSIAC's Subject Matter Expert (SME) Network is one of the most valuable resources to the user community. They provide a wealth of knowledge and information to the center through a variety of means. For example, SMEs are the main contributors of journal articles and webinar presentations. They are also available to respond to inquiries, assist with State-of-the-Art reports (SOARs), and perform research and analysis to support Core Analysis Tasks (CATs).
SME qualifications
CSIAC SMEs are those individuals who are considered to be experts in the fields that fall within the CSIAC's technical domain (i.e., cybersecurity, information assurance, software engineering, Modeling & Simulation, and Knowledge Management/Information Sharing). No single criterion provides the basis for being considered an expert, but instead it is based on a combination of factors, including an individual's:
Education (i.e., undergraduate, graduate and doctoral degrees)
Work experience (years in the field, positions held, past programs, etc.)
Publications
The database consists of a wide range of SMEs from various backgrounds. Among those are members of the CSIAC's technical staff, key individuals from team member organizations, retired senior military leaders, leading academic researchers, and industry executives.
Technical inquiries
The CSIAC provides up to four hours of free technical inquiry research to answer users’ most pressing technical questions. Technical inquiries submitted online are sent directly to an analyst who identifies the staff member, CSIAC team member, or Subject Matter Expert (SME) that is best suited to answer the question. The completed response will be sent to the user, and can take up to 10 working days; though they are typically delivered sooner.
Core Analysis Task (CAT) program
Challenging technical problems that are beyond the scope of a basic inquiry (i.e., require more than four hours of research) can be solved by initiating a Core Analysis Task (CAT). CATs are separately funded work efforts over and above basic CSIAC products and services. Through the CAT program, the CSIAC can be utilized as a contracting vehicle, enabling the DoD to obtain specialized support for specific projects. These projects, however, must be within the CSIAC's technical domain (cybersecurity, Information Assurance, Software Engineering, Modeling & Simulation, and Knowledge Management/Information Sharing).
Some of the advantages of the IAC CAT program include:
Minimal start-work delay – Not only does the CSIAC provide DoD and other agencies with a contract vehicle, but it is also a pre-competed single award CPFF IDIQ. Work can begin on a project in as little as 4–6 weeks after the order is placed.
Expansive Technical Domain – the CSIAC's broad scope (Cybersecurity, Information Assurance, Software Engineering, Modeling & Simulation and Knowledge Management/Information Sharing) provides numerous resources for potential projects, and is especially valuable for efforts that cross multiple domains.
Subject Matter Expert (SME) Network – The CSIAC is able to leverage reach-back support from its expansive SME Network, including technical experts from the CSIAC staff, team members, or the greater community, to complete CATs and other projects.
Scientific and Technical Information (STI) Repositories – As a consolidation of three predecessor IACs, the CSIAC has a wealth of data and information to support the completion of CATs.
Apply the Latest Findings – Draw from the most recent studies performed for agencies across the DoD, as the results from all CSIAC CATs and SNIM Technical Area Tasks (TATs) are collected, stored and used to support future efforts by the CSIAC.
Scientific and Technical Information (STI) Program
CSIAC collects IA/DIO related STI to share with the DoD, other federal agencies, their contractors, and the research and engineering (R&E) community. The STI program is governed by DoD Directive 3200.12, DoD STI Program.
CSIAC has thousands of IA/DIO-related documents in their technical repository. This collection is a combination of both classified and unclassified material. All of CSIAC's documents are uploaded to DTIC Online Access Control (DOAC), which is an online repository of STI from all of DTIC's IAC's.
CSIAC's library facilitates knowledge sharing between diverse groups and organizations, and all STI is readily accessible to the IA/DIO community within the classification and secondary distribution instructions.
All STI collected by CSIAC is relevant to IA/CS research, development, engineering, testing, evaluation, production, operation, use, or maintenance. STI is collected in many forms including text-based documents, multimedia, and rich media files. Some topic areas include: Biometrics, Computer Network Attack, Computer Network Defense, Cyber Terrorism, Hacking, Information Warfare, Network-centric Warfare, Malicious Code, Product Evaluations, among others. CSIAC collects unclassified submissions from across all of the IA/CS community.
Events Calendar
The CSIAC maintains an online calendar of events related to the interests of its members. The Events Calendar is also available as an RSS feed or a HTML viewable from the CSIAC website.
Products
S2CPAT
Software & Systems Cost & Performance Analysis Toolkit (S2CPAT) is a web-based toolkit with the goal of capturing and analyzing software engineering data from completed software projects that can be used to improve:
the quality of software–intensive systems
the ability to predict the development of software–intensive systems with respect to effort and schedule
S2CPAT currently allows users to search for similar software projects and use the data to support:
Rough order of magnitude estimates for software development effort and schedule
Project planning and management: life cycle model information, key risks, lessons learned, templates, estimation heuristics
Software engineering research
The S2CPAT repository contains Software Resources Data Report (SRDR) data provided by the US Air Force. This data has been sanitized for public release by DoD and validated by a DoD-funded academic research team.
Reports
CSIAC publishes three types of reports on current Cyber Security and Information Systems topics:
State-of-the-art Reports (SOAR) investigate developments in IA issues. Past SOAR topics include: Insider Threat, Software Security Assurance, Risk Management for the Off-the-Shelf Information Communications Technology Supply Chain, and Measuring Cyber Security and Information Assurance.
Critical Reviews and Technology Assessments (CR/TA) evaluate and synthesize the latest available information resulting from recent R&D findings. They offer comparative assessments of technologies and/or methodologies based on specific technical characteristics. Topics include Wireless Wide Area Network (WWAN) Security, Network-Centric Warfare, and Biotechnology.
Tools Reports outline a current technology and provide an objective listing of currently available products. Topics for tools reports include firewalls, vulnerability assessment, Intrusion Detection Systems, and malware.
Journal
CSIAC's Journal of Cyber Security & Information Systems is a quarterly technical journal written from a DoD perspective and contains the following: synopses and critiques of significant, newly acquired reports and/or journal articles; summaries of the initiation of new R&D programs; listing or calendar of future conferences, symposia, etc.; and summaries of significant technological breakthroughs and significant new technological applications and highlights of any other outstanding developments. News from various DoD CSIAC programs that would be of interest to other DoD organizations may also be included. The journal is distributed in print and electronic format to registered CSIAC subscribers free of charge and is available for viewing and download from the CSIAC website.
Journal sources are direct invitations, publishing "Call for Papers", and unsolicited submissions. Direct invites are the most common method, usually by contacting conference presenters and asking if they could write something based on their presentation. Authors are solicited for their expertise or experience relative to the theme of the journal issue.
IA Digest
The CSIAC's Information Assurance (IA) Digest is a semi-weekly news summary for information assurance and software reliability professionals protecting the Global Information Grid (GIG). It is transmitted in an HTML-formatted email and provides links to articles and news summaries across a spectrum of cyber security, information assurance, and information systems topics.
Webinars
References
External links
CSIAC Home Page
DoD Information Analysis Center Home Page
DoD Research and Engineering Enterprise Home Page
DoD Chief Information Officer Home Page
Defense Online Access Control
{{|date= december 2018 |bot=InternetArchiveBot |fix-attempted=yes }}
United States Department of Defense
Computer security organizations
Cyberwarfare in the United States |
1120399 | https://en.wikipedia.org/wiki/Maze%20War | Maze War | Maze, later expanded and renamed to Maze War, is a 3D networked first-person shooter maze game originally developed by Steve Colley, Greg Thompson, and Howard Palmer for the Imlac PDS-1 computer. It was largely developed between the summer of 1972 and fall of 1973, at which point it included shooter elements and soon after was playable over ARPANET between multiple universities. It is considered the earliest first-person shooter; ambiguity over its development timeline has led it to be considered, along with Spasim (March 1974, on PLATO), to be one of the "joint ancestors" of the genre.
Although the first-person shooter genre did not crystallize for many years, Maze War influenced first-person games in other genres, particularly RPGs. The Maze War style view was first adopted by Moria in 1975, an early RPG on the PLATO network, and further popularized by Ultima and Wizardry, eventually appearing in bitmapped form in games like Dungeon Master, Phantasy Star, Eye of the Beholder and countless others.
Gameplay
Players wander around a maze, being capable of moving backward or forwards, turning right or left in 90-degree increments, and peeking around corners through doorways. The game also uses simple tile-based movement, where the player moves from square to square. Other players are seen as their names, figures or later eyeballs in the Xerox version. When a player sees another player, they can shoot or otherwise negatively affect them. Players gain points for shooting other players, and lose them for being shot. Some versions (like the X11 port) had a cheat mode where the player running the server could see the other players' positions on the map. The original MIT Imlac version had cheat keys to knock out a wall in the player's local Imlac maze copy, which made it possible to walk through walls as seen by other players. Occasionally in later versions, a duck also appears in the passage.
Innovations
Features either invented for Maze War or disseminated by it include:
First-person 3-D perspective. Players saw the playing field as if they themselves were walking around in it, with the maze walls rendered in one point perspective. This makes the game one of the first, if not the first first-person shooter.
Avatars. Players were represented to each other as eyeballs. While some earlier games represented players as spacecraft or as dots, this was probably the first computer game to represent players as organic beings.
Player's position depicted on level map. Representation of a player's position on a playing field map. Unlike the playing field of a side-view or second-person perspective, this is only used for position reference as opposed to being the primary depiction of play. It does not normally depict opponents. The combination of a first-person view and a top-down, second-person view has been used in many games since.
Level editor. A program was written to edit the playing field design.
Network play. Probably the first game ever to be played between two peer-to-peer computers, as opposed to earlier multiplayer games which were generally based on a minicomputer or mainframe with players using either terminals or specialized controls, in 1973.
Client-server networked play. An updated version may well have been the first client-server game, with workstations running the client connecting to a mainframe running a server program. This version could be played across the ARPANET, in 1977.
Observer mode. In the 1974 and 1977 versions, a graphics terminal could be used by observers to watch the game in progress without participating.
Modifying clients in order to cheat at the game.
Encrypting source code to prevent cheating.
Development
1973, Imlacs at NASA
It was originally written by Steve Colley (later founder of nCUBE) in 1972–1973 on the Imlac PDS-1's at the NASA Ames Research Center in California. He had written a program for portraying and navigating mazes from a first-person perspective. The maze was depicted in memory with a 16 by 16 bit array. Colley, together with Greg Thompson, and Howard Palmer developed the MazeWars program at NASA in Jim Hart's Computation Division 2nd floor lab. Colley writes:
1973, Imlacs at MIT
The original Imlac networked version was limited to two players, with the Imlacs directly cabled to each other. In the fall of 1973, Greg Thompson brought the game with him when he went away to college at Massachusetts Institute of Technology (MIT). There at J. C. R. Licklider's MIT Project MAC Dynamic Modelling Laboratory, Greg with Dave Lebling expanded Maze Wars into a full multiplayer game that could operate between Imlacs over the early ARPANET (the predecessor to the modern Internet). According to Lebling, the version of the game Thompson brought did not have shooting in it yet: "you wandered around in a maze, and that was it." In addition to the network connectivity, they added the shooting, and Lebling wrote a maze editor where players could create mazes with a text file of '1's for walls and '0's for spaces. As described in the history of interactive computing book The Dream Machine:
The clients ran on Imlacs which had 56 kbit/s serial connections, allowing them to communicate with a PDP-10 computer running MIT's Incompatible Timesharing System (ITS). A server program on the mainframe coordinated up to eight Imlac clients playing against each other. By using terminal servers, Imlacs at other colleges that were connected to the ARPANET could connect to the server at MIT and play against players located across the United States. The server program would also implement robot players and drive a playing monitor that would display a top-down map of the maze showing the locations of human and robotic players on an Evans & Sutherland LDS-1 Line Drawing System. Later a level editor was written so that different playing fields could have different designs.
1976, TTL at MIT
For a class in the fall of 1976, Greg Thompson (computer design), Mark Horowitz (display processor) and George Woltman (firmware) built a "hardware" version of Maze entirely from 7400 series TTL circuits, essentially creating a Maze computer with 256 16-bit instructions and 256 bytes of RAM, dedicated solely to playing Maze for up to 4 human players and 4 robots. Arcade games such as Pong had used this approach before. The TTL version of Maze used Tektronix oscilloscopes to display vector graphics. This was natural, since the Imlacs also used vector displays. This version introduced a full third dimension, by having a four-level maze with players able to climb up and down between levels. Robots players were also implemented in the firmware. Their skill level was controlled by simply adjusting the clock rate of the system. The game was so popular that even though it had been built as a MIT 6.111/6.112 class project it was kept assembled and operational for over a year.
1977, Xerox
In 1977, Jim Guyton, a staff member at Xerox's Palo Alto Research Center (PARC) rewrote Mazewar for the Xerox Alto and other Xerox Star machines. This was the first raster display version of Mazewar. It made use of the Alto's Ethernet network, using the Xerox PUP network protocol. The Data General servers used on the network were capable of gatewaying games to remote office locations, allowing people at several Xerox sites to play against each other, making Mazewar capable of being played in four different configurations: peer to peer with two Imlacs, client-server with Imlacs and a PDP-10, in pure hardware, and over Ethernet and PUP.
Several programmers at PARC cheated by modifying the code so that they could see the positions of other players on the playing field map. This upset the authors enough that the source code was subsequently stored in an encrypted form, the only program on the system to receive this protection. This is interesting in light of the fact that this laboratory housed many of the most important programming developments of the time, including the first graphical user interfaces.
1986, Digital Equipment Corporation
In 1982, Jim Guyton showed Christopher (Kent) Kantarjiev Mazewar at RAND.
Kent later interned at Digital Equipment Corporation's Western Research Lab (DEC WRL) in Palo Alto during his Ph.D. studies. Several former PARC employees worked at WRL, and one of them, Gene McDaniel, gave Kent a hard copy of the Mesa source code listing from the Xerox version of Maze, and the bitmap file that is used for the display.
The X Window System had been newly released as a result of collaborative efforts between DEC and MIT. Kent wrote a networked version of Mazewar, which he released in December 1986. This version used UDP port 1111, and could be played by Unix workstations running X Window across the Internet. This was probably the second game which directly used TCP/IP, and the first which could be played across the Internet (1983's SGI Dogfight used broadcast packets and thus could not transit a router).
1992, Oracle SQL*Net
Using Kent's code and earlier code from MIT, Jack Haverty and others at Oracle created a version of Maze running over Oracle SQL*Net over TCP/IP, Novell SPX/IPX, DECnet, and Banyan Vines at Fall Interop 92 on a number of workstations, including Unix machines from Sun, IBM, and SGI, as well as DEC VMS workstations and MS-Windows. Attendees could play against each other at stations placed throughout the Moscone Convention Center in San Francisco.
Legacy
A 30th anniversary retrospective was hosted by the Vintage Computer Festival held at the Computer History Museum in Mountain View, California, on November 7, 2004.
Alumni
Steve Colley subsequently worked on very early versions of Mars rover technology for NASA, and found that his 3D perspective work on Maze Wars was useful for this project.
Greg Thompson subsequently was CTO at nCUBE, a Sr. Director at Cisco, CTO for Multimedia at Huawei, and is currently a mentor at the i-GATE Innovation Hub in Livermore CA.
Howard Palmer later became a Senior Engineer at Netscape, Chief Architect at Resonate, and is currently a software developer at Stanford University.
Dave Lebling went on to form Infocom in 1979, creating the space of text based interactive fiction games like Zork and Enchanter.
Mark Horowitz became a founder of Rambus, Chair of the EE department at Stanford University, and is currently a professor at Stanford.
George Woltman subsequently was a programmer at Data General and became famous as the founder of the Great Internet Mersenne Prime Search (GIMPS) and author of Prime95 and mprime.
Other versions
1982, Snipes by SuperSet Software. This used semaphores in a shared file which resided on a network drive. It was written as a demonstration program for SuperSet's local area network system, which became Novell NetWare.
1987, MacroMind MazeWars+ for the Apple Macintosh computer. Fully 3D (multiple vertical game levels) and could be played over AppleTalk networks. It was distributed by Apple with new Macintosh computers for some period of time. Later the 1993 title Super Maze Wars by the Callisto Corporation was distributed with some Macintosh computers.
Versions were written for NeXT computers, Palm OS, iPod Touch, and iOS.
References
Sources
External links
The DigiBarn Computer Museum's Maze War 30-Year Retrospective: (Collection of Maze War's history, stories and references)
The DigiBarn Computer Museum's Maze War 30-Year Event Pages: "The First First-person Shooter" (Additional text, images and screenshots)
Powerpoint presentation given at the Computer History Museum November 7, 2004 "The aMazing History of Maze" "(pdf)"
Demo video of Imlac Maze War by Tom Uban at the Computer History Museum event YouTube version
PC Gaming History video of First Person Shooter History - Maze War for Xerox Alto & Xerox Star!
Prior Art video of Xerox Alto Maze War
Ad and press release for MacroMind MazeWars+
1974 video games
First-person shooters
Mainframe games
Maze games
Multiplayer online games
Public-domain software with source code
Video games developed in the United States |
1896953 | https://en.wikipedia.org/wiki/LiVES | LiVES | LiVES (LiVES Editing System) is a free video editing software and VJ tool, released under the GNU General Public License version 3 or later. There are binary versions available for most popular Linux distributions (including Debian, Ubuntu, Fedora, Suse, Gentoo, Slackware, Arch Linux, Mandriva and Mageia). There are also ports for BSD, and it will run under Solaris and IRIX. It has been compiled under OS X Leopard, but not thoroughly tested on that platform. In early 2019, a version for Microsoft Windows was announced, with a release slated for in the second half of 2019.
Development
The main developer of LiVES is Gabriel Finch (a.k.a. Salsaman), who is also a video artist and international VJ.
The project began in late 2002, and the 1.0.0 version was released in July 2009.
On the Freecode site, LiVES is listed as the most popular non-linear video editing software.
LiVES was nominated for the category of Best Project for Multimedia in the SourceForge Community Choice Awards 2009.
The LiVES application allows the user to manipulate video in realtime and in non-realtime. The application also has features which go beyond traditional video editing applications - for example, it can be controlled and monitored remotely over a network, and it has facilities for streaming to and from another copy of LiVES. It is resolution and frame rate independent.
LiVES uses a system of plugins for effects, decoders, encoders and video playback. The APIs for these are now well defined, and the application can be easily extended.
Actual Version 3.2 is based on GTK+ 2.16+ or 3.
Interface
LiVES has two main interfaces: the clip editor, which serves as a repository of video and audio material, and the multitrack window, where multiple clips can be positioned on the timeline.
The clip editor allows free playback at variable play rates, applying of multiple realtime effects and mixing of clips. This mode is mainly used by VJs. Video editors can also use this interface to prepare the clips before entering into multitrack mode.
In multitrack mode, the individual clips can be arranged in layers on the timeline. Further effects and transitions can be applied here, and the audio can be mixed down. The entire timeline can then be rendered, creating a new clip.
Rendering previews are shown in real time.
Features
LiVES' features include:
Near-instant opening for most audio / video formats via libav.
Smooth playback at variable frame rates, forward and in reverse. Display frame rate can be controlled independently of playback frame rate.
Frame accurate cutting and pasting within and between clips.
Saving/re-encoding of clips, selections, and individual frames.
Lossless backup/restore.
Streaming input and output.
Real time blending of clips (various chroma and luma blends).
Ability to edit many file types and sources including remotely located files (with mplayer/ffmpeg libraries), and directories of images (rotoscoping).
Real time capture/recording of interactive (via mouseclicks) external windows.
Encode to any of the 50+ output formats which are now supported (e.g. mjpeg, mpeg4, mpeg1/2, h264, webm, VCD, SVCD, DVD, x264/Blu-ray, ogg/mp4 ogm, Matroska mkv, dv, swf, Ogg Theora, Dirac, MNG, Snow, xvid, Flash and even animated GIF and PDF)
Resampling of video (time stretching) to any frame rate (1 to 200 frame/s - accurate to 8 decimal places); option to auto-resample or speed up/slow down between clips.
Rotation, resizing and trimming of video clips.
Deinterlacing, subtitle removal. Auto deinterlacing for dv can be enabled.
Can load mp3, vorbis, mod, it, xm and wav audio files.
LiVES can also load audio tracks directly off CD to use with video.
Sample accurate cutting and pasting of audio within and between clips.
Resampling of audio (rate, channels, sample size, signedness and endianness); audio is auto-resampled between clips.
Able to record from any external audio source.
Fade in/fade out feature for clips.
Audio speed and direction can be smoothly adjusted; both in real time and when rendering.
Hundreds of effects, including random/targeted zooming, panning of video, colour cycling and colorisation/colour filtering and colour correction.
LADSPA support for audio effects.
Merging/compositing of frames is possible: e.g. frame-in-frame, fade in/out and transparency.
Real time previews when rendered effects are processing.
Support for the Frei0r 1.1 and 1.2 effect plugin architectures, libvisual plugins, and projectM plugins.
Multiple real time effects are possible during playback (VJ mode), these can also be rendered to frames.
Multitrack window with drag and drop and configurable auto-transitions.
Intelligent screen organisation - shows you only the information which is relevant, no more and no less
Support for an almost limitless number of tracks and effects
Non-destructive editing in the multitrack window, with multiple levels of undo/redo.
Full automation/interpolation of effect parameters.
Support for stereo backing audio track + stereo audio track per video track
Automatic gain control for rendering multiple audio channels
Realtime mixing/previewing of audio
Channel mixer volume control + fine grained, time variable per-channel volume and pan control.
Auto-transitioning of audio with video (selectable).
Full crash recovery.
Configurable multi-monitor screen placement.
Simple and intuitive menu layout.
Drag and drop interface.
Remote monitoring and control (via Open Sound Control) of the application can be enabled.
VJ functions can be controlled via keyboard, joystick or MIDI controller.
I18N text support. Translations into at least French, Czech, German, Japanese, Dutch, Portuguese, Spanish, Italian, Russian, Turkish, Hungarian, Slovak, Simplified Chinese, Finnish, Ukrainian, Arabic, Estonian, Uzbek and Hebrew are included.
Support for audio output through pulse audio.
Support for audio output through jack.
Jack transport support (master or client).
Support for .srt and .sub subtitle files.
Vloopback/vloopback2 output for video (Linux only).
MIDI sequence synchronisation (start/stop).
Shuttle controls for FireWire cameras/recorders. Can grab from DV and HDV formats.
Multi-threaded / multi-core for optimised processing.
Able to download and import clips directly from YouTube, Vimeo and many other video sites.
Performances can be recorded in real time and then rendered after playback.
Audio can be switched between internal and external sources with a simple button click.
Support for custom themes / colour schemes (with import and export abilities).
Built-in webcam (unicap) support for real time playback, effects and mixing.
Effect plugins can be linked to provide real time data analysis / processing channels.
Automatic VJ mode, which can optionally be linked to Mixxx and other DJ software.
Integration with projectM - generate video in real time from the audio source.
Can handle in/out streams in LiVES to LiVES or yuv4mpeg format. Streams can be piped from stdout into other applications.
Support for live FireWire and TV card inputs.
Internal support for RGB24, RGBA32, YUVA, YUV, YUV422, YUV420 (jpeg and mpeg), YUYV, YUV411, and UYVY palettes; one step conversion with chroma super and subsampling is implemented.
RFX builder allows rapid prototyping of new effects, transitions, generators, utilities and tools. Custom RFX scripts can be exported to share with others or downloaded and imported. Test scripts are run in a sandbox to allow safe testing of new plugins.
Gallery
Notes
Further reading
The book "Video Wiedergabe, Bearbeitung und Streaming unter Linux", Open Source Press, contains a chapter on LiVES.
Interview with the author for Linux BG magazine, original English version
The LiVES Video Editor and VJ Tool Turns 1.0, Linux Journal, July 2009.
Review of LiVES 1.0.0-pre1 for Linux Journal
Review of LiVES on Linux Insider January 2012.
LiVES: LiVES is a Video Editing System (masters dissertation) October 2013.
External links
Video tutorial part 1 of 4
LiVES episode on hackerpublicradio.org (mp3 audio)
Free multimedia software
Free video software
Free software programmed in C
Free software programmed in Perl
Free software programmed in Python
Video editing software
Video editing software that uses GTK |
7775662 | https://en.wikipedia.org/wiki/Nothing%20Real | Nothing Real | Nothing Real L.L.C, a company founded in October 1996 by Allen Edwards and Arnaud Hervas, developed high-end digital effects software for the feature film, broadcast and interactive gaming industries.
Apple Inc. purchased Nothing Real in February 2002 for its flagship digital effects software, Shake.
History
In 1996, Allen Edwards and Arnaud Hervas founded Nothing Real, and released Shake 1.0 as a command-line tool for image processing to high-end visual effects facilities in early 1997. Edwards and Hervas met in 1990 working in the R&D of Thomson Digital Images, a Paris company and maker of Explore, a 3D software. In 1993, they were among the early employees at Sony Pictures Imageworks in Los Angeles, and later worked at early Weta in New Zealand on the film "The Frighteners" for Peter Jackson. Their experiences from both the software development and user sides led them to begin development of Shake as a high-speed, high-quality tool specifically designed for the feature film visual effects industry.
Emmanuel Mogenet joined the R&D as a senior developer in the summer of 1997 as Shake 2.0 was being rewritten with a full user interface. In the fall of 1997, Dan Candela (R&D), Louis Cetorelli (head of support) and Peter Warner (designer/expert user) were added to the team. After initially working as a consultant in early 1998, Ron Brinkmann also joined that spring as product manager. This core group were all among the original Sony Imageworks employees.
Shake 2.0 was first shown at the 1998 NAB conference as an alpha demo with a minimal set of nodes, a node view and the player. A more complete beta version of Shake 2.0 was shown at the 1998 SIGGRAPH conference.
Version 2 was released in early 1999 for Windows NT and IRIX, costing $9900 US per license, or $3900 for a render-only license. Over the next few years, Shake rapidly became the standard compositing software in the visual effects industry for feature films.
In 2000, Shake version 2.1 cost US$9,900 plus an annual maintenance fee of approximately US$1,500. The additional render-only license cost US$3,900. The software was available for Linux, Irix and Windows NT.
Apple Computer's purchase of Nothing Real in 2002 changed the way in which Shake was marketed and sold. Apple released Shake for Mac OS X and lowered the price to US$4,950 with an annual maintenance fee of US$1,199. Apple continued to lower Shake's price over the years. In 2006, Apple released Shake 4.1 for US$499 with no annual maintenance fee. On July 30, 2009, Apple discontinued Shake.
Corporate facts
Nothing Real was headquartered in Venice Beach, California.
Approximately one-third of the company’s sales were overseas in the United Kingdom, France, and Japan.
References
External links
Coverage discussing Tremor from The Register
Software companies established in 1996
Apple Inc. software
Apple Inc. acquisitions
2002 mergers and acquisitions |
61823062 | https://en.wikipedia.org/wiki/Command%20and%20control%20structure%20of%20the%20European%20Union | Command and control structure of the European Union | This article outlines the command and control (C2) structure of the European Union's (EU) missions, which are deployed as part of the Common Security and Defence Policy (CSDP). This C2 structure ranges from the political strategic level to the tactical level.
At the military/civilian strategic level, missions are commanded by an operation headquarters (OHQ). For all civilian missions the Civilian Planning and Conduct Capability (CPCC) serves this purpose. For each military mission an OHQ is chosen from a list of available facilities. The EU does not have a permanent military command structure along the lines of the North Atlantic Treaty Organization's (NATO) Allied Command Operations (ACO), although it has been agreed that ACO resources may be used for the conduct of the EU's CSDP missions. The Military Planning and Conduct Capability (MPCC), established in 2017 and to be strengthened in 2020, does however represent the EU's first step in developing a permanent OHQ.
The MPCC and CPCC are counterparts that cooperate through the Joint Support Coordination Cell (JSCC).
The CPCC, MPCC and JSCC are all part of the External Action Service (EEAS), and situated in the Kortenberg building in Brussels, Belgium.
Overview
Civilian missions
Strategic level
All civilian missions are directed on the strategic level by the Civilian Planning and Conduct Capability (CPCC), a directorate of the External Action Service (EEAS) in Brussels, Belgium. The Director of the CPCC acts as Civilian Operation Commander (Civ OpCdr).
Operational level
The CPCC directs the subordinate Head of Mission (HoM), who administers the mission on the operational level.
Military missions and operations
Strategic level
Command options for EU-led missions
For each military mission (certain missions are also referred to as operation), the Council nominates a dedicated OHQ. This section outlines the main options for OHQ.
Autonomous operations and missions
Military Planning and Conduct Capability (MPCC) of the EEAS' Military Staff (EUMS) in Brussels, Belgium
Established in 2017, the MPCC is the EU's first permanent OHQ and supersedes the previous EU OPCEN. At present it may run only non-executive operations, but will by the end of 2020 the MPCC will also be capable of running executive operations of up to 2500 troops (i.e. the size of one battle group).
National OHQ offered by member states:
(CPCO) in Paris, France
Armed Forces Operational Command (EinsFüKdoBw) in Potsdam, Germany
(EL EU OHQ) in Larissa, Greece
Italian Joint Force Headquarters (ITA-JFHQ) in Centocelle, Rome, Italy
(see :es:Mando de Operaciones, the Spanish national nucleus of the HQ).
The practice of activating ad hoc national OHQs has been criticised as being inefficient due to high start-up costs and fact that their temporary nature to a certain extent prevents the staff forming a strong working relationships and ‘collective memory’.
Operations with recourse to NATO assets and capabilities
: An OHQ would be set up within the Supreme Headquarters Allied Powers Europe (SHAPE) in Mons, Belgium. SHAPE is the main headquarters of Allied Command Operations (ACO).
The Berlin Plus agreement requires that the use of NATO assets by the EU is subject to a "right of first refusal", i.e. NATO must first decline to intervene in a given crisis, and contingent on unanimous approval among NATO states, including those outside of the EU. For example, Turkish reservations about Operation Concordia using NATO assets delayed its deployment by more than five months.
Operation Commander
Each OHQ is led by an Operation Commander (OpCdr).
When the MPCC acts as OHQ, the OpCdr is the MPCC Director, who is also Director General of the European Union Military Staff (EUMS).
When the NCS provides the OHQ, the OpCdr is the Deputy Supreme Allied Commander Europe (DSACEUR).
Operational level
The OHQ directs the subordinate Force Headquarters (FHQ), which carries out the operation on the tactical level (i.e. on the ground). The FHQ is led by a Force Commander (FCdr).
In case the MPCC acts as OHQ, the FHQ is termed Mission Force Headquarters (MFHQ) instead. The MFHQ is led by a Mission Force Commander (MFCdr).
Tactical level
The FCdr/MFCdr directs Component Commanders (CCs) for all service branches that may be required as part of the operation. The military forces within each component is subordinate to the CC.
Civilian-military coordination
In the event that both a military and civilian mission are in the field, the military OHQ and its Operation Commander (OpCdr) coordinate relations on the strategic level horizontally with the Civilian Planning and Conduct Capability (CPCC) and its Civilian Operation Commander (Civ OpCdr). Equally, on the tactical level the military Force Headquarters (FHQ) and its Force Commander (FCdr) coordinate relations horizontally with the civilian Head of Mission (HoM).
If the Military Planning and Conduct Capability acts as OHQ, it will coordinate its relations with the CPCC through the Joint Support Coordination Cell (JSCC).
See also
Command and control
Structure of the Common Security and Defence Policy
List of military and civilian missions of the European Union
Berlin Plus agreement
Structure of NATO
References
External links
European Union Concept for Command and Control, European External Action Service
STRATEGIC COMMAND AND CONTROL (C2) SYSTEM FOR CSDP MISSIONS AND OPERATIONS, Permanent Structured Cooperation
Military of the European Union
European Union |
52154682 | https://en.wikipedia.org/wiki/List%20of%20tech%20companies%20in%20the%20New%20York%20metropolitan%20area | List of tech companies in the New York metropolitan area | Technology companies in the New York City metropolitan area represent a significant and growing economic component of the New York metropolitan area, the most populous combined statistical area in the United States and one of the most populous urban agglomerations in the world. In the region's Silicon Alley, new establishments include those of Israeli companies in New York City, at a rate of ten new startups per month, and the technology sector has been claiming a greater share of New York City's economy since 2010. Tech:NYC, founded in 2016, is a non-profit organization which represents New York City's technology industry with government, civic institutions, in business, and in the media and whose primary goals are to attract tech talent to the city and to advocate for policies that will help tech companies grow.
The following is a partial and growing list of notable New York metropolitan area tech companies:
Apps
FanDuel
Moonit
Trello
Artificial intelligence
Enigma Technologies
IBM Watson
Cloud and database services
MongoDB
BetterCloud
IBM
KeyMe
LiveTiles
SimpleReach
SocialFlow
Zeta Global
Digital media
33Across
AppNexus
Arkadium
Behance
DoubleClick
Innovid
Invite Media
JW Player
Kaltura
Mic
SoundCloud
Spotify
Squarespace
Stack Exchange
Taboola
Vimeo
Zola Books
Financial technology (Fintech)
Betterment
Bloomberg L.P.
Current
E-Trade
Finco Services Inc
SeedInvest
Stash (company)
Two Sigma
Hardware
HTC
Huawei
IBM
LG Electronics
Nokia Bell Labs
Samsung Electronics
TCL Corporation
ZTE
Latch
Health services technology
Oscar Health
Phreesia
Zocdoc
Flatiron Health
Life insurance
Haven Life
Software
Animoto
Appetizer Mobile
Atavist
Barnes & Noble
Blackbird Group
CA Technologies
Cockroach Labs
Cognizant
ConsenSys
Datadog
Deeplink
DigitalOcean
Enterproid
Fog Creek Software
Greenhouse Software
Helix Software Company
Impelsys
Infor
letgo
LivePerson
Mediaocean
MongoDB
Oscar Health
Paribus
Primer Archives
Q-Sensei
Safefood 360°
Telmar (company)
Videology
Yext
Perpetual
Software as a service (SaaS)
Diligent
Marpipe
Medidata Solutions
Other services
Andela
Birchbox
Blue Apron
ButterflyMX
Cambridge Analytica
CARTO
ClassPass
Etsy
Foursquare
Gilt Groupe
Integral Ad Science
Jet.com
Kickstarter
littleBits
Mimeo, Inc
OkCupid
Paddle8
Panjiva
Rent the Runway
Seamless
SeatGeek
Shapeways
Sharp Electronics Corporation USA
ShopKeep
Shutterstock
Skillshare
Squarefoot
Peloton
See also
BioValley
List of biotech and pharmaceutical companies in the New York metropolitan area
List of companies based in New York City
Silicon Alley
Silicon Hills
Tech Valley
References
New York
Software companies based in New York City
New York City |
32186915 | https://en.wikipedia.org/wiki/Nikola%20Vu%C4%8Devi%C4%87 | Nikola Vučević | Nikola Vučević (Serbian Cyrillic: Никола Вучевић, ; born 24 October 1990) is a Montenegrin professional basketball player for the Chicago Bulls of the National Basketball Association (NBA). He played college basketball for the University of Southern California before being drafted 16th overall in the 2011 NBA draft by the Philadelphia 76ers.
Vučević, who spent his rookie season with the 76ers, was traded to Orlando Magic before the start of the 2012–13 season as a part of the four-team trade that sent Dwight Howard to the Los Angeles Lakers. He played nine seasons for the Magic and was named an NBA All-Star twice during his tenure with the team. In the middle of the 2020–21 season, the Magic traded Vučević to the Chicago Bulls.
Early life
Vučević was born in Morges, Switzerland during the time his professional basketball player father, Borislav, played for a club based in nearby Lausanne. He was primarily raised in Belgium where the family moved in 1994 when his father got a professional contract there. Borislav Vučević played professionally for 24 years, a journeyman career that included stops in Yugoslavia, Switzerland, and Belgium, and was a member of the KK Bosna team that won the European Champions Cup in 1979 in addition to several appearances for the Yugoslavia national team, primarily at the 1983 Mediterranean Games in Casablanca, Morocco and EuroBasket 1985 in West Germany. Vučević's mother, Ljiljana Kubura, an ethnic Serb was a 6-foot-2 forward for the Sarajevo club Željezničar, as well as for the Yugoslavia women's national team.
His family moved to Montenegro when he was a teenager. In 2006, Vučević was a survivor in the Bioče derailment, a train crash that killed at least 45 people and wounded 184 others.
In 2007, seventeen-year-old Vučević was named Montenegro's Best Young Player.
High school
Vučević moved to Simi Valley, California in the United States in October 2007 to play his senior year of high school at Stoneridge Prep. He knew little English, but did speak French, which many of his teammates also spoke. Under coach Babacar Sy, a friend of his father's, he was team captain and led the team in scoring and rebounding with 18 points and 12 rebounds per game.
College career
Vučević played three seasons with the Trojans of the University of Southern California.
Freshman
Vučević missed the first eight games of the season while waiting to have his amateur status confirmed by the NCAA Clearinghouse. He averaged 2.6 points and 2.7 rebounds in 23 games in three starts. Vučević played in his first game with USC on 15 December 2008, against Pepperdine, and had two points, two blocks, and two rebounds in six minutes. He made his first start of the season on 24 January 2009, at Washington State in the Trojans' 46–44 win with a season-high eight points and five rebounds. He also scored eight points on 9 February 2009 at UCLA, and in his second start of the season on 19 February against Washington State. Vučević had a season-best seven rebounds in that game and matched that total on 5 March 2009, vs. Oregon. Vučević scored six points and had four rebounds in the NCAA second-round loss to Michigan State on 22 March. In all, he made 57.8 percent of his shots from the field (26-for-45).
Sophomore
Vučević began to excel in his sophomore season. He scored 18 points and had eight rebounds in the first game of the season against UC Riverside on 17 November 2009, both totals better than any of his freshman games. Vučević had 18 points and 14 rebounds at Texas on 3 December 2009. He scored a career-high 19 points and had 11 rebounds vs. Loyola Marymount on 21 November 2009, for his first career double-double. He matched his career high with 19 points on 9-of-12 shooting at UCLA on 16 January 2010, scoring 17 points in the second half. By the end of the year, he had led USC in scoring five times and in rebounding 20 times, including the last nine games.
Overall, he was the second-best scorer and leading rebounder on the Trojans, with 10.7 points and 9.4 rebounds per game. Vučević led the Pac-10 with 283 rebounds and offensive rebounds per game (6.3) and his 39 blocks were the fourth most in the conference. Vučević's .504 shooting percentage (126-for-250) led USC and was seventh best in the Pac-10. Vučević was named the 2009–10 Pac-10 Most Improved Player, and earned all-Pac-10 second team and Pac-10 honorable-mention all-defensive team honors. He had the second-most blocks ever in a season by a Trojan sophomore and the third-most rebounds. Vučević started all 30 games for USC and posted 10 double-doubles.
Junior
As a junior, Vucevic was picked to the Fourth Team All-America by Fox Sports and was named to the All-Pac-10 first team. In March 2011, Vučević announced that he would give up his senior year to enter the NBA draft. The website NBAdraft.net projected him as the 23rd pick in the draft.
During his stint with the Trojans, Vučević averaged 11.1 points and 8.0 rebounds per game.
Professional career
Philadelphia 76ers (2011–2012)
On 23 June 2011, Vučević was drafted with the 16th overall pick in the 2011 NBA draft by the Philadelphia 76ers. During the 2011 NBA lockout, Vučević played for Montenegrin team Budućnost Podgorica. Following the conclusion of the lockout, he returned to the United States and signed his rookie scale contract with the 76ers on 9 December 2011. On 22 February 2012, Vučević scored a season-high 18 points in a loss to the Houston Rockets.
Orlando Magic (2012–2021)
2012–13 season
On 10 August 2012, Vučević was traded to the Orlando Magic as a part of the blockbuster four-team deal that sent Dwight Howard to the Los Angeles Lakers. On 31 December 2012, in an overtime loss to the Miami Heat, Vučević set a franchise record with 29 rebounds. On 10 April 2013, he recorded his second straight 20/20 game with a career-high 30 points and 20 rebounds in a 113–103 win over the Milwaukee Bucks.
2013–14 season
On 6 November 2013, Vučević recorded a career-high tying 30 points and 21 rebounds in a 98–90 win over the Los Angeles Clippers. Vučević's strong play over the second half of the 2013–14 season was noticeable on 28 March 2014 when he dominated the Charlotte Bobcats. He overcame a slow start shooting the ball to finish the game with 24 points and 23 rebounds in an overtime victory. Vučević made nine of his last 11 shots to lead a Magic rally in the second half, while also grabbing 16 first-half rebounds and 10 offensive boards in the game, marking the sixth 20-point, 20-rebound game of his career. On 31 March, Vučević earned his first Eastern Conference Player of the Week honor, after he led the Magic to a 2-1 week. He logged a double-double average with 22.7 points (sixth in the conference) and a league-leading 14.3 rebounds. He posted a point-rebound double-double in all three contests. Vučević became the first Magic player to win the Eastern Player of the Week honors since Dwight Howard in 2012.
2014–15 season
On 23 October 2014, Vučević signed a four-year, $53 million contract extension with the Magic. On 3 April 2015, he scored a career-high 37 points in a 97–84 win over the Minnesota Timberwolves.
2015–16 season
On 11 November 2015, Vučević didn't start for the Magic against the Los Angeles Lakers, returning to action after a three-game absence with a right knee contusion. Vučević, who had started all 223 games for the Magic over his four-season tenure, came off the bench for the first time and scored 18 points, including a fallaway 18-footer at the buzzer to lift the Magic over the Lakers 101–99. Vučević averaged 18.4 points, 9.0 rebounds and 3.0 assists in his first 12 games of December (1–23 December). Shaquille O'Neal is the only other Magic player to reach those numbers in one month in franchise history. On 7 February 2016, he scored 22 points and hit an 18-footer at the buzzer to lead the Magic over the Atlanta Hawks 96–94, winning for only the third time in 18 games in 2016. On 23 February, he scored a season-high 35 points in a 124–115 win over the Philadelphia 76ers. On 31 March, he returned to action after missing the previous 13 games with a right groin strain. He subsequently came off the bench for just the second in his Magic tenure, as he scored 24 points in a 114–94 win over the Indiana Pacers. He came off the bench for a further three games before returning to the starting line-up on 8 April against the Miami Heat, where he scored a game-high 29 points in a 112–109 win.
2016–17 season
Vučević started in all 16 games for the Magic to start the 2016–17 season, coming off the bench for just the sixth time in his Magic tenure on 27 November 2016 against the Milwaukee Bucks. Vučević continued to come off the bench throughout December, while also missing three games between 10 and 14 December with a back injury. On 20 December, he had a season-high 26 points and 12 rebounds off the bench in a 136–130 double overtime win over the Miami Heat. He regained his starting spot in mid-January, and as a result, he hit 13 of 18 shots, scored a season-high 30 points and grabbed 10 rebounds to lead the Magic to a 115–109 victory over the Portland Trail Blazers on 13 January. On 7 February 2017, he had 14 points and a season-high 19 rebounds in a 128–104 loss to the Houston Rockets. On 11 March 2017, in a 116–104 loss to Cleveland, Vučević had a team-high 20 points and 16 rebounds after missing the previous four games with a sore right Achilles. On 10 April 2017, he grabbed 10 rebounds against Chicago to move ahead of Shaquille O'Neal into second place in franchise history, trailing only Dwight Howard.
2017–18 season
On 20 October 2017, Vučević scored a career-high 41 points and grabbed 12 rebounds in a 126–121 loss to the Brooklyn Nets. On 9 December 2017, he recorded his first career triple-double with game highs of 31 points, 13 rebounds and 10 assists in a 117–110 loss to the Atlanta Hawks. He became the first Magic center to record a triple-double with assists—Shaquille O'Neal and Dwight Howard each accomplished it with blocks. On 23 December, he suffered a fractured left hand against the Washington Wizards and was subsequently ruled out for six to eight weeks. He returned to action on 22 February 2018, against the New York Knicks after missing 23 games, recording 19 points and six rebounds in a 120–113 loss. On 14 March, he recorded 22 points, nine rebounds and nine assists in a 126–117 win over the Milwaukee Bucks.
2018–19 season
On 20 October 2018, Vučević recorded his second career triple-double with 27 points, 13 rebounds and 12 assists in a 116–115 loss to the Philadelphia 76ers. On 17 November, he had a season-high 36 points and 13 rebounds in a 130–117 win over the Los Angeles Lakers. Eight days later, he had 31 points, 15 rebounds and seven assists in a 108–104 win also against the Lakers. Prior to the second match-up against the Lakers, Lakers coach Luke Walton described Vučević as a "nightmare with" because of the versatility of his offensive game. Following the 108–104 win, Lakers forward LeBron James noted that "He's (Vučević) got our number this year. All you can say." On 19 November, Vučević was named the Eastern Conference Player of the Week, after leading the Magic to a 3-1 record. It was the first time a Magic player has won the award since 2014, when Vučević himself captured the award for the first time in his NBA career. He averaged 27.6 points, 12.3 rebounds, 4.0 assists and 1.3 steals while shooting 58.1 percent from the floor and 47 percent from 3-point range. During his stretch of strong play, Vučević became the first Magic player to post at least 28 points and 10 rebounds in three consecutive games since Dwight Howard. He also became the first Magic player to score 30 points in back-to-back games since Victor Oladipo in March 2015. On 6 January 2019, Vučević had 16 points and 24 rebounds in a 106–96 loss to the Los Angeles Clippers. On 31 January, he received his first All-Star selection in his eight-year career, earning Eastern Conference reserve honors for the 2019 NBA All-Star Game. He became the Magic's first All-Star since Howard in 2012. On 17 March, he scored 17 of his 27 points in the first eight minutes of the game and added 20 rebounds in a 101–91 win over the Atlanta Hawks. Vučević helped the Magic go 22–9 over the final 31 games of the season to clinch their first playoff berth since 2012. In game one of the Magic's first-round playoff series against the Toronto Raptors, Vučević scored 11 points in a 104–101 upset victory. They went on to lose to the Raptors in five games.
2019–20 season
Vučević entered the offseason as an unrestricted free agent. On 6 July 2019, he signed a four-year, $100 million contract to remain with the Magic. On 17 November, Vučević posted 30 points on 11 of 14 shooting, including 3-for-4 from three along with 17 rebounds and six assists in a 125–121 victory against the Washington Wizards, putting up season-highs for points and rebounds while also logging his seventh consecutive double-double and his tenth overall on the season. In his efforts, Vučević set a franchise record for most 30-point, 15-rebound and five-assist games with four, surpassing Shaquille O’Neal with three. The following day, Vučević earned his third Eastern Conference Player of the Week honor, after logging a double-double average with 21.7 points, 14.0 rebounds to go along 4.0 assists and 1.33 blocks per game while shooting 54.2 percent overall and 66.7 percent from 3-point range. He led the Magic in scoring and rebounding and sparked its 3-0 record during the week with victories over the Philadelphia 76ers, the San Antonio Spurs and the Wizards.
2020–21 season
Though largely credited to a balanced attack, Vučević has played a key role in the hot start by the Orlando Magic.
On 5 February 2021, Vučević scored a career-high 43 points, grabbed a season-high 19 rebounds and dished out 4 assists in a 123–119 win over the Chicago Bulls. He succeeded in two free throw shots with 2.6 seconds left to play, ending the Magic's fourth losing streak. In his efforts, Vučević joined Shaquille O’Neal and Dwight Howard as the third player in franchise history to log at least 43 points and 19 rebounds in a game. On 19 February, Vučević logged his third career triple-double with 30 points, 16 rebounds and 10 assists in a 124–120 win over the Golden State Warriors, becoming the second center (along with Nikola Jokić) to score a 30-point triple-double with zero turnovers since 1985. He also set a franchise record for the most triple-doubles for the center position. On 23 February, Vučević was named an Eastern Conference reserve for the 2021 NBA All-Star Game.
At the conclusion of Vučević's time with the Magic, he ranks at or near the top of a number of key categories in the Magic franchise history, including first in all-time field goals made (4,490), second in rebounds (6,381), third in blocks (550), third in points scored (10,423), and fourth in games played (591). He also led the franchise to end its six-year playoff drought and reach the postseason for two straight seasons.
Chicago Bulls (2021–present)
2020–21 season
On 25 March 2021, the Orlando Magic traded Vučević along with Al-Farouq Aminu to the Chicago Bulls in exchange for Wendell Carter Jr., Otto Porter and two future first-round picks. Vučević played and started in all 44 games and was averaging 24.5 points, 11.8 rebounds and 3.8 assists, while shooting 40.6% from three-point range and 82.7% from the free throw line with the Magic for the 2020–21 season. He was the fourth leading rebounder in the league at the time. On 27 March, Vučević debuted for the Bulls in a 120–104 loss to the San Antonio Spurs, scoring a game-high 21 points, grabbing nine rebounds and dishing out four assists in 32 minutes of action. On 31 March, Vučević logged his first double-double as a Bull with 24 points and 10 rebounds in a 121–116 loss to the Phoenix Suns. On 4 April, Vučević posted his thirty-second double-double of the season and second as a Bull with 22 points and a game-high 13 rebounds along with two assists, two steals and two blocks in a 115–107 win over the Brooklyn Nets, ending the Bulls' longest losing streak of the season at six games. Two days later, Vučević tallied 32 points, 17 rebounds, and five assists in a 113–97 victory against the Indiana Pacers, becoming the third player in franchise history to log at least 30 points, 15 rebounds and five assists in a game, joining Joakim Noah and Pau Gasol. On 26 April, Vučević logged his third straight double-double and the fortieth of his career with 24 points and 11 rebounds in a 110–102 win over the Miami Heat. It was also his tenth double-double in his 18 games as a Bull. After missing two games due to a hip injury, Vučević returned to action on 6 May, recording his sixth consecutive double-double with 29 points and 14 rebounds, in addition to three assists, two steals and a block in a 120–99 victory over the Charlotte Hornets. The next day, Vučević posted his second triple-double of the season and the fourth of his career with 18 points, 14 rebounds and 10 assists in a 121–99 win over the Boston Celtics, becoming the first Bulls player to log a triple-double since Jimmy Butler in 2017.
2021–22 season
On 11 November 2021, Vučević entered the NBA's health and safety protocols after testing positive for COVID-19. After missing seven games, Vučević made his return on 24 November, posting a double-double with 14 points and 13 rebounds in a 118–113 loss to the Houston Rockets. On November 29, Vučević logged 30 points on 6-of-6 shooting from three, in addition to 14 rebounds and five assists in a 133–119 victory over the Charlotte Hornets, joining Kemba Walker as the only two players in NBA history to record at least 30 points, 10 rebounds and 5 assists with a 100% 3-point field goal percentage on at least 5 attempts. On 27 December, Vučević scored 24 points, knocking down four 3-pointers to go along with 17 rebounds, six assists and four blocks in a 130–118 victory against the Atlanta Hawks, becoming the first player in the league to ever post such stat line. Two days later, Vučević logged a double-double with 16 points and 20 rebounds on 8-of-14 shooting from the field, to go along with one assist, three steals and one block across 33 minutes of play in a win over the Hawks, joining Joakim Noah and Tyson Chandler as the third Bulls player to grab at least 20 rebounds in less than 35 minutes of action since Dennis Rodman in the 1990s. On 4 February 2022, Vučević finished with 36 points, 17 rebounds, four assists and three blocks in a 122–115 win over the Indiana Pacers, becoming the first Bulls player since Michael Jordan in 1996 to log 35 points, 15 rebounds and 3 blocks in a game. On 14 February 2022, Vučević scored 25 points, grabbed 16 rebounds and dished out five assists in a 120–109 win against the San Antonio Spurs.
National team career
Vučević represented the Montenegro Under-20 team at the FIBA Europe Under-20 Championship. He then represented the senior Montenegro national basketball team at FIBA EuroBasket 2011, FIBA EuroBasket 2013 and FIBA EuroBasket 2017. He averaged 5.0 points per game and 3.2 rebounds per game in 2011 while backing up Nikola Peković. With Peković out of the 2013 tournament, Vučević started for the team and put up 7.0 points per game and 4.0 rebounds per game.
Personal life
In 2016, Vučević married longtime girlfriend and fellow Montenegrin Nikoleta Pavlović, sister of former NBA player Aleksandar "Sasha" Pavlović. The couple have two sons, Filip (born 17 December 2018) and Matija (born 1 October 2020).
Vučević is an ethnic Serb and a Serbian Orthodox Christian. He speaks French, Serbian and English. He also holds dual citizenship with Montenegro and Belgium.
Vučević is a fan of KK Crvena zvezda, FK Crvena zvezda and Juventus Torino. On 27 December 2021, he was elected on a 5-year term as a member of the Assembly of the KK Crvena zvezda.
Career statistics
NBA
Regular season
|-
| style="text-align:left;"|
| style="text-align:left;"|Philadelphia
| 51 || 15 || 15.9 || .450 || .375 || .529 || 4.8 || .6 || .4 || .7 || 5.5
|-
| style="text-align:left;"|
| style="text-align:left;"|Orlando
| 77 || 77 || 33.2 || .519 || .000 || .683 || 11.9 || 1.9 || .8 || 1.0 || 13.1
|-
| style="text-align:left;"|
| style="text-align:left;"|Orlando
| 57 || 57 || 31.8 || .507 || || .766 || 11.0 || 1.8 || 1.1 || .8 || 14.2
|-
| style="text-align:left;"|
| style="text-align:left;"|Orlando
| 74 || 74 || 34.2 || .523 || .333 || .752 || 10.9 || 2.0 || .7 || .7 || 19.3
|-
| style="text-align:left;"|
| style="text-align:left;"|Orlando
| 65 || 60 || 31.3 || .510 || .222 || .753 || 8.9 || 2.8 || .8 || 1.1 || 18.2
|-
| style="text-align:left;"|
| style="text-align:left;"|Orlando
| 75 || 55 || 28.8 || .468 || .307 || .669 || 10.4 || 2.8 || 1.0 || 1.0 || 14.6
|-
| style="text-align:left;"|
| style="text-align:left;"|Orlando
| 57 || 57 || 29.5 || .475 || .315 || .819 || 9.2 || 3.4 || 1.0 || 1.1 || 16.5
|-
| style="text-align:left;"|
| style="text-align:left;"|Orlando
| 80 || 80 || 31.4 || .518 || .364 || .789 || 12.0 || 3.8 || 1.0 || 1.1 || 20.8
|-
| style="text-align:left;"|
| style="text-align:left;"|Orlando
| 62 || 62 || 32.2 || .477 || .339 || .784 || 10.9 || 3.6 || .9 || .8 || 19.6
|-
| style="text-align:left;"|
| style="text-align:left;"|Orlando
| 44 || 44 || 34.1 || .480 || .406 || .827 || 11.8 || 3.8 || 1.0 || .6 || 24.5
|-
| style="text-align:left;"|
| style="text-align:left;"|Chicago
| 26 || 26 || 32.6 || .471 || .388 || .870 || 11.5 || 3.9 || .9 || .8 || 21.5
|- class="sortbottom"
| style="text-align:center;" colspan="2"|Career
| 668 || 607 || 30.6 || .496 || .357 || .756 || 10.4 || 2.7 || .9 || .9 || 16.9
|- class="sortbottom"
| style="text-align:center;" colspan="2"|All-Star
| 2 || 0 || 12.5 || .500 || .200 || || 6.0 || 1.5 || 1.0 || .0 || 4.5
Playoffs
|-
| style="text-align:left;"|2012
| style="text-align:left;"|Philadelphia
| 1 || 0 || 3.0 || .000 || || .500 || 1.0 || .0 || .0 || .0 || 1.0
|-
| style="text-align:left;"|2019
| style="text-align:left;"|Orlando
| 5 || 5 || 29.4 || .362 || .231 || .786 || 8.0 || 3.0 || .4 || 1.0 || 11.2
|-
| style="text-align:left;"|2020
| style="text-align:left;"|Orlando
| 5 || 5 || 37.0 || .505 || .409 || .909 || 11.0 || 4.0 || .8 || .6 || 28.0
|- class="sortbottom"
| style="text-align:center;" colspan="2"|Career
| 11 || 10 || 30.5 || .453 || .368 || .815 || 8.7 || 3.2 || .5 || .7 || 17.9
College
|-
| style="text-align:left;"|2008–09
| style="text-align:left;"|USC
| 23 || 3 || 11.0 || .578 || .000 || .875 || 2.7 || .3 || .4 || .4 || 2.6
|- class="sortbottom"
| style="text-align:left;"|2009–10
| style="text-align:left;"|USC
| 30 || 30 || 32.3 || .504 || .222 || .718 || 9.4 || 1.2 || .6 || 1.3 || 10.7
|- class="sortbottom"
| style="text-align:left;"|2010–11
| style="text-align:left;"|USC
| 34 || 34 || 34.9 || .505 || .349 || .755 || 10.3 || 1.6 || .5 || 1.4 || 17.1
|- class="sortbottom"
| style="text-align:center;" colspan="2"|Career
| 87 || 67 || 27.7 || .509 || .303 || .746 || 8.0 || 1.1 || .5 || 1.1 || 11.1
See also
List of European basketball players in the United States
List of Montenegrin NBA players
References
External links
Nikola Vučević at draftexpress.com
USC Trojans bio
1990 births
Living people
2019 FIBA Basketball World Cup players
ABA League players
Centers (basketball)
Chicago Bulls players
KK Budućnost players
Members of the Assembly of KK Crvena zvezda
Montenegrin expatriates in Belgium
Montenegrin expatriates in Switzerland
Montenegrin expatriate basketball people in the United States
Montenegrin men's basketball players
National Basketball Association All-Stars
National Basketball Association players from Montenegro
National Basketball Association players from Switzerland
Orlando Magic players
People from Morges
Philadelphia 76ers draft picks
Philadelphia 76ers players
Serbs of Montenegro
USC Trojans men's basketball players |
19210030 | https://en.wikipedia.org/wiki/Music%20Wizard | Music Wizard | Music Wizard Group is a software development firm that develops and publishes software products to teach students to play various musical instruments through MIDI software and a Guitar Hero-like interface. Unlike Guitar Hero, it uses real instruments and teaches to read sheet music as well.
History
Founder and CEO Chris Salter entered Southern Illinois University Carbondale (SIU) in 1978 to study cinematography and began producing films about music. Shortly after, Salter met piano instructor Don Beattie, who came to the School of Music in 1979, and joined a group piano class. He was very inspired by Prof. Beattie's innovative work and approach, and began to take many other music classes, while studying directly with Prof. Beattie for the next 4 years. Parallel with this, he changed his major to Linguistics, and began to study French, Spanish, Chinese, and Japanese languages. It was in this way he got his key insights about developmental linguistics, and became curious about the potential of learning music as if it were a native language. He eventually took so many music courses that he ended up earning a double degree in Music and Linguistics from SIU. Combining his two passions, he was able to enter the nationally recognized Ethnomusicology program at UCLA, win a two-year Organization of American States Fellowship to study abroad in Brazil to do his Master's thesis research, and then returned to get his Master's in Musicology from UCLA in 1990. His insights on his thesis into the role visual cues have on teaching rhythm and guitar in Brazil led him to think about how that might facilitate learning to play other instruments like piano.
Years before, Salter took classes to learn how to type, and failed pretty miserably at it, finding it boring and tedious. With the early Apple computers he played a typing game, and soon he was typing 40 words per minute. It was then that he first had the idea that a piano video game could have the same effect, with the added complexity of precision timing being much more important than with typing. Around this time the MIDI protocol for computers to deal with music was created, and the combination of the two concepts lead to the possibility of a game, with a twist. Salter's idea was to start with a simple game, but then transition to reading music notation, allowing even very young children to learn to play and read music without the traditional necessity of music theory and notation deciphering as a "prerequisite" to playing.
After years of consideration, Salter decided to form a business to develop and manufacture the Piano Commando game (which went on to become Piano Wizard). Salter incorporated his new business in under the name Allegro Multimedia, although the company is better known under the DBA (Doing Business As) Music Wizard.
Understanding how students can benefit from music education, Salter met with Don Beattie, his former piano teacher at the SIU school of music to see how piano teachers could best utilize the game in their classrooms. The plan was to introduce a short week-long summer boot camp at the school. After the success of the boot camp, in the fall of 2005, Don and his wife Delayna founded and directed the Piano Wizard Academy at SIU Carbondale. Over the next 3 years, their work as Academy Directors opened new horizons for young children and adult piano beginners, and they were challenged to "productize" their work, and create a self-sufficient package to allow maximum utility of the video game's potential. Don and Delayna finalized the 100 Song Lesson Series that is now the "Academy music curriculum" for the game. Music Wizard and the Beatties then collaborated in the development of a series of 50 Tutorial DVD and Songbook lessons for the Academy Music Library. These materials were meant to empower parents, non-music educators and piano teachers alike to leverage the compressed concrete learning engendered by the game with the icing of musical artistic technique, and music theory as needed. This has proven to be a dramatic success, with hundreds of positive reviews, and testimonials. In particular, homeschool families and special needs communities have embraced this warmly due to the ease, affordability and sometimes dramatic success possible using these tools.
Products
Piano Wizard
Released in 2005, this educational software helps teach the student to play the piano through a patented 4-step learning approach. While the software was available separately, a bundle was made available that included an M-Audio Keystation 49e keyboard. This software is no longer sold separately since the development of the full Piano Wizard Academy.
Piano Wizard Academy
The Piano Wizard Academy version is more popular as it introduces another critical level of music learning, i.e. "Step 5" where the student is helped to get off the game, and read sheet music at the piano. This allows many more musical elements of playing to be introduced by the parent or facilitator, ensuring a deeper more artistic experience rather than just playing a video game. In addition to the core software, it also includes over 50 video lessons and sheet music, to demonstrate how to move children, adults or themselves through the levels of the game, and then to transition off the game to playing a real piano and reading basic musical notation.
The academy version allows unlimited import of MIDI files. The concept is to allow the use of these files as a tool for the student to play (on piano) virtually any song written in Western notation once imported into the program. The game's engine automatically renders game objects and music notes from the imported MIDI file. Currently there are hundreds of thousands of MIDI files online, comprising one thousand years of classical music repertoire, available for free or low cost. The software allows almost any MIDI file to be opened, with the various tracks available for background play or to learn, and it adjusts to multiple common digital keyboard sized, making the software extremely versatile. The 100 song Easy Mode curriculum included is the equivalent of about 2 years children's piano lessons, though most go through this in less than six months or a year because of the game's ability to dramatically shorten and eliminate the normal practice times needed to master a song. They each have fingering indications built in for every note, an option useful for transitioning from the color coded phases to reading black music notation. The Piano Wizard Academy package includes another 100 song, and most are custom arranged to fit easily within a 4 octave keyboard. Imported MIDI files, however, are more like a "box of chocolates" in that these were designed by thousands of music enthusiasts over the last couple of decades and uploaded and shared without any filtering or quality control, so the quality of any MIDI file ranges dramatically. That said, this open source approach of Music Wizard allows for a much longer and more varied use of the product, making it potentially a music learning system for life.
In 2007, Fisher Price licensed the 4-step method for use in their I Can Play Piano product.
A touchscreen Piano Wizard app is currently in development for iPhone and Android platforms.
Impact of music training on cognitive development
A number of customers in the home school market tried the game to see if the simple, visual game like style of the system would allow even special needs children access to the benefits of music education. While not designed specifically for this market, a surprising number of sometimes dramatic stories of cognitive development have emerged spontaneously from customer testimonials . While these success stories with Piano Wizard are anecdotal, not scientific, they are compelling, and often very moving. There is significant scientific research on the effects of music training, especially piano, on the brain. The newest advances in neuroscience, especially the latest technologies like functional MRI, Petscans, diagnostic EKGs, etc. allow scientists to measure the "neuroplasticity" and impact of music on cognitive development. In fact, music training is emerging as a kind of supertonic, having a "metaplastic" effect on the brain, impacting many other areas of cognitive competence, with lifelong benefits. The company is exploring participating in more rigorous and formal neuroscience studies to test the hypothesis that the game system allows a much wider range of people to obtain these known music training benefits faster, easier and more cost effectively. In particular, they are looking not only at participating in studies of early cognitive development, but those with people affected by autism, Down's syndrome, other special needs, even seniors with dementia, Alzheimer's, stroke or other forms of brain trauma or damage. The company is currently seeking research partners and non-profits interested in engaging in these more formal kinds of studies.
Guitar Wizard
Guitar Wizard is intended to teach guitar skills. The software version is under development, but there was a toy version called Mattel's I Can Play Guitar, aimed at young children, which has now been discontinued.
A package in development by the manufacturer will bundle the software with an acoustic guitar with a MIDI pickup or guitar controller.
Licensing
See also
Guitar Rising
References
External links
Official site
Music education video games
Musical training software
Software companies based in Colorado
Music video games
Music education organizations
Piano
Software companies of the United States |
47013794 | https://en.wikipedia.org/wiki/WebAssembly | WebAssembly | WebAssembly (sometimes abbreviated Wasm) defines a portable binary-code format and a corresponding text format for executable programs as well as software interfaces for facilitating interactions between such programs and their host environment.
The main goal of WebAssembly is to enable high-performance applications on web pages, "but it does not make any Web-specific assumptions or provide Web-specific features, so it can be employed in other environments as well." It is an open standard and aims to support any language on any operating system, and in practice all of the most popular languages already have at least some level of support.
Announced in and first released in , WebAssembly became a World Wide Web Consortium recommendation on 5 December 2019 and it received the Programming Languages Software Award from ACM SIGPLAN in 2021. The World Wide Web Consortium (W3C) maintains the standard with contributions from Mozilla, Microsoft, Google, Apple, Fastly, Intel, and Red Hat.
History
WebAssembly was first announced in 2015, and the first demonstration was executing Unity's Angry Bots in Firefox, Google Chrome, and Microsoft Edge. The precursor technologies were asm.js from Mozilla and Google Native Client, and the initial implementation was based on the feature set of asm.js. The asm.js technology already provides near-native code execution speeds and can be considered a viable alternative for browsers that don't support WebAssembly or have it disabled for security reasons.
In March 2017, the design of the minimum viable product (MVP) was declared to be finished and the preview phase ended. , Safari 11 was released with support. In February 2018, the WebAssembly Working Group published three public working drafts for the Core Specification, JavaScript Interface, and Web API.
In June 2019, Chrome 75 was released with enabled by default WebAssembly threads.
Implementations
While WebAssembly was initially designed to enable near-native code execution speed in the web browser, it has been considered valuable outside of such, in more generalized contexts. Since WebAssembly's runtime environments (RE) are low level virtual stack machines (akin to JVM or Flash VM) that can be embedded into host applications, some of them have found a way to standalone runtime environments like Wasmtime and Wasmer.
Web browsers
In November 2017, Mozilla declared support "in all major browsers", after WebAssembly was enabled by default in Edge 16. The support includes mobile web browsers for iOS and Android. , 95% of installed browsers support WebAssembly. But for older browsers, Wasm can be compiled into asm.js by a JavaScript polyfill.
Compilers
WebAssembly implementations usually use either ahead-of-time (AOT) or just-in-time (JIT) compilation, but may also use an interpreter. While the first implementations have landed in web browsers, there are also non-browser implementations for general-purpose use, including Wasmer, Wasmtime or WAMR, wasm3, WAVM, and many others.
Because WebAssembly executables are precompiled, it is possible to use a variety of programming languages to make them. This is achieved either through direct compilation to Wasm, or through implementation of the corresponding virtual machines in Wasm. There have been around 40 programming languages reported to support Wasm as a compilation target.
Emscripten compiles C and C++ to Wasm using the Binaryen and LLVM as backend. The Emscripten SDK can compile any LLVM-supported languages (such as C, C++ or Rust, among others) source code into a binary file which runs in the same sandbox as JavaScript code. Emscripten provides bindings for several commonly used environment interfaces like WebGL.
As of version 8, a standalone Clang can compile C and C++ to Wasm.
Its initial aim is to support compilation from C and C++, though support for other source languages such as Rust, .NET languages and AssemblyScript (TypeScript-like) is also emerging. After the MVP release, there are plans to support multithreading and garbage collection which would make WebAssembly a compilation target for garbage-collected programming languages like C# (supported via Blazor), F# (supported via Bolero with help of Blazor), Python, and even JavaScript where the browser's just-in-time compilation speed is considered too slow. A number of other languages have some support including Python, Java, Julia, Zig,Ring and Ruby, as well as Go.
Limitations
In general, WebAssembly does not allow direct interaction with the DOM. All interaction must flow through JavaScript interop.
Absence of Garbage collection (although there are plans to address this.)
Security considerations (discussed below)
WebAssembly is supported on desktops, and mobile, but on the latter, in practice (for non-small memory allocations, such as with Unity game engine) there are "grave limitations that make many applications infeasible to be reliably deployed on mobile browsers [..] Currently allocating more than ~300MB of memory is not reliable on Chrome on Android without resorting to Chrome-specific workarounds, nor in Safari on iOS."
There is no direct Document Object Model (DOM) access; however, it is possible to create proxy functions for this, for example through stdweb or web_sys when using the Rust language.
All major web browsers allow WebAssembly if Content-Security-Policy is not specified, or if "unsafe-eval" is used, but otherwise the major web browsers behave differently. In practice WebAssembly can't be used on Chrome without "unsafe-eval", while a worker thread workaround is available.
Security considerations
In June 2018, a security researcher presented the possibility of using WebAssembly to circumvent browser mitigations for Spectre and Meltdown security vulnerabilities once support for threads with shared memory is added. Due to this concern, WebAssembly developers put the feature on hold. However, in order to explore these future language extensions, Google Chrome added experimental support for the WebAssembly thread proposal in October 2018.
WebAssembly has been criticized for allowing greater ease of hiding the evidence for malware writers, scammers and phishing attackers; WebAssembly is present on the user's machine only in its compiled form, which "[makes malware] detection difficult". The speed and concealability of WebAssembly have led to its use in hidden crypto mining on the website visitor's device. Coinhive, a now defunct service facilitating cryptocurrency mining in website visitors' browsers, claims their "miner uses WebAssembly and runs with about 65% of the performance of a native Miner." A June 2019 study from the Technische Universität Braunschweig analyzed the usage of WebAssembly in the Alexa top 1 million websites and found the prevalent use was for malicious crypto mining, and that malware accounted for more than half of the WebAssembly-using websites studied. An April 2021 study from Universität Stuttgart found that since then crypto mining has been marginalized, falling to below 1% of all WebAssembly modules gathered from a wide range of sources, also including the Alexa top 1 million websites.
The ability to effectively obfuscate large amounts of code can also be used to disable ad blocking and privacy tools that prevent web tracking like Privacy Badger.
As WebAssembly supports only structured control flow, it is amenable toward security verification techniques including symbolic execution. Current efforts in this direction include the Manticore symbolic execution engine.
WASI
WebAssembly System Interface (WASI) is a simple interface (ABI and API) designed by Mozilla intended to be portable to any platform. It provides POSIX-like features like file I/O constrained by capability-based security. There are also a few other proposed ABI/APIs.
WASI is influenced by CloudABI and Capsicum.
Solomon Hykes, a co-founder of Docker, wrote in 2019, "If WASM+WASI existed in 2008, we wouldn't have needed to create Docker. That's how important it is. WebAssembly on the server is the future of computing." Wasmer, out in version 1.0, provides "software containerization, we create universal binaries that work anywhere without modification, including operating systems like Linux, macOS, Windows, and web browsers. Wasm automatically sandboxes applications by default for secure execution".
Specification
Host environment
The general standard provides core specifications for JavaScript API and details on embedding.
Virtual machine
Wasm code (binary code, i.e. bytecode) is intended to be run on a portable virtual stack machine (VM). The VM is designed to be faster to parse and execute than JavaScript and to have a compact code representation. An external functionality (like syscalls) that may be expected by Wasm binary code is not stipulated by the standard. It rather provides a way to deliver interfacing via modules by the host environment that the VM implementation runs in.
Wasm program
A Wasm program is designed to be a separate module containing collections of various Wasm-defined values and program type definitions. These are expressed in either binary or textual format (see below) that both have a common structure.
Instruction set
The core standard for the binary format of a Wasm program defines an instruction set architecture (ISA) consisting of specific binary encodings of types of operations which are executed by the VM (without specifying how exactly they must be executed). The list of instructions includes standard memory load/store instructions, numeric, parametric, control of flow instruction types and Wasm-specific variable instructions.
The number of opcodes used in the original standard (MVP) was a bit fewer than 200 of the 256 possible opcodes. Subsequent versions of WebAssembly pushed the number of opcodes a bit over 200. The WebAssembly SIMD proposal (for parallel processing) introduces an alternate opcode prefix (0xfd) for 128-bit SIMD. The concatenation of the SIMD prefix, plus an opcode that is valid after the SIMD prefix, forms a SIMD opcode. The SIMD opcodes bring an additional 236 instructions for the "minimum viable product" (MVP) SIMD capability (for a total of around 436 instructions). Those instructions, the "finalized opcodes" are implemented in Google's V8 (in Google Chrome) and in the corresponding engine in Mozilla Firefox (but not enabled in stable versions of the web browsers), and there are also some additional proposal for instructions for later "post SIMD MVP", and there's also a separate "relaxed-simd" proposal on the table.
These SIMD opcodes are also portable and translate to native instruction sets like x64 and ARM. In contrast, neither Java's JVM (nor CIL) support SIMD, at their opcode level, i.e. in the standard; both do have some parallel APIs which provide SIMD speedup. There is an extension for Java adding intrinsics for x64 SIMD, that isn't portable, i.e. not usable on ARM or smartphones. Smartphones can support SIMD by calling assembly code with SIMD, and C# has similar support.
Code representation
In March 2017, the WebAssembly Community Group reached consensus on the initial (MVP) binary format, JavaScript API, and reference interpreter. It defines a WebAssembly binary format (), which is not designed to be used by humans, as well as a human-readable WebAssembly text format () that resembles a cross between S-expressions and traditional assembly languages.
The table below shows an example of a factorial function written in C and its corresponding WebAssembly code after compilation, shown both in text format (a human-readable textual representation of WebAssembly) and in binary format (the raw bytecode, expressed below in hexadecimal), that is executed by a Web browser or run-time environment that supports WebAssembly.
All integer constants are encoded using a space-efficient, variable-length LEB128 encoding.
The WebAssembly text format is more canonically written in a folded format using S-expressions. For instructions and expressions, this format is purely syntactic sugar and has no behavioral differences with the linear format. Through , the code above decompiles to:
(module
(type $t0 (func (param i64) (result i64)))
(func $f0 (type $t0) (param $p0 i64) (result i64)
(if $I0 (result i64) ;; $I0 is an unused label name
(i64.eqz
(local.get $p0)) ;; the name $p0 is the same as 0 here
(then
(i64.const 1))
(else
(i64.mul
(local.get $p0)
(call $f0 ;; the name $f0 is the same as 0 here
(i64.sub
(local.get $p0)
(i64.const 1))))))))
Note that a module is implicitly generated by the compiler. The function is actually referenced by an entry of the type table in the binary, hence a type section and the emitted by the decompiler. The compiler and decompiler can be accessed online.
Notes
See also
Architecture Neutral Distribution Format (ANDF)
UNCOL
Java bytecode
Common Language Runtime
LLVM
Compilation
Software portability
References
External links
W3C Community Group
WebAssembly Design
with info on browser compatibility and specifications (WebAssembly JavaScript API)
Assembly languages
Computer-related introductions in 2015
World Wide Web Consortium standards
Web programming |
11025173 | https://en.wikipedia.org/wiki/DPubS | DPubS | DPubS (Digital Publishing System), developed by Cornell University Library and Penn State University Libraries, is a free open access publication management software. DPubS arose out of Project Euclid, an electronic publishing platform for journals in mathematics and statistics. DPubS is free software released under Educational Community License.
History
Cornell University Library's involvement in digital publishing dates back to the 1980s. In partnership with the Xerox Corporation and the Commission on Preservation and Access, Cornell developed an early digital imaging project to preserve books in a fragile condition. Initially focused upon republishing mathematics titles, this effort expanded to include projects in agricultural history, home economics and American studies.
The “serials crisis” in the 1980s and 1990s likely encouraged Cornell University Library and other academic libraries and institutions to investigate such possibilities. In the 1980s libraries noticed that their journal subscription prices were increasing alarmingly. By the early 1990s, many solutions were being explored, with cancellations being significant among them; in one dramatic case, LSU cancelled $650,000 in subscriptions in 1992-93. Other alternatives emerged, however, involving the use of new technologies – such as those that enabled Cornell's digital imaging project – and the increasing availability of the Internet.
One such method of increasing access, Project MUSE, was initiated by Johns Hopkins University Press. Initially, Project Muse was intended to allow electronic access to titles published by Johns Hopkins University Press, but it has expanded to include the “full text of more than 300 journals from 60 different publishing groups worldwide." Another such project, developed by Cornell University Library and influenced by Project Muse, is Project Euclid, an electronic gathering of mathematics and statistics journals. As of 2005 it was delivering “40 journals to libraries and individuals under subscription, hosting, or open access delivery plans." Project Euclid was developed out of the code used to create NCSTRL, “a distributed network of Computer Science technical reports” in Cornell's Computer Science department. It offers an opportunity for “low-cost independent and society journals” to take advantage of the benefits of inclusion in an online database “without sacrificing their intellectual or economic independence or commitment to low subscription prices." Several pricing options are available: Euclid Prime (EP), Euclid Select (ES), Euclid Direct (ED), and Open Access (OA).
Developed with the help of two grants from The Andrew W. Mellon Foundation, Project Euclid, named after ancient mathematician Euclid of Alexandria, launched in 2003. As did their experience developing the digital imaging project, Project Euclid afforded Cornell University Library the opportunity to learn a great deal. Specifically, the library discovered much about functioning as a digital press that it was previously unfamiliar with, such as “marketing or handling subscription requests . . . editorial management procedures and the ability to negotiate contracts with journal owners." While Euclid has been successful thus far – reported as “healthy and growing” in early 2006 – Cornell's heavy investment in the project and the ever-changing nature of the academic journal field where “sustainability is a moving target” has led to the exploration of other publishing avenues.
Another initiative with relevance to the development of DPubS, arXiv, came to Cornell along with its initial developer Paul Ginsparg in 2001. Institutional repositories, which serve as a central database of scholarly work such as preprints and postprints of journal articles, have become increasingly popular: OpenDOAR, an online directory of open access repositories, has shown an increase from 350 to 850 repositories included in its database since mid-2006. Use of arXiv has been described as “intense,” averaging about 4,000 submissions per month in 2005. Though many repositories – including all of those listed in OpenDOAR – are open access, they “have not substituted for traditional publications, and thus have not had a substantial impact on the journals pricing situation." However, the success of open access repositories such as arXiv could indicate a growing willingness on the part of scholars to make use of non-traditional methods of publishing for their work.
Apart from Cornell's own desire to inquire further into unconventional approaches to publishing, there was an additional motivator. One of the results of the release of Project Euclid was interest in the software used to produce it. Cornell decided that they would eventually release this software, renaming it DPubS, but that it needed further development in order to be utilized by others. It was in 2004 that the Pennsylvania State University Libraries became involved – expressing interest in the software that was used to develop Project Euclid – and the first project in developing DPubS was making available the journal Pennsylvania History: A Journal of Mid-Atlantic Studies. This journal has been published since 1934 and is an official publication of the Pennsylvania Historical Association (PHA).
The software was last updated on 2013-04-02.
Goals
Also in 2004, Terry Ehling, the Director of the Center for Innovative Publishing at Cornell University Library addressed four goals for developing the software used to create Project Euclid into DPubS. These goals included: broadening the software's applicability by expanding its flexibility, including improving its ability to be used for monographs and other “non-serial literature”; “provide on-line editorial management services to support ‘peer review’ activities”; further developing “the administrative functionality and interface”; and “provide interoperability with institutional repository systems." Furthermore, the DpubS website states the following as the development goals of Cornell University Library and Penn State University Libraries: “generalizing the platform beyond a single discipline and document format (serials); adding administrative interfaces for non-technical staff; allowing a level of interoperability between DPubS and institutional repository systems, specifically Fedora and DSpace and developing editorial services to support the peer review process.”
Features
After two years of development, DPubS was released in November 2006, also with thanks to a grant of the Andrew W. Mellon Foundation. Its user interface utilizes XML (Extensible Markup Language) and XSLT (eXtensible Style Sheet Language Transformations) which enable a high-level of adjustment for the design of the Web appearance for publications supported by DPubS. Additionally, it has the following features: “scalable, single platform for electronic publishing,” allowing for the publication of several formats from one place; “rich presentation features,” due to the inclusion of XML; “multiple business models,” allowing both publications that are open access and those that are fee-based to utilize the software; “greater exposure and visibility of publications,” due to the use of OAI-MHP 2.0 (Open Access Initiative Metadata Harvesting Protocol) to allow metadata to be harvested from the content supported by DPubS and shared with users through services such as Google Scholar; “administrative management tools for non-technical staff”; “interoperability with institutional repositories” such as Fedora and DSpace (the latter forthcoming as of April 2007); “flexible and extensible handling of file and metadata formats,” allowing the easy use of PDFs, HTML, Microsoft Word files, PowerPoint presentations, etc.; and a “modular architecture allowing easy extension and customization.
Cornell and Penn State seem to have been largely successful in addressing most of the goals stated by Ehling, particularly in its customizability. As intended, the software used to develop Project Euclid has been expanded in order to encompass non-journal publications such as books and conference proceedings. Furthermore, the DpubS software can be adapted in order to be used with other formats. This aspect of DpubS results from it being open source, meaning that the software’s coding has been made available, enabling programmers to develop additions and modifications of the software for their own and others purposes. While the administrative tools have been included, the editorial management services will wait for “future releases.”
Toward the goal of further development, Cornell University Library and The Pennsylvania State University Libraries has partnered with several institutions that will be using DPubS and providing feedback. As of April 2007, these partners were: Australian National University, Bielefeld University – Germany, University of Kansas, University of Utah, University of Wisconsin–Madison, and Vanderbilt University. Along with Pennsylvania History mentioned above, other journals being supported by DPubS include: Medieval Philosophy and Theology, “a semi-annual, peer-reviewed journal . . . of medieval philosophy, including logic and natural science, and in medieval theology, including Christian, Jewish, and Islamic”; Indonesia, “a semi-annual journal published by the Cornell Southeast Asia Program . . . of Indonesia’s culture, history, government, economy, and society from 1966 to the present”; and Cornell Technical Reports and Papers, “a collection of publications from the Cornell Theory Center, the Cornell Computer Science Department, and other departments and units."
DPubS has been designed with the opinion that “libraries should get involved in publishing." As mentioned above, the traditional model of journal publication and the dissemination of scholarly information has been through those titles published by commercial publishers. Over time, the reputations of scholars have become strongly linked with the appearance of their work in these journals to the exception of publications outside of the commercial realm; some groups of scholars initially reacted to Project Euclid in a “skittish” manner due to concerns over the unfamiliar nature of its model. The creators of DPubs believe that libraries are uniquely positioned to play an important role in alternating this status quo. Efforts such as Project Muse, Project Euclid, arXiv, DPubS and other endeavors represent the kind of efforts that can be made by libraries and university presses to combat the challenges rising journal prices have presented to their budgets. Furthermore, due to issues of profitability, an increasing amount of scholarship does not get printed.
DPubS’ potential to contain the “cycle of knowledge creation and dissemination . . . within the academy and its close collaborators” could have a significant impact on academic publishing. DPubS positions itself to “encourage” libraries to take upon a new role with new responsibilities in order to alter some of the regrettable developments in the accessibility of scholarship over the past several decades. The hope is that it will help to increase access through electronic publishing by offering for free software that could easily cost six figures “for the initial licensing."
See also
EPrints
Open Journal Systems – a similar system
OpenACS
References
External links
http://sourceforge.net/projects/dpubs/
DPubS documentation on Cornell Community Wiki
Project Euclid
Academic publishing
Free library and information science software
Publication management software |
1696837 | https://en.wikipedia.org/wiki/Chris%20Wysopal | Chris Wysopal | Chris Wysopal (also known as Weld Pond) is an entrepreneur, computer security expert and co-founder and CTO of Veracode. He was a member of the high-profile hacker think tank the L0pht where he was a vulnerability researcher.
Chris Wysopal was born in 1965 in New Haven, Connecticut, his mother an educator and his father an engineer. He attended Rensselaer Polytechnic Institute in Troy, New York where he received a bachelor's degree in computer and systems engineering in 1987.
Career
He was the seventh member to join the L0pht. His development projects there included Netcat and L0phtCrack for Windows. He was also webmaster/graphic designer for the L0pht website and for Hacker News Network, the first hacker blog. He researched and published security advisories on vulnerabilities in Microsoft Windows, Lotus Domino, Microsoft IIS, and ColdFusion. Weld was one of the seven L0pht members who testified before a Senate committee in 1998 that they could bring down the Internet in 30 minutes. When L0pht was acquired by @stake in 1999 he became the manager of @stake's Research Group and later @stake's Vice President of Research and Development. In 2004 when @stake was acquired by Symantec he became its Director of Development. In 2006 he founded Veracode with Christien Rioux and serves as CTO. In 2017 Veracode was acquired by CA Technology for $614M. Veracode was subsequently spun out and became independent once again by being purchased by Thoma Bravo for $950M. Wysopal continues to serve as CTO.
In 2018 Wysopal joined the Humanyze board of directors.
Wysopal was instrumental in developing industry guidelines for responsible disclosure of software vulnerabilities. He was a contributor to RFPolicy, the first vulnerability disclosure policy. Together with Steve Christey of MITRE he proposed an IETF RFC titled "Responsible Vulnerability Disclosure Process" in 2002. The process was eventually rejected by the IETF as not within their purview but the process did become the foundation for Organization for Internet Safety, an industry group bringing together software vendors and security researchers of which he was a founder. In 2001 he founded the non-profit full disclosure mailing list VulnWatch for which was moderator. In 2003 he testified before a United States House of Representatives subcommittee on the topic of vulnerability research and disclosure.
In 2008 Wysopal was recognized for his achievements in the IT industry by being named one of the 100 Most Influential People in IT by eWeek and selected as one of the InfoWorld CTO 25. In 2010 he was named a SANS Security Thought Leader. In 2012, he began serving on the Black Hat Review Board. He was named one of the Top 25 Disruptors of 2013 by Computer Reseller News. In 2014 he was named one of 5 Security Thought Leaders by SC Magazine.
Patents
U.S. Patent 10,275,600, Assessment and analysis of software security flaws
U.S. Patent 9,672,355, Automated behavioral and static analysis using an instrumented sandbox and machine learning classification for mobile security
U.S. Patent 8,613,080, Assessment and analysis of software security flaws in virtual machines
Publications
Wysopal, Chris; Geer, Dan (August 2013). For Good Measure: Security Debt. ;login: The USENIX Magazine.
Wysopal, Chris (September, 2012). Software Security Varies Greatly. Datenschutz und Datensicherheit - DuD.
Wysopal, Chris; Shields, Tyler; Eng, Chris (February 24, 2010). Static Detection of Application Backdoors. Datenschutz und Datensicherheit - DuD.
References
L0pht
People associated with computer security
Rensselaer Polytechnic Institute alumni
Living people
1965 births |
2701329 | https://en.wikipedia.org/wiki/Salsa20 | Salsa20 | Salsa20 and the closely related ChaCha are stream ciphers developed by Daniel J. Bernstein. Salsa20, the original cipher, was designed in 2005, then later submitted to the eSTREAM European Union cryptographic validation process by Bernstein. ChaCha is a modification of Salsa20 published in 2008. It uses a new round function that increases diffusion and increases performance on some architectures.
Both ciphers are built on a pseudorandom function based on add-rotate-XOR (ARX) operations — 32-bit addition, bitwise addition (XOR) and rotation operations. The core function maps a 256-bit key, a 64-bit nonce, and a 64-bit counter to a 512-bit block of the key stream (a Salsa version with a 128-bit key also exists). This gives Salsa20 and ChaCha the unusual advantage that the user can efficiently seek to any position in the key stream in constant time. Salsa20 offers speeds of around 4–14 cycles per byte in software on modern x86 processors, and reasonable hardware performance. It is not patented, and Bernstein has written several public domain implementations optimized for common architectures.
Structure
Internally, the cipher uses bitwise addition ⊕ (exclusive OR), 32-bit addition mod 232 ⊞, and constant-distance rotation operations (<<<) on an internal state of sixteen 32-bit words. Using only add-rotate-xor operations avoids the possibility of timing attacks in software implementations. The internal state is made of sixteen 32-bit words arranged as a 4×4 matrix.
The initial state is made up of eight words of key, two words of stream position, two words of nonce (essentially additional stream position bits), and four fixed words:
The constant words spell "expand 32-byte k" in ASCII (i.e. the 4 words are "expa", "nd 3", "2-by", and "te k"). This is an example of a nothing-up-my-sleeve number. The core operation in Salsa20 is the quarter-round QR(a, b, c, d) that takes a four-word input and produces a four-word output:
b ^= (a + d) <<< 7;
c ^= (b + a) <<< 9;
d ^= (c + b) <<< 13;
a ^= (d + c) <<< 18;
Odd-numbered rounds apply QR(a, b, c, d) to each of the four columns in the 4×4 matrix, and even-numbered rounds apply it to each of the four rows. Two consecutive rounds (column-round and row-round) together are called a double-round:
// Odd round
QR( 0, 4, 8, 12) // column 1
QR( 5, 9, 13, 1) // column 2
QR(10, 14, 2, 6) // column 3
QR(15, 3, 7, 11) // column 4
// Even round
QR( 0, 1, 2, 3) // row 1
QR( 5, 6, 7, 4) // row 2
QR(10, 11, 8, 9) // row 3
QR(15, 12, 13, 14) // row 4
An implementation in C/C++ appears below.
#include <stdint.h>
#define ROTL(a,b) (((a) << (b)) | ((a) >> (32 - (b))))
#define QR(a, b, c, d)( \
b ^= ROTL(a + d, 7), \
c ^= ROTL(b + a, 9), \
d ^= ROTL(c + b,13), \
a ^= ROTL(d + c,18))
#define ROUNDS 20
void salsa20_block(uint32_t out[16], uint32_t const in[16])
{
int i;
uint32_t x[16];
for (i = 0; i < 16; ++i)
x[i] = in[i];
// 10 loops × 2 rounds/loop = 20 rounds
for (i = 0; i < ROUNDS; i += 2) {
// Odd round
QR(x[ 0], x[ 4], x[ 8], x[12]); // column 1
QR(x[ 5], x[ 9], x[13], x[ 1]); // column 2
QR(x[10], x[14], x[ 2], x[ 6]); // column 3
QR(x[15], x[ 3], x[ 7], x[11]); // column 4
// Even round
QR(x[ 0], x[ 1], x[ 2], x[ 3]); // row 1
QR(x[ 5], x[ 6], x[ 7], x[ 4]); // row 2
QR(x[10], x[11], x[ 8], x[ 9]); // row 3
QR(x[15], x[12], x[13], x[14]); // row 4
}
for (i = 0; i < 16; ++i)
out[i] = x[i] + in[i];
}
In the last line, the mixed array is added, word by word, to the original array to obtain its 64-byte key stream block. This is important because the mixing rounds on their own are invertible. In other words, applying the reverse operations would produce the original 4×4 matrix, including the key. Adding the mixed array to the original makes it impossible to recover the input. (This same technique is widely used in hash functions from MD4 through SHA-2.)
Salsa20 performs 20 rounds of mixing on its input. However, reduced round variants Salsa20/8 and Salsa20/12 using 8 and 12 rounds respectively have also been introduced. These variants were introduced to complement the original Salsa20, not to replace it, and perform even better in the eSTREAM benchmarks than Salsa20, though with a correspondingly lower security margin.
XSalsa20 with 192-bit nonce
In 2008, Bernstein proposed a variant of Salsa20 with 192-bit nonces called XSalsa20. XSalsa20 is provably secure if Salsa20 is secure, but is more suitable for applications where longer nonces are desired. XSalsa20 feeds the key and the first 128 bits of the nonce into one block of Salsa20 (without the final addition, which may either be omitted, or subtracted after a standard Salsa20 block), and uses 256 bits of the output as the key for standard Salsa20 using the last 64 bits of the nonce and the stream position. Specifically, the 256 bits of output used are those corresponding to the non-secret portions of the input: indexes 0, 5, 10, 15, 6, 7, 8 and 9.
eSTREAM selection of Salsa20
Salsa20 has been selected as a Phase 3 design for Profile 1 (software) by the eSTREAM project, receiving the highest weighted voting score of any Profile 1 algorithm at the end of Phase 2. Salsa20 had previously been selected as Phase 2 Focus design for Profile 1 (software) and as a Phase 2 design for Profile 2 (hardware) by the eSTREAM project, but was not advanced to Phase 3 for Profile 2 because eSTREAM felt that it was probably not a good candidate for extremely resource constrained hardware environments.
Cryptanalysis of Salsa20
, there are no published attacks on Salsa20/12 or the full Salsa20/20; the best attack known breaks 8 of the 12 or 20 rounds.
In 2005, Paul Crowley reported an attack on Salsa20/5 with an estimated time complexity of 2165, and won Bernstein's US$1000 prize for "most interesting Salsa20 cryptanalysis". This attack, and all subsequent attacks are based on truncated differential cryptanalysis. In 2006, Fischer, Meier, Berbain, Biasse, and Robshaw reported an attack on Salsa20/6 with estimated time complexity of 2177, and a related-key attack on Salsa20/7 with estimated time complexity of 2217.
In 2007, Tsunoo et al. announced a cryptanalysis of Salsa20 which breaks 8 out of 20 rounds to recover the 256-bit secret key in 2255 operations, using 211.37 keystream pairs. However, this attack does not seem to be competitive with the brute force attack.
In 2008, Aumasson, Fischer, Khazaei, Meier, and Rechberger reported a cryptanalytic attack against Salsa20/7 with a time complexity of 2153, and they reported the first attack against Salsa20/8 with an estimated time complexity of 2251. This attack makes use of the new concept of probabilistic neutral key bits for probabilistic detection of a truncated differential. The attack can be adapted to break Salsa20/7 with a 128-bit key.
In 2012, the attack by Aumasson et al. was improved by Shi et al. against Salsa20/7 (128-bit key) to a time complexity of 2109 and Salsa20/8 (256-bit key) to 2250.
In 2013, Mouha and Preneel published a proof that 15 rounds of Salsa20 was 128-bit secure against differential cryptanalysis. (Specifically, it has no differential characteristic with higher probability than 2−130, so differential cryptanalysis would be more difficult than 128-bit key exhaustion.)
ChaCha variant
In 2008, Bernstein published the closely related ChaCha family of ciphers, which aim to increase the diffusion per round while achieving the same or slightly better performance. The Aumasson et al. paper also attacks ChaCha, achieving one round fewer: for 256 bits ChaCha6 with complexity 2139 and ChaCha7 with complexity 2248. 128 bits ChaCha6 within 2107, but claims that the attack fails to break 128 bits ChaCha7.
ChaCha's initial state is similar to the initial state of Salsa20, however there are some differences. ChaCha's initial state includes a 128-bit constant, a 256-bit key, a 32-bit counter, and a 96-bit nonce, arranged as a 4×4 matrix of 32-bit words. ChaCha also re-arranges some of the words in the initial state:
The constant is the same as Salsa20 ("expand 32-byte k"). ChaCha replaces the Salsa20 quarter-round QR(a, b, c, d) with a += b; d ^= a; d <<<= 16;
c += d; b ^= c; b <<<= 12;
a += b; d ^= a; d <<<= 8;
c += d; b ^= c; b <<<= 7;
Notice that this version updates each word twice, while Salsa20's quarter round updates each word only once. In addition, the ChaCha quarter-round diffuses changes more quickly. On average, after changing 1 input bit the Salsa20 quarter-round will change 8 output bits while ChaCha will change 12.5 output bits.
The ChaCha quarter round has the same number of adds, xors, and bit rotates as the Salsa20 quarter-round, but the fact that two of the rotates are multiples of 8 allows for a small optimization on some architectures including x86. Additionally, the input formatting has been rearranged to support an efficient SSE implementation optimization discovered for Salsa20. Rather than alternating rounds down columns and across rows, they are performed down columns and along diagonals. Like Salsa20, ChaCha arranges the sixteen 32-bit words in a 4×4 matrix. If we index the matrix elements from 0 to 15
Then a double round in ChaCha is: // Odd round
QR(0, 4, 8, 12) // 1st column
QR(1, 5, 9, 13) // 2nd column
QR(2, 6, 10, 14) // 3rd column
QR(3, 7, 11, 15) // 4th column
// Even round
QR(0, 5, 10, 15) // diagonal 1 (main diagonal)
QR(1, 6, 11, 12) // diagonal 2
QR(2, 7, 8, 13) // diagonal 3
QR(3, 4, 9, 14) // diagonal 4
ChaCha20 uses 10 iterations of the double round. An implementation in C/C++ appears below.
#define ROTL(a,b) (((a) << (b)) | ((a) >> (32 - (b))))
#define QR(a, b, c, d) ( \
a += b, d ^= a, d = ROTL(d,16), \
c += d, b ^= c, b = ROTL(b,12), \
a += b, d ^= a, d = ROTL(d, 8), \
c += d, b ^= c, b = ROTL(b, 7))
#define ROUNDS 20
void chacha_block(uint32_t out[16], uint32_t const in[16])
{
int i;
uint32_t x[16];
for (i = 0; i < 16; ++i)
x[i] = in[i];
// 10 loops × 2 rounds/loop = 20 rounds
for (i = 0; i < ROUNDS; i += 2) {
// Odd round
QR(x[0], x[4], x[ 8], x[12]); // column 0
QR(x[1], x[5], x[ 9], x[13]); // column 1
QR(x[2], x[6], x[10], x[14]); // column 2
QR(x[3], x[7], x[11], x[15]); // column 3
// Even round
QR(x[0], x[5], x[10], x[15]); // diagonal 1 (main diagonal)
QR(x[1], x[6], x[11], x[12]); // diagonal 2
QR(x[2], x[7], x[ 8], x[13]); // diagonal 3
QR(x[3], x[4], x[ 9], x[14]); // diagonal 4
}
for (i = 0; i < 16; ++i)
out[i] = x[i] + in[i];
}
ChaCha is the basis of the BLAKE hash function, a finalist in the NIST hash function competition, and BLAKE2/3 successors tuned for even higher speed. It also defines a variant using sixteen 64-bit words (1024 bits of state), with correspondingly adjusted rotation constants.
XChaCha
Although not announced by Bernstein, the security proof of XSalsa20 extends straightforwardly to an analogous XChaCha cipher. Use the key and the first 128 bits of the nonce (in input words 12 through 15) to form a ChaCha input block, then perform the block operation (omitting the final addition). Output words 0–3 and 12–15 (those words corresponding to non-key words of the input) then form the key used for ordinary ChaCha (with the last 64 bits of nonce and 64 bits of block counter).
ChaCha20 adoption
Google had selected ChaCha20 along with Bernstein's Poly1305 message authentication code in SPDY, which was intended as a replacement for TLS over TCP. In the process, they proposed a new authenticated encryption construction combining both algorithms, which is called ChaCha20-Poly1305. ChaCha20 and Poly1305 are now used in the QUIC-protocol, which replaces SPDY and is used by HTTP/3.
Shortly after Google's adoption for TLS, both the ChaCha20 and Poly1305 algorithms were also used for a new [email protected] cipher in OpenSSH. Subsequently, this made it possible for OpenSSH to avoid any dependency on OpenSSL, via a compile-time option.
ChaCha20 is also used for the arc4random random number generator in FreeBSD, OpenBSD, and NetBSD operating systems, instead of the broken RC4, and in DragonFly BSD for the CSPRNG subroutine of the kernel. Starting from version 4.8, the Linux kernel uses the ChaCha20 algorithm to generate data for the nonblocking /dev/urandom device.
An implementation reference for ChaCha20 has been published in . The IETF's implementation modified Bernstein's published algorithm by changing 64-bit nonce and 64-bit block counter to 96-bit nonce and 32-bit block counter, The name was not changed when the algorithm was modified, as it is cryptographically insignificant (both form what a cryptographer would recognize as a 128-bit nonce), but the interface change could be a source of confusion for developers. Because of the reduced block counter, the maximum message length that can be safely encrypted by the IETF's variant is 232 blocks of 64 bytes (256 GiB). For applications where this is not enough, such as file or disk encryption, proposes using the original algorithm with 64-bit nonce.
Use of ChaCha20 in IKE and IPsec have been proposed for standardization in . Proposed standardization of its use in TLS is published as .
ChaCha20 usually offers better performance than the more prevalent Advanced Encryption Standard (AES) algorithm on systems where the CPU does not feature AES acceleration (such as the AES instruction set for x86 processors). As a result, ChaCha20 is sometimes preferred over AES in certain use cases involving mobile devices, which mostly use ARM-based CPUs.
In 2018, RFC 7539 was obsoleted by .
ChaCha20 is the exclusive algorithm used by the WireGuard VPN system, as of protocol version 1.
See also
Speck – an add-rotate-xor cipher developed by the NSA
ChaCha20-Poly1305 – an AEAD scheme combining ChaCha20 with the Poly1305 MAC
Notes
References
External links
Snuffle 2005: the Salsa20 encryption function
Salsa20 specification (PDF)
Salsa20/8 and Salsa20/12 (PDF)
The eSTREAM Project: Salsa20
The ChaCha family of stream ciphers
Salsa20 Usage & Deployment
Implementation and Didactical Visualization of the ChaCha Cipher Family in CrypTool 2
Internet Standards
Stream ciphers
Public-domain software with source code |
19809528 | https://en.wikipedia.org/wiki/Charlie%20Miller%20%28security%20researcher%29 | Charlie Miller (security researcher) | Charles Alfred Miller is an American computer security researcher with Cruise Automation. Prior to his current employment, he spent five years working for the National Security Agency and has worked for Uber.
Education
Miller holds a bachelor's degree in mathematics with a minor in philosophy from the then called Northeast Missouri State, and a Ph.D. in mathematics from the University of Notre Dame in 2000. He lives in Wildwood, Missouri.
Security research
Miller was a lead analyst at Independent Security Evaluators, a computer protection consultancy. He has publicly demonstrated many security exploits of Apple products. In 2008, he won a $10,000 cash prize at the hacker conference Pwn2Own in Vancouver, British Columbia, Canada for being the first to find a critical bug in the MacBook Air. In 2009, he won $5,000 for cracking Apple's Safari browser. Also in 2009, he and Collin Mulliner demonstrated an SMS processing vulnerability that allowed for complete compromise of the Apple iPhone and denial-of-service attacks on other phones. In 2011, he found a security hole in the iPhone and iPad, whereby an application can contact a remote computer to download new unapproved software that can execute any command that could steal personal data or otherwise using iOS applications functions for malicious purposes. As a proof of concept, Miller created an application called Instastock that was approved by Apple's App Store. He then informed Apple about the security hole, who promptly expelled him from the App Store.
Miller participated in research on discovering security vulnerabilities in NFC (Near Field Communication).
Miller, along with Chris Valasek, is known for remotely hacking a 2014 Jeep Cherokee and controlling the braking, steering, and acceleration of the vehicle.
Publications
iOS Hacker Handbook
The Mac Hacker's Handbook
Fuzzing for Software Security Testing and Quality Assurance
Battery firmware hacking: inside the innards of a smart battery
References
External links
Living people
University of Notre Dame alumni
Computer security specialists
Year of birth missing (living people) |
19453392 | https://en.wikipedia.org/wiki/Egenera | Egenera | Egenera, Inc. is a multinational cloud manager and data center infrastructure automation company with corporate headquarters in Boxborough, Massachusetts in the United States. It is a privately held company with approximately 110 employees. Founded in March 2000, the company was named by Network World as one of the top 10 startups to watch in 2002 and was a winner in the annual "Red Herring 100 North America" award given by Red Herring magazine in 2006.
Egenera maintains overseas headquarters in the United Kingdom, Japan and Hong Kong.
History
Egenera was founded by Vern Brownell in March 2000.
The company launched its first product, the Egenera BladeFrame, in October, 2001.
In October 2006, Egenera announced its plan to create a separate line of business in order to make its virtualization management software, called PAN Manager, available under OEM agreement to other server vendors.
In December 2012, Egenera acquired Fort Technologies who was a developer of cloud management software.
OEM partners
As of 2013, Egenera has OEM agreements with the following vendors:
Dell
Fujitsu
Hewlett Packard
IBM
NEC
See also
Cloud computing
Storage virtualization
Network virtualization
x86 virtualization
Blade Server
Fabric computing
Unified computing
References
Software companies based in Massachusetts
Virtualization software
Software companies established in 2000
Software companies of the United States
Remote companies |
616119 | https://en.wikipedia.org/wiki/Key%20server%20%28cryptographic%29 | Key server (cryptographic) | In computer security, a key server is a computer that receives and then serves existing cryptographic keys to users or other programs. The users' programs can be running on the same network as the key server or on another networked computer.
The keys distributed by the key server are almost always provided as part of a cryptographically protected public key certificates containing not only the key but also 'entity' information about the owner of the key. The certificate is usually in a standard format, such as the OpenPGP public key format, the X.509 certificate format, or the PKCS format. Further, the key is almost always a public key for use with an asymmetric key encryption algorithm.
History
Key servers play an important role in public key cryptography.
In public key cryptography an individual is able to generate a key pair, where one of the keys is kept private
while the other is distributed publicly. Knowledge of the public key does not compromise the security of public key cryptography. An
individual holding the public key of a key pair can use that key to carry out cryptographic operations that allow secret communications with strong authentication of the holder of the matching private key. The
need to have the public key of a key pair in order to start
communication or verify signatures is a bootstrapping problem. Locating keys
on the web or writing to the individual asking them to transmit their public
keys can be time consuming and insecure. Key servers act as central repositories to
alleviate the need to individually transmit public keys and can act as the root of a chain of trust.
The first web-based PGP keyserver was written for a thesis by Marc Horowitz, while he was studying at MIT. Horowitz's keyserver was called the HKP Keyserver
after a web-based OpenPGP HTTP Keyserver Protocol (HKP) it used to allow people to interact with the
keyserver. Users were able to upload, download, and search keys either through
HKP on TCP port 11371, or through web pages which ran CGI
scripts. Before the creation of the HKP Keyserver, keyservers relied on email
processing scripts for interaction.
A separate key server, known as the PGP Certificate Server, was developed by PGP, Inc. and was used as the software (through version 2.5.x for the server) for the default key server in PGP through version 8.x (for the client software), keyserver.pgp.com. Network Associates was granted a patent co-authored by Jon Callas (United States Patent 6336186) on the key server concept.
To replace the aging Certificate Server, an LDAP-based key server was redesigned at Network Associates in part by Randy Harmon and Len Sassaman, called PGP Keyserver 7. With the release of PGP 6.0, LDAP was the preferred key server interface for Network Associates’ PGP versions. This LDAP and LDAPS key server (which also spoke HKP for backwards compatibility, though the protocol was (arguably correctly) referred to as “HTTP” or “HTTPS”) also formed the basis for the PGP Administration tools for private key servers in corporate settings, along with a schema for Netscape Directory Server.
PGP Keyserver 7 was later replaced by the new PGP Corporation PGP Global Directory which allows PGP keys to published and downloaded using HTTPS or LDAP.
Public versus private keyservers
Many publicly accessible key servers, located around the world, are computers which store and provide OpenPGP keys over the Internet for users of that cryptosystem. In this instance, the computers can be, and mostly are, run by individuals as a pro bono service, facilitating the web of trust model PGP uses.
Several publicly accessible S/MIME key servers are available to publish or retrieve certificates used with the S/MIME cryptosystem.
There are also multiple proprietary public key infrastructure systems which maintain key servers for their users; those may be private or public, and only the participating users are likely to be aware of those keyservers at all.
Privacy concerns
For many individuals, the purpose of using cryptography is to obtain a higher
level of privacy in personal interactions and relationships. It has been pointed
out that allowing a public key to be uploaded in a key server when using
decentralized web of trust based cryptographic systems, like PGP, may reveal a
good deal of information that an individual may wish to have kept private. Since
PGP relies on signatures on an individual's public key to determine the
authenticity of that key, potential relationships can be revealed by analyzing
the signers of a given key. In this way, models of entire social networks can be
developed.
Problems with keyservers
The OpenPGP keyservers since their development in 1990s suffered from a few problems. Once a public key has been uploaded, it was purposefully made difficult to remove it as servers auto-synchronize between each other (it was done in order to fight government censorship). Some users stop using their public keys for various reasons, such as when they forget their pass phrase, or if their private key is compromised or lost. In those cases, it was hard to delete a public key from the server, and even if it were deleted, someone else can upload a fresh copy of the same public key to the server. This leads to an accumulation of old fossil public keys that never go away, a form of "keyserver plaque". As consequence anyone can upload a bogus public key to the keyserver, bearing the name of a person who in fact does not own that key, or even worse, use it as vulnerability: the Certificate Spamming Attack.
The keyserver had no way to check to see if the key was legitimate (belong to true owner).
To solve these problems, PGP Corp developed a new generation of key server, called the PGP Global Directory. This keyserver sent an email confirmation request to the putative key owner, asking that person to confirm that the key in question is theirs. If they confirm it, the PGP Global Directory accepts the key. This can be renewed periodically, to prevent the accumulation of keyserver plaque. The result is a higher quality collection of public keys, and each key has been vetted by email with the key's apparent owner. But as consequence, another problem arise: because PGP Global Directory allows key account maintenance and verifies only by email, not cryptographically, anybody having access to the email account could for example delete a key and upload a bogus one.
The last IETF draft for HKP also defines a distributed key server network, based on DNS SRV records: to find the key of [email protected], one can ask it by requesting example.com's key server.
Keyserver examples
These are some keyservers that are often used for looking up keys with gpg --recv-keys. These can be queried via https:// (HTTPS) or hkps:// (HKP over TLS) respectively.
keys.openpgp.org
pgp.mit.edu
keyring.debian.org
keyserver.ubuntu.com
attester.flowcrypt.com
zimmermann.mayfirst.org
pgp.surf.nl
See also
Lightweight Directory Access Protocol
GnuPG
References
External links
The OpenPGP HTTP Keyserver Protocol (HKP) (March 2003)
- an OpenPGP key server software package distributed under a BSD-style license. It has largely been replaced by SKS and Hockeypuck.
Synchronizing Key Server (SKS) - an OpenPGP key server software package distributed under the GPL.
Hockeypuck - a synchronising OpenPGP keyserver software package distributed under the AGPL.
Hagrid - a non-synchronising, verifying OpenPGP keyserver software package distributed under the AGPL.
PGP Global Directory hosted by the PGP Corporation.
Key management |
730229 | https://en.wikipedia.org/wiki/Digital%20gold%20currency | Digital gold currency | Digital gold currency (or DGC) is a form of electronic money (or digital currency) based on mass units of gold. It is a kind of representative money, like a US paper gold certificate at the time (from 1873 to 1933) that these were exchangeable for gold on demand. The typical unit of account for such currency is linked to grams or troy ounces of gold, although other units such as the gold dinar are sometimes used. DGCs are backed by gold through unallocated or allocated gold storage.
Digital gold currencies are issued by a number of companies, each of which provides a system that enables users to pay each other in units that hold the same value as gold bullion. These competing providers issue a type of independent currency.
Features
Universal currency
Proponents claim that DGC offers a truly global and borderless world currency system which is independent of exchange rate variations and political manipulation. Gold, silver, platinum and palladium each have recognized international currency codes under ISO 4217.
Asset protection
Unlike fractional-reserve banking, DGCs hold 100% of clients' funds in reserve as gold, silver, and/or platinum, which can be exchanged via digital certificates. Proponents of DGC systems say that deposits are protected against inflation, devaluation and other economic risks inherent in fiat currencies. These risks include the monetary policy of countries or territories, which are said by proponents to be harmful to the value of paper currency.
Bullion investing
can be used to buy, hold, and sell precious metals, but do not promote themselves as an "investment", as this implies an anticipated return.
Exchanging national currency
Some providers do not sell DGC directly to clients. For those DGCs, e-currency must be bought and sold via a digital currency exchanger.
Currency exchangers accept payment in national currencies by a variety of methods, including Bank Wire, Direct Deposit, Cheque, Money Order. Some exchangers also sell and fund pre-paid debit cards to make it easier for their clientele to convert DGC into an easily spendable form of national currency.
DGCs are known as private currency as they are not issued by governments.
Non-reversible transactions
Unlike the credit card industry, digital gold currency issuers generally do not have services to dispute or reverse charges. So, reversing transactions, even in case of a legitimate error, unauthorized use, or failure of a vendor to supply goods is difficult, if not impossible. This means that using digital gold currency is more akin to a cash transaction, while PayPal transfers, for example, could be considered more similar to credit card transactions.
The advantage of this agreement is that the operating costs of the digital currency system are significantly reduced because of the shortage of settlement of payment disputes. Plus, it allows to instantly clear digital gold currency transactions, making the funds immediately available to the recipient. Unlike credit cards, checks, ACH, and other reversible payment methods, there are typically 72 hours or more to clear.
Risks
As with all financial media, there are several types of risk inherent to the use of DGCs: management risk, political risk, data security and exchange risk.
Management and political risks
DGCs, like all financial institutions and public securities, have a layer of risk in the form of the management of the issuing institution. Controls aimed to limit management risk are called "governance".
All other DGC providers operate under self-regulation. DGC providers are not banks and therefore not subject to many bank regulations that pertain to fractional reserve lending as they do not engage in lending. However, DGCs do provide a method for transferring currency from one person to another, and therefore may fall under regulations pertaining to money transmitting in various jurisdictions.
The Global Digital Currency Association (GDCA), which was founded in 2002, is a non-profit association of online currency operators, exchangers, merchants and users. The GDCA is an example of the DGC industry's attempt at self-regulation. On their website they claim their goal is to "further the interests of the industry as a whole and help with fighting fraud and other illegal activities, arbitrate disputes and act as escrow agent when and where required." Of the once DGC providers, Pecunix (gone, see below), Liberty Reserve (shut down for money laundering in 2013), and eight others became members of the association. It costs one gram of gold to file a complaint if you are not a member, and the list of filable complaints is not exhaustive. Their domain name is registered anonymously through domains by proxy, see whois.
OS-Gold, Standard Reserve and INTGold
Several companies claiming to be Digital Gold Currencies sprang up and failed between 1999 and 2004, such as OS-Gold, Standard Reserve and INTGold. All these companies failed because the principals diverted deposits for other purposes instead of holding them in the form of gold. In each of these cases, account holders lost several million dollars worth of gold when the "institution" failed.
e-gold
e-gold was a digital gold currency founded in 1996. A legal case was brought against e-gold in April 2007 that included violations of 18 U.S. Code § 1960 (Prohibition of unlicensed money transmitting businesses). e-gold vigorously contested the § 1960 charges brought against it in April 2007 for more than a year. In July 2008, following a ruling from the court that effectively enshrined in case law the Treasury Department's expansion of the definition of "money transmitter", e-gold entered into a plea agreement that detailed actions required to bring the companies into compliance with laws and regulations governing operation of a money transmitting business. Although e-gold complied with all other terms of its plea agreement, it was not able to obtain money transmitting licenses due to its guilty plea. Since returning value to customers could constitute money transmitting without a license, e-gold entered into an agreement in 2010 with the US Government to enable e-gold account holders to claim the "monetized value" of their accounts, collectively valued in excess of 90 million US Dollars.
1mdc
1mdc was a digital gold currency backed by e-gold rather than by physical gold. On April 27, 2007, a US court ordered e-gold to freeze or block the e-gold accounts 1mdc used to back the digital gold currency it issued. Before the year was out, the 1mdc website was no longer accessible.
e-Bullion
E-Bullion was a digital-gold currency exchange that had risen, then become defunct around 2008.
In August 2008, James Fayed, the owner and chief executive official of the E-Bullion Company, was taken into United States Federal custody to face felony charges of conducting unlicensed money transactions and the murder of his business partner. Shortly thereafter, the website ceased to be available. As a consequence of these charges, by January 2010 the U.S. Government had seized all of the assets of e-Bullion, resulting in the complete closure of the company. In June 2011 a California jury found Fayed guilty of murder and sentenced him to death.
Pecunix
Pecunix was a gold based digital currency (or e-currency) in which accounts had balances in GAU (gold grams).
Pecunix was founded by Simon "Sidd" Davis in 2002, and was registered and incorporated in Panama. All gold bullion was originally stored with Mat Securitas Express AG in Zürich, Switzerland, but in 2008 the Pecunix directors transferred the bullion to an undisclosed location.
In a 2012 interview with DGCMagazine, Mr. Davis described the development of the Voucher-Safe software and peer-to-peer network for the exchange of digital currencies. In early 2014, Pecunix announced they would be replacing their Pecunix Payments system with the open source Voucher-Safe system and PX-Gold. By the end of 2014, all legitimate digital currency exchangers had ceased dealing with Pecunix. In early 2015, Pecunix disabled the log-in feature of their website, thereby preventing all users from accessing their accounts. A statement on the Pecunix website claimed that this was a temporary change "due to new management and restructuring", but access was never restored. The P2P Voucher Payment System became fully operational in August 2015, but PX-Gold never came into existence. Account holders never recovered their funds.
Data security
Digital gold systems are completely dependent on electronic storage and transmission of account ownership information. Therefore, the security of a given digital currency account is dependent upon the security of the issuer as well as the security of the accountholder's computer.
While the digital gold issuers employ data security experts to protect their systems, the average accountholder's computer is poorly protected against malware (trojans, worms and viruses) that can be used to intercept information used to access the user's DGC account. Therefore, the most common attacks on digital currency systems are directed against accountholders' computers by the use of malicious spam, phishing and other methods.
Issuers have taken quite different approaches to this problem. E-gold basically places the entire responsibility on the user, and employs a user-name and password authentication system that is weak and highly vulnerable to interception by malware. (Though it is the most common authentication method used by online banks.) The "not our problem" approach to user security has negatively contributed to e-gold's public image, as not a few e-gold accounts have been hacked and swept clean by attackers..
e-Bullion offers account holders a "Cryptocard" security token that changes the passphrase with each logon, but charges the account holder US$99.50 for the token. E-bullion does not require customers to use the Cryptocard, so account holders who choose not to get one may suffer from the same security issues as e-gold customers.
Pecunix devised a unique rotating key system that provides many of the benefits of a security token without requiring the user to buy one. Pecunix also supports the use of PGP signatures to access an account, which is probably the strongest of all authentication methods.
Exchange risk
Digital gold currency is a form of representative money as it directly represents gold metal on deposit or in custody. This depends on the issuer. Most issuers have the gold on deposit - i.e., the issuer will redeem the digital currency obligation with physical metal. Just as the exchange rates of national currencies fluctuate against each other, the exchange rates of DGCs fluctuate against national currencies, which is reflected by the price of gold in a particular currency. This creates exchange risk for any account holder, in the same way one would experience exchange risk by holding a bank account in a foreign currency.
Some DGC holders make use of the digital currency for daily monetary transactions, even though most of their normal income and expenses are denominated in the national currency of their home country. Fluctuations in the value of gold against their national currency can create some confusion and difficulty for new users as they see the "value" of their DGC account fluctuate in terms of their native currency.
In contrast to exchange risk, caused by gold's fluctuation against national currency, the purchasing power of gold (and therefore DGCs) is measured by its fluctuation against other commodities, goods and services. Since gold has historically been the refuge of choice in times of inflation or economic hardship, the purchasing power of gold becomes stronger during times of negative sentiment in the markets. Due to this speculative interference, there are times when purchasing power has also declined. For example, in 2007–2008, gold volatility closely tracked the run-up in oil prices.
Providers
Comparison of operating DGCs:
Criticisms
DGC providers and exchangers have been accused of being a medium for fraudulent high-yield investment program (HYIP) schemes. In January 2006, BusinessWeek reported that ShadowCrew, an online gang, used the e-gold system in a massive identity theft and fraud scheme. Traditional banks are also used frequently for such fraud. Allegations that e-gold is a safe medium for crime and fraud are strongly denied by its Chairman and founder, Dr. Douglas Jackson. Further, it can be argued that such problems lay with the source of the information or monies, rather than the location of storage of such ill-gotten gains. In other words, it would be difficult to claim the bank as villain when the criminal activity occurred by other parties away from the storage location.
Many DGC providers do not disclose the amount of bullion stored (see table), or do not allow independent external bullion audits, raising concerns that such companies do not maintain a 100% reserve ratio, or that their currency is entirely virtual and not backed by physical gold at all.
Due to increase of compliance requirements for payment service providers, Jersey-based GoldMoney decided to suspend its DGC service as from January 21, 2012
Cultural references
The novel Cryptonomicon by Neal Stephenson uses the idea of a gold-based digital currency in combination with strong cryptography.
The novel Alongside Night by J. Neil Schulman features several types of competing DGCs.
The novel Molon Labe! by Kenneth W. Royce features a gold based digital accounting system very similar to DGCs.
The novel Minerva by Robert P. Murphy features DGCs prominently.
The novel The Cryptographer by Tobias Hill is centered around Soft Gold, a DGC.
The novel The Way to Freedom by Carl Kyler is about gold money and digital gold currencies.
See also
Digital currency exchanger
Full-reserve banking
Gold as an investment
Gold exchange-traded fund
Gold standard
Silver as an investment
Gold to Go (Automated banking machine)
Vaulted gold
Cryptocurrency
Bitcoin
References |
14559427 | https://en.wikipedia.org/wiki/File%20URI%20scheme | File URI scheme | The File URI Scheme is a URI scheme defined in , typically used to retrieve files from within one's own computer.
Previously the file URI scheme was specified in and . The Internet Engineering Task Force (IETF) published RFC 8089, updating the latter RFC, with "a syntax based on the generic syntax of that is compatible with most existing usages."
Format
A file URI takes the form of
file://host/path
where host is the fully qualified domain name of the system on which the path is accessible, and path is a hierarchical directory path of the form directory/directory/.../name. If host is omitted, it is taken to be "localhost", the machine from which the URL is being interpreted. Note that when omitting host, the slash is not omitted (while "file:///foo.txt" is valid, "file://foo.txt" is not, although some interpreters manage to handle the latter).
RFC 3986 includes additional information about the treatment of ".." and "." segments in URIs.
How many slashes?
The // after the file: denotes that either a hostname or the literal term localhost will follow, although this part may be omitted entirely, or may contain an empty hostname.
The single slash between host and path denotes the start of the local-path part of the URI and must be present.
A valid file URI must therefore begin with either file:/path (no hostname), file:///path (empty hostname), or file://hostname/path.
file://path (i.e. two slashes, without a hostname) is never correct, but is often used.
Further slashes in path separate directory names in a hierarchical system of directories and subdirectories. In this usage, the slash is a general, system-independent way of separating the parts, and in a particular host system it might be used as such in any pathname (as in Unix systems).
There are two ways that Windows UNC filenames (such as \\server\folder\data.xml) can be represented. These are both described in RFC 8089, Appendix E as "non-standard". The first way (called here the 2-slash format) is to represent the server name using the Authority part of the URI, which then becomes file://server/folder/data.xml. The second way (called here the 4-slash format) is to represent the server name as part of the Path component, so the URI becomes file:////server/folder/data.xml. Both forms are actively used. Microsoft .NET (for example, the method new Uri(path)) generally uses the 2-slash form; Java (for example, the method new URI(path)) generally uses the 4-slash form. Either form allows the most common operations on URIs (resolving relative URIs, and dereferencing to obtain a connection to the remote file) to be used successfully. However, because these URIs are non-standard, some less common operations fail: an example is the normalize operation (defined in RFC 3986 and implemented in the Java java.net.URI.normalize() method) which reduces file:////server/folder/data.xml to the unusable form file:/server/folder/data.xml.
Examples
Unix
Here are two Unix examples pointing to the same /etc/fstab file:
file://localhost/etc/fstab
file:///etc/fstab
Windows
Here are some examples which may be accepted by some applications on Windows systems, referring to the same, local file c:\WINDOWS\clock.avi
file://localhost/c:/WINDOWS/clock.avi
file:///c:/WINDOWS/clock.avi
Here is the URI as understood by the Windows Shell API:
file:///c:/WINDOWS/clock.avi
Note that the drive letter followed by a colon and slash is part of the acceptable file URI.
Implementations
Windows
On Microsoft Windows systems, the normal colon (:) after a device letter has sometimes been replaced by a vertical bar (|) in file URLs. This reflected the original URL syntax, which made the colon a reserved character in a path part.
Since Internet Explorer 4, file URIs have been standardized on Windows, and should follow the following scheme. This applies to all applications which use URLMON or SHLWAPI for parsing, fetching or binding to URIs. To convert a path to a URL, use UrlCreateFromPath, and to convert a URL to a path, use PathCreateFromUrl.
To access a file "the file.txt", the following might be used.
For a network location:
file://hostname/path/to/the%20file.txt
Or for a local file, the hostname is omitted, but the slash is not (note the third slash):
file:///c:/path/to/the%20file.txt
This is not the same as providing the string "localhost" or the dot "." in place of the hostname. The string "localhost" will attempt to access the file as UNC path \\localhost\c:\path\to\the file.txt, which will not work since the colon is not allowed in a share name. The dot "." results in the string being passed as \\.\c:\path\to\the file.txt, which will work for local files, but not shares on the local system. For example file://./sharename/path/to/the%20file.txt will not work, because it will result in sharename being interpreted as part of the DOSDEVICES namespace, not as a network share.
The following outline roughly describes the requirements.
The colon should be used, and should not be replaced with a vertical bar for Internet Explorer.
Forward slashes should be used to delimit paths.
Characters such as the hash (#) or question mark (?) which are part of the filename should be percent-encoded.
Characters which are not allowed in URIs, but which are allowed in filenames, must also be percent-encoded. For example, any of "{}`^ " and all control characters. In the example above, the space in the filename is encoded as %20.
Characters which are allowed in both URIs and filenames must NOT be percent-encoded.
Must not use legacy ACP encodings. (ACP code pages are specified by DOS CHCP or Windows Control Panel language setting.)
Unicode characters outside of the ASCII range must be UTF-8 encoded, and those UTF-8 encodings must be percent-encoded.
Use the provided functions if possible. If you must create a URL programmatically and cannot access SHLWAPI.dll (for example from script, or another programming environment where the equivalent functions are not available) the above outline will help.
Legacy URLs
To aid the installed base of legacy applications on Win32 PathCreateFromUrl recognizes certain URLs which do not meet these criteria, and treats them uniformly. These are called "legacy" file URLs as opposed to "healthy" file URLs.
In the past, a variety of other applications have used other systems. Some added an additional two slashes. For example, UNC path \\remotehost\share\dir\file.txt would become file:////remotehost/share/dir/file.txt instead of the "healthy" file://remotehost/share/dir/file.txt.
Web pages
File URLs are rarely used in Web pages on the public Internet, since they imply that a file exists on the designated host. The host specifier can be used to retrieve a file from an external source, although no specific file-retrieval protocol is specified; and using it should result in a message that informs the user that no mechanism to access that machine is available.
References
Internet Standards
Identifiers
URI schemes |
10952182 | https://en.wikipedia.org/wiki/Lift%20%28Radiohead%20song%29 | Lift (Radiohead song) | "Lift" is a song by the English rock band Radiohead released in 2017. It was first performed in 1996; bootleg recordings were widely circulated, and it became a fan favourite. Radiohead recorded versions of "Lift" during the sessions for their third album, OK Computer (1997), but abandoned it. Guitarist Ed O'Brien said the band had felt pressured by its commercial potential, and drummer Philip Selway said it did not represent what Radiohead wanted to say at the time.
Critics described "Lift" as anthemic and "Britpop-like". In 2017, Radiohead released a version recorded during the OK Computer sessions on the reissue OKNOTOK 1997 2017, followed by a music video. It received positive reviews, though some critics found it inferior to the bootlegged performances. Further versions recorded during the OK Computer period were released on the 2019 compilation MiniDiscs [Hacked] and received more positive reviews.
History
Radiohead first performed "Lift" on March 14, 1996, at the Troubadour in West Hollywood. They performed it over 30 times that year while supporting Alanis Morissette on her Jagged Little Pill tour. Journalists noted "Lift" as a highlight and possible future single. According to guitarist Ed O'Brien, the audience responded warmly to the song: "Suddenly you'd see them get up and start grooving. It had this infectiousness." A bootleg recording was widely circulated, and "Lift" became a fan favourite.
Radiohead recorded "Lift" during the sessions for their third album, OK Computer (1997), but it went unreleased. According to drummer Philip Selway, the band felt it was "fey" and not "what we wanted to say about ourselves as a band at the time". In 1999, O'Brien dismissed it as "bogshite", and said the band were "very happy to leave [it] off the album ... There wasn't any stage where it was a key track for any of us." Asked why the song could have worked in live performance but not in the studio, O'Brien said Radiohead had not worked hard on it, and that if a song did not come together in the "first five times" of playing it through, they would move onto another track. Singer Thom Yorke said the band had settled on playing it in "a certain way that didn't work", and that it became impossible to rearrange. In 2017, O'Brien said that Radiohead had felt pressured by the song's commercial potential:If that song had been on that album, it would've taken us to a different place, and probably we'd have sold a lot more records—if we'd done it right. And everyone was saying this. And I think we subconsciously killed it. If OK Computer had been like a Jagged Little Pill, it would've killed us. But "Lift" had this magic about it. But when we got to the studio and did it, it felt like having a gun to your head. There was so much pressure.
In 2002, Radiohead performed "Lift" in a slower, more restrained arrangement, which Pitchfork described as "a somber, almost queasy affair". O'Brien and guitarist Jonny Greenwood later dismissed this version as inferior. In 2003, O'Brien said: "The spirit of the song was there in '96 ... And we're not in that place at the moment." In 2015, Greenwood suggested that Radiohead had worked on "Lift" again, describing it as a "management favourite". He likened the situation to "Nude", a song released on Radiohead's seventh album In Rainbows (2007) but written years earlier.
"Lift" remained a fan favourite. Pitchfork wrote that it came to "hold an important place in Radiohead lore", and according to Rolling Stone it was, for decades, the "great white whale" for Radiohead fans.
Release
In June 2017, Radiohead released "Lift" on the OK Computer reissue OKNOTOK 1997 2017, alongside two other previously unreleased tracks: "I Promise" and "Man of War". The version of "Lift" was recorded at Chipping Norton Recording Studios in February 1996, while the band were recording demos for OK Computer. In 2019, hours of recordings made during the OK Computer sessions, including more versions of "Lift", were leaked online; in response, Radiohead released the recordings as the compilation MiniDiscs [Hacked].
Composition
Spin described "Lift" as a Britpop-like ballad, and Rolling Stone described it as "one of the last vestiges of [Radiohead's] anthemic, Britpop hooks before the band embarked on a darker path with OK Computer". According to Pitchfork, the song is "strummy and steadily building, with yearning vocals".
The lyrics describe a man who has been rescued from a malfunctioning lift. Pitchfork likened its themes to the OK Computer single "No Surprises", and interpreted the lyric "Today is the first day of the rest of your days" as "a death sentence ... the hapless soul inside it is doomed to expire soundlessly in the intestines of some soulless corporate edifice".
Music video
On 12 September 2017, Radiohead released a music video for "Lift" directed by Oscar Hudson, featuring Yorke taking an unusual journey in a lift. The video features cameos from Yorke's girlfriend and his daughter, and references older Radiohead videos, with appearances from characters from the "Paranoid Android" and "Karma Police" videos. Pitchfork named it the 14th best music video of 2017.
Reception
Jamie Atkins, reviewing OKNOTOK for Record Collector, wrote that "Lift" was "an undeniably brilliant alt rock song, with surprising echoes of the grandstanding, otherworldly melancholy of prime Smashing Pumpkins". Guardian critic Alexis Petridis said it had "an immense, air-punch-inducing chorus", and that it would have been a "huge" hit had Radiohead released it. Pitchfork critic Jayson Greene described it as "a lovely, weightless strummer of a song".
Several critics felt the version released on OKNOTOK was lacking. Rolling Stone wrote that it was "restrained, decelerated and boasting an uncharacteristically blasé Yorke vocal take" and did not match the "magnitude" of the live bootlegs. Spin critic Andy Cush felt it was "strangely neutered, with drums that patter instead of exploding with energy", and that the widely circulated 1996 bootleg remained "canonical". Fellow Spin writer Winston Cook-Wilson agreed that the bootlegs were "a bit more staggering", but wrote that this was "a testament to the band's remarkable pop sense at the time – an inclination they, for their own neurotic reasons, quickly moved to complicate or subvert".
Following the 2019 release of MiniDiscs [Hacked], Pitchfork critics Greene and Jeremy D Larson wrote that it had a superior version of "Lift": "It's not mixed very carefully, but it sounds scrappy and untamed, like the band is pushing it into the red unselfconsciously. It lives up to the myth." Rolling Stone considered this version the "crown jewel" of the release and "well worth" the price of purchase. The Guardian wrote that this "satisfying" version would likely have pleased EMI had Radiohead released it as the first OK Computer single, but that it was "ultimately a conservative song and feels like a path the band were right to fork from".
Personnel
Radiohead
Colin Greenwood
Jonny Greenwood
Ed O'Brien
Philip Selway
Thom Yorke
Additional musicians
Chris Blair – mastering
Nigel Godrich – production, engineering
Jim Warren – production, engineering
References
2017 songs
Radiohead songs
Song recordings produced by Nigel Godrich
Songs written by Thom Yorke
Songs written by Jonny Greenwood
Songs written by Colin Greenwood
Songs written by Ed O'Brien
Songs written by Philip Selway |
5140862 | https://en.wikipedia.org/wiki/Monaco%20%28typeface%29 | Monaco (typeface) | Monaco is a monospaced sans-serif typeface designed by Susan Kare and Kris Holmes. It ships with macOS and was already present with all previous versions of the Mac operating system. Characters are distinct, and it is difficult to confuse (figure zero) and (uppercase O), or (figure one), (vertical bar), (uppercase i) and (lowercase L). A unique feature of the font is the high curvature of its parentheses as well as the width of its square brackets, the result of these being that an empty pair of parentheses or square brackets will strongly resemble a circle or square, respectively.
Monaco has been released in at least three forms. The original was a bitmap monospace font that still appears in the ROMs of even New World Macs, and is still available in recent macOS releases (size 9, with disabled antialiasing). The second is the outline form, loosely similar to Lucida Mono and created as a TrueType font for System 6 and 7; this is the standard font used for all other sizes. There was briefly a third known as MPW, since it was designed to be used with the Macintosh Programmer's Workshop IDE; it was essentially a straight conversion of the bitmap font into an outline font with the addition of some of the same disambiguation features as were added to the TrueType Monaco.
The original Monaco 9 point bitmap font was designed so that when a Compact Macintosh window was displayed full screen, such as for a terminal emulator program, it would result in a standard text user interface display of 80 columns by 25 lines.
With the August 2009 release of Mac OS X 10.6 "Snow Leopard", Menlo was introduced as the default monospaced font instead of Monaco in Terminal and Xcode, However, Monaco remains a part of macOS. Monaco is the default font in the current Python IDLE when used on a Mac running OS X El Capitan.
See also
Apple typography
ProFont
References
External links
Monospaced typefaces
Sans-serif typefaces
Apple Inc. typefaces
Macintosh operating systems
MacOS
Typefaces and fonts introduced in 1983
Typefaces designed by Susan Kare
Typefaces designed by Kris Holmes |
41259265 | https://en.wikipedia.org/wiki/Samsung%20YP-R0 | Samsung YP-R0 | The Samsung YP-R0 (also known as Samsung R0 worldwide or Yepp R0 in Korea or Samsung R'PLAY in France) is a portable media player made by Samsung, leaked on August 10, 2009 and first released end of October 2009 in Russia. It was developed along with the YP-R1 with which it shares several specifications (similar aluminum design, same Linux kernel and SoC).
The R0 is available in three different Flash memory capacities: 4 GB, 8 GB, and 16 GB. It comes in three different colors: black, silver and pink. Storage is expandable via a microSDHC slot with capacity up to 32 GB, and unofficially to 64GB or more via FAT32 formatted SDXC cards. It features an aluminum case, a 2.6 inch TFT LCD display with a resolution of 240 by 320 pixels, RDS FM tuner, tactile buttons and microUSB connector. Several EQ and sound effects are available through Samsung's DNSe 3.0 sound engine.
Media support
Audio codecs: MP3, WMA, WAV, OGG Vorbis, AAC-LC/Plus and FLAC.
Video codecs: DivX, Xvid, MPEG-4, H.264, WMV9 in AVI/SVI/MP4/WMV/ASF/MOV containers. Video files up to resolution 720x480 are natively supported so in most cases converting is not necessary.
Picture formats: JPG, BMP, PNG, GIF
Other: TXT files
Operation
The Samsung R0 is compatible with Microsoft Windows, Linux and Mac OS X when USB mode is set to MSC as a drag and drop USB mass storage device. The player can also function as a Windows Media 10 and up device when USB mode is set to MTP. Unlike many Samsung players, the R0 features a standard microUSB connector. It also has 9 physical keys (power/hold, back, menu, user button, select, up, down, left, right) but neither hardware hold key nor dedicated volume buttons. The R0 only powers fully off after several hours not being used or when pressing the reset hole. The rest of the time it only switches to a sleep mode.
Samsung claims up to 30 hours of music playback (with MP3 128kbit/s files, volume level 15, normal sound mode and display off) and 6 hours of video playback (SVI, brightness 3, volume level
15, normal sound mode).
Software
No additional CD is shipped with the R0 but the optional software EmoDio can be found in the device's internal memory or can be downloaded from the official Samsung website. EmoDio (discontinued, now replaced by Kies) is able to sync one's library with the R0, convert video files, manage playlists, rip CDs etc..
Samsung original firmware
Samsung released 10 firmware revisions, from v1.03 to the latest one, v1.25, released on May 11, 2011.
The firmware was only released to fix bugs but not to add or improve features. Indeed, only the UCI (User Created Interface: main menu customization) and premium fonts (in Korea only) support have been added via a new firmware.
The YP-R0 is infamous for its many bugs, especially the library update one: when adding new files into the internal memory of the player, it automatically updates the media library after disconnecting the device from the computer. It sometimes occurs that due to some specific audio files, the updating process hangs up, leading to brick the R0. It is then no longer recognized by the computer and can no longer start up. Samsung released several firmware updates to fix this bug but never managed to find all root causes. As a result, the issue may occur even with the latest firmware v1.25. A preventive solution is to install a modded firmware or Rockbox (see below). Otherwise the user has to send his device back to the after-sales service center or use the leaked recovery tool.
Unlike the older Samsung players which had a specific firmware for every region, on R0 the region code can be changed independently from the firmware (meaning that the firmware is the same worldwide). Each region code (KR, EU, FR, RU, US etc...) has it owns specificities like RDS support, lyrics support, MSC only or MSC/MTP setting etc...
Alternative firmware
On July 7, 2011 the first modded firmware v2.00 based on the official firmware v1.25 was released by an Italian developer. Modded firmware releases v2.10 and v2.20 came out later. Compared to the official firmware v1.25, they add a "Safe Mode" (the device can be connected to the computer even when bricked), a DRK (Device Rescue Kit) to unbrick the device, a CPU downclocking tool to save battery life and the possibility to customize the resources (pictures, sounds, fonts, language) of the firmware.
Since 2011 the alternative free and open-source firmware Rockbox can be installed on the YP-R0. Unlike most usual Rockbox targets, the YP-R0 port is not a native port. It runs as an application based on the original Linux kernel used by Samsung in the official firmware. That makes the development easier but on the downside, Rockbox boots up slower as on the other usual targets.
As of December 2013, the YP-R0 is an unstable port according to the official Rockbox classification. In fact, it runs well enough for a daily use but cannot be installed via Rockbox Utility yet. The user has to compile the Rockbox bootloader himself or to use a pre-built modded firmware including the Rockbox bootloader such as modded firmware v2.51 (dualboot) or lightROM 4.6 (Rockbox only).
Contrary to the Samsung firmware, Rockbox merges the internal memory and the microSD card into the same tags library, supports more audio codecs and gapless playback among other things.
See also
Samsung YEPP
Samsung Galaxy Player
Samsung Electronics
References
External links
Samsung YP-R0 product page on the official Samsung US website
Samsung YP-R0 subforum on anythingbutipod.com
Samsung Open Source Center
Digital audio players
Portable media players |
48547307 | https://en.wikipedia.org/wiki/Help%20desk%20software | Help desk software | Help desk software refers to a computer program that enables customer-care operators to keep track of user requests and deal with other customer-care-related issues. It is what makes customer-care service efficient and enterprising.
Generally, help desk software is part of an umbrella category called the service desk, which includes asset management and IT service management. Oftentimes, the two terms are used interchangeably. Nevertheless, help desk software specifically refers to the system that addresses customer queries.
History
The history of help desk software dates back to the 20th century when businesses relied mostly on face-to-face interaction to resolve customer issues. Customers had to visit a company’s store or office with the product to get their problems solved.
With the invention of the telephone in 1876 and the telephone switchboard in the 1890s, the help desk assumed a better approach. Customers were able to reach their company and voice out their problem over the phone system. During the 20th-century era, companies used mainly equipment like dictation machines, typewriters, and dumb terminals with access to a mainframe computer, to address customer issues. The earliest use of computers for customer service was done through the use of mainframe software. Customers would submit paper forms or communicate their issue by phone to customer service agents who would seek avenues to handle the issues.
In the 1960s, companies began to set up call centers and also train staff to receive and handle customer inquiries in an organized and efficient manner. This was the era of Interactive voice response (IVR) which became a big boost to the telephone customer service system. Later on, Desktop PCs and email significantly improved help desk systems. Customers could communicate their problems by email, bypassing paper forms. Help desk agents could provide status updates and resolutions by email as well.
Meanwhile, the actual introduction of help desk systems began in 1980 when the internet was officially made available for public use. Many companies started outsourcing their customer service department. This led to the massive use of email and live chat systems in the 1990s. This new development enabled several US companies to outsource their help desk to low-cost countries like India and the Philippines.
In the 2000s, companies began to use diverse kinds of software packages to deal with customer-care issues. This led to the massive production of different kinds of help desk software programs across the internet and the world at large. In recent times, the internet and networked systems make help desk software more interactive and participatory for customers and agents. Customer can now submit and track their issues more easily.
Customer service and help desk software systems have become increasingly popular in recent times. According to a recent report, there is a massive increase in sales of customer relationship management (CRM) software, which includes help desk software across the globe.
Basic characteristics
Help desk software automates customer services in diverse ways. It typically consists of at least three parts. These include Ticket Management, Automation Suite, and Reporting/Optimization.
Help desk software has a point of contact for customers to send their queries and a ticketing system that tracks and organizes issues for faster resolution. It may also have a feature that aggregates and organizes queries and answers into a knowledge base, such as FAQs or guide articles. It may accommodate multiple points of contact; a working dashboard; and an analytics section. It may also have a feature that allows agents to escalate issues to a higher level.
More advanced help desk applications feature online chat, insights and analytics, automated processes, multiple contact channels, reporting tools, collaboration tools, and a CRM feature.
Benefits
The following benefits are typically associated with help desk software:
Any business that uses webmail for support tends to resolve customer support issues quicker and sees an increase in support productivity when they switch to help desk software.
Help desk software automates tasks such as: ticket categorization and prioritization, ticket routing, alerts and notifications, ticket status management, and so on. With the right help desk solution, the workload is cut down as many tasks such as issue tracking, assigning, and ticket management can be automated.
Some cloud-based help desk software has built-in security features, such as HIPAA compliance if for handling US health care information, or GDPR compliance for accepting requests from persons located in the European Union.
Disadvantages
There are some disadvantages related to help desk software as well, mainly:
Many help desk software platforms have expensive upfront costs as well as time-consuming implementation periods, which can significantly drain company resources. While most offer a trial option, effectively trialing software is difficult and time-consuming in a large organization. It's also difficult to evaluate the software with a full volume of tickets and staff in a short period of time.
On-premises help desks can have costs associated with maintenance, upgrades, and scheduled downtime of servers, which are borne by the customer, not the help desk software provider.
Cloud-based help desks can incur higher costs over longer periods of time. Cloud-based help desk software can become partially or entirely unavailable to users without an Internet connection. Consequently, unexpected disruptions in Internet connection may make such services temporarily unavailable.
See also
Customer service
Customer support
Help desk
Issue tracking system
Technical support
References
Customer relationship management software
Business software |
54784150 | https://en.wikipedia.org/wiki/David%20A.%20Klarner | David A. Klarner | David Anthony Klarner (October 10, 1940March 20, 1999) was an American mathematician, author, and educator. He is known for his work in combinatorial enumeration, polyominoes, and box-packing.
Klarner was a friend and correspondent of mathematics popularizer Martin Gardner and frequently made contributions to Gardner's Mathematical Games column in Scientific American. He edited a book honoring Gardner on the occasion of his 65th birthday. Gardner in turn dedicated his twelfth collection of mathematical games columns to Klarner.
Beginning in 1969 Klarner made significant contributions to the theory of combinatorial enumeration, especially focusing on polyominoes and box-packing. Working with Ronald L. Rivest he found upper bounds on the number of n-ominoes. Klarner's Theorem is the statement that an m by n rectangle can be packed with 1-by-x rectangles if and only if x divides one of m and n.
He has also published important results in group theory and number theory, in particular working on the Collatz conjecture (sometimes called the 3x + 1 problem). The Klarner-Rado Sequence is named after Klarner and Richard Rado.
Biography
Klarner was born in Fort Bragg, California, and spent his childhood in Napa, California. He married Kara Lynn Klarner in 1961. Their son Carl Eoin Klarner was born on April 21, 1969.
Klarner did his undergraduate work at Humboldt State University (1960–63), got his Ph.D. at the University of Alberta (1963–66), and did post-doctoral work at McMaster University in Hamilton, Ontario (1966–68). He also did post-doctoral work at Eindhoven University of Technology in the Netherlands (1968-1970), at the University of Reading in England working with Richard Rado (1970–71), and at Stanford University (1971–73). He served as an assistant professor at Binghamton University (1973–79) and was a visiting professor at Humboldt State University in California (1979–80). He returned to Eindhoven as a professor (1980–81), and to Binghamton (1981–82). From 1982 to 1996 he was a professor of computer science at the University of Nebraska, at Lincoln, with a one-year break at Eindhoven in academic year 1991–92. He retired to Eureka, California in 1997 and died there in 1999.
He was a frequent contributor to recreational mathematics and worked with many key mathematics popularizers including Ronald L. Rivest, John H. Conway, Richard K. Guy, Donald Coxeter, Ronald Graham, and Donald Knuth.
Organizations and awards
Klarner was a member of the Association for Computing Machinery, the American Mathematical Society, the Mathematical Association of America, and the Fibonacci Association. He was awarded a National Science Foundation Fellowship Award in mathematics in 1963. In 1986 Klarner received a University of Nebraska-Lincoln Distinguished Teaching Award in Computer Science.
The David A. Klarner Fellowship for Computer Science was set up after Klarner's death by Spyros Magliveras a fellow professor in Computer Science at UNL.
Bibliography
Asymptotically Optimal Box Packing Theorems: Klarner systems by Michael Reid, Department of Mathematics, University of Central Florida, June, 2008
A Lifetime of Puzzles edited by Erik D. Demaine, Martin L. Demaine, Tom Rodgers; pp. 221–225: Satterfield's Tomb, a puzzle by David A. Klarner and Wade Satterfield;
Selected publications
Books
The Mathematical Gardner (editor), Publisher: Boston : Prindle, Weber & Schmidt; Belmont, Calif. : Wadsworth International, , (electronic book)
Papers
Polyominoes by Gill Barequet, Solomon W. Golomb, and David A. Klarner, December 2016
The number of tilings of a block with blocks (with F. S. S. Magliveras), European Journal of Combinatorics: Volume 9 Issue 4, July 1988
The number of tiered posets modulo six Discrete Mathematics, Vol. 62, Issue 3, pp. 295–297, December 1986
Asymptotics for coefficients of algebraic functions (with Patricia Woodworth), Aequationes Mathematicae, Volume 23, Issue 1, pp. 236–241, December 1981
An algorithm to determine when certain sets have 0-density Journal of Algorithms, Vol. 2, Issue 1, Pages 31–43, March 1981
Some remarks on the Cayley-Hamilton theorem American Mathematical Monthly, Vol. 83, No. 5, pp. 367–369, May, 1976
Asymptotic bounds for the number of convex n-ominoes (with Ronald L. Rivest), Discrete Mathematics, Vol. 8, Issue 1, pp. 31–40, March 1974
A finite basis theorem revisited Stanford University: Computer Science Department, April 1973
The number of SDR's in certain regular systems Stanford University: Computer Science Department, April 1973
Selected combinatorial research problems (with Václav Chvátal and Donald E. Knuth), Stanford University: Computer Science Department, June 1972
Sets generated by iteration of a linear operation Stanford University: Computer Science Department, March 1972
Linear Combinations of Sets of Consecutive Integers (with Richard Rado), Stanford University: Computer Science Department, March 1972
Sets generated by iteration of a linear operation Stanford University: Computer Science Department, March 1972
Packing a rectangle with congruent n-ominoes Journal of Combinatorial Theory, Vol. 7, Issue 2, Pages 107–115, September 1969
Packing boxes with congruent figures (with F. Göbel), Indagationes Mathematicae 31, pp. 465–472, MR 40 #6362, 1969
Some Results Concerning Polyominoes Fibonacci Quarterly, 3, pp. 9–20, February 1965
References
External links
David A. Klarner fonds University of Calgary Special Collections
Mathematics popularizers
Recreational mathematicians
20th-century American mathematicians
California State Polytechnic University, Humboldt alumni
University of Alberta alumni
McMaster University alumni
Eindhoven University of Technology faculty
Binghamton University faculty
University of Calgary faculty
University of Nebraska faculty
Number theorists
Combinatorial game theorists
1940 births
1999 deaths
People from Fort Bragg, California
Writers from California
Mathematicians from California |
28669204 | https://en.wikipedia.org/wiki/Universal%20Time-Sharing%20System | Universal Time-Sharing System | The Universal Time-Sharing System (UTS) is a discontinued operating system for the XDS Sigma series of computers, succeeding Batch Processing Monitor (BPM)/Batch Time-Sharing Monitor (BTM). UTS was announced in 1966, but because of delays did not actually ship until 1971. It was designed to provide multi-programming services for online (interactive) user programs in addition to batch-mode production jobs, symbiont (spooled) I/O, and critical real-time processes. System Daemons, called "ghost jobs" were used to run monitor code in user space. The final release, D00, shipped in January, 1973. It was succeeded by the CP-V operating system, which combined UTS with the heavily batch-oriented Xerox Operating System (XOS).
CP-V
The CP-V (pronounced sea-pea-five) operating system, the compatible successor to UTS, was released in August 1973. CP-V supported the same CPUs as UTS plus the Xerox 560. CP-V offers "single-stream and multiprogrammed batch; timesharing; and the remote processing mode, including intelligent remote batch." Realtime processing was added in release B00 in April 1974, and transaction processing in release C00 in November 1974.
CP-V version C00 and F00, and Telefile's TCP-V version I00 still run on a Sigma emulator developed in 1997.
CP-R
CP-R (Control Program for Real-Time) is a discontinued realtime operating system for Xerox 550 and Sigma 9 computer systems. CP-R supports three types of tasks: Foreground Primary Tasks, Foreground Secondary Tasks, and Batch Tasks.
CP-6
In 1975, Xerox decided to exit the computer business which it had purchased from Scientific Data Systems in 1969. Honeywell offered to purchase Xerox Data Systems, initially to provide field service support to the existing customer base.
The CP-6 system including OS and program products was developed, beginning in 1976, by Honeywell to convert Xerox CP-V users to run on Honeywell equipment. The first beta site was installed at Carleton University in Ottawa Canada in June 1979, and three other sites were installed before the end of 1979.
Support for CP-6 was transferred to ACTC in Canada in 1993. CP-6 systems continued to run for many years in the US, Canada, Sweden, the UK, and Germany. The final system shut down was at Carleton University in 2005.
CP-6 and its accomplishments, its developers, and its customers are commemorated with a plaque on the community wall at the Computer History Museum in Mountain View, California.
Software
CP-V Software as of release B00, 1974. CP-V was supported by the CP-6 team at the Honeywell Los Angeles Development Center (LADC) until 1977 and thereafter.
Bundled Software
TEL – Terminal Executive Language.
EASY – Simple interactive environment for FORTRAN and BASIC programs and data files.
CCI – Control Command (or Card) Interpreter. The batch counterpart of TEL.
BATCH – Submit jobstream to batch queue.
PCL – Peripheral Conversion Language (pronounced "pickle"). Data file device to device copy.
EDIT – Line Editor.
LINK – One-pass linking loader.
LOAD – Two-pass overlay loader.
DELTA – Instruction-level debugger.
SORT/MERGE.
Extended FORTRAN IV.
FDP – FORTRAN Debug Package.
META-SYMBOL – Macro assembler.
BASIC.
FLAG – Load-and-go FORTRAN compatible with IBM Fortran-H.
ANS COBOL.
COBOL On-Line debugger.
APL.
SL-1 – Simulation Language.
IBM 1400 Series Simulator.
SYSGEN – System Generation.
DEFCOM – Export external definitions from a load module.
SYMCON – Manipulate symbols in a load module.
ANALYZE – System dump analyzer.
Separately Priced Software
MANAGE – A generalized file management and reporting tool.
EDMS – Database Management System.
GPDS – General Purpose Discrete Simulator.
CIRC – Electronic Circuit Analysis.
Contributed Software
Xerox maintained a library of other Xerox and user-written software from the EXCHANGE user group.
References
Further reading
Bryan, G. Edward, "Not All Programmers Are Created Equal --Redux," 2012 IEEE Aerospace Conference Proceedings, March 2012
P.A. Crisman and Bryan, G. Edward, "Management of Software Development for CP 6 at LADC", Proceedings of the Fifth Annual Honeywell International Software Conference, March 1981.
Bryan, G. Edward, "CP-6: Quality and Productivity Measures in the 15-Year Life Cycle of an Operating System," Software Quality Journal 2, 129–144, June 1993.
Frost, Bruce, “APL and I-D-S/II APL access to large databases,” APL '83 Proceedings of the international conference on APL, pages 103–107.
Fielding, Roy T., "An Empirical Microanalysis of Software Failure Data from a 12-Year Software Maintenance Process," Masters thesis, University of California Irvine, 1992
External links
UTS Documentation at Bitsavers
CP-V Documentation at Bitsavers
CP-R Documentation at Bitsavers
The COMPUTER That Will Not Die: The SDS Sigma 7
A working Sigma 9 running CP-V at Living Computers: Museum + Labs: request a login
Time-sharing operating systems
Discontinued operating systems
Proprietary operating systems
Xerox computers |
278017 | https://en.wikipedia.org/wiki/PSK31 | PSK31 | PSK31 or "Phase Shift Keying, 31 Baud", also BPSK31 and QPSK31, is a popular computer-sound card-generated radioteletype mode, used primarily by amateur radio operators to conduct real-time keyboard-to-keyboard chat, most often using frequencies in the high frequency amateur radio bands (near-shortwave). PSK31 is distinguished from other digital modes in that it is specifically tuned to have a data rate close to typing speed, and has an extremely narrow bandwidth, allowing many conversations in the same bandwidth as a single voice channel. This narrow bandwidth makes better use of the RF energy in a very narrow space thus allowing relatively low-power equipment (5 watts) to communicate globally using the same skywave propagation used by shortwave radio stations.
History
PSK31 was developed and named by English amateur radio operator Peter Martinez (call sign G3PLX) and introduced to the wider amateur radio community in December 1998.
The 31 baud BPSK modulation system used in PSK31 was introduced by Pawel Jalocha (SP9VRC) in his SLOWBPSK program written for Motorola's EVM radio. Instead of the traditional frequency-shift keying, the information is transmitted by patterns of polarity-reversals (sometimes called 180-degree phase shifts). PSK31 was enthusiastically received, and its usage spread like wildfire worldwide, lending a new popularity and tone to the on-air conduct of digital communications. Due to the efficiency of the mode, it became, and still remains, especially popular with operators whose circumstances do not permit the installation of large antenna systems, the use of high power, or both.
Use and implementation
A PSK31 operator typically uses a single-sideband (SSB) transceiver connected to the sound card of a computer running PSK31 software. When the operator enters a message for transmission, the software produces an audio tone that sounds, to the human ear, like a continuous whistle with a slight warble. This sound is then fed through either a microphone jack (using an intermediate resistive attenuator to reduce the sound card's output power to microphone levels) or an auxiliary connection into the transceiver, from which it is transmitted.
From the perspective of the transmitter, the sound amounts to little more than somebody whistling into the microphone. However, the software rapidly shifts the phase of the audio signal between two states (hence the name "phase-shift keying"), forming the character codes. These phase shifts serve the same function as the two tones used in traditional RTTY and similar systems.
To decode PSK31, the audio whistle received from the transceiver's headphone output is fed into a computer sound card's audio input, and software decodes it. The software displays the decoded text.
Because PSK31 was developed for use through a computer's sound card, many programs have since been created to use the same technology for other modes, such as RTTY, Hellschreiber, and Olivia MFSK. So, once it has been set up to run PSK31, a computer can be used to explore a variety of digital message transmission modes.
Aside from a standard radio transceiver and a computer with a sound card, very little equipment is required to use PSK31. Normally, an older computer and a few cables will suffice, and many PSK31 software applications are free. Many operators now use a commercially available interface/modem device (or "nomic") between their computers and radios. These devices incorporate the necessary impedance matching and sound level adjustment to permit the sound card output to be injected into the microphone input, send the radio's audio output to the sound card input, and handle the radio's transmit-receive switching.
Soundcard to radio interfaces typically use isolation transformers on both the send and receive audio paths
to eliminate hum caused by ground-loops.
Recently introduced interfaces also incorporate their own sound card and can be powered and run from the computer via a single USB connection.
Resistance to interference
Like other narrow band digital modes, PSK31 can often overcome interference and poor propagation conditions in situations where voice or other methods of communication fail. However, PSK31 was designed only for leisure use by amateurs, and due to its relatively slow speed and limited error control, is not suitable for transmitting large blocks of data or text, or critical data requiring high immunity from errors.
PSK31 works well over propagation paths that preserve phase, and resists fading (QSB) well. However, it can be adversely affected by propagation modes—such as transpolar paths—where auroral "flutter" or multipathing can disrupt the signal phase continuity. In such cases the use of QPSK (see below) is often beneficial.
Some software supports PSK10 and PSK05 variants, running at 10 baud and 5 baud, respectively. These slower speeds sacrifice throughput to provide greater resistance to noise and other interference. Conversely, PSK63 is increasingly used for faster exchanges, especially during amateur radio contest operating.
Technical information
PSK31 is typically created by software that generates an amplitude- and phase-modulated waveform that is converted to an audio frequency analog signal by a sound card. In the most-commonly-used variant, BPSK31, binary information is transmitted by either imparting a 180-degree phase shift (a binary "zero") or no phase shift (a binary "one") in each 32ms symbol interval. The 180-degree phase shift for a "zero" bit code occurs at a null amplitude.
As shown in the figure, a cosine filter is used to smooth the rise and fall times of the audio waveform and eliminate key clicks. All subsequent amplification of the signal must be linear to preserve the modulation waveform and ensure minimum occupied bandwidth. In practice, this means limiting the transmit audio volume to below the level where the transmitter generates Automatic Level Control (ALC) feedback and disabling any audio compression or speech processing.
The Varicode is a kind of Fibonacci code where the boundaries between character codes are marked by two or more consecutive zeros. Like all Fibonacci codes, since no character code contains more than one consecutive zero, the software can easily identify the spaces between characters, regardless of the length of the character. The idle sequence, sent when an operator is not typing, is a continuous sequence of phase-shifts, which do not print on the screen. Martinez arranged the character alphabet so that, as in Morse code, the more frequently occurring characters have the shortest encodings, while rarer characters use longer encodings. He named this encoding scheme "varicode".
PSK31's symbol rate of 31.25 Hz was chosen because a normal typing speed of about 50 words per minute requires a bit rate of about 32 bits per second, and specifically because 31.25 Hz could easily be derived from the 8 kHz sample rate used in many DSP systems, including those used in the computer sound cards commonly used for PSK31 operation (31.25 Hz is 8 kHz divided by 256, and so can be derived from 8 kHz by halving the frequency eight times in succession).
BPSK31 and QPSK31 variants
Colloquial usage of the term 'PSK31' in amateur radio usually implies the use of the most commonly used variant of PSK31: binary phase shift keying (BPSK). The BPSK variant of PSK31 uses no error control. QPSK31, the variant based on quadrature phase shift keying (QPSK), uses four phases instead of two. It is simple to switch from BPSK to QPSK if difficulties arise during a contact; the mode has the same number of symbols per second, and hence the same bandwidth as the BPSK variant. In a coherent receiver, the bit error probability of QPSK is the same as for BPSK operating at the same power, making QPSK31 the generally preferable mode from a robustness, and thus reach, point of view.
Using four instead of two constellation points provides twice the physical layer bit rate, which allows addition of redundant information to provide a degree of forward error correction. When QPSK is used, after encoding into varicode, the bits of the binary data signal is subject to a rate-1/2 channel code, which means that for every information bit, two code bits are calculated and transmitted. For that, a convolutional code with constraint length 5 (i.e. the last five bits from the input are incorporated to select two output bits per input bit) is used.
The resulting bits are mapped to a quaternary set of phases. At the receiver, a decoder for the convolutional code needs to be used, typically the Viterbi Algorithm, which is able to reconstruct the most likely sent sequence, even if multiple symbols were received incorrectly. Optimal decoding must take into account the same constraint length of information bits as encoding, yielding a 5-symbol decoding delay, which corresponds to 160 ms of delay.
Spectrum efficiency compared to other modes
PSK31's efficiency and narrow bandwidth make it highly suitable for low-power and crowded-band operation. PSK31 contacts can be conducted at less than 100 Hz separation, so with disciplined operation at least twenty simultaneous PSK31 contacts can be carried out side-by-side in the 2.5 kHz bandwidth required for just one SSB voice contact.
Common frequencies
The following amateur radio frequencies are commonly used for transmitting and receiving PSK31 signals. They normally occupy the lower edge of each band's digital modes section. PSK31 operators generally use upper sideband (USB), even on frequencies below 10 MHz where the convention normally calls for lower sideband. This is because (a) signals then spread upwards into the digimode section from the "base" frequency, and (b) using QPSK requires both stations to use the same sideband.
* Current usage as of 2010, based on observation, is centered on 7,070.15 and 21,070.15. 7,035.15 is commonly used in Region 2 as of 2012. There is no authoritative list, as the frequencies are determined by common convention.
** PSK has moved from 18.100 to 18.097 due to FT8 use of the 18.100 frequency as of November, 2019.
The IARU Region 1 Bandplan was revised in March 2009 to reflect the expanded 40 meter band. The CW-only section within Europe, Africa, the Middle East and the former USSR is now 7.000 to 7.040. Region 2 - The Americas - followed in September 2013. Region 3 - South Asia and Australasia - has not yet synchronised its bandplan with Regions 1 and 2.
References
Further reading
Martinez, Peter. PSK31: A new radio-teletype mode with a traditional philosophy (PDF) (November 1998).
Meltz, Steve "The New HF Digital Modes - PSK31", QST, April, 1999, pp. 50-51
Martinez, Peter. "PSK31: A New Radio-Teletype Mode". RadCom, December 1998, updated February 1999
External links
The "Official" PSK31 Page
PSK31 Setup and Operation | a PSK31 guide
PSK31 email discussion list with contests, app reviews, and more
Quantized radio modulation modes
Packet radio |
52960125 | https://en.wikipedia.org/wiki/Emigma | Emigma | EMIGMA is a geophysics interpretation software platform developed by Petros Eikon Incorporated for data processing, simulation, inversion and imaging as well as other associated tasks. The software focuses on non-seismic applications and operates only on the Windows operating system.
It supports files standard to the industry, instrument native formats as well as files used by other software in the industry such as AutoCAD, Google Earth and Oasis montaj.
There is a free version of EMIGMA called EMIGMA Basic developed to allow viewing of databases created by licensed users. It does not allow data simulation nor modeling nor data import.
The software is utilized by geoscientists for exploration and delineating purposes in mining,
oil and gas
and groundwater as well as hydrologists,
environmental engineers,
archaeologists
and academic institutions
for research purposes. Principal contributors to the software are R. W. Groom,
H. Wu, E. Vassilenko,
R. Jia, C. Ottay and C. Alvarez.
EMIGMA tools
Forward simulation of geophysical models
These applications were the initial motivation for the platform
and are still given attention in new releases.
Geological models can be simulated for a variety of geophysical measurement systems such as conventional dipole-dipole, , time domain electromagnetics(TEM), Magnetotellurics(MT), /, magnetic, gravity, resistivity and induced polarization systems. Surveys can be airborne, ground, down a hole, crosshole, underwater or on the water. A survey is defined by properties related to a transmitter, a receiver and other system properties. The system and survey parameters are stored with the input data allowing the user freedom from continually specifying these parameters for every model. Synthetic measurements at the receiver due to the model are what are calculated during a simulation. Early versions of EMIGMA could simulate the responses of 3d blocks, thin plates and the response of a many layered earth model. Simulation algorithms now include one for a sphere model, and alternate algorithms for thin plates and various algorithms for 3D prisms and polyhedra.
Blocks and polyhedra components of a model are simulated by algorithms based on the LN approximation.
When compared with a real world electromagnetic system, it has been found that simulation results for a thin plate tended to agree in some situations. One case study required other algorithms for initial analysis of data due to EMIGMA's complexity. EMIGMA was then used when the limitations of the other software was reached. EMIGMA is the only commercial EM modelling tool that can model a thick prism, a complex polyhedra as well as a thin plates. Another advantage is the ability to simulate the response of multiple types of targets on more than one profile.
Inversion of geophysical data
A model response can be simulated and compared to a measured response adjusted by the user and repeated. But another approach, which is often taken, is to make this process for forward simulation and model adjustment automatic. After enough iterations, a model can be found that has a response that matches the measured response within a limit specified by the user. This is termed inversion.
Petros Eikon has been developing inversion processes for almost 2 decades. Initial inversion procedures provided one dimensional (1D)models for frequency domain electromagnetic data both controlled source and natural field for ground and airborne data.
Later, capabilities for 3d inversion were added.
1D inversion determines the model for a single station. It is available for FEM, TEM, MT, CSAMT and Resistivity data. This process can be repeated for each station that exists to produce what is termed inversion sections.
3D inversion determines the properties of a model in the form of a network of 3d cells. This tool is available for magnetic, gravity, MT, CSEM, CSAMT and Resistivity data. Petros Eikon has moved from standard steepest descent inversion techniques to a Trust Region technique.
3D visualization
The design of a survey, geological model and data can be displayed in 3D. The geometry and parameters of model structures can be edited in 3D space. Measured and synthetic data can be viewed in different formats including vectors, lines, surfaces and contours in association with the models. Results from inversion tools can be displayed as a volume.
2D plotter
A plotter designed to analyze geophysical data. Data can be displayed on a 2D axis as a function of time, frequency, position or tx-rx separation. Measured data can be compared with simulated and inverted data by displaying multiple plots on the same axis or calculating a residual plot. Data can be converted to different properties such as apparent resistivity.
Survey editor
The design of a survey is displayed on a two dimensional X-Y (North/South)display including transmitters and data stations. Data stations and models can be interactively edited. Files from mapping software can be imported to display the survey overlaid on a map. Projections of models can also be displayed. The application allows export to GIS graphic formats.
Gridding
Multiple data can be interpolated into a multi-dimensional grid to allow viewing of maps of such things as multi-time windows or multi-transmitter receiver settings. Grid cells need not be square but may be rectangular to correspond to different spatial densities of stations and lines.
Data can be interpolated to a defined grid and viewed in Grid Presentation and a 3D contour application. Grid Presentation also supports map overlays from other mapping software as well as export to all the common geophysical mapping software.
Other tools
The data spreadsheet displays survey data in a spreadsheet format. Data can be edited. PseudoShow displays
data from a series of points as a cross section by assigning tx-rx separation, frequency or time values to depth. The tool calculates resistivity for frequency domain and helicopter data collected at different transmitter frequencies. Results can be displayed in the CDI viewer that also displays 1d inversion results. The poly generator creates synthetic topography and complex anomalies for modeling. Models can also be imported from CAD applications. FFT processing is available for gravity and magnetic data including derivative generation, windowing and upward/downward continuation. Other tools provide features such as digital and spatial data filtering as well as survey editing.
Version history
EMIGMA 1
Released in 1994. DOS application to simulate EM responses of a thin-sheet.
EMIGMA 5
Released in 1997. WINDOWS 95/NT application. Simulation of geophysical models for various EM systems such as surface and borehole TDEM, airborne and ground FDEM, IP/Resistivity, Magnetotellurics, and CSAMT as a controlled source application. The earliest commercial example of a 3D modeling CSEM application. Application included plotting and visualization capabilities.
EMIGMA 6
This version featured forward simulation, 2d and 3d plotting, contouring, a pseudosection tool, 1d inversion of FEM, MT and CSAMT data and 3d inversion of magnetic data. This design has since been renamed and is now sold as GeoTutor for educational purposes. GeoTutor is now in its 5th version as GeoTutor 5.
EMIGMA 7
Released in 2000, EMIGMA 7 changed the manner in which data was stored.
The basic stored data structure was changed from an ASCII text file structure in a full relational database. With a database structure it was now possible to add many associated tools such as a range of filtering and editing tools.
New geophysical features added included source conductivity depth imaging, 1d TEM inversion, Euler deconvolution, FFT tools for magnetic and gravity data, 3d resistivity inversion and magnetization vector inversion.
EMIGMA 8
Full compatibility with Windows Vista was added to EMIGMA with the April 2008 release of version 8.
Support was added for new data collection instruments. Other new features include the freespace eikplate simulation algorithm,
inversion tools for MT and CSAMT and more efficient inversion algorithms.
EMIGMA 9
October 2015 was the release date of EMIGMA 9. A new Fortran compiler was used to rebuild the algorithm code for numerical algorithms such as 3D magnetic and gravity inversion, data interpolation and freespace plate simulation, increasing the scale of problems that could be processed and increasing speed by 5 times.
New features were also added to the 1d TEM inversion tool
and IP modeling.
References
Microsoft Windows
Science software
Proprietary software |
6244738 | https://en.wikipedia.org/wiki/Submarine%20Command%20System | Submarine Command System | SMCS, the Submarine Command System, was first created for the Royal Navy of the United Kingdom's s as a tactical information system and a torpedo weapon control system. Versions have now also been installed on all active Royal Navy submarine classes.
Initial Phase: SMCS for Vanguard class
With the decision in 1983 to build a new class of submarine to carry the Trident missile system, the UK Ministry of Defence (MoD) ran an open competition for the command system. Up to that point all Royal Navy (RN) ships and submarines had command systems built by Ferranti using custom-built electronics and specialised proprietary processors. In a departure from previous practice, which had favoured 'preferred contractor' policies, the competition was won by a new company called Gresham-CAP, leading a consortium of Gresham-Lion (now part of Ultra Electronics plc) and CAP Scientific.
The consortium proposed a novel distributed processing system based on commercial off-the-shelf (COTS) processors, with a modular software architecture largely written in the Ada programming language. Each set of Initial Phase SMCS equipment has multiple computer nodes. At the centre of the system there is an Input/Output Node (which provides interfaces to weapons and sensors) and a Central Services Node (which holds fast numeric processors). Each central node is duplicated to create a fault-tolerant system which is dual modular redundant. The Human-Computer Interface is provided by Multi Function Consoles and some additional terminals. The dual redundant central nodes are linked to each other and to the consoles via a dual redundant fibre optic LAN.
In the Initial Phase equipment fitted to the s most processing is done by Intel 80386 single-board computers, each with its own Ada run-time environment. CAP Scientific created a complex layer of middleware to link the many processors together. At its time SMCS was the largest Ada project so far seen. As a pioneering user of Ada, the SMCS project encountered many teething problems with the large-scale use of Ada compilers, Ada development tools, and the special characteristics of the early dialect of the Ada programming language, later known as Ada 83.
Second Phase: SMCS for Swiftsure and Trafalgar class
By 1991, CAP Scientific was part of Sema Group and the SMCS project was owned by BAeSEMA, a joint venture between Sema Group and British Aerospace. Once SMCS was proven to work on Vanguard boats, it was proposed in the early 1990s to extend its use to the s and the s, as part of an improvement programme for these vessels. There was a commercial desire for yet further adoption of COTS technology. The consensus was to port SMCS to some form of UNIX. Sema Group, with considerable experience both of real-time systems and of commercial UNIX, had concerns about the technological feasibility of this port. The essence of the problem was the need to map the Ada tasking environment to the run-time model of UNIX processes in a way which preserved SMCS' real-time characteristics enough to maintain dependability. A team from BAeSEMA, led by Ray Foulkes, conducted thorough research into possible alternatives to the distributed Ada architecture used in the Initial phase. After extensive investigation of the run-time behaviour of different UNIX variants, and of the code generated by different Ada compilers, the project selected the Solaris operating system running on SPARC computers, which could now be procured as COTS single-board computers.
To limit risk, only the consoles were converted to Solaris on SPARC in this phase. The central nodes were kept in the same form as the Initial Phase equipment. The benefit was that there was no need to implement the dual modular redundancy scheme on Solaris at this stage. However, the project had to manage some additional issues arising from mixed intel/SPARC working, such as endianism (since intel architecture is little-endian and SPARC is big-endian).
A detailed and generally accurate independent analysis of these stages in the development of SMCS was made in 1998.
Third Phase: ACMS for Astute class
After being successfully deployed on the Swiftsure and Trafalgar submarines in the mixed Intel/SPARC configuration, the architecture was further revised for the UK's new attack boats, the s.
The Astute Combat Management System (ACMS) combines SMCS with several other sub-systems. For ACMS, the Central Nodes have also been converted to SPARC computers. The dual redundant architecture, both of central nodes and of LAN connections, remains a key feature. There are about twice as many consoles as provided in earlier versions of SMCS. This phase of SMCS is an all-UNIX solution running Solaris on multiple SPARC nodes, with built-in dual redundancy.
Submarine Command System Next Generation
Controversy about system architecture
By 2000, Sema Group had sold its interest in BAeSEMA, and the SMCS project was now fully owned by BAE Systems. In its last major Defence Review, as reported in 2003, the UK Parliament agreed numerous improvements for RN submarines, but no changes to the Vanguard boats or the Trident missile system. It was expected that the SMCS equipment, supplied and maintained under a support contract with Ultra Electronics, would last out the service life of the Vanguard fleet. The programmes in place for other submarine improvements were mainly for new sonar equipment, and had been reviewed and approved by the UK's parliament.
For a brief period, the SMCS project came under the ownership of Alenia Marconi Systems, a joint venture of BAE Systems. In 2002, it was proposed to convert SMCS to run on standard PC x86 hardware, albeit in rugged industrialised form, for naval command systems. The SMCS project started to develop SMCS-NG ("Next Generation") as SMCS running on PC hardware. The plan was to convert the SMCS infrastructure and applications to run on the Microsoft Windows operating system.
However, some software engineers had misgivings. In April 2002 Bill Gates, appearing in his capacity as Microsoft's Chief Software Architect, had given sworn testimony under oath to the US Courts. Gates' testimony included statements that Microsoft Windows was indissoluble and could not be created in cut-down form. Paragraphs 207 to 223 of Gates' testimony indicated that Windows had an entangled monolithic structure, rather than a structure organised in modular fashion. Assuming Gates' testimony to be true, these 'pro-UNIX' engineers felt that open-source UNIX, rather than Microsoft Windows, should be used as the foundation of future naval command systems and circulated their concerns within the company.
SMCS-NG as first deployment of "Windows for Warships"
Despite the concerns of some engineers, SMCS-NG was created as a port to Microsoft Windows of the SMCS infrastructure and applications, a move which some commentators have termed "Windows for Warships". The UK's Defence Ministry later gave assurances, through questions in the UK parliament, that this is a low risk use of Microsoft Windows. However, some other suppliers have taken a different path. The consoles for the new Sonar 2076 supplied by Thales Underwater Systems for the Astute class submarines, and which may be retro-fitted to other classes, are built as PCs running Linux rather than Windows.
Having developed SMCS-NG as an internal project, BAE Systems independently proposed to the MoD that the original SMCS equipment be replaced by its own, newer, version. After sea trials in , the MoD awarded contracts to BAE Systems for refit of SMCS-NG into most RN submarines, including the Vanguard fleet. Although the Defence Minister Adam Ingram told the UK Parliament in October 2004 that no decision had been made about conversion of the Vanguard fleet to run SMCS-NG, the MoD placed the contracts the following month. By December 2008, all of the active Royal Navy submarines had been retrofitted with SMCS-NG.
Unlike with previous versions of SMCS, the software is supplied as a single-fit release which is intended to be configured for the sensor and weapon fit of each submarine.
Footnotes
External links
Submarine Command System Next Generation
Computer systems
Trident (UK nuclear programme) |
44331841 | https://en.wikipedia.org/wiki/YubiKey | YubiKey | The YubiKey is a hardware authentication device manufactured by Yubico to protect access to computers, networks, and online services that supports one-time passwords (OTP), public-key cryptography, and authentication, and the Universal 2nd Factor (U2F) and FIDO2 protocols developed by the FIDO Alliance. It allows users to securely log into their accounts by emitting one-time passwords or using a FIDO-based public/private key pair generated by the device. YubiKey also allows for storing static passwords for use at sites that do not support one-time passwords. Google, Microsoft, and Facebook use YubiKey devices to secure employee accounts as well as end user accounts. Some password managers support YubiKey. Yubico also manufactures the Security Key, a similar lower cost device with only FIDO/U2F support.
The YubiKey implements the HMAC-based One-time Password Algorithm (HOTP) and the Time-based One-time Password Algorithm (TOTP), and identifies itself as a keyboard that delivers the one-time password over the USB HID protocol. A YubiKey can also present itself as an OpenPGP card using 1024, 2048, 3072 and 4096-bit RSA (for key sizes over 2048 bits, GnuPG version 2.0 or higher is required) and elliptic curve cryptography (ECC) p256 and p384, allowing users to sign, encrypt and decrypt messages without exposing the private keys to the outside world. Also supported is the PKCS#11 standard to emulate a PIV smart card. This feature allows for code signing of Docker images as well as certificate-based authentication for Microsoft Active Directory and SSH.
Founded in 2007 by CEO Stina Ehrensvärd, Yubico is a private company with offices in Palo Alto, Seattle, and Stockholm. Yubico CTO, Jakob Ehrensvärd, is the lead author of the original strong authentication specification that became known as Universal 2nd Factor (U2F).
YubiKey released the YubiKey 5 series in 2018 which adds support for FIDO2.
History
Yubico was founded in 2007 and began offering a Pilot Box for developers in November of that year. The original YubiKey product was shown at the annual RSA Conference in April 2008, and a more robust YubiKey II model was launched in 2009. The name "YubiKey" derives from the Japanese word for finger.
YubiKey II and later models have two "slots" available, for storing two distinct configurations with separate AES secrets and other settings. When authenticating the first slot is used by only briefly pressing the button on the device, while the second slot gets used when holding the button for 2 to 5 seconds.
In 2010, Yubico began offering the YubiKey OATH and YubiKey RFID models. The YubiKey OATH added the ability to generate 6- and 8-character one-time passwords using protocols from the Initiative for Open Authentication (OATH), in addition to the 32-character passwords used by Yubico's own OTP authentication scheme. The YubiKey RFID model included the OATH capability plus also included a MIFARE Classic 1k radio-frequency identification chip, though that was a separate device within the package that could not be configured with the normal Yubico software over a USB connection.
Yubico announced the YubiKey Nano in February 2012, a miniaturized version of the standard YubiKey which was designed so it would fit almost entirely inside a USB port and only expose a small touch pad for the button. Most later models of the YubiKey have also been available in both standard and "nano" sizes.
2012 also saw the introduction of the YubiKey Neo, which improved upon the previous YubiKey RFID product by implementing near-field communication (NFC) technology and integrating it with the USB side of the device. The YubiKey Neo (and Neo-n, a "nano" version of the device) are able to transmit one-time passwords to NFC readers as part of a configurable URL contained in a NFC Data Exchange Format (NDEF) message. The Neo is also able to communicate using the CCID smart-card protocol in addition to USB HID (human interface device) keyboard emulation. The CCID mode is used for PIV smart card and OpenPGP support, while USB HID is used for the one-time password authentication schemes.
In 2014, the YubiKey Neo was updated with FIDO Universal 2nd Factor (U2F) support. Later that year, Yubico released the FIDO U2F Security Key, which specifically included U2F support but none of the other one-time password, static password, smart card, or NFC features of previous YubiKeys. At launch, it was correspondingly sold at a lower price point of just $18, compared to $25 for the YubiKey Standard ($40 for the Nano version), and $50 for the YubiKey Neo ($60 for Neo-n). Some of the pre-release devices issued by Google during FIDO/U2F development reported themselves as "Yubico WinUSB Gnubby (gnubby1)".
In April 2015, the company launched the YubiKey Edge in both standard and nano form factors. This slotted in between the Neo and FIDO U2F products feature-wise, as it was designed to handle OTP and U2F authentication, but did not include smart card or NFC support.
The YubiKey 4 family of devices was first launched in November 2015, with USB-A models in both standard and nano sizes. The YubiKey 4 includes most features of the YubiKey Neo, including increasing the allowed OpenPGP key size to 4096 bits (vs. the previous 2048), but dropped the NFC capability of the Neo.
At CES 2017, Yubico announced an expansion of the YubiKey 4 series to support a new USB-C design. The YubiKey 4C was released on February 13, 2017. On Android OS over the USB-C connection, only the one-time password feature is supported by the Android OS and YubiKey, with other features not currently supported including Universal 2nd Factor (U2F). A 4C Nano version became available in September 2017.
In April 2018, the company brought out the Security Key by Yubico, their first device to implement the new FIDO2 authentication protocols, WebAuthn (which reached W3C Candidate Recommendation status in March) and Client to Authenticator Protocol (CTAP, still under development as of May 2018). At launch, the device is only available in the "standard" form factor with a USB-A connector. Like the previous FIDO U2F Security Key, it is blue in color and uses a key icon on its button. It is distinguished by a number "2" etched into the plastic between the button and the keyring hole. It is also less expensive than the YubiKey Neo and YubiKey 4 models, costing $20 per unit at launch because it lacks the OTP and smart card features of those previous devices, though it retains FIDO U2F capability.
Product features
A list of the primary features and capabilities of the YubiKey products.
|-
! YubiKey VIP !! YubiKey Plus !! YubiKey Nano !! YubiKey NEO-n !! YubiKey 4 Nano !! YubiKey Edge-n !! YubiKey Standard !! YubiHSM 1 !! FIDO U2F Security Key !! Security Key by Yubico !! YubiKey NEO !! YubiKey 4C Nano !! YubiKey 4C !! YubiKey 4 Nano !! YubiKey 4 !! YubiKey C Nano FIPS !! YubiKey C FIPS !! YubiKey Nano FIPS !! YubiKey FIPS !! YubiHSM 2 !! Security Key NFC by Yubico !! YubiKey 5C Nano !! YubiKey 5C !! YubiKey 5 Nano !! YubiKey 5 NFC !! YubiKey 5Ci !! YubiKey 5C NFC
|-
| 2011–2017 || 2014–2015 || 2012–2016 || 2014–2016 || 2016–2017 || 2015–2016 || 2014–2016 || 2015–2017 || 2013–2018 || 2018–2020 || 2012–2018 || 2017–2018 || 2017–2018 || 2015–2018 || 2015–2018 || 2018–present || 2018–present || 2018–present || 2018–present || 2017–present || 2019–present || 2018–present || 2018–present || 2018–present || 2018–present || 2019–present || 2020–present
|-
| Yes || Yes || || || || || || || || || || || || || || || || || || || || || || || || ||
|-
| || || Yes || Yes || Yes || Yes || Yes || || || || Yes || Yes || Yes || Yes || Yes || || || || || || || Yes || Yes || Yes || Yes || Yes || Yes
|-
| || || Yes || Yes || Yes || Yes || Yes || || || || Yes || Yes || Yes || Yes || Yes || || || || || || || Yes || Yes || Yes || Yes || Yes || Yes
|-
| || || Yes || Yes || Yes || Yes || Yes || || || || Yes || Yes || Yes || Yes || Yes || || || || || || || Yes || Yes || Yes || Yes || Yes || Yes
|-
| || || || Yes || || || || || || || Yes || Yes || Yes || Yes || Yes || || || || || || || Yes || Yes || Yes || Yes || Yes || Yes
|-
| || || || Yes || Yes || Yes || || || || || Yes || Yes || Yes || Yes || Yes || || || || || || || Yes || Yes || Yes || Yes || Yes || Yes
|-
| || || || Yes || Yes || Yes || || || || || Yes || Yes || Yes || Yes || Yes || || || || || || || Yes || Yes || Yes || Yes || Yes || Yes
|-
| || Yes || || Yes || Yes || Yes || || || Yes || Yes || Yes || Yes || Yes || Yes || Yes || || || || || || Yes || Yes || Yes || Yes || Yes || Yes || Yes
|-
| || || || || || || || || || Yes || || || || || || || || || || || Yes || Yes || Yes || Yes || Yes || Yes || Yes
|-
| || || || || || || || Yes || || || || || || || || || || || || Yes || || || || || || ||
|-
| || || || || || || || || || || || || || || || Yes || Yes || Yes || Yes || || || || || || || ||
|-
| || || || || || || || || || || Yes || || || || || || || || || || Yes || || || || Yes || || Yes
|-
| Yes || Yes || Yes || Yes || Yes || Yes || Yes || Yes || Yes || Yes || Yes || || || Yes || Yes || || || Yes || Yes || Yes || Yes || || || Yes || Yes || ||
|-
| || || || || || || || || || || || Yes || Yes || || || Yes || Yes || || || || || Yes || Yes || || || Yes || Yes
|-
| || || || || || || || || || || || || || || || || || || || || || || || || || Yes ||
ModHex
When being used for one-time passwords and stored static passwords, the YubiKey emits characters using a modified hexadecimal alphabet which is intended to be as independent of system keyboard settings as possible. This alphabet, referred to as ModHex or Modified Hexadecimal, consists of the characters "cbdefghijklnrtuv", corresponding to the hexadecimal digits "0123456789abcdef". Due to YubiKeys using raw keyboard scan codes in USB HID mode, there can be problems when using the devices on computers that are set up with different keyboard layouts, such as Dvorak. It is recommended to either use operating system features to temporarily switch to a standard US keyboard layout (or similar) when using one-time passwords, although YubiKey Neo and later devices can be configured with alternate scan codes to match layouts that aren't compatible with the ModHex character set.
U2F authentication in YubiKeys and Security Keys bypasses this problem by using the alternate U2FHID protocol, which sends and receives raw binary messages instead of keyboard scan codes. CCID mode acts as a smart card reader, which does not use HID protocols at all.
Security issues
YubiKey 4 closed-sourcing concerns
In an example of security through obscurity, most of the code that runs on a YubiKey is closed source. While Yubico has released some code for industry standard functionality like PGP and HOTP it was disclosed that as of the 4th generation of the product this is not the same code that the new units ship with. Because new units are permanently firmware locked at the factory it is not possible to compile the open source code and load it on the device manually, a user must trust that the code on a new key is authentic and secure.
Code for other functionality such as U2F, PIV and Modhex is entirely closed source.
On May 16, 2016, Yubico CTO Jakob Ehrensvärd responded to the open-source community's concerns with a blog post saying that "we, as a product company, have taken a clear stand against implementations based on off-the-shelf components and further believe that something like a commercial-grade AVR or ARM controller is unfit to be used in a security product."
Techdirt founder Mike Masnick strongly criticized this decision, saying "Encryption is tricky. There are almost always vulnerabilities and bugs -- a point we've been making a lot lately. But the best way to fix those tends to be getting as many knowledgeable eyes on the code as possible. And that's not possible when it's closed source."
ROCA vulnerability in certain YubiKey 4, 4C, and 4 Nano devices
In October 2017, security researchers found a vulnerability (known as ROCA) in the implementation of RSA keypair generation in a cryptographic library used by a large number of Infineon security chips, as used in a wide range of security keys and security token products (including YubiKey). The vulnerability allows an attacker to reconstruct the private key by using the public key. All YubiKey 4, YubiKey 4C, and YubiKey 4 Nano devices within the revisions 4.2.6 to 4.3.4 were affected by this vulnerability. Yubico remedied this issue in all shipping YubiKey 4 devices by switching to a different key generation function and offered free replacements for any affected keys. The replacement offer ended on March 31, 2019. In some cases the issue can be bypassed by generating new keys outside of the YubiKey and importing them onto the device.
OTP Password Protection on YubiKey NEO
In January 2018, Yubico disclosed a moderate vulnerability where password protection for the OTP functionality on the YubiKey NEO could be bypassed under certain conditions. The issue was corrected as of firmware version 3.5.0 and Yubico offered free replacement keys to any user claiming to be affected.
Reduced initial randomness on certain FIPS series devices
In June 2019, Yubico released a security advisory reporting reduced randomness in FIPS-certified devices with firmware version 4.4.2 and 4.4.4, shortly after power-up (there is no version 4.4.3). Security keys with reduced randomness may leave keys more easily discovered and compromised than expected. The issue affected the FIPS series only, and then only certain scenarios, although FIPS ECDSA usage was "at higher risk". The company offered free replacements for any affected keys.
Social activism
Yubico provided 500 YubiKeys to protesters during the 2019–2020 Hong Kong protests. The company states the decision is based on their mission to protect vulnerable Internet users, and works with free speech supporters.
See also
OpenPGP card
References
External links
YubiKey 5 comparison table
YubiKey FIPS comparison table
2007 establishments in California
Authentication methods
Companies based in Palo Alto, California
Companies based in Seattle
Companies based in Stockholm
Computer access control
Computer companies established in 2007
Cryptographic hardware
Technology companies based in the San Francisco Bay Area |
405515 | https://en.wikipedia.org/wiki/Edward%20Fredkin | Edward Fredkin | Edward Fredkin (born October 2, 1934) is a distinguished career professor at Carnegie Mellon University (CMU), and an early pioneer of digital physics.
Fredkin's primary contributions include work on reversible computing and cellular automata. While Konrad Zuse's book, Calculating Space (1969), mentioned the importance of reversible computation, the Fredkin gate represented the essential breakthrough. In recent work, he uses the term digital philosophy (DP).
During his career, Fredkin has been a professor of computer science at the Massachusetts Institute of Technology, a Fairchild Distinguished Scholar at Caltech, and Research Professor of Physics at Boston University.
Early life and education
At age 19, Fredkin left California Institute of Technology (Caltech) after a year to join the United States Air Force (USAF) to become a fighter pilot.
Career
Fredkin has worked with a number of companies in the computer field and has held academic positions at a number of universities. He is a computer programmer, a pilot, an advisor to businesses and governments, and a physicist. His main interests concern digital computer-like models of basic processes in physics.
Fredkin's initial focus was physics; however, he became involved with computers in 1956 when he was sent by the Air Force, where he had trained as a jet pilot, to the MIT Lincoln Laboratory. On completing his service in 1958, Fredkin was hired by J. C. R. Licklider to work at the research firm, Bolt Beranek & Newman (BBN). After seeing the PDP-1 computer prototype at the Eastern Joint Computer Conference in Boston, in December 1959, Fredkin recommended that BBN purchase the very first PDP-1 to support research projects at BBN. The new hardware was initially delivered with no software whatsoever.
Fredkin wrote a PDP-1 assembler language called FRAP (Free of Rules Assembly Program, also sometimes called Fredkin's Assembly Program), and its first operating system (OS). He organized and founded the Digital Equipment Computer Users' Society (DECUS) in 1961, and participated in its early projects. Working directly with Ben Gurley, the designer of the PDP-1, Fredkin designed significant modifications to the hardware to support time-sharing via the BBN Time-Sharing System. He invented and designed the first modern interrupt system, which Digital called the "Sequence Break". He went on to become a contributor in the field of Artificial Intelligence (AI).
In 1962, he founded Information International, Inc., an early computer technology company which developed high-precision digital-to-film scanners, as well as other leading-edge hardware.
In 1968, Fredkin returned to academia, starting at the Massachusetts Institute of Technology (MIT) as a full professor. From 1971 to 1974, Fredkin was the Director of Project MAC at MIT. (Project MAC was renamed the MIT Laboratory for Computer Science in 1976.) He spent a year at Caltech as a Fairchild Distinguished Scholar, working with Nobel Prize-winning physicist Richard Feynman, and was a Professor of Physics at Boston University for six years.
Fredkin has had formal and informal associations with Carnegie Mellon University (CMU) over several decades. His current academic interests are in the area of digital mechanics, which is the study of discrete models of fundamental process in physics. Fredkin has been a Distinguished Career Professor of Computer Science at CMU. and also a Visiting Scientist at MIT Media Laboratory. , he is Distinguished Career Professor of Robotics at CMU.
Fredkin has served as the founder or CEO of a diverse set of companies, including Information International, Three Rivers Computer Corporation, New England Television Corporation (owner of Boston's then CBS affiliate WNEV on channel 7), and The Reliable Water Company (manufacturer of advanced sea water desalination plants).
Fredkin has been broadly interested in computation: hardware and software. He is the inventor of the trie data structure, radio transponders for vehicle identification, the concept of computer navigation for automobiles, the Fredkin gate, and the Billiard-Ball Computer Model for reversible computing. He has also been involved in computer vision, chess, and other areas of Artificial Intelligence research.
Fredkin also worked at the intersection of theoretical issues in the physics of computation with computational models of physics. He invented the SALT Cellular Automata family. Dan Miller designed and programmed the Busy Boxes implementation of Salt, with assistance from Suresh Kumar Devanathan. The early SALT models are 2+1 dimensional quasi-physical, reversible, universal cellular automata, that are second order in time, and that follow rules that model CPT reversibility..
Fredkin's version of digital philosophy
Digital philosophy (DP) is one type of digital physics/pancomputationalism, a school of philosophy which claims that all the physical processes of nature are forms of computation or information processing at the most fundamental level of reality. Pancomputationalism is related to several larger schools of philosophy: atomism, determinism, mechanism, monism, naturalism, philosophical realism, reductionism, and scientific empiricism.
Pancomputationalists believe that biology reduces to chemistry which reduces to physics which reduces to the computation of information. Fredkin's career and achievements have much of their motivation in digital philosophy, a particular type of pancomputationalism described in Fredkin's papers: "Introduction to Digital Philosophy", "On the Soul", "Finite Nature", "A New Cosmogony", and "Digital Mechanics".
Fredkin's digital philosophy contains several fundamental ideas:
Everything in physics and physical reality must have a digital informational representation.
All changes in physical nature are consequences of digital informational processes.
Nature is finite and digital.
The traditional Judaeo-Christian concept of the soul has a counterpart in a static/dynamic soul defined in terms of digital philosophy.
Recent Projects
PDP-1 Restoration Project
Fredkin chaired the PDP-1 Restoration Project, which was able to restore and reactivate the Computer History Museum's PDP-1 computer after seven months of work.
Awards and honors
In 1984, Fredkin was awarded the Carnegie Mellon University Dickson Prize in Science, given annually to the person who has been judged to have made the most progress in a scientific field in the United States during that year. In 1999, CMU established the Fredkin professorship.
Cultural references
A layman's profile of Fredkin, along with a readable explanation of some of his theories, can be found in the first part of Three Scientists and Their Gods by Robert Wright (1988). The section of the book covering Fredkin was excerpted in The Atlantic Monthly in April 1988.
According to biographer Robert Wright, the character Stephen Falken in the film WarGames was modeled after Fredkin.
Further reading
Robert Wright, "Did the Universe Just Happen?", Atlantic Monthly, April 1988 – article contains extensive biographical content on Fredkin
See also
Fredkin's paradox
References
External links
Digital Philosophy.org
Did the Universe Just Happen? The Atlantic Monthly, by Robert Wright, 1988.
Two-state, Reversible, Universal Cellular Automata in Three Dimensions by Edward Fredkin,
Information International, Inc.
American computer scientists
American philosophers
Ontologists
Cellular automatists
1934 births
Living people
California Institute of Technology alumni
Quantum information scientists |
8803454 | https://en.wikipedia.org/wiki/Capability-based%20addressing | Capability-based addressing | In computer science, capability-based addressing is a scheme used by some computers to control access to memory as an efficient implementation of capability-based security. Under a capability-based addressing scheme, pointers are replaced by protected objects (called capabilities) that can be created only through the use of privileged instructions which may be executed only by either the kernel or some other privileged process authorised to do so. This effectively allows the kernel to control which processes may access which objects in memory without the need to use separate address spaces and therefore requiring a context switch when an access occurs.
Practical implementations
Two techniques are available for implementation:
Require capabilities to be stored in a particular area of memory that cannot be written to by the process that will use them. For example, the Plessey System 250 required that all capabilities be stored in capability-list segments.
Extend memory with an additional bit, writable only in supervisor mode, that indicates that a particular location is a capability. This is a generalization of the use of tag bits to protect segment descriptors in the Burroughs large systems, and it was used to protect capabilities in the IBM System/38.
The designers of the System/38's descendent systems, including AS/400 and IBM i, removed capability-based addressing. The reason given for this decision is that they could find no way to revoke capabilities (although patterns for implementing revocation in capability systems had been published as early as 1974, even before the introduction of System/38).
Chronology of systems adopting capability-based addressing
1969: System 250 – Plessey Company
1970–77: CAP computer – University of Cambridge Computer Laboratory
1978: System/38 – IBM
1980: Flex machine – Royal Signals and Radar Establishment (RSRE) Malvern
1981: Intel iAPX 432 – Intel
2014: CHERI
2020: CHEx86
Notes
References
Viktors Berstis, Security and protection of data in the IBM System/38, Proceedings of the 7th annual symposium on Computer Architecture, p. 245-252, May 6–08, 1980, La Baule, United States
W. David Sincoskie, David J. Farber: SODS/OS: Distributed Operating System for the IBM Series/1. Operating Systems Review 14(3): 46-54 (July 1980)
G. J. Myers, B. R. S. Buckingham, A hardware implementation of capability-based addressing, ACM SIGOPS Operating Systems Review, v.14 n.4, p. 13-25, October 1980
Houdek, M. E., Soltis, F. G., and Hoffman, R. L. 1981. IBM System/38 support for capability-based addressing. In Proceedings of the 8th ACM International Symposium on Computer Architecture. ACM/IEEE, pp. 341–348.
The Cambridge CAP Computer, Levy, 1988
Plessey System 250, a commercial Capability solution, Hank Levy, 1988
G. D. Buzzard, T. N. Mudge (1983) Object-based Computer Systems and the Ada Programming Language . The University of Michigan – Computer Research Laboratory and Robotics Research Laboratory Department of Electrical and Computer Engineering
External links
Memory management
Operating system security |
21029840 | https://en.wikipedia.org/wiki/Mindomo | Mindomo | Mindomo is a versatile freemium collaborative mind mapping, concept mapping and outlining tool developed by Expert Software Applications. It can be used to develop ideas and interactively brainstorm, with features including sharing, collaboration, task management, presentation and interactive web publication.
The online version of Mindomo is available through any browser. There are also offline desktop versions for Windows, Linux and Mac, and app versions for both Android and iOS. Registered users can create and collaborate in real-time on mind maps, while unregistered users can view the maps shared with them. The software also provides ways to create presentations and mind map assignments.
History
In 2006 development was begun at Expert Software Application for a mind mapping tool called Mindomo using Adobe Flex development kit which is based on Adobe Flash platform.
In 2007 Mindomo web app was launched. The user interface used a ribbon consistent with office 2007.
In 2008 Mindomo Desktop Version was Released that would requires installing Adobe AIR platform.
In 2010 Real-Time collaboration was introduced.
In 2011 Mindomo version 6.0 was released ditching the office like ribbon and introduced a minimalist interface with no compromise on functionality.
In 2011 Mindomo Presenter was Released.
In 2012 Mindomo apps for iPad and Android were released.
In 2014 Mindomo web app made the shift from Flash to HTML5.
In 2016 Mindomo Desktop 8.0 was released offer a consistent interface with the web version and no longer requiring Adobe Air.
In 2018 Gantt chart was Introduced.
In 2020 Mindomo Android app was redesigned from the ground up to support vertical screen with minimalist interface.
In 2021 Mindomo web version received a major interface redesign along with releasing Mindomo desktop version 10.0 with minimalist and modern new graphical interface.
Features
Mindomo has highly customizable nodes where users can insert videos, audio, images, notes, emoji, hyperlinks and attachments to any other node. Users can collaborate in real time and can choose to publish their work online. Mindomo has an intuitive presentation mode allowing you to travel through your mind map with zooms and pans. You can also publish interactive versions of mindmaps, as well as static images of them.
Mindomo can be used from any standard web browser, or by installing the desktop, iPad and Android applications. As a freemium software, Mindomo offers its basic services for free, and charges for premium features, such as downloading in certain formats and uploading documents.
For the educational sector, Mindomo offers student assignments for teachers, integration with many learning-management systems, and allows students to include videos, images and external files to support their ideas.
Mindomo offers teachers with assignment tool that allow them to send an online request to their students inviting them to access incomplete mind map about a specific lesson. Student starts working in groups and begin exploring their broad topic questions and all their changes will be sent to their teacher as notifications in which he can provide feedback via Mindomo comment feature. In the end students encouraged to turn their mind map into a presentation using Mindomo presenter to explain their research process step by step. Mindomo also allows for printing mind map on paper, which can be annotated and also distributed to other students for peer review.
Mindomo's business features include task assignment for team members, commenting, and projects planning with Gantt chart views. Team members can share mind maps and make their own updates. Individuals can be assigned to each business function, and subtopics that need focus can be prioritized.
Michael Stratton, author of "The Effective Project Manager", used Mindomo for examples of using mind maps in project management.
Reception
When Mindomo launched in 2007, Chuck Frey, author of "The Mind Mapping Software Blog" wrote, "Mindomo sets a new standard for web-based mind mapping tools with features that rivals many desktop mind maps. In 2014, Mindomo was available to all public schools in Ontario, Canada, as it was approved by the Ontario Software Acquisition Program Advisory Committee (OSAPAC), which advises the Canadian Ministry of Education on the acquisition of provincial licenses for publicly-funded schools in Ontario.
In 2019 Mindomo won PC Magazine's Editors' Choice award as "best mind mapping tool", citing how it coupled mind-mapping with the social aspects of knowledge management. According to Mindomo's website, there are more than six million users of Mindomo worldwide.
References
Further reading
External links
Mindomo Website
Hands-On with Mindomo: Maps and Charts Made Easy
Expert Software Applications Website
Mind-mapping software
Note-taking software
Presentation software
Task management software
Project management software
Collaborative software |
2987970 | https://en.wikipedia.org/wiki/List%20of%20Apple%20II%20application%20software | List of Apple II application software | Following is a List of Apple II applications including utilities and development tools.
0–9
3D Art Graphics - 3D computer graphics software, a set of 3D computer graphics effects, written by Kazumasa Mitazawa and released in June 1978
A
A2Command - Norton Commander style file manager
ADTPro - telecom
Apple Writer - word processor
AppleWorks - integrated word processor, spreadsheet, and database suite (II & GS)
ASCII Express - telecom
B
Bank Street Writer - word processor
C
CatFur - file transfer / chat software for the APPLE-CAT modem
Cattlecar Galactica - Super Hi-Res Chess in its later, expanded version
Contiki - 8-bit text web browser
Copy II+ - copy and disk utilities
Crossword Magic - Given clues and answers, software automatically arranges the answers into a crossword grid.
D
Dalton Disk Desintegrator - disk archiver
Davex - Unix type shell
Dazzle Draw - bitmap graphics editor
Design Your Own Home - home design (GS)
Disk Muncher - disk copy
Diversi Copy - disk copy (GS)
DOS.MASTER - DOS 3.3 -> ProDOS utility
E
Edisoft - text editor
EasyMailer
EasyWriter
F
Fantavision - vector graphics animation package
G
GEOS - integrated office suite
GNO/ME - Unix type shell (GS)
GraphicEdge - business graphics for AppleWorks spreadsheets (II & GS & Mac)
Great American Probability Machine - first full-screen Apple II animations
L
Lock Smith - copy and disk utilities
Logo - easy educational graphic programming language
M
Magic Window - one of the most popular Apple II word processors by Artsci
Merlin 8 & 16 - assembler (II & GS)
Micro-DYNAMO - simulation software to build system dynamics models
MouseWrite and MouseWrite II - first mouse based word processor for Apple II (II & GS)
O
Omnis I,II, and III - database/file manager (II & GS)
ORCA - program language suite (II & GS)
P
Point2Point - computer to computer communications program for chat and file transmission (II)
PrintShop - sign, banner, and card maker (II & GS)
ProSel - disk and file utilities (II & GS)
ProTERM - telecom program and text editor
PublishIT - desktop publishing (versions 1-4)
R
Rendezvous - shuttle orbital simulation game
S
ShrinkIt - disk and file compressor and archiver (II & GS)
Spectrum Internet Suite - Internet tools and web browser (GS)
Super Hi-Res Chess - early game aimed at programmers and "power users"
SynthLAB - music composing software
T
TellStar - astronomy
Twilight II - Apple IIGS screensaver (GS)
V
VisiCalc - spreadsheet
W
Word Juggler - word processor
WordPerfect - word processor
WordStar - word processor
Z
Z-Link - telecom
Zardax - word processor
ZBASIC - language - Zedcor Systems
References
Apple II application software |
16238637 | https://en.wikipedia.org/wiki/Mark%20A.%20O%27Neill | Mark A. O'Neill | Mark A. O'Neill (born 3 November 1959) is an English computational biologist with interests in artificial intelligence, systems biology, complex systems and image analysis. He is the creator and lead programmer on a number of computational projects including the Digital Automated Identification SYstem (DAISY) for automated species identification and PUPS P3, an organic computing environment for Linux.
Education
O'Neill was educated at The King's School, Grantham, Sheffield University and University College London.
Research interests
O'Neill's interests lie at the interface of biology and computing. He has worked in the areas of artificial life and biologically inspired computing. In particular, he has attempted to answer the question "can one create software agents which are capable of carrying a useful computational payload which respond to their environment with the flexibility of a living organism?"
He has also investigated how computational methods may be used to analyze biological and quasi biological systems for example: ecosystems and economies.
O'Neill is also interested in ethology, especially the emergent social ecosystems which occur as a result of social networking on the internet. His recent projects include the use of artificial intelligence techniques to look at complex socio-economic data.
On the computer science front, O'Neill continues to develop and contribute to a number of other open source and commercial software projects and is involved in the design of cluster/parallel computer hardware via his company, Tumbling Dice Ltd. Long-running projects include DAISY;
PUPS P3 an organic computing environment for Linux; Cryopid, a Linux process freezer; the [Mensor digital terrain model generation system]; and RanaVision, a vision based motion detection system. He has also worked with public domain agent based social interaction models such as Sugarscape and artificial life simulators, for example physis, which is a development of Tierra.
O'Neill has been a keen naturalist since childhood. In addition to his interests in complex systems and computer science, he is a member of the Royal Entomological Society and an expert in the rearing and ecology of hawk moths. He is also currently convenor of the [Electronic and Computing Technology Special Interest Group] (SIG) for the Royal Entomological Society.
He is also interested in the use of precision agriculture methodologies to monitor agri-ecosystems, and has been an active participant in a series of projects looking at the automatic tracking of bumblebees, and other insects using vision, and using both network analysis and remote sensing techniques to monitor ecosystem health. Latterly, he has become interested in applying these techniques in the commercial sphere to look at issues of corporate responsibility and sustainability in industries like mining and agriculture which have significant ecological footprints.
He has also been involved in both computational neuroscience and systems biology, the former association resulting in many papers while working at Oxford University. Work in the latter area led to the successful flotation in 2007 of a systems biology company, e-Therapeutics, where O'Neill was a senior scientist, assisted with the establishment of the company, and was named in a number of seminal patents.
O'Neill is a fellow of the British Computer Society, the Institute of Engineering and Technology, and the Royal Astronomical Society. He is also a chartered engineer, a chartered IT professional and a member of the Institute of Directors. He was one of the recipients of the BCS Award for Computing Technology in 1992.
Publications
References
External links
Tumbling Dice
e-Therapeutics
20th-century British biologists
21st-century British biologists
Artificial intelligence researchers
English computer scientists
Systems biologists
Computational biologists
Fellows of the Royal Entomological Society
Fellows of the Royal Astronomical Society
Fellows of the British Computer Society
Living people
1959 births
People educated at The King's School, Grantham |
3540442 | https://en.wikipedia.org/wiki/Apple%20Inc.%20litigation | Apple Inc. litigation | The multinational technology corporation Apple Inc. has been a participant in various legal proceedings and claims since it began operation and, like its competitors and peers, engages in litigation in its normal course of business for a variety of reasons. In particular, Apple is known for and promotes itself as actively and aggressively enforcing its intellectual property interests.
From the 1980s to the present, Apple has been plaintiff or defendant in civil actions in the United States and other countries. Some of these actions have determined significant case law for the information technology industry and many have captured the attention of the public and media. Apple's litigation generally involves intellectual property disputes, but the company has also been a party in lawsuits that include antitrust claims, consumer actions, commercial unfair trade practice suits, defamation claims, and corporate espionage, among other matters.
Background
Apple is a member of the Business Software Alliance (BSA), whose principal activity is trying to stop copyright infringement of software produced by BSA members; Apple treats all its intellectual property as a business asset, engaging in litigation as one method among many to police its assets and to respond to claims by others against it. Apple's portfolio of intellectual property is broad enough, for trademarks alone, to encompass several pages of the company's web site and, in April 2012, it listed 176 general business trademarks, 79 service marks, 7 trademarks related to NeXT products and services, and 2 trademarks related to FileMaker. Apple claims copyright interests in multiple products and processes and owns and licenses patents of various types as well and, while it states it generally does not license its patent portfolio, it does work with third parties having an interest in product interoperability. Steve Jobs alone was a named inventor on over 300 design and utility patents. Between January 2008 and May 2010, Apple Inc. filed more than 350 cases with the U.S. Patent and Trademark office (USPTO) alone, most in opposition to or taking exception to others' use of the terms "apple", "pod", and "safari"; those cases include sellers of apples (the fruit), as well as many others' less unassuming use of the term "apple".
Antitrust
Apple iPod, iTunes antitrust litigation
The case In re Apple iPod iTunes Antitrust Litigation was filed as a class action in 2005 claiming Apple violated the U.S. antitrust statutes in operating a music-downloading monopoly that it created by changing its software design to the proprietary FairPlay encoding in 2004, resulting in other vendors' music files being incompatible with and thus inoperable on the iPod. The suit initially alleged that five days after RealNetworks released in 2004 its Harmony technology making its music playable on iPods, Apple changed its software such that the RealNetworks music would no longer play on iPods. The claims of Apple's changes to its encoding and its refusal to license FairPlay technology to other companies were dismissed by the court 2009, but the allegation of Apple's monopoly on the iPod's music download capabilities between 2004 and 2009 remained as of July 2012. In March 2011, Bloomberg reported that, after a related 3-year inquiry by the Competition Commission, Apple agreed in 2008 to lower its prices on iTunes tracks sold in the United Kingdom and that Steve Jobs had been directed by the court in March 2011 to make himself available to be deposed on Apple's FairPlay changes as they relate to the plaintiffs' monopolization claim.
Apple and AT&T Mobility antitrust class action
In October 2007 (four months after the iPhone was introduced), Paul Holman and Lucy Rivello filed a class action lawsuit (numbered C07-05152) in the Northern District of California. The lawsuit referenced Apple's SIM lock on the iPhone and Apple's (at the time) complete ban on third-party apps, and alleged that the 1.1.1 software update was "expressly designed" to disable unapproved SIM cards and apps. The lawsuit said that this was an unfair, unlawful, and fraudulent business practice (see False advertising) under California's Unfair Competition Law; that the combination of AT&T Mobility and Apple was to reduce competition and cause a monopoly in violation of California's antitrust law and the Sherman Antitrust Act; and that this disabling was a violation of the Consumer Fraud and Abuse Act.
Shortly after this initial filing, other lawsuits were filed, and these were consolidated with the original Holman suit, bringing in additional plaintiffs and complaints: Timothy Smith, et al., v. Apple, Inc. et al., No. C 07-05662 RMW, adding complaints related to ringtones, and Kliegerman v. Apple, Inc., No. C 08-948, bringing in allegations under the federal Magnuson–Moss Warranty Act. The combined case title was changed to "In Re Apple & AT&TM Anti-Trust Litigation." The court appointed lead counsel from the various plaintiffs' lawyers, and several versions of a combined complaint were filed.
In October 2008, the court denied the defendants' motions to dismiss the case on the federal claims and granted their motions to dismiss the state unfair trade practice claims except in California, New York, and Washington, but gave the plaintiffs leave to amend those claims. In December 2011, the district court granted Apple and AT&T's motions to compel arbitration, following the Supreme Court decision in AT&T Mobility v. Concepcion, and decertified the class; in April 2012 the Ninth Circuit denied plaintiffs permission to appeal.
In December 2011, immediately after class decertification of the previous case, a new group of plaintiffs led by Robert Pepper won the race to the courthouse by filing a complaint in the Northern District, which was combined with some slightly later filers and titled "In re Apple iPhone Antitrust Litigation", case 11-cv-06714-YGR. The new case is essentially the same but is filed only against Apple, not AT&T Mobility. In late 2013, the various parts of the case were dismissed by the district court. The parts relating to SIM locking were rejected because AT&T was not a party and the plaintiffs were not willing to add AT&T. The remaining claim, in its final version, was that Apple monopolised the market for iPhone applications and that the plaintiffs were damaged by paying Apple's 30% commission for paid applications in the App Store, which the court rejected saying that the commission was "a cost passed-on to consumers by independent software developers", not paid by the consumers directly, and so the plaintiffs did not have standing under the Illinois Brick doctrine.
The plaintiffs appealed to the Ninth District, which reversed the District Court's dismissal. The Ninth Circuit asked the question that in light of Illinois Brick, if Apple was to be treated as a manufacturer or producer, in which case the class did not have standing to sue, or if they were a distributor, in which case the class could sue for damages.
Apple appealed the case to the Supreme Court of the United States, which agreed to hear the case, Apple Inc. v. Pepper in its 2018 term. The Supreme Court upheld the Ninth Circuit's ruling in May 2019, in that the class did have standing to litigate Apple for antitrust concerns.
European antitrust investigation
In 2008, Apple agreed to cut the price UK consumers pay to download music for their iPods after a formal complaint to the European Commission from the UK consumer group Which? demonstrated higher prices in UK for the same iTunes songs sold elsewhere in the European Union (EU). The Commission began an antitrust investigation in 2007 of Apple's business practices after the complaint was made, but ultimately the Commission probe found no agreements between Apple and major record labels on how iTunes is run in Europe, only that Apple had been paying higher wholesale prices to UK music labels and was passing the cost along to UK customers.
eBook price-fixing lawsuit
In April 2012, the U.S. Justice Department (DOJ) and 33 U.S. states brought a civil antitrust action against Apple, HarperCollins, Macmillan Publishers, Penguin Books, Simon & Schuster, and Hachette Book Group, Inc., alleging violations of the Sherman Act. The suit was filed in the Southern District of New York and alleges the defendants conspired to restrain retail price competition in the sale of e-books because they viewed Amazon's price discounting as a substantial challenge to their traditional business model. Regarding Apple in particular, the federal complaint alleged that "Apple facilitated the Publisher Defendants' collective effort to end retail price competition by coordinating their transition to an agency model across all retailers. Apple clearly understood that its participation in this scheme would result in higher prices to consumers." In such an agency-model, publishers set prices rather than sellers. Fifteen states and Puerto Rico also filed a companion federal case in Austin, Texas, against Apple, Penguin, Simon & Schuster and Macmillan. In the same month, HarperCollins, Hachette and Simon & Schuster settled with both the DOJ and the state attorneys general, with HarperCollins and Hachette agreeing to pay Texas and Connecticut $52 million in consumer restitution, leaving Apple, Penguin, and Macmillan as remaining defendants. As of July 2012, the case was still in the discovery stage of litigation. On July 10, 2013, District Court Judge Denise Cote in Manhattan found Apple Inc. guilty of the violation of federal antitrust law, citing "compelling evidence" that Apple played a "central role" in a conspiracy with publishers to eliminate retail competition and the prices of e-books.
High-Tech Employee Antitrust Litigation
In 2014, Apple settled out of court both an antitrust lawsuit and a related class-action suit regarding cold calling employees of other companies.
iOS Fees Litigation
A class-action lawsuit was filed in the California Northern District District Court by iOS app developers, alleging that Apple abuses its control of the iOS App Store to require its 30% revenue cut and its developer fee. The developers are being represented by the same lawfirm that won the previous eBook price-fixing scheme case.
Epic Games lawsuit
On August 13, 2020, Epic Games filed a lawsuit against Apple as well as Google for antitrust violations and anti-competitive behavior. Epic, which had long since challenged the 30% revenue share that Apple, Google, and other digital storefronts took, had introduced a payment option in Fortnite Battle Royale that day that allowed users to buy microtransactions directly from Epic at a discounted rate. Apple immediately removed Fortnite from their storefronts for violating their policies as apps are not allowed to bypass the App Store payment system; Google also removed the game for similar reasons from the Play Store. Epic subsequently filed the lawsuits against both companies after the game was pulled. The federal district judge issued a ruling in September 2021 that cleared Apple on nine of ten counts related to anti-trust charges Epic had raised, but did find that Apple's anti-steering provision violated California's anti-competition laws. The court issued a permanent injunction against Apple preventing them from blocking developers from including links in App Store apps to third-party payment systems or collecting information within apps to notify users about these systems.
Consumer class actions
Technical support class action
From 1993 to 1996, Apple developed a marketing strategy that promised free and unlimited live-telephone support on certain products for as long as the original purchaser owned those products; by 1997, however, changes in Apple's AppleCare support policy led Apple to rescind the offer, resulting in a consumer class action lawsuit for breach of contract. Apple denied wrongdoing but, in settlement of the claims, Apple ultimately reinstated the telephone support for the duration of original ownership of the otherwise obsolete products and customers affected by the change were given a limited reimbursement if they had been refused telephone support, had been charged per incident, or had incurred third party support charges.
iPod battery life class action
In 2004 and 2005, two state-level class action suits were filed against Apple in New York and California alleging the first, second, and third generation iPod music players sold prior to May 2004 did not have the battery life represented and/or that the battery's capacity to take and hold a charge substantially diminished over time. Rather than litigate these claims, Apple entered into a settlement agreement in August 2005 after a fairness hearing in the California action, with the settlement terms designed to end the New York action as well. An appeal followed the California court's approval of the settlement but the appellate court upheld the settlement in December 2005. Eligible members of the class were entitled to extended warranties, store credit, cash compensation, or battery replacement, and some incentive payments, with all unfiled claims expiring after September 2005. Apple agreed to pay all costs of the litigation, including incentive payments to the class members and the plaintiffs' attorney fees, but admitted no fault. In 2006 Apple Canada, Inc., also settled several similar Canadian class action suits alleging misrepresentations by Apple regarding iPod battery life.
iPad and iPhone privacy issue class action
In December 2010, two separate groups of iPhone and iPad users sued Apple, alleging that certain software applications were passing personal user information to third-party advertisers without the users' consent. The individual cases were consolidated in the U.S. District Court for the Northern District of California, San Jose division, under the title In Re iPhone Application Litigation, and further defendants were added to the action. The complainants petitioned the court for a ban on the "passing of user information without consent and monetary compensation," claimed damages for breach of privacy, and sought redress for other enumerated claims. Press reports stated that in April 2011, Apple agreed to amend its developer agreement to stop this from happening "except for information directly necessary for the functionality of the apps"; however, the suit alleged that Apple took no steps to do this or enforce it "in any meaningful way due to criticism from advertising networks".
The Associated Press reported a pending congressional inquiry into the matter, with United States Congress members stating that commercial storage and usage of location information without a consumer's express consent is illegal under current law, but Apple defended its use of customer tracking in a letter released May 9, 2011, by the House of Representatives. National Public Radio's senior director of technology published an article examining the data collected by his own iPhone, showing examples of the data collected and maps correlating the data. Separately, digital forensics researchers reported they regularly use the data collected from Apple mobile devices in working with law enforcement officials investigating crimes and have been doing so since at least mid-2010. In contrast with earlier statements, Apple revealed in a hearing with the U.S. Senate Judiciary Committee that a "software bug" caused iPhones to continue to send anonymous location data to the company's servers, even when location services on the device were turned off.
In September 2011, the District Court granted Apple's motion to dismiss for lack of Article III standing and failure to state a claim, but gave the plaintiffs leave to amend their complaint, thereby not shutting out the claims permanently. The court ruled that without a showing of legal damages compensable under current law, the plaintiffs had not shown they sustained injury in fact by the defendants' actions. The problem facing the plaintiffs is the current state of electronic privacy law, the issue being that there is no national privacy law that provides for compensatory damages for breach of privacy, and this is the same issue faced by victims of data breaches, as breaches, per se, sustain no legal damages without a showing of actual and measurable harm such as monetary loss. Under U.S. law as of July 2012, it is only when a data breach results in actual loss as defined by applicable law that compensable damages arise. The case remained on the California court's docket as of July 2012.
iTunes price-switching class action
In June 2009, a group of consumers filed the class action suits Owens v. Apple, Inc. and Johnson v. Apple Inc. against Apple on behalf of American individuals who purchased iTunes gift cards and who were then unable to use the cards to purchase iTunes music at the price advertised on the card because Apple raised the price of the music after it sold the cards to consumers. The Johnson case absorbed the Owens case and was settled on February 10, 2012, with payments to be made to consumers by Apple. The Owens complaint alleged that Apple wrongfully marketed, distributed, and sold iTunes gift cards and songs through its online iTunes store, while representing that consumers could use the gift cards to purchase songs for US$0.99 a song and then, after such gift cards were purchased, raised the price on certain songs to $1.29 on April 7, 2009. The lawsuit's allegations included that Apple's conduct constituted breach of contract, violated the state consumer fraud statute, and violated consumer protection statutes of other states. The plaintiffs sought a $.30 refund remedy for each song that class members purchased using a $.99 iTunes card for which they were charged $1.29, plus their attorneys' fees and costs. Apple mounted a vigorous defense and sought to dismiss the suit but lost its motion in December 2009. Individuals are part of the class of plaintiffs if they are U.S. residents who purchased or received an iTunes Gift Card on which the card itself or its packaging contained language to the effect that songs were priced at $0.99 and who used the card to purchase one or more $1.29 songs from the iTunes Store on or before May 10, 2010. The settlement provides class members with an iTunes Store credit of $3.25 if an online claim form was submitted on or before September 24, 2012.
Macbook MagSafe power adapter class action
Apple settled a U.S. class action in 2011 regarding the older T-shaped MagSafe power adapters. Apple agreed to replace the adapters with newer adapters, and to compensate customers who were forced to buy replacement adapters.
In-app purchases class action
In 2011, five parents filed a class action suit against Apple for "in-app" purchases, which are purchases that can be made within applications ("apps"). The parents contended that Apple had not disclosed that the "free" apps that were to be used by children had the potential to rack up fees without the parent's knowledge. Potentially 23 million customers could make up the class. Apple offered a settlement option for customers who had fees in excess of $30. In 2011 The Federal Trade Commission (FTC) investigated similar claims. This settled for $100 million. The FTC's action lead to a payout of $32.5 million payout in February 2014.
A similar case was filed by a parent in March 2014 against Google.
iPhone slowdown class action
Apple was claimed to intentionally slow down old iPhone models by adjusting their operating systems in order to encourage users to buy new products. The company confirmed these suspicions but said that the slowdown is exclusively due to the fact that the performance of old lithium-ion batteries decreases over time. Nevertheless, users were forced to spend extra on battery replacement to restore their phones' former speed. After the issuance of a class-action lawsuit in 2017 and lengthy litigation, in 2020, Apple agreed to pay the compensation of $500m (about $25 for each affected user).
Trade practice
Resellers v. Apple
In 2004, independent Apple resellers filed a lawsuit against Apple alleging the company used misleading advertising practices by using unfair business practices that harmed the resellers' sales while boosting Apple-owned outlets, in effect by favoring its own outlets over those of its resellers. The lawsuit claimed that Apple favored company-owned stores by providing significant discounts unavailable to independent dealers. The complaint alleged Apple's acts in favoring its own stores constituted breach of contract, false advertising, fraud, trade libel, defamation, and intentional interference with prospective economic advantage. , Apple reached settlements with all of the plaintiffs, including the bankruptcy trustee for one reseller that failed, while the former principal of that company appealed the bankruptcy court's approval of the settlement.
Defamation
Libel dispute with Carl Sagan
In 1994, engineers at Apple Computer code-named the mid-level Power Macintosh 7100 "Carl Sagan" after the popular astronomer in the hope that Apple would make "billions and billions" with the sale of the computer. Apple used the name only internally, but after the name was publicized in a 1993 issue of MacWeek, Sagan was concerned that it would become a product endorsement and sent Apple a cease-and-desist letter. Apple complied, but its engineers retaliated by changing the internal codename to "BHA" for "Butt-Head Astronomer".
Sagan then sued Apple for libel in federal court. The court granted Apple's motion to dismiss Sagan's claims and opined in dicta that a reader aware of the context would understand Apple was "clearly attempting to retaliate in a humorous and satirical way", and that "It strains reason to conclude that Defendant was attempting to criticize Plaintiff's reputation or competency as an astronomer. One does not seriously attack the expertise of a scientist using the undefined phrase 'butt-head'." Sagan then sued for Apple's original use of his name and likeness, but again lost and appealed that ruling. In November 1995, Apple and Sagan reached an out-of-court settlement and Apple's office of trademarks and patents released a conciliatory statement that "Apple has always had great respect for Dr. Sagan. It was never Apple's intention to cause Dr. Sagan or his family any embarrassment or concern". Apple's third and final code name for the project was "LaW", short for "Lawyers are Wimps".
Trademarks, copyrights, and patents
Trademark
Apple Corps
For nearly 30 years Apple Corps (The Beatles-founded record label and holding company) and Apple Inc. (then Apple Computer) litigated a dispute involving the use of the name "Apple" as a trademark and its association with music. In 1978, Apple Corps filed suit against Apple Computer for trademark infringement and the parties settled in 1981 with Apple Computer paying an undisclosed amount to Apple Corps, later revealed to be $80,000. A primary condition of the settlement was that Apple Computer agreed to stay out of the music business. In 1991, after Apple introduced the Apple IIgs with an Ensoniq music synthesizer chip, Apple Corps alleged the product to be in violation of the terms of their settlement. The parties then reached another settlement agreement and Apple paid Apple Corps around $26.5 million, with Apple agreeing it would not package, sell, or distribute physical music materials.
In September 2003, Apple Corps again sued Apple Computer alleging Apple Computer had breached the settlement once more, this time for introducing iTunes and the iPod. Apple Corps alleged Apple Computer's introduction of the music-playing products with the iTunes Music Store violated the terms of the previous agreement in which Apple agreed not to distribute music. The trial opened on March 29, 2006, in the UK. and ended on May 8, 2006, with the court issuing judgement in favor of Apple Computer. "[I] find no breach of the trademark agreement has been demonstrated," the presiding Justice Mann said.
On February 5, 2007, Apple Inc. and Apple Corps announced another settlement of their trademark dispute, agreeing that Apple Inc. would own all of the trademarks related to 'Apple' and would license certain of those trademarks back to Apple Corps for its continued use. The settlement ended the ongoing trademark lawsuit between the companies, with each party bearing its own legal costs, and Apple Inc. continuing to use the Apple name and logos on iTunes. The settlement's full terms were confidential.
Swatch Group
In April 2019, a Swiss court ruled against Apple’s claim that the ‘Tick Different’ slogan employed by watchmaker Swatch Group had infringed on Apple’s Think Different advertising campaign that ran from 1997 until 2002. Swatch contended that Apple’s campaign wasn’t well known enough in Switzerland to warrant protection and the Federal Administrative Court concluded that Apple had failed to produce sufficient documentation to support its claim.
Domain name disputes
appleimac.com
In an early domain name dispute, two months before announcing the iMac in July 1998, Apple sued then-teenager Abdul Traya. Having registered the domain name appleimac.com in an attempt to draw attention to the web-hosting business he ran out of his parents' basement, a note on Traya's site stated that his plan was to "generate traffic to our servers and try to put the domain to sale. " After a legal dispute lasting for nearly a year, Apple settled out of court, paying Traya's legal fees and giving him a 'token payment' in exchange for the domain name.
itunes.co.uk
The Apple-Cohen dispute was a cybersquatting case where a top-level domain registrar's decision differed from prior decisions by awarding a domain name to a subsequent registrant (Apple), rather than to the prior registrant (Cohen). As the decision recounts, in November 2000, Benjamin Cohen of CyberBritain registered the domain name itunes.co.uk. The domain initially pointed to skipmusic.com, and then to cyberbritain.com, and was then inoperative for some time. Apple applied for a UK trademark for iTunes in October 2000 which was granted in March 2001, and then launched its UK iTunes music store service in 2004. Afterward, Cohen reactivated his registered domain name, redirecting it to iTunes' then-rival, Napster; later Cohen forwarded the domain name to his CyberBritain's cash back/rewards website.
In 2005, Apple took the matter to the Dispute Resolution Service operated by .uk domain name registry Nominet UK (the DRS), claiming that Apple had trademark rights in the name "iTunes" and that the use of the domain name by Cohen's company was abusive (these being the two tests under the DRS rules for prevailing in a matter where the complaint related only to the later use of a trademarked name). The dispute was unresolved at the free mediation stage and so Apple paid for an independent expert to decide the case; the expert decided the dispute in Apple's favor.
Cohen thereafter launched a media offensive claiming the DRS was biased in favor of large businesses and made frequent threats of lawsuits against Nominet. Cohen stated he believed that the DRS system was unfair for a number of reasons and would seek redress against Nominet with the High Court via judicial review. Nominet stated that Cohen should appeal the case via the appeal process in the DRS. Cohen refused and, after several months, instead issued proceedings for judicial review. The High Court at first instance rejected Cohen's case in August 2005, noting that Cohen's company, Cyberbritain Group Ltd., should have used the appeal process forming part of Nominet's domain resolution service. Afterward, Cohen's company asked for a rehearing and, as that case progressed, the interim domain name was transferred to Apple in accord with the expert's decision and thereafter pointed to the Apple music site. In November 2005, Cohen dropped all legal action against Apple.
Cisco Systems: iPhone mark
In 2006, Cisco Systems and Apple negotiated over allowing Apple rights to use Cisco's Linksys iPhone trademark, but the negotiations stalled when Cisco pushed for the two products to be interoperable. Following the public unveiling of the Apple iPhone at the 2007 Macworld Expo, Cisco filed a lawsuit against Apple in January 2007, alleging Apple's iPhone name infringed on Cisco's iPhone trademark. Cisco alleged that Apple created a front company subsequent to their negotiations to try to acquire the rights another way, while Apple countered that there would be no likelihood of confusion between the two products, because Apple's iPhone product was the first cell phone with such a name, while Cisco's iPhone was a VoIP phone. Bloomberg reported Cisco's iPhone as a product marketed for less than $100 and part of the Linksys home routers, enabling internet-based calls through Skype and Yahoo! Messenger, and contrasted it with Apple's iPhone as a mobile phone which sold for around $600. In February 2007, Cisco and Apple announced an agreement under which both companies would be allowed to use the iPhone name worldwide.
Sector Labs: use of Pod
In March 2007, Apple opposed a trademark application by startup Sector Labs, which sought to register "Video Pod" as a mark identifying goods associated with a video projector product. Apple argued that the proposed mark was merely "descriptive" and should be denied because the registration would cause a likelihood of confusion with Apple's pre-existing "iPod" marks. In March 2012, the U.S. Trademark Trial and Appeal Board (TTAB) ruled in Apple's favor and denied Sector Labs' registration, finding that the "iPod" mark was "famous" and therefore entitled to broad protection under U.S. trademark law.
New York City "GreeNYC" logo
In January 2008, Apple filed an opposition with the U.S. Trademark Trial and Appeal Board against New York City's (NYC) trademark application for the "Big Apple" logo for NYC's GreeNYC initiative, by designer Blake E. Marquis. NYC originally filed for its trademark: "a stylized apple design" for "[e]ducation services, namely, providing public service announcements on policies and practices of the City of New York in the field of environmentally sustainable growth" in May 2007, with an amendment filed in June 2007. The TTAB's Notice of Publication was published in September 2007 and Apple filed an opposition with the TTAB the following January, claiming a likelihood of confusion. In June 2008, NYC filed a motion to amend its application to delete the leaf element from its design, leaving the stem, and the TTAB dismissed Apple's opposition and counterclaims in accordance with the parties' stipulation in July 2008. In November 2011, the TTAB issued NYC's trademark registration.
Victoria School of Business and Technology
In September 2008, Apple sent a cease and desist letter to the Victoria School of Business and Technology in Saanich, British Columbia, claiming the school's logo infringed Apple's trademark rights and that the school's logo falsely suggested Apple had authorized the school's activities. The logo in question featured the outline of an apple and a leaf, although the design incorporated a mountain, had three bumps on top of the apple instead of the two used by Apple, and had no bite out of the apple, unlike Apple's logo. In April 2011, the school reported it had settled its 3-year dispute with Apple, was launching a new logo under a new name, Q College, and was expanding its operations. The settlement's full terms were undisclosed.
Woolworths Limited logo
In October 2009, Apple disputed a trademark application by Woolworths Limited in Australia over the new logo for its supermarket chain Woolworths Supermarkets, a stylised "W", similar in shape to an apple. Apple reportedly took objection to the breadth of Woolworths' application, which would allow it to brand products, including consumer electronics, with the logo. In April 2011, Woolworths amended its trademark application to remove various goods and services, such as "apparatus for recording, transmission or reproduction of sound or images" and Apple withdrew its opposition, allowing the trademark to proceed to registration. In August 2011 Woolworths introduced a shopping app for the iPhone, and, as of January 2019 continues to use the logo, including on the face of its iPhone app. The Woolworths smartphone app is also available on Apple's App Store where the logo is featured prominently; Apple closely manages its App Store offerings.
Apple v. DOPi: lower-case i use
In March 2010, an Australian Trademarks tribunal denied Apple's attempt to prevent a small company from trademarking the name DOPi for use on its laptop bags and cases for Apple products. Apple argued that the DOPi name — which is iPod spelled backwards — is too similar to its own product's name, the iPod.
Proview: iPad trademark
In 2006, Apple secured Taiwanese rights to the iPad mark from the Taiwanese company Proview Electronics; in China the iPad mark was still owned by the subsidiary of Proview Electronics, Shenzhen company Proview Technology, as of April 2012. Proview Technology sued Apple over the rights to the mark in China in 2011; Apple counter-sued but lost and then appealed, with the case before the Xicheng district court, where Proview claimed $1.6 billion USD in damages. Apple paid Proview approximately $53,000 – $55,000 for the mark in 2009. In February 2012, Proview sued Apple in the Santa Clara Superior Court, alleging several permutations of fraud (intentional misrepresentation, concealment, inducement) and unfair competition. Apple paid $60 million to Proview to end the dispute in a court-mediated settlement in the Higher People's Court of Guangdong province; the U.S. case was thrown out of court.
Amazon "App Store"
In 2011, Apple filed suit against Amazon.com alleging trademark infringement, unfair competition, and dilution under the Lanham Act and related California state law over Amazon's use of the "App Store" phrase relating to Amazon's "Amazon Appstore Developer Portal" and Amazon's alleged other similar uses of the phrase. In its complaint, Apple did not refer to "apps" as a common name, but described its applications store as a place consumers license "software programs or products"; Amazon countered in its answer that "app store" is a common phrase meaning a "place to buy apps". Reuters reported that Microsoft was opposing Apple's attempted registration of the phrase as a trademark and that part of the matter was before the Trademark Trial and Appeal Board (TTAB). Apple motioned the court for a preliminary injunction to bar Amazon from using the "App Store" name but, in July 2011, U.S. District Judge Phyllis Hamilton, presiding over Apple's case against Amazon, denied Apple's motion. In July 2012, the case was still in the discovery stage of litigation.
In January 2013, Apple's claims were rejected by a US District judge, who argued that the company presented no evidence that Amazon had "[attempted] to mimic Apple's site or advertising", or communicated that its service "possesses the characteristics and qualities that the public has come to expect from the Apple APP STORE and/or Apple products" In July 2013, Apple dropped the lawsuit.
Trade secrets
Apple v. Does
Ultimately decided under the title O'Grady v. Superior Court, the suit filed by Apple against unnamed bloggers raised the issue for the first time of whether bloggers hold the same protections against revealing sources that journalists have. In November 2004, three popular weblog sites featuring Apple rumors publicly revealed information about two unreleased Apple products, the Mac mini and an as yet unreleased product code-named Asteroid, also known as Project Q97. Apple subpoenaed three sites to force them to identify their confidential sources: Apple Insider, Power Page, and, separately, Think Secret, which did no original reporting on the case and thus had no sources to reveal. In February 2005, a trial court in California decided that website operators do not have the same shield law protection as do other journalists. The journalists appealed and, in May 2006, the California Court of Appeal reversed the trial court's decision, ruling that activities in question were covered by the shield law.
Apple v. Think Secret
In Apple Computer v. DePlume, a case illustrating one of Apple's methods of protecting its claims in trade secrets, Apple sued Think Secret's parent company, the dePlume Organization LLC, and Think Secret's editor in January 2005, alleging misappropriation of trade secrets with regard to Think Secret's stories on a "headless iMac" and new version of iWork. In response, DePlume filed a motion to dismiss the case based on First Amendment grounds under California's state Anti-SLAPP statute, a law designed to dispense with meritless legal claims attempting to silence valid exercises of freedom of speech. In late 2007, Think Secret announced "Apple and Think Secret have settled their lawsuit, reaching an agreement that results in a positive solution for both sides. As part of the confidential settlement, no sources were revealed and Think Secret will no longer be published".
Copyright
Apple v. Franklin
Apple v. Franklin established the fundamental basis of copyright of computer software, even if it was provided only as object code or in firmware. In 1982, Apple filed a lawsuit against Franklin Computer Corp., alleging that Franklin's ACE 100 personal computer used illegal copies of the Apple II's operating system and ROM. The case was decided in Franklin's favor but reversed by the Court of Appeals for the Third Circuit.
Object code cases and conflicts of law
Apple's litigation over object code contributed to the development of contemporary copyright law because the company's object code cases brought different results in different courts, creating a conflict of laws that resulted in international litigation. In the 1980s, Apple litigated two copyright cases with central issues that included the question of whether object code (as contrasted with source code) of a computer program is subject to copyright laws. A third case in which Apple was not a party but that involved the Apple decisions followed in New Zealand. The specific cases were Computer Edge Pty. Ltd. v Apple Computer Inc. (1986, Australia) ("Computer Edge"), Apple Computer Inc. v Mackintosh Computers Ltd., (Canada, 1987) ("Apple v. Mackintosh"), and IBM v. Computer Imports Ltd. ("IBM v. Computer Imports"), (New Zealand, 1989).
In the Computer Edge case, the Australian court decided against the then-prevailing opinions in other courts (the U.K., Canada, South Africa, and the U.S.) and ruled object code was not copyrightable, while the Supreme Court of Canada in Apple v. Mackintosh reversed its earlier decisions and ruled that because object code was a translation of source code and embodied in a silicon chip, it was therefore a translation of an original literary work expressed in a material form and unauthorized reproduction of the object code was therefore an infringement of copyright. The Canadian court opined that programs within ROM silicon chips are protected under the Copyright Act of Canada and the conversion from the source code into object code is a form of translation. It further held that such translation does not include the expression of an idea in another form, but rather only applies to the expression of an idea in another language, and that a translation has a one-to-one correspondence between works that are expressed in two different languages.
In these conflict of laws cases, Apple met with conflicting international judicial opinions: an Australian court decision conflicted with a Canadian court decision on the copyrightability of object code. In IBM v. Computer Imports, the High Court of New Zealand then considered these prior decisions and sided with the Canadian decision in ruling that, although object code is not an original literary work in its own right, it is a reproduction of source code in material form and therefore an infringement of copyright takes place if it is copied without the authorization of the copyright owner. Such legal conflicts affected not only Apple, but all other software companies as well, and the conflicts remained unresolved until the creation of an international legal regime embodied in further changes to national copyright laws, which ultimately made object code subject to copyright law. These revisions of law in favor of making object code subject to copyright law are still controversial. The revisions also form the technical underpinnings (via the Digital Millennium Copyright Act (DMCA) and the Electronic Communications Privacy Act) for the legal notions of electronic privacy violation and computer trespass, as well as the further development of anti-hacking law-making such as the Patriot Act and the Convention on Cybercrime.
Apple v. Microsoft and Hewlett-Packard
In 1988, after the introduction of Microsoft's Windows 2.0, Apple filed a lawsuit against Microsoft and Hewlett-Packard alleging that Microsoft Windows and HP's NewWave violated Apple's copyrights in the Macintosh user interface. Cited, among other things, was the use of overlapping and resizable windows in Windows 2.0. The case was one of the "look and feel" copyright lawsuits of the 1980s. After several years in court, Apple's claims against Microsoft were dismissed, primarily due to a license John Sculley had negotiated with Bill Gates for Windows 1.0. The decision was upheld on appeal in 1994, but legal disputes on this topic were still ongoing until 1997, when the two companies came to a wide-ranging agreement that included Microsoft buying non-voting Apple stocks.
Xerox v. Apple Computer
Xerox Corp. v. Apple Computer was a 1989 case where Xerox sued Apple over its graphical user interface (GUI) copyrights. A federal district court dismissed Xerox's claims without addressing whether Apple's GUI infringed Xerox's.
OdioWorks v. Apple
The OdioWorks case was one of the first high-profile cases illustrating Apple's attempts to employ federal police power in its litigation practices by invoking the anti-circumvention provisions of the Digital Millennium Copyright Act (DMCA) as a means of shielding its intellectual property from reverse engineering. In November 2008, Apple sent a cease-and-desist letter to BluWiki, a non-commercial wiki provider, alleging BluWiki infringed Apple's copyrights in publishing a discussion of how to make the latest iPods interoperate with other software and that, by so doing, violated the DMCA. In April 2009, OdioWorks, the operators of BluWiki, backed by the Electronic Frontier Foundation (EFF), defensively sued Apple seeking a declaration of non-infringement and non-circumvention. In July 2009, Apple ceased claiming infringement, stating it was "withdrawing [Apple's] takedown notifications" and that "Apple no longer has, nor will it have in the future, any objection to the publication of the itunesDB Pages which are the subject of the OdioWorks complaint". After Apple withdrew its complaint and cited code obsolescence as a contributing factor in its decision to withdraw, BluWiki then republished its discussion of the issue. The EFF noted, "While we are glad that Apple retracted its baseless legal threats, we are disappointed that it only came after 7 months of censorship and a lawsuit".
Apple v. Corellium
In 2019, Apple sued security start-up Corellium for creating the first virtual iPhone-simulating software. The product was created with the intent of helping users research security issues in iOS. Apple’s lawsuit argued that Corellium’s product would be dangerous in the wrong hands as it’d let hackers learn exploits easier, as well as claiming that Corellium was selling their product indiscriminately, even to potential competitors of Apple.
The judge ruled in favor of Corellium in the case, concluding that the company used a thorough vetting process for clients and that the product was not intended to compete with Apple or diminish security of iOS. He also stated that Apple’s claim was “Puzzling, if not disingenuous.”
Trade dress
GEM "look and feel" suit
Prevailing in an early copyright infringement suit in the mid-1980s, Apple forced Digital Research to alter basic components in Digital Research's Graphics Environment Manager ("GEM"), almost a direct copy of the Macintosh's graphical user interface (GUI), or "look and feel". Features Digital Research removed from GEM as a result of the lawsuit included disk drive icons on the desktop, movable and resizable windows in the file manager, shading in the title bars, and window open/close animations. In addition, visual elements including the scrollbar thumbs and the window close button were changed to be less similar to those in the Mac GUI.
Apple v. eMachines
In 1999, Apple successfully sued eMachines, whose eOne too closely resembled the then-new iMac's trade dress. The eOne was taken off the market, resulting in eMachines' losing the ability to sell the eOne as intended. In eMachines' EDGAR statement for May 1, 2001, eMachines stated that its "net loss for the first quarter of 2001 was $31.1 million, or $0.21 per share, compared to a loss of $11.9 million, or $0.13 per share, in the first quarter of 2000", and that these results "reflect the substantial discounts and incentives that we gave to retailers to enable liquidation of product inventories".
Patent
Creative Technology v. Apple, Inc. (menu structure)
In a dispute illustrating the nature of claims, defenses, and counterclaims for patent infringement based on arguments of prior art and first to file, rival digital music player maker Creative Technology sued Apple in May 2006 for Apple's alleged infringement of Creative's Zen patent claiming Apple infringed Creative's patent for the menuing structures on an MP3 player. Creative claimed it began using its menuing method on its Nomad players in September 2000, approximately a year prior to Apple's first iPod release in October 2001. Creative, a Singapore-based consumer electronics group, also filed a trade complaint with the United States International Trade Commission (ITC) against Apple. Creative asked for a court injunction to block the import and sale of Apple's iPod and iPod nano in the United States and for money damages for past sales. Apple filed a countersuit against Creative on similar grounds.
In August 2006, Apple and Creative settled the suit with Apple agreeing to pay Creative $100 million USD for the right to implement Creative's method of sorting songs on the iPod. The settlement effectively ended the patent dispute and five other pending lawsuits between the two companies. Creative also secured an agreement to participate in the "Made for iPod" program by producing accessories for the iPod.
Typhoon Touch Technologies (touch screen)
In June 2008, Apple was named among others as a defendant in a suit brought by plaintiff Typhoon Touch Technologies in the federal U.S. District Court for the Eastern District of Texas alleging patent infringement in portable touch screen technology. The suit illustrated the vagaries of litigating patent licensing and royalty collection issues in the commercial exploitation of intellectual property rights. Ultimately, Typhoon could not prevail against patent defense arguments of prior art and obviousness and earned itself a reputation as a patent troll. Typhoon acquired two pre-existing patents, (filed in 1993 and 1994 and issued in 1995 and 1997), in mid-2007 for $350,000 plus a percentage of collected licensing fees. The patents had languished for some time and were not being policed; shortly after Typhoon acquired the patents, it began enforcement by bringing suit against exploiters of the technology who had not paid licensing fees. Typhoon was successful in its patent infringement suits against some small companies, and then expanded its litigation to go after larger ones. Typhoon alleged that Apple and others used its patented technology inventions without permission. Typhoon originally filed the suit in December 2007 against Dell after settling with some smaller companies but, in mid-2008, amended its complaint to add Apple, Fujitsu, Toshiba, Lenovo, Panasonic, HTC, Palm, Samsung, Nokia, and LG. In 2010, Apple settled with Typhoon for an undisclosed sum and was then dismissed from the litigation as of September 2010. The other large companies were able to rebuff Typhoon's claims, and Typhoon ceased doing business in 2008 after the U.S. Securities and Exchange Commission (SEC) suspended its trading in a fraud investigation.
Nokia v. Apple (wireless, iPhone)
In October 2009, Nokia Corporation sued Apple for Apple's infringement of Nokia's patents relating to wireless technology; Apple countersued Nokia in December 2009. The two companies engaged in nearly two-years of litigation and both parties amended their claims multiple times and in multiple courts before finally settling in June 2011. For an undisclosed amount of cash and future ongoing iPhone royalties to be paid by Apple, Nokia agreed to settle, with Apple's royalty payments retroactively back-payable to the iPhone's introduction in 2007, but with no broad cross-licensing agreement made between the companies. Apple only agreed to cross-license some patents to Nokia. "Apple said in a statement today that Nokia will have a license to some technology, "but not the majority of the innovations that make the iPhone unique". Apple gets a license to some of Nokia's patents, including ones that were deemed essential to industry standards on mobile phones.
Apple v. HTC
Apple filed a patent infringement suit against High Tech Computer Corp. (HTC) in March 2010 in the U.S. District Court for the District of Delaware in the two companies' ongoing battle with each other, and a complaint against HTC under Section 337 of the Tariff Act of 1930 with the U.S. International Trade Commission (ITC) in Washington, D.C. Apple's suit alleged 20 separate patent infringements relating to the iPhone's user interface, underlying architecture and hardware. Steve Jobs exclaimed "We can sit by and watch competitors steal our patented inventions, or we can do something about it. We've decided to do something about it ... [We] think competition is healthy, but competitors should create their own original technology, not steal ours". The ITC rejected all but one of Apple's claims, however, ruling for Apple on a single claim relating to data tapping. HTC motioned the Delaware court for a change of venue to the Northern District of California, arguing against Apple's desire to consolidate the case with the similar cases brought by Nokia against Apple, alleging insubstantial overlap between those cases and Apple's complaint, but Judge Gregory M. Sleet denied HTC's motion for a venue change, ruling that Apple's choice of forum would prevail. HTC countersued Apple in September 2011 in the same court claiming infringement of four patents HTC obtained from Google, also filing a counter-complaint with the ITC, with HTC's general counsel saying "HTC will continue to protect its patented inventions against infringement from Apple until such infringement stops." In May 2012 the Delaware court ordered mediation between the companies. In November 2012, HTC and Apple ended the patent dispute by settling the case, but did not disclose the terms of the settlement. The companies reported the settlement included a 10-year agreement for licensing both companies' current and future patents to each other."
Kodak v. Apple (digital imaging)
Eastman Kodak sued Apple and Research In Motion (RIM) in January 2010, filing two lawsuits against Apple and a complaint with the U.S. International Trade Commission against both Apple and RIM after the companies refused to pay royalties for use of Kodak's patents for digital cameras. Kodak alleged Apple's and RIM's phones infringed on patented Kodak digital imaging technology. Kodak sought an injunction against further imports into the United States of Apple's iPhone and RIM's BlackBerry. After Kodak filed an additional suit in January 2012 against Apple and another against HTC claiming infringement of four of its key patents, Apple filed a countersuit with the U.S. Bankruptcy Court to block Kodak's efforts to use the disputed patents as collateral for loans. In the January complaint Kodak claimed violations of the same image preview technology at issue in the original dispute between Kodak, Apple, and RIM that is, as of 2012, pending before ITC. In March 2012, bankruptcy court judge Allen Gropper, overseeing Kodak's restructuring, denied Apple's request to file a patent complaint with the ITC over some of Kodak's cameras, photo frames, and printers. In July 2012, the Court of Appeals for the Federal Circuit ruled that Kodak did not infringe on Apple's patent technology for digital cameras, although a few days earlier Kodak lost its case before the ITC against Apple and RIM; Kodak announced it would appeal that decision.
Motorola Mobility v. Apple
In the year before Apple and Samsung began suing each other on most continents, and while Apple and HTC were already embroiled in a patent fight, Motorola Mobility and Apple started a period of intense patent litigation. The Motorola-Apple patent imbroglio commenced with claims and cross-claims between the companies for patent infringement and encompassed multiple forums in multiple countries as each party sought friendly venues for litigating its respective claims; the fight also included administrative law rulings as well as ITC and European Commission involvement. As of April 2012, the controversy centered on whether a FRAND license to a components manufacturer carries over to an equipment manufacturer incorporating the component into equipment, an issue not addressed in the U.S. Supreme Court's default exhaustion doctrine in Quanta v. LG Electronics. In June 2012, appellate Judge Richard Posner ordered dismissal of the case with prejudice and Apple announced its intention to appeal a month later.
VirnetX patent infringement lawsuits
Since 2010, at least three different cases have been filed against Apple by VirnetX related to patent infringement on at least thirteen of their patents in Apple's FaceTime and VPN On Demand technology in the iOS system. The first case, involving four of VirnetX's patents, was found in favor of VirnetX, and while Apple was able to content one of the patents with the Patent Office, the other three stood up to scrutiny. Apple further appealed up to the Supreme Court, but the Supreme Court refused to hear the case in February 2020, leaving in place a verdict against it. Other cases cover redesigns versions of FaceTime that VirnetX claim still violate their patents.
Apple v. Samsung: Android phones and tablets
Apple Inc. v. Samsung Electronics Co., Ltd. was the first of many lawsuits between Apple and Samsung. In the spring of 2011, Apple sued Samsung while already fully engaged in a patent war with Motorola. Apple's multinational litigation over technology patents became known as the Smartphone patent wars: Extensive litigation followed fierce competition in the global market for consumer mobile communications.
By August 2011, Apple and Samsung were engaged in 19 ongoing lawsuits in 12 courts in nine countries on four continents; by October, the fight expanded to 10 countries, and by July 2012, the two companies were embroiled in more than 50 lawsuits around the globe with billions of dollars in damages claimed between them. As of August 2013, the ultimate cost of these patent wars to consumers, shareholders, and investors is not known.
A U.S. jury trial was held on July 30, 2012, with Apple prevailing and Samsung ordered to pay more than $1 billion in damages, after which Samsung stated: "This is not the final word in this case or in battles being waged in courts and tribunals around the world, some of which have already rejected many of Apple's claims." Judge Lucy H. Koh later decided that the jury had miscalculated $450 million in its initial damage assessment and ordered a retrial that commenced in November 2013. Following a week-long trial, also overseen by Judge Koh, Samsung was ordered to pay $600 million to Apple for the 2012 lawsuit.
On August 9, 2013, the U.S. International Trade Commission (USITC) announced its decision regarding an Apple-initiated case, whereby Samsung is accused of infringing four Apple patents related to user interfaces and headphone input functionality. The USITC sided with Apple in what was described in the media as a "mixed ruling" and stated that some of Samsung's older devices infringe on two of Apple's patents—one covering touch-screen technology and another regarding headphone jacks; however, no violations were identified in four other patents. The final determination of the ITC was signed by Lisa Barton, Acting Secretary to the Commission.
In a damage-only retrial court session on November 13, 2013, as ordered by Judge Koh in December 2012, a Samsung Electronics representative stated in a San Jose, U.S. courtroom that Apple's hometown jury found that the company copied some features of both the iPhone and iPad. Samsung's attorney clarified the purpose of the damage-only retrial and stated the result of the first trial, "This is a case not where we're disputing that the 13 phones contain some elements of Apple's property," but the company has disputed the $379.8 million amount that Apple claimed — Samsung presented a figure of $52 million. The San Jose jury eventually awarded Apple $290 million in damages after jurors completed a one-page assessment form for each infringed patent. The six-woman, two-man jury reached its decision after a three-day period.
In the first week of January 2014, a filing with the U.S. District court in San Jose showed that legal executives from both parties agreed to meet prior to February 19, 2014, to engage in settlement discussions. Both Samsung and Apple were responding to a court order that instructed the completion of such a meeting before a new trial begins in March 2014. One of three Samsung chiefs met with Cook, but the filing did not reveal the name of the representative.
A new trial is scheduled for March 2014, in which Apple will seek to prevent Samsung from selling some of its current devices in the U.S. The case will involve further debate over monetary compensation. In the 2014 lawsuit, Samsung is accused of infringing five of Apple Inc.'s patents in 10 phone and tablet models, while Samsung has responded with a counterclaim, in which it states that two patents for nine phones and tablets have been infringed on by Apple. Jury selection for the trial occurred on March 31, 2014. Samsung stands to gain $6 million if the jury rules in its favor, while Apple is seeking $2 billion in damages and could proceed with similar lawsuits against other Android handset makers, as the relevant patent issues extend beyond Samsung's software technology.
Corephotonics v. Apple
On 6 November 2017, Israeli start-up Corephotonics sued Apple. They claimed that the technology behind the dual-camera systems in Apple's iPhone 7 Plus and 8 Plus infringed four patents owned by them (Corephotonics). Corephotonics said that they approached Apple over a possible partnership, but Apple's lead negotiator apparently declined the idea, with Apple going ahead and launching the iPhone 7 Plus in late 2016, and then the 8 Plus in late 2017.
The patents claimed by Corephotonics to be infringed are: two patents on mini telephoto lens assembly, one patent on dual aperture zoom digital cameras, and one on high resolution thin multi-aperture imaging systems.
Corephotonics also blamed Apple's consumers (who bought the 7 Plus or 8 Plus) to be infringing the patents, as they claim that Apple sells the products with "knowledge of or willful blindness", which the consumers buy.
The lawsuit demands monetary compensation for the lawyers the start-up had to hire, as well as for damages. They are also asking Apple to immediately stop producing dual-lens cameras systems. The iPhone X is not included in the lawsuit, despite having a dual-lens camera.
Licensing
Norwegian Consumer Council
In June 2006, the Consumer Ombudsmen in Norway, Sweden and Denmark challenged Apple's iTunes end user license agreement (EULA) through the Norwegian Consumer Ombudsman Bjørn Erik Thon, who claimed that Apple was violating contract and copyright laws in their countries. Thon stated that Apple's "being an international company does not entitle [it] to disregard the laws of the countries in which it operates. The company's standard customer contract violates Norwegian law". An official complaint was filed by the Norwegian Consumer Council in January 2006, after which German and French consumer groups joined the Nordic-led drive to force Apple to make its iTunes online store compatible with digital music players made by rival companies. A French law allows regulators to force Apple to make its player and store compatible with rival offerings. The consumer protection regulators of Norway, Sweden, and Finland met with Apple in September 2006 in hopes of resolving the issues without litigation, but the matter was only resolved after Apple discontinued its FairPlay digital rights management (DRM) scheme.
Office of Fair Trading investigation
In 2008, the UK National Consumer Council (NCC, now Consumer Focus) called on the UK's Office of Fair Trading (OFT) to investigate Apple's EULA, claiming Apple's EULA, and those of multiple other technology companies, misled consumers and infringed legal rights. The NCC's product complaint included Apple's iLife as well as Microsoft's Office for Mac, and products by Corel, Adobe, Symantec, Kaspersky, McAfee, and others. The OFT determined the licensing agreements were unfair and Apple agreed to improve its terms and conditions to make them clearer and fairer to consumers.
Apple Inc. v. Psystar Corporation
In July 2008, Apple Inc. filed suit against Psystar Corporation alleging Psystar sold Intel-based systems with Mac OS X pre-installed and that, in so doing, violated Apple's copyright and trademark rights and the software licensing terms of Apple's shrink wrap license. That license restricted the use of Mac OS X to Apple computers, and specifically prohibited customers from installing the operating system on non-Apple computers.The case brought the anti-circumvention and anti-trafficking facets of the DMCA into this licensing dispute, with Apple ultimately prevailing and awarded permanent injunctive relief, and the decision affirmed on appeal in 2011. Psystar's appeal asserted copyright misuse as a defense by arguing that Apple's license agreement was an unlawful attempt to extend copyright protection to products that are not copyrightable. The appeals court ruled that Psystar failed to demonstrate "copyright misuse" by Apple because Psystar must show either that the license agreement restricts creativity or that it restricts competition, and that Apple's license agreement does neither.
Corporate espionage and data theft
QuickTime code theft litigation
In 1995, Apple added Microsoft and Intel to an existing lawsuit against the San Francisco Canyon Company, alleging that Microsoft and Intel knowingly used the software company to aid them in stealing several thousand lines of Apple's QuickTime code in an effort to improve the performance of Video for Windows. After a threat to withdraw support for the Macintosh edition of Microsoft Office the suit was settled in 1997, along with all lingering issues from the Apple Computer, Inc. v. Microsoft Corporation "look & feel" suit. Apple agreed to make Internet Explorer the default browser over Netscape, while Microsoft agreed to continue developing Office and other software for the Mac for the next five years and to purchase $150 million of non-voting Apple stock.
FBI demand to unlock iPhone
In February 2016, the Federal Bureau of Investigation, as part of its investigation into the 2015 San Bernardino attack, obtained a court order that demanded that Apple create a version of its operating system that would allow the FBI to circumvent security controls, so that it could inspect the contents of an iPhone used by one of the terrorists involved in the attack. Apple claimed the order "would undermine the very freedoms and liberty our government is meant to protect" and appealed. On March 28, 2016, the DOJ reported that it had retrieved the data from the attacker's iPhone through an alternative method without Apple's assistance, ending the legal proceedings.
See also
Antennagate
Batterygate
References
External links
Apple Legal
Lawsuits
Class action lawsuits
Conflict of laws case law
Defamation case law
Intellectual property case law
Patent case law
United States case law lists
United States administrative case law
United States antitrust case law
United States computer case law
United States contract case law
United States copyright case law
United States defamation case law
United States Internet case law
United States patent case law
United States trademark case law
Ongoing legal cases
United States district court cases |
40141710 | https://en.wikipedia.org/wiki/Open%20Windows%20%28film%29 | Open Windows (film) | Open Windows is a 2014 found footage techno-thriller film directed and written by Nacho Vigalondo. The film stars Elijah Wood, Sasha Grey and Neil Maskell, and had its world premiere at South by Southwest on 10 March 2014. It is Vigalondo's first English-language film.
Plot
Nick Chambers wins a contest to meet his favorite actress, Jill Goddard. Nick, the webmaster of a fansite dedicated to Jill, is crushed when Chord, Jill's manager, informs him that she has not only failed to invite him to the film's publicity event but also canceled the contest. Chord remotely sends Nick a link to his laptop that opens a live stream. Chord explains that he has hacked into Jill's cell phone and activated the microphone and camera without her knowledge. Although uneasy about invading her privacy, Nick goes along with Chord's plans to spy on her. By eavesdropping on her phone conversations, they learn that she will secretly meet her agent, Tony, with whom she is having an affair, at the same hotel in which Nick is staying.
Chord directs Nick to use preexisting high-end surveillance equipment to spy on Jill and Tony. As he watches them, Nick is briefly contacted by a trio of hackers who address him as Nevada. Jill leaves Tony's room. When Nick's lights spontaneously turn on and Tony can see the camera pointed at his room, Nick panics as Tony leaves his room to investigate. Chord orders Nick to use a Taser to incapacitate Tony. Feeling that he has no choice, Nick agrees. Nick initially refuses to tie up Tony but does so once Chord threatens to stop helping him. Suspicious of why all this equipment is available in his hotel room, Nick questions who Chord really is; Chord ignores him and guides him out of the hotel by hacking into its security system.
Chord blackmails Nick into further compliance by revealing that the entire contest was a hoax, and Chord now has video proof of Nick's crimes. Chord forces Nick to follow Jill to her house, and he is contacted once again by the trio of hackers who believe Nick to be a famous hacker. They offer to help him in his latest hack and Nick recruits them to counteract Chord. Meanwhile, Chord hacks into Jill's PC when she goes home. When Nick refuses to send her PC a file, Chord demonstrates that he is capable of sneaking into Jill's house and killing her.
The file turns out to be a live feed of Tony's torture by electric current. Horrified, Nick attempts to bargain with Chord for Tony's release, but Chord only tortures Tony further. Chord forces Nick to give commands to Jill through her PC, and Nick demands that she reveal her breasts. Satisfied with the resulting video, Chord breaks the connection. Nick frantically attempts to warn Jill but she is kidnapped by Chord. With the help of the hackers, Nick pursues Chord. However, once they realize that Chord is actually the master hacker, Nevada, their loyalties are torn. Although they continue to help him, they warn Nick that Nevada is the best in the world and a veteran of numerous anarchist operations, though none of them have resulted in physical harm to anyone.
The hackers later discover that Chord has killed Nevada and taken his place. After both Nick and Chord throw off the police, Nick crashes his car and Chord shoots him. Chord hacks into the entire Internet and virtually every website is replaced with a teaser of Jill's revealing video. When the site goes live, Chord explains that instead of a sex tape, she will be killed live on the Internet unless her fans immediately close the browser window. The site's traffic increases dramatically and Chord fakes her death at an abandoned factory. Jill plays along with Chord and says that she understands the point about society that he is making. However, when his guard is down, she flees.
Nevada reveals to Chord that he is still alive and has been impersonating Nick the whole time. The real Nick was safely hidden in Nevada's car trunk, and the whole scenario was an operation designed to flush Chord out. Nevada and Jill escape to safety in a bunker before explosives blow up the factory, killing Chord in his own trap. Nevada and Jill discuss what to do next, and she asks to accompany him as he retreats back into the underground hacker movement.
Cast
Elijah Wood as Nick Chambers/Nevada
Sasha Grey as Jill Goddard
Neil Maskell as Chord
Nacho Vigalondo as Richy Gabilondo
Iván González as Tony Hillman
Scott Weinberg as Don Delano
Trevante Rhodes as Brian
Brian Elder as Fantastic Fest Attendee
Adam Quintero as Pierre
Adam J. Reeb as Fantastic Fest Fan
Daniel Pérez Prada as Triop
Mike McCutchen as Moviegoer
Jaime Olías
Rachel Arieff
Ulysses Lopez
Production
Vigalondo was inspired to create Open Windows after he was asked to create a thriller film that heavily featured the Internet, akin to Mike Nichols's Closer. He found writing the script a challenge, as he had to create the film's plot as well as give specific reasons for each window that opened and why the point of view would shift between the characters. Vigalondo approached Wood specifically to star in the film and actress Sasha Grey was brought on board the project after she asked her manager to get her a copy of the script and set up a meeting with the director. The film appealed to Grey, as she was a fan of Vigalondo's work but was also intrigued by the character of Jill as a public figure and as someone who has to deal with "criticism and scrutiny and online haters and cyber stalkers". On 1 April 2014, Cinedigm acquired the US distribution rights to the film.
Filming took place in Madrid, Spain, during the last week of October 2012, and in Austin, Texas.
Release
The film had its world premiere at South by Southwest on 10 March 2014, and was screened in Los Angeles as part of SpectreFest on 4 October that year.
Reception
Rotten Tomatoes, a review aggregator, reports a 40% score of 40 critics surveyed; the average rating is 5.35/10. The site's consensus states: "Open Windows is undeniably ambitious; unfortunately, director Nacho Vigalondo's reach far exceeds his grasp." The film has a score of 47 out of 100 based on 10 reviews at Metacritic.
We Got This Covered praised the acting of Wood and Grey while stating overall that "Open Windows spams audiences with an overload of development without much explanation, much like those information-less ads claiming to solve your impotency problem with a magic formula." Shock Till You Drop panned the film and gave it a rating of 4 out of 10, criticizing it as the "biggest disappointment of the fest." Justin Chang of Variety wrote, "A fiendishly inventive thriller built around an audacious if unsustainable gimmick, Open Windows elevates Hitchcockian suspense to jittery new levels of mayhem and paranoia." John DeFore of The Hollywood Reporter wrote that only genre diehards are likely to accept the level of suspension of disbelief necessary to enjoy the film. Jeannettte Catsoulis of The New York Times described it as "cleverly designed but hellish to watch" due to its overdone plot twists.
References
External links
2014 films
2010s thriller films
American films
American thriller films
English-language films
English-language Spanish films
Films about kidnapping
Films about computing
Films about security and surveillance
Films directed by Nacho Vigalondo
Films shot in Austin, Texas
Films shot in Madrid
Spanish films
Spanish thriller films
Techno-thriller films
Works about computer hacking |
2313092 | https://en.wikipedia.org/wiki/Education%20in%20Africa | Education in Africa | The history of education in Africa can be roughly divided into pre- and post- colonial periods. Since the introduction of formal education to Africa by European colonists, African education, particularly in West and Central Africa, is characterised by both traditional African teachings and European-style schooling systems. The state of education reflects not only the effects of colonialism, but instability resulting from and exacerbated by armed conflicts in many regions of Africa as well as fallout from humanitarian crises such as famine, lack of drinking water, and outbreaks of diseases such as malaria and Ebola, among others. Although the quality of education and the quantity of well-equipped schools and teachers has steadily increased since the onset of the colonial period, there are still evident numerous inequalities in the existing educational systems based on region, economic status, and gender.
History
Education in Precolonial Africa
Precolonial Africa was mostly made up of tribes who often migrated depending on seasons, availability of fertile soil, and political circumstances. Therefore, power was decentralized in precolonial Africa ( many people held some form of authority as such power was not concentrated in a particular person or an institution). Usually, a person's entitlement to land (which were mostly given patriarchally) gives the person some form of power within the person's household and or within the person's tribe. Households were also economically independent such that members of a household produced their own food, shelter and security. There was, therefore, no need for a formally organized education in pre-colonial Africa, as members of each household learned their skills, values, responsibilities, socialization and norms of their tribe/community/household by observing and assisting older household members or community members.
Education in precolonial Africa was therefore in the form of apprenticeship, a form of informal education, where children and or younger members of each household mostly learned from older members of their tribe, household, and community. In most cases, each household member learned more than one skills in addition to learning the values, socialization, and norms of the community/tribe/household. Some of the common skills that people in precolonial Africa had to learn include, dancing, farming, wine making, cooking (mostly the females), and in some cases selected people learn how to practice herbal medicine, how to carve stools, how to carve masks and other furniture.
Story telling also played a significant role in education during pre-colonial Africa. Parents, other older members of households and Griots used oral story telling to teach children about the history, norms and values of their household/tribe/community. Children usually gathered around the storyteller who then narrates stories, usually, using personifications to tell stories that encourage conformity, obedience and values such as endurance, integrity, and other ethical values that are important for co-operations in the community.
Festivals and rituals in most cases were also used as means to teach younger members of a household/tribe/community about the history of their household, community and or tribe. Rituals were mainly used to teach young adults about the responsibilities and expectations of adulthood such as teaching females how to cook and care for a household and teaching the men how to hunt, farm, make masks, etc. An example of a ritual which was used to teach young girls about womanhood is Dipo. This ritual was used to teach young girls, usually, adolescents about cooking, motherhood, and other necessary womanhood skills and values before they marry (engage in sexually related activities).
The origins of African education may be found in Egypt in Northern Africa. One of the first convenient mediums for retaining accurate information, papyrus, was used to develop systems for learning and developing new ideas. In fact, one of the first forms of higher education in Africa were the School of Holy Scriptures built in Ethiopia and Al-Azhar which was in Egypt. These schools became cultural and academic centers as many people traveled from all over the globe for knowledge and instruction. Well before contact with external cultures, Africans had developed pools of understanding and educational tools.
Overview of Education in Colonial Africa
The onset of the colonial period in the 19th century marked the beginning of the end for traditional African education as the primary method of instruction. European military forces, missionaries, and colonists all came ready and willing to change existing traditions to meet their own needs and ambitions. Colonial powers such as Spain, Portugal, Belgium and France colonized the continent without putting in a system of education. Because the primary focus of colonization was reaping benefits from commercial colonial economies, cash crop production, extraction of raw materials, other physically laborious tasks were prioritized. These economies did not expand to require jobs of a higher skill set or more labor, therefore intensive labor that required little skill was high in demand. Because of such circumstances, there was little demand to educate or train the colonized populations. Furthermore, colonial powers were unwilling to offer education to those they colonized unless it benefited them. Either colonial powers did not view investing in African education as a practical use of their revenue or they refrained from educating Africans in order to avoid any uprisings. Those in positions of authority were in fear of access to widespread access to higher education specifically. Colonial powers often found themselves in a debate whether or not to educate their colonized populations and if so, to what extent. Specifically, the British Education Committee of the Privy Council advocated for vocational education and training rather than one focused on academia. This vocational training however neglected professions such as engineering, technology, or similar subjects. Instead, the vocational training had a dominant racial overtone that stressed African training for skills fitting with their assumed social and mental inadequacy. Notably, the Belgians under Leopold II prohibited access to higher education in their colonies, whilst other colonial powers put in barriers in infrastructure or access such as limiting language of instruction to the language of the colonizer, limits on teaching curriculums, and ensuring the curriculum did not reflect any Afro-ethnicity. By demanding that communities create physical schools with strict curriculum, the foreign powers were able to dictate what the people learned, adjusting it to further their agenda. This not only forced new form and content to education, but abandoned the knowledge gained from the largely informal education. With less community awareness, efficiency in learning skills, and especially understanding of the past, African communities began to dwindle in education and prosperity. Aspects of colonialism and its tumultuous effects on the ethos of education are still prevalent in African countries that still struggle to escape the effects of colonization today.
However, a 2021 study found that colonial education systems may also have had some positive effects on education levels in Africa, namely on numeracy. The increase of numeracy in Africa had been accelerating since the 1830s, but it picked up speed during the late 19th and the first two decades of the 20th century. This suggests that colonial education was a determining factor for better education. This positive relationship might have existed due to the effort to spread European schooling among native populations to legitimize colonial power, since this accelerated the organization of schools. At the same time, demand for European-prompted education was rising because the colonial economy brought about new export opportunities, which African farmers responded to.
Between the 1950s and 1990s, African countries finally regained their independence. With this recovered freedom, they began to rebuild their traditional forms of education. What had inevitably evolved, however, was a hybrid of the two models. With the collaboration of donor agencies and Western demand, pushes for development of African education and the building of human capital dominated global conversation. Namely, the 1960s were known as the First Development Decade by the UN. Policymakers prioritized secondary and tertiary education before also setting their sights for universal primary education around 1980. This set the precedent for educational planning. Although children and adults may learn from their families and community, a sense of individuality has also developed that today both drives ingenuity and creates separation between groups and cultural tradition. African education programs have developed that involve both groups; an HIV/AIDS awareness program, for example, may involve members coming into communities and sharing their knowledge. Although this is a direct, cognitive approach, they also try to involve all members of the community, allowing for the creation of ownership and cultural acceptance.
French Colonial Africa
The use of education as a tool of colonization was widespread throughout the French Colonial Empire. Hubert Lyautey, the first Resident-General of French Morocco, advocated for the facilitation of ruling and conquest through cooperation with native elites. To facilitate the relationship with this "bourgeois” class of francophone Africans, selective educational institutions were established across the French Empire.
The teaching of the French language in Moroccan institutions of higher education, such as the University of Fez, was intended to “promote economic development and political compliance without assimilating or deracinating the students or preparing them for political agency”. This system allowed colonial authorities to educate a class of native Moroccans that could carry out administrative roles and functions. In his book, French Colonial Education and the Making of the Francophone African Bourgeoisie, Program Chair of Africana Studies at Washington and Lee University, Mohamed Kamara writes, “For the kind of society the colonialist had in mind, he must create and nurture an elite that will assist for as long as possible in the administration and exploitation of its vast overseas territories”.
In classrooms, students were given a predetermined curriculum. The basic goal of this classroom practice was to provide only a limited selection of information for students, leaving very little margin for questioning or critical thinking. Only a limited number of families were permitted to send their children to school, which fit with the underlying goal of creating an exclusive class of native-born Moroccans, who would serve as a sort of liaison between white colonial officials and the masses.
British Colonial Africa
Education in British Colonial Africa can be characterized by three primary phases. The first of these is from the end of the 19th century until the outbreak of the First World War, then the Interwar Period, and finally, the conclusion of the Second World War until independence.
From the late 19th-Century until the First World War, British colonial education in Africa was largely carried out by missionaries at mission schools. Although these schools were founded with religious intent, they played a significant role in the early colonial machine. Much like in French Colonial Africa, British colonists sought out English-speaking natives who could serve as ‘liaisons” between them and the native population, however, this was done far more out of an economic incentive than a political one. As the demand for English-speaking Africans increased, mission schools provided training in the form of teaching of the Bible. As time went on, however, British industrialists began to complain about the lack of skilled labor, and as such, the British Government supplied mission schools with grants for the vocational training of Africans in various trades critical to British industrial efforts.
British colonial education in Africa during the Interwar Period can be characterized by a push for uniformity, despite colonial authorities demonstrating their acute awareness of the notable differences between the different regions of the Empire. Critical to this, as well, was the universal recognition of nationality as a basic human right under the Covenant of the League of Nations. Colonies were, as outlined by the League of Nations, to be eventually granted independence, with the European powers entrusted as the stewards of “civilization” for their respective colonies. Colonies were only to be allowed independence once they could demonstrate their capacity for self-rule. In former Governor General of Nigeria (1914–1919), Lord Lugard's, 1922 book, The Dual Mandate in British Tropical Africa writes,
“...do not enter the tropics on sufferance, or employ their technical skill, their energy, and their capital as ‘interlopers’ or as ‘greedy capitalists’, but in the fulfillment of the Mandate of civilization”.
In accordance with this, in 1923 the British Government established the Advisory Committee on Education in British Tropical Africa (with the word ‘tropical’ removed to broaden its jurisdiction). With its establishment, for the first time, the colonial authority would be uniformly administering its educational goals across all British African colonies. Programs begun under the new committee were aimed at increasing the “self-sufficiency” of village economies and providing community incentives to counteract flight into big cities. Educational practices under the CEBA came to be known as ‘adapted’, as it was sought to adjust western education to the contemporary European understanding of the ‘African Mind’ as inherently different; education was often administered through local contexts and practices, all the while teaching western curriculum. In his essay British Colonial Education in Africa: Policy and Practice in the Era of Trusteeship, Aaron Windel of Bowdoin College describes it as such,
“Typical lessons in a village school operating on adapted principles focused on hygiene, vernacular word building, drill, and basic local geography. Ideally, lessons would be taught on the principle of ‘teach by doing’ and would include objects from village life. One geography lesson used a bicycle pump, a pail of water, and a small gourd to simulate a ship carrying sugar from India and caught in a monsoon. Adapted pedagogy could also include dramatizations of ‘African tribal histories’ or special holiday plays with an African focus".
Most British officials (including Lord Lugard) believed that trusteeship would continue for many generations to come, and the goals of ‘civilizing’ the native population began to take precedence. Treatment of colonial subjects continued to vary wildly as determined by race, and white settlers were continuously given preferential treatment in the distribution of land and opportunities for careers, among other benefits.
The British education system proved to be quite effective. A 2021 study observed a positive effect of British colonization on education levels. Areas that were influenced by the British education system showed a rapid increase in numeracy. For example, in South Africa – where the colonial education and political system switched from Dutch to British in 1806 – the increase in numeracy was rapid since the early-19th century. The reliance on local resources and languages in education as well as missionary largely being run by Africans seem to have had a positive impact.
As British-administered schools were taking shape during the Interwar Period, a number of independent schools focusing on literacy and offering alternative curricula began to emerge. Such schools were thought of as a threat to the colonial system and colonial governments were worried that these so-called ‘outlaw’ schools would instill thoughts of subversion and anti-colonialism in the native populations. One such independent school was formed in Kenya among the Kikuyu, and made English as its language of instruction, with the ultimate goal of enabling the Kikuyu to fight for land property rights in colonial legal and administrative bodies. Over time, as anti-colonial sentiment gained momentum, independent schools were increasingly viewed by the colonial government as breeding grounds for freedom fighters and independence advocates, which culminated in their banning in 1952 as part of the Mau Mau Emergency.
Education in Postcolonial Africa
In 2000, the United Nations adopted the Millennium Development Goals, a set of development goals for the year 2015, more specifically, “to ensure that by 2015, children everywhere, boys and girls alike will be able to complete a full course of primary schooling.” That same year, the World Education Forum met in Dakar, Senegal, and adopted the Dakar Framework for Action reaffirming the commitment to achieving Education for All by the year 2015.
At the time, according to UNESCO, only 57% of African children were enrolled in primary schools, the lowest enrollment rate of any region surveyed. The report also showed marked gender inequalities: in almost all countries enrollment of boys far outpaced that of girls. However, in some countries, education is relatively strong. In Zimbabwe, literacy has reached 92%.
Steps such as the abolition of school fees, investments in teaching infrastructure and resources, and school meals from the World Food Programme helped drive enrollment up by millions. Yet despite the significant progress of many countries, the world fell short of meeting its goal of Universal Primary Education (UPE). In sub-Saharan Africa as of 2013, only about 79% of primary school-age children were enrolled in school. 59 million children of primary-school age were out of school, and enrollment of girls continued to lag behind that of boys. Disparity between genders is partially due to females being excluded from school for being pregnant.
Following the expiration of the MDGs in 2015, the UN adopted a set of Sustainable Development Goals for the year 2030. The fourth goal addressed education, with the stated aim to “ensure inclusive and equitable quality education and promote lifelong learning opportunities for all.” The World Education Forum also convened in Incheon, Korea to discuss the implementation of this goal, and adopted the Incheon Declaration for Education 2030. Data reflecting the effects the latest measures have on the state of education participation in African countries is not readily available. There are many underlying causes that deter progress towards education equity, such as high attrition rates of students, teacher shortages, poor infrastructure and supplies, access to education for rural and remote areas, and stigmas surrounding marginalized groups, among many others.
Language
Due to high linguistic diversity, the legacy of colonialism, and the need for knowledge of international languages such as English and French in employment and higher education, most schooling in Africa takes places in languages that teachers and pupils do not speak natively, and in some cases simply do not understand. There is considerable evidence that pupils schooled in a second language achieve poorer results than those schooled in their mother tongue, as lack of proficiency in the second language impairs understanding and encourages ineffective rote learning. Although UNESCO have recommended since the 1950s that children be taught early literacy in their mother tongue, progressing later to other languages, not all African countries implement this effectively. Even where the earliest grades are taught in the mother tongue, pupils are typically forced to switch to languages such as English and French before acquiring proficiency in these languages.
Lack of proper facilities and educators
Another reason for the low education rates in Africa is the lack of proper schooling facilities and unequal opportunities for education across countries. Many schools across Africa find it hard to employ teachers due to the low pay and lack of suitable people. This is particularly true for schools in remote areas. Most people who manage to receive education would prefer to move to big cities or even overseas where more opportunities and higher pay await. Thus, there will be an overly large class sizes and a high average number of students per teacher in a school. Moreover, the teachers are usually those unqualified with few teaching aids and/or textbook provision. Due to this, children attending schools in rural areas usually attain poorer results in standardised tests compared to their urban counterparts. This can be seen in the reports given by the Northern and Eastern Africa Consortium for Monitoring Educational Quality (SACMEQ). Those taking the tests in rural areas score much lower than those in small towns and big cities. This shows a lack of equal education opportunity given to children from different parts of the same country.
With teachers being less qualified than others in urban areas the teaching to learning environment takes an effect amongst the students. In one instance teachers took the same test as their students and three-fourths of them had failed. In addition, those that do not receive the same education to those in the bigger cities have trouble even after graduation with reading, writing, and doing math. Students who do not attain the same equal education to those in urban environments do not achieve the same outcome in establishing success with a career. With education being a major concern towards achieving a career and establishing a future, Africa needs to be aware that equal education needs to be established within all schools throughout the countries.
Emigration
Emigration has led to a loss of highly educated people and financial loss. The loss of skilled people can only be replaced with another huge cost which imply the loss of money spent educating people who leave and new people to replace them. Even though an almost 5.5% of GDP investment in education, the loss makes it difficult for the government to budget another amount in education as they will need to prioritize other needs such as military budget and debt servicing.
Culture
Western models and standards still continue to dominate African education. Because of colonization African institutions, particularly universities, still instruct using Euro-centric curriculums with almost no connection to life in Africa. This is further perpetuated by the use of European and American imported textbooks. Many view this lack of self sufficiency as an ongoing effect of colonization upheld by the modern, corrupt African elite. This attitude rests on the basis that during colonization the African ruling elite exploited their own people for their own benefit rather than advocate for the interests of their own people.
Global Water Crisis
The global water crisis has severe effects on education in rural countries in Africa. Limited access to education and health issues can be further compounded by inadequate water systems or disease that may follow. Malaria, cited to be a main cause of death in Africa, is a mosquito-borne disease that can commonly be found in unmanaged pools of still water. The mosquitos breed in such pools and consequently, children who drink from these pools can die or fall severely ill. Furthermore, such an intense illness can later affect the cognitive abilities of children who fall ill at a young age. This is not only applicable biologically but also as an effect of falling ill: children who miss a significant amount of school are unable to optimize their education due to missing lessons.
Military and Conflict
Military spending is causing education spending to decrease immensely. According to a March 2011 report by UNESCO, armed conflict is the biggest threat to education in Africa. While the number of dropouts across the continent has been increasing dramatically, one of the influences of war and conflict on education is the diversion of public funds from education to military spending. An already underfunded system is losing more money. Twenty-one African countries have been identified as the highest spenders of gross domestic product on military globally compared with the amount directed toward education. Military and conflict also leads to the displacement of children. It often forces them to remain in camps or flee to their neighboring countries where education is not available the world is going to explode to them.
Influential initiatives
Initiatives to improve education in Africa include:
Intracontinental
NEPAD's E-school programme is an ambitious plan to provide internet and computer facilities to all schools on the continent.
SACMEQ is a consortium of 15 Ministries of Education in Southern and Eastern Africa that undertakes integrated research and training activities to monitor and evaluate the quality of basic education, and generates information that can be used by decision-makers to plan and improve the quality of education.
For 10 years, the Benin Education Fund (BEF) has provided scholarships and education support to students from the Atakora province in northeastern Benin. Over 450 students have been able to stay in school because of their programmes.
International
She's the First is a New York City, New York-based non-profit organization. The organization seeks to empower girls in Asia, Africa, and Latin America by facilitating the sponsorship of their education through creative and innovative means.
Working through local organizations, The African Children's Educational Trust is supporting thousands of youngsters with long-term scholarships and a community rural elementary school building programme. It has built seven schools to date and is raising funds for more.
British Airways' "" project which, in collaboration with UNICEF, opened the model school Kuje Science Primary School in Nigeria in 2002.
The Elias Fund provides scholarships to children in Zimbabwe to get a better education.
The Ahmadiyya Muslim Community in association with Humanity First, an international charity organisation, has built over 500 schools in the African continent and is running a 'learn a skill' initiative for young men and women.
Fast Track Initiative
The Volkswagen Foundation has been running a funding initiative called "Knowledge for Tomorrow – Cooperative Research Projects in Sub-Saharan Africa" since 2003. It provides scholarships for young African researchers and helps to establish a scientific community in African universities.
Corruption in Education
A 2010 Transparency International report, with research gathered from 8,500 educators and parents in Ghana, Madagascar, Morocco, Niger, Senegal, Sierra Leone and Uganda, found that education is being denied to African children in incredibly large numbers.
A lack of parent involvement, especially as an overseer of government activities also leads to enormous corruption. This was most often found to be because parents and communities feel as though they lack any kind of power in regard to their child's education. In Uganda only 50% of parents believe that they have the power to influence decisions regarding the education of their child. In Morocco, just 20% of parents believed they held any sort of power.
The unavailability and incompleteness of records in schools and districts prevents the documentation and prevention of corrupt practices. The African Education Watch conducted surveys all over the continent and identified the three most common practices of corruption:
Illegal collection of fees: One part of their research focused on so-called registration fees. Parents from every country surveyed reported paying even though, by law, primary schooling is free. The report found that the number of parents forced to pay these illegal accounting fees ranged from 9% in Ghana, to 90% in Morocco. An average of 44% of parents still report paying skill fees in the study. The average fee cost $4.16, a major expense for families in countries like Madagascar, Niger and Sierra Leone.
Embezzlement of school funds: In the study, Transparency International found that 64% of the schools surveyed on the continent published no financial information at all.
Power abuse: Another major problem is incompetent management. The report found that in many schools the little resources they did have were being wasted or lost. Overall, 85% of schools across all countries had either deficient accounting systems or none at all. In Morocco, just 23% of head teachers received training in financial management, despite being responsible for budgets. The TI report found that there was sexual abuse in schools from teachers. The TI report also found that many schools were plagued by teacher absenteeism and alcoholism.
Without this basic education, the report found it was nearly impossible to go on to high school or college. African children are missing this link that allows them to have a chance in trade or to go beyond their villages.
NGO Involvement
A report by USAID and the Bureau for Africa, Office of Sustainable Development, found that NGOs are increasingly participating in contribute to the delivery of education services, education policy decisions and are included by donors and government officials in many parts of the education system. Of course, this varies from country to country and region to region.
NGOs working in education in Africa often encountered tension and competition when working. Schools, parents and, most often government officials, feel threatened by third-party involvement and feel that they are "crashing the party." The report continues that for NGOs to be effective, they must understand that they do not have the same perspective as government officials as to who is in control. If they do not recognize the government of the country they are working in, they will compromise their objectives.
The report goes into more detail about NGO relations with governments in education. The relationship is viewed from completely separate points. African governments see NGOs and their work as "an affair of government" or, in other words, working as a part and in collaboration with the country's government. NGOs on the other hand view themselves as very separate entities in African education. They see themselves fulfilling moral responsibility. They believe that they are identifying needs or areas of development in situations under which the government has ultimately been unaccountable and separately mobilizing resources toward those needs or development areas. Government and NGOs may hold contrasting beliefs about each other's abilities. Governments often think NGOs are unqualified to make important policy decisions and that they could undermine their legitimacy if seen as superior. In some cases, NGOs have found government incompetent themselves, if not their own fault, as the fault of a lack of resources. In the best cases, NGOs and government officials find each other's mutual strengths in education policy and find ways to practically collaborate and reach both of their objectives.
To be effective in education in Africa NGOs must effect policy and create policy changes that support their projects. NGOs also found that to see this policy change that they are striving for, they must create and foster relationships with many different stakeholders. The most important stakeholders are usually donors and government officials. The biggest challenge for NGOs has been linking these networks together. NGO interventions to change policy have revealed that NGO programmes have failed to create a successful way to change the policy process while making sure that the public understands and is a part of education policy. This problem will prove more influential in the future if it is not solved.
Adult Education
Adult education in Africa, having experienced a comeback following the independence and increasing prosperity of many African nations, poses specific requirements on policymakers and planners to take into consideration indigenous cultural traits and characteristics. With a moderate backlash against Western ideals and educational traditions, many universities and other institutes of higher education take it upon themselves to develop a new approach to higher education and adult education.
Most contemporary analysts regard illiteracy as a development issue because of the link between poverty and illiteracy. Funding is inadequate and inconsistent and is needed for priority areas such as educator training, monitoring, and evaluation. There is a clear need for investment in capacity development, having a full, sufficiently paid and well qualified professionalized staff, and increasing the demands for adult education professionals. The majority of adult educators are untrained, especially in basic literacy. Governments often employ school teachers and others in adult education posts rather than experienced adult educators. Many of the difficulties experienced could be solved by an allocation of resources to meet the needs (adequate funds, more staff, appropriate training for staff and suitable material). Underfunding is a huge threat to the sustainability of these programs, and in some cases, to their continued existence. The best-reported data on funding is about adult literacy and non-formal education programs. Funding for continuing education, either academic or vocational is provided and reported on, but little data is given on its financing. Funding may come from public or private sector sources. International and foreign aid is also likely to be important. The costs of much adult education seem to be kept artificially low by the use of state facilities and by the extremely low salaries paid to many adult education specialists.
Public universities have not been successful in attracting older students onto mainstream degree programs and so the post-apartheid ideal of opening access to public higher education for growing numbers of non-traditional students is not yet a reality. However, certain countries have reported some success rates in Adult Education programs. Between 1990 and 2007 Uganda enrolled over 2million participants in the functional adult literacy program. The Family Basic Education program was active in 18 schools by 2005, reaching over 3,300 children and 1,400 parents. This is a successful family literacy mediation whose impact at household, school and community level has been evaluated.
Unfortunately, the national reports typically do not provide sufficient information on the content of the adult education programs that run in their countries. In the majority of cases, the name of the program is as much detail as is given. Curriculum content does not seem to be a major issue.
Cultural Considerations
African communities are very close knit; activities, lifestyles, particularities of individuals are nearly always common knowledge. Because of this, it is difficult for any one member or group within an area to take a significantly different approach to any facet of life within the community. For this reason, program planners for adult learners in Africa find higher rates of success when they employ a participatory approach. Through open and honest dialogue about the fears, motivations, beliefs and ambitions of the community as a whole, there is less social strain concerning individual divergent behavior.
In addition to strong traditional beliefs, years of slavery through colonization have led to a sense of unity and common struggle in African communities. Therefore, lesson plans in these areas should reflect this cultural sensibility; collaboration and cooperation are key components of successful programs. Teaching techniques that utilize these ideas may include story-telling, experiential simulation, and the practice of indigenous traditions with slight modifications. Every program and lesson must be tailored to the particular community because they almost always learn, live, and achieve as a group or not at all.
Informal education plays a strong role within indigenous learning in African communities. This poses a significant challenge to western-style program planners that emphasize formal learning within a designated time-frame and setting. These requirements must often be abandoned in order to achieve success in communities that have no strong affinity for time and formal education. Programs must be planned that become ingrained into the daily life of participants, that reflects their values and add positive functionality to their lives. Successful programs often involve more long-term learning arrangements consisting of regular visits and the free, unforced exchange of information.
Philosophies
African philosophy of adult education recognizes the western ideas such as liberalism, progressivism, humanism and behaviorism, while complementing them with native African perspectives.
Ethnophilosophy is the idea that the main purpose of adult education is to enable social harmony at all levels of society, from immediate family to community and country. It is of primary importance to ensure the retention of knowledge passed down from one generation to another concerning values, cultural understanding and beliefs. This philosophy promotes active learning – learning by doing, following, practicing the work of the elders. Particular lessons may be taught through activities such as role-play, practical demonstrations, exhibitions, discussions or competitions.
The nationalist-ideological philosophy separates itself from ethnophilosophy in that it less concerned with the methods of learning and more with its use. As a philosophy born of the revolutionary movements of the 1950s, it is unsurprising that its main focus is to be able to apply knowledge to active participation in politics and civil society. Although it is important in this philosophy to retain the communal nature of traditional African society, functionalism for social understanding and change takes prime importance in its implementation.
Professional philosophy represents the strongest bridge between western and traditional African educational systems. It promotes a hybrid approach to adult programs, allowing for a wide range of learning techniques, even purely cognitive lecture, so long as community values are accounted for within the lesson. Finally, philosophic sagacity suggests that the only true African philosophies are those that have developed with no contact with the West whatsoever. Rather than a specific approach, this idea simply notes the huge range of educational techniques that may exist throughout the continent by a wide variety of people. It essentially states that there is no one correct method, and that the subject and activities should always be set by the participants.
Women's education
In 2000, 93.4 million women in Sub-Saharan Africa were illiterate. Many reasons exist for why formal education for females is unavailable to so many, including cultural reasons. For example, some believe that a woman's education will get in the way of her duties as a wife and a mother. In some places in Africa where women marry at age 12 or 13, education is considered a hindrance to a young woman's development.
A positive correlation exists between the enrollment of girls in primary school and the gross national product and increase of life expectancy. However, women's education in Africa has sometimes been dotted with instances of sexual violence. Sexual violence against girls and female students affects many African education systems. In Sub-Saharan Africa, sexual violence is one of the most common and least known forms of corruption.
Disparity in Education
While most of the Millennium Development Goals face a deadline of 2015, the gender parity target was set to be achieved a full ten years earlier – an acknowledgment that equal access to education is the foundation for all other development goals. Gender disparity is defined as inequalities of some quantity attributed to the reason of gender type. In countries where resources and school facilities are lacking, and total enrollments are low, a choice must often be made in families between sending a girl or a boy to school. Of an estimated 101 million children not in school, more than half are girls. However, this statistic increased when examining secondary school education. In high-income countries, 95% as many girls as boys attend primary and secondary schools. However, in sub-Saharan Africa the figure is just 60%.
The foremost factor limiting female education is poverty. Economic poverty plays a key role when it comes to coping with direct costs such as tuition fees, cost of textbooks, uniforms, transportation and other expenses. Wherever, especially in families with many children, these costs exceed the income of the family, girls are the first to be denied schooling. This gender bias decision in sending females to school is also based on gender roles dictated by culture. Girls usually are required to complete household chores or take care of their younger siblings when they reach home. This limits their time to study and in many cases, may even have to miss school to complete their duties. It is common for girls to be taken out of school at this point. Boys however, may be given more time to study if their parents believe that education will allow them to earn more in the future. Expectations, attitudes and biases in communities and families, economic costs, social traditions, and religious and cultural beliefs limit girls’ educational opportunities.
Additionally, in most African societies, women are seen as the collectors, managers, and guardians of water, especially within the domestic sphere that includes household chores, cooking, washing, and child rearing. Because of these traditional gender labor roles, women are forced to spend around sixty percent of each day collecting water, which translates to approximately 200 million collective work hours by women globally per day and a decrease in the amount of time available for education, shown by the correlation of decrease in access to water with a decrease in combined primary, secondary, and tertiary enrollment of women.
Whatever the underlying reason(s) are, about having large numbers of girls outside the formal schooling system brings developmental challenges to both current and future generations. According to the UNESCO, the rates of female children out of primary school is higher than that of male children in all the African countries where data is available. Until equal numbers of girls and boys are in school, it will be impossible to build the knowledge necessary to eradicate poverty and hunger, combat disease and ensure environmental sustainability. Millions of children and women will continue to die needlessly, placing the rest of the development agenda at risk.
Significance of a Gender-Equitable Education System
In Africa and the Arab world, promoting gender equality and empowering women is perhaps the most important of the eight Millennium Development Goals. The target associated with achieving this goal is to eliminate gender disparity in primary and secondary enrollment preferably by 2005, and at all levels by 2015. Women deserve the instrumental effects of gender equality in education and the intrinsic dimension of female education; which in essence derives from the role of education in enhancing a woman's set of capabilities. Thus, in theory, there is a direct effect from female education to income (or growth).
Education, especially for girls, has social and economic benefits for society as a whole. Women earn only one-tenth of the world's income and own less than one percent of property, so households without a male head are at special risk of impoverishment. These women will also be less likely to immunize their children and know how to help them survive. Women who are educated tend to have fewer and healthier children, and these children are more likely to attend school. Higher female education makes women better-informed mothers and hence could
contribute to lowering child mortality rates and malnutrition. In Africa, limited education and employment opportunities for women reduce annual per capita growth by 0.8%. Had this growth taken place, Africa's economies would have doubled over the past 30 years. It is estimated that some low-income countries in Africa would need up to $23.8 billion annually to achieve the Millennium Development Goal focused on promoting gender equality and empowering women by 2015. This would translate from $7 to $13 per capita per year from 2006 to 2015, according to OECD-DAC.
Education is also key to an effective response to HIV/AIDS. Studies show that educated women are more likely to know how to prevent HIV infection, to delay sexual activity and to take measures to protect themselves. New analysis by the Global Campaign for Education suggests that if all children received a complete primary education, the economic impact of HIV/AIDS could be greatly reduced and around 700,000 cases of HIV in young adults could be prevented each year—seven million in a decade. According to the Global Campaign for Education, "research shows that a primary education is the minimum threshold needed to benefit from health information programmes. Not only is a basic education essential to be able to process and evaluate information, it also gives the most marginalized groups in society—notably young women—the status and confidence needed to act on information and refuse unsafe sex."
Current Policies of Progression
The Convention on the Elimination of All Forms of Discrimination against Women (CEDAW), adopted in 1979 by the UN General Assembly and acceded to by 180 States, sets down rights for women, of freedom from discrimination and equality under the law. CEDAW has realized the rights and equality of woman is also the key to the survival and development of children and to building healthy families, communities and nations. Article 10 pinpoints nine changes that must be changed in order to help African women and other women suffering from gender disparity. It first states, there must be the same conditions for careers, vocational guidance, and for the achievement of diplomas in educational establishments of all categories in rural as well as in urban areas. This equality shall be ensured in pre-school, general, technical, professional and higher technical education, as well as in all types of vocational training. Second, is access to the same curricula, the same examinations, teaching staff with qualifications of the same standard and school premises and equipment of the same quality. Third, is the elimination of any stereotyped concept of the roles of men and women at all levels and in all forms of education. This is encouraged by coeducation and other types of education which will help to achieve this aim and, in particular, by the revision of textbooks and school programmes and the adaptation of teaching methods. Fourth, the same opportunities to benefit from scholarships and other study grants. Similarly, fifth is the same opportunities of access to programmes of continuing education, including adult and functional literacy programmes, particularly those aimed at reducing, at the earliest possible time, any gap in education existing between men and women. Sixth, is the reduction of female student drop-out rates and the organization of programmes for girls and women who have left school prematurely. Seventh concern listed is the same opportunities to participate actively in sports and physical education. Lastly, is access to specific educational information to help to ensure the health and well-being of families, including information and advice on family planning.
Other global goals echoing these commitments include the World Education Forum's Dakar platform, which stresses the rights of girls, ethnic minorities and children in difficult circumstances; and A World Fit for Children's emphasis on ensuring girls’ equal access to and achievement in basic education of good quality. In April 2000 more than 1,100 participants from 164 countries gathered in Dakar, Senegal, for the World Education Forum. Ranging from teachers to prime ministers, academics to policymakers, non-governmental bodies to the heads of major international organizations, they adopted the Dakar Framework for Action, Education for All: Meeting Our Collective Commitments. The goal is education for all as laid out by the World Conference on Education for All and other international conferences. Between 1990 and 1998 the net enrollment of boys increased by 9% to 56%, and of girls by 7% to 48% in sub-Saharan Africa. However, these figures mask considerable regional variations. In countries of the Indian Ocean, both girls and boys attained over 70% net enrollment. The most outstanding progress in terms of percentage increase of boys' enrollment was in East Africa, where the net enrollment of boys increased by 27% (to 60%) and of girls by 18% (to 50%). For girls in Southern Africa, the comparable figures for girls were 23% (to 7%) and for boys, 16% (to 58%). This is the resurgence of a vibrant Africa, rich in its cultural diversity, history, languages and arts, standing united to end its marginalization in world progress and development to create a prosperous Africa, where the knowledge and the skills of its people are its first and most important resource.
The Forum for African Women Educationalists (FAWE) announces a call for the second round of research proposals from research institutions for its Strengthening Gender Research To Improve Girls’ And Women's Education In Africa initiative. The initiative, which is supported by the Norwegian Agency for Development Cooperation (NORAD), promotes girls and women's education through the integration of gender into education policy and practice in sub-Saharan Africa. FAWE believes it is vital to invest in research in Africa as a way to produce current information for advocacy in education policy. This three-year research initiative aims to work collaboratively with established research institutions to produce pertinent and robust research, which can be used to constructively engage government, policy makers and other regional bodies on strategies to advance girls' education in Africa. Findings from the research will be used to inform FAWE's advocacy work and help redress gender inequities that hinder women's fulfillment of their right to education and meaningful participation in Africa's social and economic advancement.
Major progress in access to education
A joint study by the World Bank and AFD carried out by Alain Mingat, Blandine Ledoux and Ramahatra Rakotomalala sought to anticipate the pressures that would be brought to bear on post-primary teaching. The study puts it this way: “In the reference year (2005), our sample of 33 countries in sub-Saharan Africa had 14.9 million pupils enrolled in the first year of secondary school. If the rate of completion of the primary stage reaches 95% by 2020 with levels of transition from primary to the first year of secondary maintained at their current level in each country, the first year of secondary school would have 37.2 million pupils in 2020, or 2.5 times the current number. If all the pupils finishing primary school could continue with their education, the number of pupils in the first year of secondary school would reach 62.9 million by 2020, a multiplication by 4.2 over the period.” Behind the regional averages, there are still enormous disparities between the countries, and even between the different zones and regions within countries, which means that it is not possible to “[…] identify conditions that apply uniformly to education across the different countries of sub-Saharan Africa.” While some countries have lower demographic growth, others enjoy a more satisfactory level of school enrolment. Only a few countries are falling seriously behind in education at the same time as having to address a steady growth in their school-age population: Niger, Eritrea, Burundi, Guinea-Bissau, Uganda and to a lesser extent Burkina Faso, Chad, Mali, Mozambique, Rwanda, Senegal and Malawi are particularly affected by this dual constraint. The EFA 2012 report highlights great disparities between the sub-Saharan African countries: the percentage of children excluded from primary school is only 7% in Gabon and 14% in Congo compared to over 55% in Burkina Faso and Niger. The gap in terms of the proportion of those excluded from the first year of middle school is even wider, with 6% in Gabon compared to 68% in Burkina Faso and 73% in Niger.
The majority of out-of-school populations are to be found in countries where there is conflict or very weak governance. At the Dakar Forum, the 181 signatory countries of the Dakar Framework for Action identified armed conflict as well as internal instability within a country as “a major barrier towards attaining Education for All” (EFA) – education being one of the sectors to suffer most from the effects of armed conflict and political instability. In the 2011 EFA Global Monitoring Report, UNESCO pointed out that the countries touched by conflict showed a gross rate of secondary school admissions almost 30% lower than countries of equivalent revenue that were at peace. Conflicts also affect the rate of literacy of the population. At the global level, the rate of literacy among adults in countries touched by conflict was 69% in 2010 compared to 85% in peaceful countries. Twenty states in sub-Saharan Africa have been touched by conflict since 1999. Those countries affected by armed conflict, such as Somalia and the Democratic Republic of the Congo, are furthest from meeting the EFA goals and contain the majority of the unschooled inhabitants of sub-Saharan Africa. In the Democratic Republic of the Congo, in North Kivu, a region particularly affected by conflicts, for example, the likelihood of young people aged between 17 and 22 having had only two years of schooling was twice the national average.
Less than half the children in sub-Saharan Africa can neither read nor write: a quarter of primary-school-age children reach the fourth year without having acquired the basics and over a third do not reach the fourth year. According to the 2010 EFA Global Monitoring Report, “millions of children are leaving school without having acquired basic skills. In some countries in sub-Saharan Africa, young adults with five years of education had a 40% probability of being illiterate”. The teacher training systems are generally not able to meet the quantitative and qualitative needs of training. In Chad, for example, only 35.5% of teachers are certified to teach.
In addition to the lack of qualified teachers, there is also the problem of extra-large classrooms in public schools. In Nigeria, there are schools with a teacher to pupil ratio of 80:1. This makes it difficult for personalized instruction. There is also a lack of culturally relevant teaching-learning aids for teachers and students.
Educational Technology
Educational technology in sub-Saharan Africa refers to the promotion, development and use of information and communication technologies (ICT), m-learning, media, and other technological tools to improve aspects of education in sub-Saharan Africa. Since the 1960s, various information and communication technologies have aroused strong interest in sub-Saharan Africa as a way of increasing access to education, and enhancing its quality and fairness.
The development of individual computer technology has proved a major turning point in the implementation of projects dependent on technology use, and calls for the acquisition of computer skills first by teachers and then by pupils. Between 1990 and 2000, multiple actions were started in order to turn technologies into a lever for improving education in sub-Saharan Africa. Many initiatives focused on equipping schools with computer hardware. A number of NGOs contributed, on varying scales, to bringing computer hardware into Africa, such as groups like Computer Aid International, Digital Links, SchoolNet Africa and World Computer Exchange. Sometimes with backing from cooperation agencies or development agencies like USAID, the African Bank or the French Ministry of Foreign Affairs, these individual initiatives grew without adequate coordination. States found it difficult to define their national strategies with regard to ICT in Education.
The American One Laptop per Child (OLPC) project, launched in several African countries in 2005, aimed to equip schools with laptop computers at low cost. While the average price of an inexpensive personal computer was between US$200 and US$500, OLPC offered its ultraportable XO-1 computer at the price of US$100. This technological breakthrough marked an important step in potential access to ICT. OLPC became an institutional system: the programme was “bought” by governments, which then took responsibility for distribution to the schools. The underlying logic of the initiative was one of centralization, thus enabling the largescale distribution of the equipment. Almost 2 million teachers and pupils are now involved in the programme worldwide (http://one.laptop.org/) and more than 2.4 million computers have been delivered. Following on from OLPC, the Intel group launched Classmate PC, a similar programme also intended for pupils in developing countries. Though it has a smaller presence in sub-Saharan Africa than the OLPC project, Classmate PC has enabled laptop computers to be delivered to primary schools in the Seychelles and Kenya, particularly in rural areas. Also in Kenya, the CFSK (Computer for School in Kenya) project was started in 2002 with the aim of distributing computers to almost 9,000 schools.
The cross-fertilization of teaching models and tools has now broadened the potential of ICT within the educational framework. Certain technologies, perceived as outdated compared to more innovative technology, nonetheless remain very much embedded in local practice. Today they are undergoing a partial revival, thanks to the combination of different media that can be used in any single project. Despite its limited uses in teaching, radio is a medium that still has considerable reach in terms of its audience. Cheaper than a computer, it also has a cost-benefit ratio that makes it attractive to many project planners. Launched in 2008, the BBC Janala programme, offering English courses in a combination of different media, including lessons of a few minutes via mobile phone, received more than 85,000 calls per day in the weeks following the launch of the service. In 15 months, over 10 million calls (paid, but at a reduced price compared to a normal communication) were made, by over 3 million users. Television, a feature of very many households, is witnessing a revival in its educational uses, by being combined with other media. As part of the Bridge IT programme in Tanzania, short educational videos, also available on mobile phones, are broadcast on the classroom television so that all the pupils can take part collectively. The e-Schools’ Network in South Africa has also, since March 2013, been developing an educational project, the object of which is to utilize unused television frequencies. There are currently ten schools taking part in the project.
Another digital tool with multiple uses, the interactive whiteboard (IWB), is also being used in some schools in sub-Saharan Africa. At the end of the 2000s, the Education for All Network (REPTA), in partnership with the Worldwide Fund for Digital Solidarity (FSN) and, in France, the interministerial delegation for digital education in Africa (DIENA) made interactive whiteboards available to schools in Burkina Faso, Niger, Benin, Senegal and Mali, along with open content. The use of the IWB has had a positive effect on motivation, for pupils and teachers alike. However, their impact in terms of learning has been muted. This system marginalizes the direct participation of the pupils in favour of multi-media demonstrations initiated by the teacher.
The main initiatives based on the use of ICT and the Internet in education originally focused on distance learning at university level. Thus, the African Virtual University (AVU), set up by the World Bank in 1997, was originally conceived as an alternative to traditional teaching. When it became an intergovernmental agency in 2003, it was training 40,000 people, mostly on short programmes. It shifted its focus to teacher training and to integrating technology into higher education. The AVU has ten e-learning centres. The Agence universitaire de la Francophonie (AUF) has also, since 1999, set up around forty French-speaking digital campuses, more than half of them in Africa. In these infrastructures, dedicated to technology and set up within the universities, the AUF offers access to over 80 first and master's degrees entirely by distance learning, about 30 of which are awarded by African institutions and created with its support. More recently, the MOOCs (Massive Open Online Courses) phenomenon has grown up, first in the United States and then in Europe.
Recommendations for Reform
Government review and regulate school and district financial record-keeping.
More comprehensive training of head teachers and administrators in economic administration.
Regular government inspection of schools.
Encourage parents to complain or fight against school fees and proactively help parents to know their rights.
Empower and mobilize local watchdog organizations such as parent-teacher organizations and school-management committees.
Improve teacher compensation.
Government investment in child and youth development through appropriate education and health policies and programmes.
Increase access to early childhood development programmes.
Increase access to schools.
Improve transportation infrastructure in rural areas.
Diversifying systems of education and broadening skills taught to make education more pertinent to the demands of the economy.
There is also a push in many African countries to reform colonial education standards to emphasize the importance of indigenous languages and cultures instead of European languages and cultures. Critics of these reforms maintain that European languages should continue to be the focus of education to ensure that African students can be competitive in a European-dominated global economy.
Recommendations for Higher Education Reform
Curriculum reform geared towards entrepreneurial skills and jobs in the private sector.
Greater emphasis on locally-relevant diploma and certificate programs, instead of overproducing university graduates.
Adoption of a system of easily identifiable and comparable degrees.
Adoption of a system based on undergraduate and graduate degree cycles.
Promotion of student and faculty mobility.
Training and student opportunities should be accessible to students.
All educational faculty such as administrative staff, professors, and researchers should have access to services relevant to their fields of study.
See also
Education in Tanzania
Adult education in Africa
History of female education in Africa
Computers for African Schools
Education in Mali
Education in Nigeria
Education in Uganda
Education in the Middle East and North Africa
Education in South Africa
Education in Kenya
Multilingual education in Africa
References
Sources
Further reading
Ajayi, J. F. A., Lameck, K. H. Goma and G. Ampah Johnson. The African Experience with Higher Education (Accra: Association of African Universities, 1996).
Ashby, Eric, with Mary Anderson. Universities: British, Indian, African: A Study in the Ecology of Higher Education (London: Weidenfeld & Nicolson, 1966).
Dilger, Hansjörg. Learning Morality, Inequalities, and Faith: Christian and Muslim Schools in Tanzania (Cambridge: Cambridge University Press & International African Institute, 2022).
Fafunwa, A. Babs. History of Education in Nigeria (London: Allen & Unwin, 1974).
Gamble, Harry. Contesting French West Africa: Battles over Schools and the Colonial Order, 1900-1950 (U of Nebraska Press, 2017). 378 pp. online review
Harper, Jim C. Western-educated elites in Kenya, 1900-1963: the African American factor (Routledge, 2005).
Kithinji, Michael Mwenda. "An imperial enterprise: The making and breaking of the University of East Africa, 1949–1969." Canadian Journal of African Studies/La Revue canadienne des études africaines 46.2 (2012): 195-214.
Livsey, Timothy. "Imagining an Imperial Modernity: Universities and the West African Roots of Colonial Development." Journal of Imperial and Commonwealth History 44#6 (2016): 952-975.
Lulat, Y. G. M. "The development of higher education in Africa: A historical survey." in Damtew Teferra and Philip G. Altbach, eds. African higher education: An international reference handbook (2003): 15-31.
Mills, David. "Life on the hill: students and the social history of Makerere." Africa 76.2 (2006): 247-266.
Njagi, Mwangi Daniel. Imperial Education and the Crisis of Political Leadership in Postcolonial Kenya ( Dissertation, State University of New York at Stony Brook, 2011) online.
Nwauwa, Apollos O. Imperialism, Academe and Nationalism: Britain and University Education for Africans, 1860–1960 (London: Frank Cass, 1997).
Ogunlade, Festus O. “Education and Politics in Colonial Nigeria: The Case of King’s College, Lagos (1906–1911).” Journal of the Historical Society of Nigeria 7#2 (1974): 325–345.
Okafor, N. The Development of Universities in Nigeria (London: Longman, 1971).
Teferra, Damtew and Philip G. Altbach, eds. African higher education: An international reference handbook (2003)
Whitehead, Clive. “The ‘Two-way Pull’ and the Establishment of University Education in British West Africa.” History of Education 16#2 (1987): 119–133.
External links
AET Africa | Portal for Agricultural Education and Training in Africa - Provides information on agricultural education in Africa
PROTA - Provides information on the approximately 7,000 useful plants of Tropical Africa and to provide wide access to the information through Webdatabases, Books, CD-Rom's and Special Products.
Portal for education in Africa
Southern and Eastern Africa Consortium for Monitoring Educational Quality (SACMEQ)
The African Children's Educational Trust
African Sage Philosophy entry discussing philosophic sagacity by Gail M. Presbey |
2952692 | https://en.wikipedia.org/wiki/GCompris | GCompris | GCompris is a software suite comprising educational entertainment software for children aged 2 to 10. GCompris was originally written in C and Python using the GTK+ widget toolkit, but a rewrite in C++ and QML using the Qt widget toolkit has been undertaken since early 2014. GCompris is free and open-source software and the current version is subject to the requirements of the AGPL-3.0-only license. It has been part of the GNU project.
The name GCompris is a pun, in the French language is pronounced the same as the phrase "I have understood", J'ai compris .
It is available for Linux, BSD, macOS, Windows and Android. While binaries compiled for Microsoft Windows and macOS were initially distributed with a restricted number of activities and a small fee was required to unlock all the activities, since February 2020 the full version is entirely free for all platforms.
Extent
At the time of writing GCompris comprised more than 130 games, called "activities". These are bundled into the following groups:
hacksaw
Computer discovery: keyboard, mouse, different mouse gestures
Numeracy: table memory, enumeration, double entry table, mirror images
Science: the canal lock, the water cycle, the submarine, electric simulations
123Geography: place the country on the map
Games: chess, memory, connect 4, oware, sudoku
Reading: reading practice
Other: learn to tell time, puzzle of famous paintings, vector drawing, cartoon making
Development history
The first version of the game was made in 2000 by Bruno Coudoin, a French software engineer. Since the first release it was distributed freely on the Internet and was protected by the GNU General Public License. The motivation behind the development was to provide native educational application for Linux. Since then, the software has seen continuous improvements, in terms of graphics and number of activities, thanks to the help of many developers and graphic artists joining the project over the years.
There are two branches of GCompris; with released versions in each. The first, older of which is the GTK+ branch that contains 140 activities is now considered to be a legacy branch in maintenance mode, with no new development. The latest release of the GTK+ version is 15.10 of 18 October 2015.
The newer branch of Gcompris is completely rewritten using Qt Quick. The current version is developed using JavaScript, QML and C++ languages.
References
External links
Download Windows, Linux and macOS versions
Source code (Qt)
Source code (Legacy)
2000 software
Educational software for Linux
Educational software for MacOS
Educational software for Windows
Educational software that uses GTK
Educational software that uses Qt
Free and open-source Android software
Free educational software
Free learning support software
GNOME Kids
GNU Project software
KDE
Linux games
Educational video games
Open-source video games
Software for children
Software that uses QML
Software that was ported from GTK to Qt |
1902163 | https://en.wikipedia.org/wiki/Lean%20software%20development | Lean software development | Lean software development is a translation of lean manufacturing principles and practices to the software development domain. Adapted from the Toyota Production System, it is emerging with the support of a pro-lean subculture within the agile community. Lean offers a solid conceptual framework, values and principles, as well as good practices, derived from experience, that support agile organizations.
Origin
The term lean software development originated in a book by the same name, written by Mary Poppendieck and Tom Poppendieck in 2003. The book restates traditional lean principles, as well as a set of 22 tools and compares the tools to corresponding agile practices. The Poppendiecks' involvement in the agile software development community, including talks at several Agile conferences has resulted in such concepts being more widely accepted within the agile community.
Lean principles
Lean development can be summarized by seven principles, very close in concept to lean manufacturing principles:
Eliminate waste
Amplify learning
Decide as late as possible
Deliver as fast as possible
Empower the team
Build integrity in
Optimize the whole
Eliminate waste
Lean philosophy regards everything not adding value to the customer as waste (muda). Such waste may include:
Partially done work
Extra features
Relearning
Task switching
Waiting
Handoffs
Defects
Management activities
In order to eliminate waste, one should be able to recognize it. If some activity could be bypassed or the result could be achieved without it, it is waste. Partially done coding eventually abandoned during the development process is waste. Extra features like paperwork and features not often used by customers are waste. Switching people between tasks is waste (because of time spent, and often lost, by people involved in context-switching). Waiting for other activities, teams, processes is waste. Relearning requirements to complete work is waste. Defects and lower quality are waste. Managerial overhead not producing real value is waste.
A value stream mapping technique is used to identify waste. The second step is to point out sources of waste and to eliminate them. Waste-removal should take place iteratively until even seemingly essential processes and procedures are liquidated.
Amplify learning
Software development is a continuous learning process based on iterations when writing code. Software design is a problem-solving process involving the developers writing the code and what they have learned. Software value is measured in fitness for use and not in conformance to requirements.
Instead of adding more documentation or detailed planning, different ideas could be tried by writing code and building. The process of user requirements gathering could be simplified by presenting screens to the end-users and getting their input. The accumulation of defects should be prevented by running tests as soon as the code is written.
The learning process is sped up by usage of short iteration cycles – each one coupled with refactoring and integration testing. Increasing feedback via short feedback sessions with customers helps when determining the current phase of development and adjusting efforts for future improvements. During those short sessions, both customer representatives and the development team learn more about the domain problem and figure out possible solutions for further development. Thus the customers better understand their needs, based on the existing result of development efforts, and the developers learn how to better satisfy those needs. Another idea in the communication and learning process with a customer is set-based development – this concentrates on communicating the constraints of the future solution and not the possible solutions, thus promoting the birth of the solution via dialogue with the customer.
Decide as late as possible
As software development is always associated with some uncertainty, better results should be achieved with a set-based or options-based approach, delaying decisions as much as possible until they can be made based on facts and not on uncertain assumptions and predictions. The more complex a system is, the more capacity for change should be built into it, thus enabling the delay of important and crucial commitments. The iterative approach promotes this principle – the ability to adapt to changes and correct mistakes, which might be very costly if discovered after the release of the system.
With set-based development: If a new brake system is needed for a car, for example, three teams may design solutions to the same problem. Each team learns about the problem space and designs a potential solution. As a solution is deemed unreasonable, it is cut. At the end of a period, the surviving designs are compared and one is chosen, perhaps with some modifications based on learning from the others - a great example of deferring commitment until the last possible moment. Software decisions could also benefit from this practice to minimize the risk brought on by big up-front design. Additionally, there would then be multiple implementations that work correctly, yet are different (implementation-wise, internally). These could be used to implement fault-tolerant systems which check all inputs and outputs for correctness, across the multiple implementations, simultaneously.
An agile software development approach can move the building of options earlier for customers, thus delaying certain crucial decisions until customers have realized their needs better. This also allows later adaptation to changes and the prevention of costly earlier technology-bounded decisions. This does not mean that no planning should be involved – on the contrary, planning activities should be concentrated on the different options and adapting to the current situation, as well as clarifying confusing situations by establishing patterns for rapid action. Evaluating different options is effective as soon as it is realized that they are not free, but provide the needed flexibility for late decision making.
Deliver as fast as possible
In the era of rapid technology evolution, it is not the biggest that survives, but the fastest. The sooner the end product is delivered without major defects, the sooner feedback can be received, and incorporated into the next iteration. The shorter the iterations, the better the learning and communication within the team. With speed, decisions can be delayed. Speed assures the fulfilling of the customer's present needs and not what they required yesterday. This gives them the opportunity to delay making up their minds about what they really require until they gain better knowledge. Customers value rapid delivery of a quality product.
The just-in-time production ideology could be applied to software development, recognizing its specific requirements and environment. This is achieved by presenting the needed result and letting the team organize itself and divide the tasks for accomplishing the needed result for a specific iteration. At the beginning, the customer provides the needed input. This could be simply presented in small cards or stories – the developers estimate the time needed for the implementation of each card. Thus the work organization changes into self-pulling system – each morning during a stand-up meeting, each member of the team reviews what has been done yesterday, what is to be done today and tomorrow, and prompts for any inputs needed from colleagues or the customer. This requires transparency of the process, which is also beneficial for team communication.
The myth underlying this principle is haste makes waste. However, lean implementation has shown that it is a good practice to deliver fast in order to see and analyze the output as early as possible.
Empower the team
There has been a traditional belief in most businesses about the decision-making in the organization – the managers tell the workers how to do their own job. In a work-out technique, the roles are turned – the managers are taught how to listen to the developers, so they can explain better what actions might be taken, as well as provide suggestions for improvements. The lean approach follows the agile principle "build projects around motivated individuals [...] and trust them to get the job done", encouraging progress, catching errors, and removing impediments, but not micro-managing.
Another mistaken belief has been the consideration of people as resources. People might be resources from the point of view of a statistical data sheet, but in software development, as well as any organizational business, people do need something more than just the list of tasks and the assurance that they will not be disturbed during the completion of the tasks. People need motivation and a higher purpose to work for – purpose within the reachable reality, with the assurance that the team might choose its own commitments. The developers should be given access to the customer; the team leader should provide support and help in difficult situations, as well as ensure that skepticism does not ruin the team’s spirit. Respecting people and acknowledging their work is one way to empower the team.
Build integrity in
The customer needs to have an overall experience of the system. This is the so-called perceived integrity: how it is being advertised, delivered, deployed, accessed, how intuitive its use is, its price and how well it solves problems.
Conceptual integrity means that the system’s separate components work well together as a whole with balance between flexibility, maintainability, efficiency, and responsiveness. This could be achieved by understanding the problem domain and solving it at the same time, not sequentially. The needed information is received in small batch pieces – not in one vast chunk - preferably by face-to-face communication and not any written documentation. The information flow should be constant in both directions – from customer to developers and back, thus avoiding the large stressful amount of information after long development in isolation.
One of the healthy ways towards integral architecture is refactoring. As more features are added to the original code base, the harder it becomes to add further improvements. Refactoring is about keeping simplicity, clarity, minimum number of features in the code. Repetitions in the code are signs of bad code designs and should be avoided (i.e. by applying the DRY rule). The complete and automated building process should be accompanied by a complete and automated suite of developer and customer tests, having the same versioning, synchronization and semantics as the current state of the system. At the end the integrity should be verified with thorough testing, thus ensuring the System does what the customer expects it to. Automated tests are also considered part of the production process, and therefore if they do not add value they should be considered waste. Automated testing should not be a goal, but rather a means to an end, specifically the reduction of defects.
Optimize the whole
Modern software systems are not simply the sum of their parts, but also the product of their interactions. Defects in software tend to accumulate during the development process – by decomposing the big tasks into smaller tasks, and by standardizing different stages of development, the root causes of defects should be found and eliminated. The larger the system, the more organizations that are involved in its development and the more parts are developed by different teams, the greater the importance of having well defined relationships between different vendors, in order to produce a system with smoothly interacting components. During a longer period of development, a stronger subcontractor network is far more beneficial than short-term profit optimizing, which does not enable win-win relationships.
Lean thinking has to be understood well by all members of a project, before implementing in a concrete, real-life situation. "Think big, act small, fail fast; learn rapidly" – these slogans summarize the importance of understanding the field and the suitability of implementing lean principles along the whole software development process. Only when all of the lean principles are implemented together, combined with strong "common sense" with respect to the working environment, is there a basis for success in software development.
Lean software practices
Lean software development practices, or what the Poppendiecks call "tools" are restated slightly from the original equivalents in agile software development. Examples of such practices include:
Seeing waste
Value stream mapping
Set-based development
Pull systems
Queueing theory
Motivation
Measurements
Test-driven development
Since agile software development is an umbrella term for a set of methods and practices based on the values and principles expressed in the Agile Manifesto, lean software development is considered an agile software development method.
See also
Extreme programming
DevOps
Kanban
Kanban board
Lean integration
Lean services
Scrum (development)
References
Further reading
Software development philosophies
Agile software development
Lean manufacturing |
12211124 | https://en.wikipedia.org/wiki/VoiceObjects | VoiceObjects | VoiceObjects is a company that produces VoiceXML-based self-service phone portals which personalize each caller’s experience and provide mobile access that integrates speech recognition, touch-tone response, texting and mobile web.
In December 2008, VoiceObjects was acquired by Voxeo Corporation. The VoiceObjects products are since then part of the Voxeo portfolio and the VoiceObjects office and staff are now operating as Voxeo Germany.
End of July 2013 Aspect Software has acquired Voxeo to strengthen Aspect's IVR and multichannel self-service offerings.
History
VoiceObjects was founded in 2001 by Karl-Heinz Land and a team of six co-founders (Georg Arens, Michael Codini, Jörg Schulz, Christoph Sieberz, Georg R. Steimel and Tiemo Winterkamp) as OneBridge software just outside downtown Cologne, Germany. Since then the company has grown to serving well over 200M callers per year with global presence, and has been incorporated in 2005 as US company with headquarters in San Mateo, California.
The company’s products are based on a new software architecture for delivering voice portals that leveraged industry standards for the Internet such as VoiceXML, SQL, Eclipse, SNMP, XML, Java, and SOA. In 2007, the company introduced software to support text-based applications for mobile phones using the USSD standard over GSM wireless networks as well as software to support Web-based applications for mobile phones with Web browsers supporting the XHTML 1.0 standard.
Products
VoiceObjects Desktop - Service Creation Environment
VoiceObjects Server - Multi-channel Phone Application Server
VoiceObjects Analyzer - Real-time Analysis and Reporting Environment
In January 2013 Voxeo has renamed the VoiceObjects product name to Voxeo CXP.
A few months later in July 2013 Voxeo has been acquired by Aspect Software and the product offering has been renamed to Aspect CXP Pro.
In May 2021 Aspect Software merged with Noble Systems to form Alvaria.
Customers
VoiceObjects customers included recognizable names such as Adobe, Deutsche Telekom, Hershey's, Kellogg Company, Lufthansa, and Swisscom. The company’s partners included SAP, Nortel, Genesys, Avaya, BEA, Oracle, and IBM.
See also
IVR
Speech recognition
VoiceXML
References
Telephony
Companies based in San Mateo, California
Telecommunications companies of the United States
Companies established in 2001
Privately held companies based in California |
6259 | https://en.wikipedia.org/wiki/Civilization%20%28video%20game%29 | Civilization (video game) | Sid Meier's Civilization is a 1991 turn-based strategy 4X video game developed and published by MicroProse. The game was originally developed for MS-DOS running on a PC, and has undergone numerous revisions for various platforms. The player is tasked with leading an entire human civilization over the course of several millennia by controlling various areas such as urban development, exploration, government, trade, research, and military. The player can control individual units and advance the exploration, conquest and settlement of the game's world. The player can also make such decisions as setting forms of government, tax rates and research priorities. The player's civilization is in competition with other computer-controlled civilizations, with which the player can enter diplomatic relationships that can either end in alliances or lead to war.
Civilization was designed by Sid Meier and Bruce Shelley following the successes of Silent Service, Sid Meier's Pirates! and Railroad Tycoon. Civilization has sold 1.5 million copies since its release, and is considered one of the most influential computer games in history due to its establishment of the 4X genre. In addition to its commercial and critical success, the game has been deemed pedagogically valuable due to its presentation of historical relationships. A multiplayer remake, Sid Meier's CivNet, was released for the PC in 1995. Civilization was followed by several sequels starting with Civilization II, with similar or modified scenarios.
Gameplay
Civilization is a turn-based single or multiplayer strategy game. The player takes on the role of the ruler of a civilization, starting with one (or occasionally two) settler units, and attempts to build an empire in competition with two to seven other civilizations. The game requires a fair amount of micromanagement (although less than other simulation games). Along with the larger tasks of exploration, warfare and diplomacy, the player has to make decisions about where to build new cities, which improvements or units to build in each city, which advances in knowledge should be sought (and at what rate), and how to transform the land surrounding the cities for maximum benefit. From time to time the player's towns may be harassed by barbarians, units with no specific nationality and no named leader. These threats only come from huts, unclaimed land or sea, so that over time and turns of exploration, there are fewer and fewer places from which barbarians will emanate.
Before the game begins, the player chooses which historical or current civilization to play. In contrast to later games in the Civilization series, this is largely a cosmetic choice, affecting titles, city names, musical heralds, and color. The choice does affect their starting position on the "Play on Earth" map, and thus different resources in one's initial cities, but has no effect on starting position when starting a random world game or a customized world game. The player's choice of civilization also prevents the computer from being able to play as that civilization or the other civilization of the same color, and since computer-controlled opponents display certain traits of their civilizations this affects gameplay as well. The Aztecs are both fiercely expansionist and generally extremely wealthy, for example. Other civilizations include the Americans, the Mongols, and Romans. Each civilization is led by a famous historical figure, such as Mahatma Gandhi for India.
The scope of Civilization is larger than most other games. The game begins in 4000 BC, before the Bronze Age, and can last through to AD 2100 (on the easiest setting) with Space Age and "future technologies". At the start of the game there are no cities anywhere in the world: the player controls one or two settler units, which can be used to found new cities in appropriate sites (and those cities may build other settler units, which can go out and found new cities, thus expanding the empire). Settlers can also alter terrain, build improvements such as mines and irrigation, build roads to connect cities, and later in the game they can construct railroads which offer unlimited movement.
As time advances, new technologies are developed; these technologies are the primary way in which the game changes and grows. At the start, players choose from advances such as pottery, the wheel, and the alphabet to, near the end of the game, nuclear fission and spaceflight. Players can gain a large advantage if their civilization is the first to learn a particular technology (the secrets of flight, for example) and put it to use in a military or other context. Most advances give access to new units, city improvements or derivative technologies: for example, the chariot unit becomes available after the wheel is developed, and the granary building becomes available to build after pottery is developed. The whole system of advancements from beginning to end is called the technology tree, or simply the Tech tree; this concept has been adopted in many other strategy games. Since only one tech may be "researched" at any given time, the order in which technologies are chosen makes a considerable difference in the outcome of the game and generally reflects the player's preferred style of gameplay.
Players can also build Wonders of the World in each of the epochs of the game, subject only to obtaining the prerequisite knowledge. These wonders are important achievements of society, science, culture and defense, ranging from the Pyramids and the Great Wall in the Ancient age, to Copernicus' Observatory and Magellan's Expedition in the middle period, up to the Apollo program, the United Nations, and the Manhattan Project in the modern era. Each wonder can only be built once in the world, and requires a lot of resources to build, far more than most other city buildings or units. Wonders provide unique benefits to the controlling civilization. For example, Magellan's Expedition increases the movement rate of naval units. Wonders typically affect either the city in which they are built (for example, the Colossus), every city on the continent (for example, J.S. Bach's Cathedral), or the civilization as a whole (for example, Darwin's Voyage). Some wonders are made obsolete by new technologies.
The game can be won by conquering all other civilizations or by winning the space race by reaching the star system of Alpha Centauri.
Development
Prior Civilization-named games
British designer Francis Tresham released his Civilization board game in 1980 under his company Hartland Trefoil. Avalon Hill had obtained the rights to publish it in the United States in 1981.
There were at least two attempts to make a computerized version of Tresham's game prior to 1990. Danielle Bunten Berry planned to start work on the game after completing M.U.L.E. in 1983, and again in 1985, after completing The Seven Cities of Gold at Electronic Arts. In 1983 Bunten and producer Joe Ybarra opted to first do Seven Cities of Gold. The success of Seven Cities in 1985 in turn led to a sequel, Heart of Africa. Bunten never returned to the idea of Civilization. Don Daglow, designer of Utopia, the first simulation game, began work programming a version of Civilization in 1987. He dropped the project, however, when he was offered an executive position at Brøderbund, and never returned to the game.
Development at MicroProse
Sid Meier and Bill Stealey co-founded MicroProse in 1982 to develop flight simulators and other military strategy video games based on Stealey's past experiences as a United States Air Force pilot. Around 1989, Meier wanted to expand his repertoire beyond these types of games, as just having finished F-19 Stealth Fighter (1988, 1990), he said "Everything I thought was cool about a flight simulator had gone into that game." He took to heart the success of the new god game genre in particular SimCity (1989) and Populous (1989). Specifically with SimCity, Meier recognized that video games could still be entertaining based on building something up. By then, Meier was not an official employee of MicroProse but worked under contract where the company paid him upfront for game development, a large payment on delivery of the game, and additional royalties on each game of his sold.
MicroProse had hired a number of Avalon Hill game designers, including Bruce Shelley. Among other works, Shelley had been responsible for adapting the railroad-based 1829 board game developed by Tresham into 1830: The Game of Railroads and Robber Barons. Shelley had joined MicroProse finding that the board game market was weakening in contrast to the video game market, and initially worked on F-19 Stealth Fighter. Meier recognized Shelley's abilities and background in game design and took him on as personal assistant designer to brainstorm new game ideas. The two initially worked on ideas for Covert Action, but had put these aside when they came up with the concepts for Railroad Tycoon (1990), based loosely on the 1829/1830 board games. Railroad Tycoon was generally well received at its release, but the title did not fit within the nature of flight simulators and military strategy from MicroProse's previous catalog. Meier and Shelley had started a sequel to Railroad Tycoon shortly after its release, but Stealey canceled the project.
One positive aspect both had taken from Railroad Tycoon was the idea of multiple smaller systems working together at the same time and the player having to manage them. Both Meier and Shelley recognized that the complex interactions between these systems led players to "make a lot of interesting decisions", and that ruling a whole civilization would readily work well with these underlying systems. Some time later, both discussed their love of the original Empire computer games, and Meier challenged Shelley to give him ten things he would change about Empire; Shelley provided him with twelve. Around May 1990, Meier presented Shelley with a 5-1/4" floppy disk which contained the first prototype of Civilization based on their past discussions and Shelley's list.
Meier described his development process as sculpting with clay. His prototype took elements from Empire, Railroad Tycoon, SimCity and the Civilization board game. This initial version of this game was a real-time simulation, with the player defining zones for their population to grow similar to zoning in SimCity. Meier and Shelley went back and forth with this, with Shelley providing suggestions based on his playthrough and acting as the game's producer, and Meier coding and reworking the game to address these points, and otherwise without involvement of other MicroProse staff. During this period, Stealey and the other managers became concerned that this game did not fit MicroProse's general catalog as computer strategies games had yet proven successful. A few months into the development, Stealey requested them to put the project on hold and complete Covert Action, after which they could go back to their new game. Meier and Shelley completed Covert Action which was published in 1990.
Once Covert Action was released, Meier and Shelley returned to the prototype. The time away from the project allowed them to recognize that the real-time aspect was not working well, and reworked the game to become turn-based and dropped the zoning aspect, a change that Meier described as "like tossing the clay in the trash and getting a new lump". They incorporated elements of city management and military aspect from Empire, including creating individual military units as well as settler units that replaced the functionality of the zoning approach. Meier felt adding military and combat to the game was necessary as "The game really isn't about being civilized. The competition is what makes the game fun and the players play their best. At times, you have to make the player uncomfortable for the good of the player". Meier also opted to include a technology tree that would help to open the game to many more choices to the player as it continued, creating a non-linear experience. Meier felt players would be able to use the technology tree to adopt a style of play and from which they could use technologies to barter with the other opponents. While the game relies on established recorded history, Meier admitted he did not spend much time in research, usually only to assure the proper chronology or spellings; Shelley noted that they wanted to design for fun, not accuracy, and that "Everything we needed was pretty much available in the children’s section of the library."
Computer Gaming World reported in 1994 that "Sid Meier has stated on numerous occasions that he emphasizes the 'fun parts' of a simulation and throws out the rest". Meier described the process as "Add another bit [of clay]—no, that went too far. Scrape it off". He eliminated the potential for any civilization to fall on its own, believing this would be punishing to the player. "Though historically accurate", Meier said, "The moment the Krakatoa volcano blew up, or the bubonic plague came marching through, all anybody wanted to do was reload from a
saved game". Meier omitted multiplayer alliances because the computer used them too effectively, causing players to think that it was cheating. He said that by contrast, minefields and minesweepers caused the computer to do "stupid things ... If you've got a feature that makes the AI look stupid, take it out. It's more important not to have stupid AI than to have good AI". Meier also omitted jets and helicopters because he thought players would not find obtaining new technologies in the endgame useful, and online multiplayer support because of the small number of online players ("if you had friends, you wouldn't need to play computer games"); he also did not believe that online play worked well with turn-based play. The game was developed for the IBM PC platform, which at the time had support for both 16-color EGA to 256-color VGA; Meier opted to support both 16-color and 256-color graphics to allow the game to run on both EGA/Tandy and VGA/MCGA systems.
"I’ve never been able to decide if it was a mistake to keep Civ isolated as long as I did", Meier wrote; while "as many eyes as possible" are beneficial during development, Meier and Shelley worked very quickly together, combining the roles of playtester, game designer, and programmer. Meier and Shelley neared the end of their development and started presenting the game to the rest of MicroProse for feedback towards publication. This process was slowed by the current vice president of development, who had taken over Meier's former position at the company. This vice president did not receive any financial bonuses for successful publication of Meier's games due to Meier's contract terms, forgoing any incentive to provide the needed resources to finish the game. The management had also expressed issue with the lack of a firm completion date, as according to Shelley, Meier would consider a game completed only when he felt he had completed it. Eventually the two got the required help for publication, with Shelley overseeing these processes and Meier making the necessary coding changes.
"One of my big rules has always been, 'double it, or cut it in half, Meier wrote. He cut the map's size in half less than a month before Civilization release after playtesting revealed that the previous size was too large and made for boring and repetitive gameplay. Other automated features, like city management, were modified to require more player involvement. They also eliminated a secondary branch of the technology tree with minor skills like beer brewing, and spent time reworking the existing technologies and units to make sure they felt appropriate and did not break the game. Most of the game was originally developed with art crafted by Meier, and MicroProse's art department helped to create most of the final assets, though some of Meier's original art was used. Shelley wrote out the "Civilopedia" entries for all the elements of the game and the game's large manual.
The name Civilization came late in the development process. MicroProse recognized at this point the 1980 Civilization board game may conflict with their video game, as it shared a similar theme including the technology tree. Meier had noted the board game's influence but considered it not as great as Empire or SimCity, while others have noted significant differences that made the video game far different from the board game such as the non-linearity introduced by Meier's technology tree. To avoid any potential legal issues, MicroProse negotiated a license to use the Civilization name from Avalon Hill. The addition of Meier's name to the title was from a current practice established by Stealey to attach games like Civilization that diverged from MicroProse's past catalog to Meier's name, so that players that played Meier's combat simulators and recognized Meier's name would give these new games a try. This approach worked, according to Meier, and he would continue this naming scheme for other titles in the future as a type of branding.
By the time the game was completed and ready for release, Meier estimated that it had cost $170,000 in development. Civilization was released in September 1991. Because of the animosity that MicroProse's management had towards Meier's games, there was very little promotion of the title, though interest in the game through word-of-mouth helped to boost sales. Following the release on the IBM PC, the game was ported to other platforms; Meier and Shelley provided this code to contractors hired by MicroProse to complete the ports.
CivNet
Civilization was released with only single-player support, with the player working against multiple computer opponents. In 1991, Internet or online gaming was still in its infancy, so this option was not considered in Civilization release. Over the next few years, as home Internet accessibility took off, MicroProse looked to develop an online version of Civilization. This led to the 1995 release of Sid Meier's CivNet. CivNet allowed for up to seven players to play the game, with computer opponents available to obtain up to six active civilizations. Games could be played either on a turn-based mode, or in a simultaneous mode where each player took their turn at the same time and only progressing to the next turn once all players have confirmed being finished that turn. The game, in addition to better support for Windows 3.1 and Windows 95, supported connectivity through LAN, primitive Internet play, modem, and direct serial link, and included a local hotseat mode. CivNet also included a map editor and a "king builder" to allow a player to customize the names and looks of their civilization as seen by other players.
According to Brian Reynolds, who led the development of Civilization II, MicroProse "sincerely believed that CivNet was going to be a much more important product" than the next single-player Civilization game that he and Jeff Briggs had started working on. Reynolds said that because their project was seen as a side effort with little risk, they were able to innovate new ideas into Civilization II. As a net result, CivNet was generally overshadowed by Civilization II which was released in the following year.
Post-release
Civilization critical success created a "golden period of MicroProse" where there was more potential for similar strategy games to succeed, according to Meier. This put stress on the company's direction and culture. Stealey wanted to continue to pursue the military-themed titles, while Meier wanted to continue his success with simulation games. Shelley left MicroProse in 1992 and joined Ensemble Studios, where he used his experience with Civilization to design the Age of Empires games. Stealey had pushed MicroProse to develop console and arcade-based versions of their games, but this put the company into debt, and Stealey eventually sold the company to Spectrum HoloByte in 1993; Spectrum HoloByte kept MicroProse as a separate company on acquisition.
Meier would continue and develop Civilization II along with Brian Reynolds, who served in a similar role to Shelley as design assistant, as well as help from Jeff Briggs and Douglas Kaufman. This game was released in early 1996, and is considered the first sequel of any Sid Meier game. Stealey eventually sold his shares in MicroProse and left the company, and Spectrum HoloByte opted to consolidate the two companies under the name MicroProse in 1996, eliminating numerous positions at MicroProse in the process. As a result, Meier, Briggs, and Reynolds all opted to leave the company and founded Firaxis, which by 2005 became a subsidiary of Take-Two. After a number of acquisitions and legal actions, the Civilization brand (both as a board game and video game) is now owned by Take-Two, and Firaxis, under Meier's oversight, continues to develop games in the Civilization series.
Reception
Civilization has been called one of the most important strategy games of all time, and has a loyal following of fans. This high level of interest has led to the creation of a number of free and open source versions and inspired similar games by other commercial developers.
Computer Gaming World stated that "a new Olympian in the genre of god games has truly emerged", comparing Civilization importance to computer games to that of the wheel. The game was reviewed in 1992 in Dragon #183 by Hartley, Patricia, and Kirk Lesser in "The Role of Computers" column. The reviewers gave the game 5 out of 5 stars. They commented: "Civilization is one of the highest dollar-to-play-ratio entertainments we've enjoyed. The scope is enormous, the strategies border on being limitless, the excitement is genuinely high, and the experience is worth every dime of the game's purchase price."
Jeff Koke reviewed Civilization in Pyramid #2 (July/Aug., 1993), and stated that "Ultimately, there are games that are a lot flashier than Civilization, with cool graphics and animation, but there aren't many - or any - in my book that have the ability to absorb the player so totally and to provide an interesting, unique outcome each and every time it's played."
Civilization won the Origins Award in the category Best Military or Strategy Computer Game of 1991. A 1992 Computer Gaming World survey of wargames with modern settings gave the game five stars out of five, describing it as "more addictive than crack ... so rich and textured that the documentation is incomplete". In 1992 the magazine named it the Overall Game of the Year, in 1993 added the game to its Hall of Fame, and in 1996 chose Civilization as the best game of all time:
A critic for Next Generation judged the Super NES version to be a disappointing port, with a cumbersome menu system (particularly that the "City" and "Production" windows are on separate screens), an unintuitive button configuration, and ugly scaled down graphics. However, he gave it a positive recommendation due to the strong gameplay and strategy of the original game: "if you've never taken a crack at this game before, be prepared to lose hours, even days, trying to conquer those pesky Babylonians." Sir Garnabus of GamePro, in contrast, was pleased with the Super NES version's interface, and said the graphics and audio are above that of a typical strategy game. He also said the game stood out among the Super NES's generally action-oriented library.
In 2000, GameSpot rated Civilization as the tenth most influential video game of all time. It was also ranked at fourth place on IGN 2000 list of the top PC games of all time. In 2004, readers of Retro Gamer voted it as the 29th top retro game. In 2007, it was named one of the 16 most influential games in history at a German technology and games trade show Telespiele. In Poland, it was included in the retrospective lists of the best Amiga games by Wirtualna Polska (ranked ninth) and CHIP (ranked fifth). In 2012, Time named it one of the 100 greatest video games of all time. In 1994, PC Gamer US named Civilization the second best computer game ever. The editors wrote, "The depth of strategies possible is impressive, and the look and feel of the game will keep you playing and exploring for months. Truly a remarkable title." That same year, PC Gamer UK named its Windows release the sixth best computer game of all time, calling it Sid Meier's "crowning glory".
On March 12, 2007, The New York Times reported on a list of the ten most important video games of all time, the so-called game canon, including Civilization.
By the release of Civilization II in 1996, Civilization had sold over 850,000 copies. By 2001, sales had reached 1 million copies. Shelley stated in a 2016 interview that Civilization had sold 1.5 million copies.
Reviews
Casus Belli #70 (July 1992)
Legacy
There have been several sequels to Civilization, including Civilization II (1996), Civilization III (2001), Civilization IV (2005), Civilization Revolution (2008), Civilization V (2010), and Civilization VI in 2016. In 1994, Meier produced a similar game titled Colonization.
Civilization is generally considered the first major game in the genre of "4X", with the four "X"s equating to "explore, expand, exploit, and exterminate", a term developed by Alan Emrich in promoting 1993's Master of Orion. While other video games with the principles of 4X had been released prior to Civilization, future 4X games would attribute some of their basic design principles to Civilization.
An Easter egg named "Nuclear Gandhi" in most of the games in the series references a supposed integer overflow bug in Civilization that causes a computer-controlled Gandhi, normally a highly peaceful leader, to become a nuclear warmonger. The game is said to start Gandhi's "aggression value" at 1 out of a maximum 255 possible for an 8-bit unsigned integer, making a computer-controlled Gandhi tend to avoid armed conflict. However, once a civilization achieves democracy as its form of government, its leader's aggression value falls by 2. Under normal arithmetic principles, Gandhi's "1" would be reduced to "-1", but because the value is an 8-bit unsigned integer, it wraps around to "255", causing Gandhi to suddenly become the most aggressive opponent in the game. Interviewed in 2019, developer Brian Reynolds said with "99.99% certainty" that this story was apocryphal, recalling Gandhi's coded aggression level as being no lower than other peaceful leaders in the game, and doubting that a wraparound would have had the effect described. He noted that all leaders in the game become "pretty ornery" after their acquisition of nuclear weapons, and suggested that this behaviour simply seemed more surprising and memorable when it happened to Gandhi. Meier, in his autobiography, stated "That kind of bug comes from something called unsigned characters, which are not the default in the C programming language, and not something I used for the leader traits. Brian Reynolds wrote Civ II in C++, and he didn't use them, either. We received no complaints about a Gandhi bug when either game came out, nor did we send out any revisions for one. Gandhi's military aggressiveness score remained at 1 throughout the game." He then explains the overflow error story was made up in 2012. It spread from there to a Wikia entry, then eventually to Reddit, and was picked up by news sites like Kotaku and Geek.com.
Another relic of Civilization was the nature of combat where a military unit from earlier civilization periods could remain in play through modern times, gaining combat bonuses due to veteran proficiency, leading to these primitive units easily beating out modern technology against all common sense, with the common example of a veteran phalanx unit able to fend off a battleship. Meier noted that this resulted from not anticipating how players would use units, expecting them to have used their forces more like a war-based board game to protect borders and maintain zones of control rather than creating "stacks of doom". Future civilization games have had many changes in combat systems to prevent such oddities, though these games do allow for such random victories.
The 1999 game Sid Meier's Alpha Centauri was also created by Meier and is in the same genre, but with a futuristic/space theme; many of the interface and gameplay innovations in this game eventually made their way into Civilization III and IV. Alpha Centauri is not actually a sequel to Civilization, despite beginning with the same event that ends Civilization and Civilization II: a crewed spacecraft from Earth arrives in the Alpha Centauri star system. Firaxis' 2014 game Civilization: Beyond Earth, although bearing the name of the main series, is a reimagining of Alpha Centauri running on the engine of Civilization V.
A 1994 Computer Gaming World survey of space war games stated that "the lesson of this incredibly popular wargame has not been lost on the software community, and technological research popped up all over the place in 1993", citing Spaceward Ho! and Master of Orion as examples. That year MicroProse published Master of Magic, a similar game but embedded in a medieval-fantasy setting where instead of technologies the player (a powerful wizard) develops spells, among other things. In 1999, Activision released Civilization: Call to Power, a sequel of sorts to Civilization II but created by a completely different design team. Call to Power spawned a sequel in 2000, but by then Activision had lost the rights to the Civilization name and could only call it Call to Power II.
An open source clone of Civilization has been developed under the name of Freeciv, with the slogan "'Cause civilization should be free." This game can be configured to match the rules of either Civilization or Civilization II. Another game that partially clones Civilization is a public domain game called C-evo.
References
The Official Guide to Sid Meier's Civilization, Keith Ferrell, Edmund Ferrell, Compute Books, 1992, .
External links
Official website
Civilization at MobyGames
Civilization at myabandonware.com
1991 video games
4X video games
Amiga games
Amiga 1200 games
Atari ST games
1
Cultural depictions of Abraham Lincoln
DOS games
Historical simulation games
Classic Mac OS games
Multiplayer and single-player video games
NEC PC-9801 games
Origins Award winners
PlayStation (console) games
Sid Meier games
Super Nintendo Entertainment System games
Top-down video games
Turn-based strategy video games
Windows games
Video games based on board games
Video games developed in the United States
Video games scored by Jeff Briggs
Video games scored by John Broomhall
Video games using procedural generation
Cultural depictions of Mahatma Gandhi |
1525837 | https://en.wikipedia.org/wiki/THE%20multiprogramming%20system | THE multiprogramming system | The THE multiprogramming system or THE OS was a computer operating system designed by a team led by Edsger W. Dijkstra, described in monographs in 1965-66 and published in 1968.
Dijkstra never named the system; "THE" is simply the abbreviation of "Technische Hogeschool Eindhoven", then the name (in Dutch) of the Eindhoven University of Technology of the Netherlands. The THE system was primarily a batch system that supported multitasking; it was not designed as a multi-user operating system. It was much like the SDS 940, but "the set of processes in the THE system was static".
The THE system apparently introduced the first forms of software-based paged virtual memory (the Electrologica X8 did not support hardware-based memory management), freeing programs from being forced to use physical locations on the drum memory. It did this by using a modified ALGOL compiler (the only programming language supported by Dijkstra's system) to "automatically generate calls to system routines, which made sure the requested information was in memory, swapping if necessary". Paged virtual memory was also used for buffering input/output (I/O) device data, and for a significant portion of the operating system code, and nearly all the ALGOL 60 compiler. In this system, semaphores were used as a programming construct for the first time.
Design
The design of the THE multiprogramming system is significant for its use of a layered structure, in which "higher" layers depend on "lower" layers only:
Layer 0 was responsible for the multiprogramming aspects of the operating system. It decided which process was allocated to the central processing unit (CPU), and accounted for processes that were blocked on semaphores. It dealt with interrupts and performed the context switches when a process change was needed. This is the lowest level. In modern terms, this was the scheduler.
Layer 1 was concerned with allocating memory to processes. In modern terms, this was the pager.
Layer 2 dealt with communication between the operating system and the system console.
Layer 3 managed all I/O between the devices attached to the computer. This included buffering information from the various devices.
Layer 4 consisted of user programs. There were 5 processes: in total, they handled the compiling, executing, and printing of user programs. When finished, they passed control back to the schedule queue, which was priority-based, favoring recently started processes and ones that blocked because of I/O.
Layer 5 was the user; as Dijkstra notes, "not implemented by us".
The constraint that higher layers can only depend on lower layers was imposed by the designers in order to make reasoning about the system (using quasi-formal methods) more tractable, and also to facilitate building and testing the system incrementally. The layers were implemented in order, layer 0 first, with thorough testing of the abstractions provided by each layer in turn. This division of the kernel into layers was similar in some ways to Multics' later ring-segmentation model. Several subsequent operating systems have used layering to some extent, including Windows NT and macOS, although usually with fewer layers.
The code of the system was written in assembly language for the Dutch Electrologica X8 computer. This computer had a word size of 27 bits, 32 kilowords of core memory, 512 kilowords of drum memory providing backing store for the LRU cache algorithm, paper tape readers, paper tape punches, plotters, and printers.
See also
RC 4000 Multiprogramming System
Ring (computer security)
Timeline of operating systems
References
Assembly language software
Discontinued operating systems
Dutch inventions
Computer science in the Netherlands
Information technology in the Netherlands
Edsger W. Dijkstra
1968 software |
42531991 | https://en.wikipedia.org/wiki/High%20Efficiency%20Video%20Coding%20implementations%20and%20products | High Efficiency Video Coding implementations and products | High Efficiency Video Coding implementations and products covers the implementations and products of High Efficiency Video Coding (HEVC).
History
2012
On February 29, 2012, at the 2012 Mobile World Congress, Qualcomm demonstrated a HEVC decoder running on an Android tablet, with a Qualcomm Snapdragon S4 dual-core processor running at 1.5 GHz, showing H.264/MPEG-4 AVC and HEVC versions of the same video content playing side by side. In this demonstration HEVC reportedly showed almost a 50% bit rate reduction compared with H.264/MPEG-4 AVC.
On August 22, 2012, Ericsson announced that the world's first HEVC encoder, the Ericsson SVP 5500, would be shown at the upcoming International Broadcasting Convention (IBC) 2012 trade show. The Ericsson SVP 5500 HEVC encoder is designed for real-time encoding of video for delivery to mobile devices. On the same day, it was announced that researchers are planning to extend MPEG-DASH to support HEVC by April 2013.
On September 2, 2012, Vanguard Video, formerly Vanguard Software Solutions (VSS), announced a real-time HEVC software encoder running at 1080p30 (1920x1080, 30fps) on a single Intel Xeon processor. This encoder was demonstrated at IBC 2012.
On September 6, 2012, Rovi Corporation announced that a MainConcept SDK for HEVC would be released in early 2013 shortly after HEVC is officially ratified. The HEVC MainConcept SDK includes a decoder, encoder, and transport multiplexer for Microsoft Windows, macOS, Linux, iOS, and Android. The HEVC MainConcept SDK encoder was demonstrated at the IBC 2012 trade show.
On September 9, 2012, ATEME demonstrated at the IBC 2012 trade show a HEVC encoder that encoded video with a resolution of 3840x2160p at 60 fps with an average bit rate of 15 Mbit/s. ATEME is planning to release their HEVC encoder in October 2013.
2013
On January 7, 2013, ViXS Systems announced that they would show the first hardware SoC capable of transcoding video to the Main 10 profile of HEVC at the 2013 International CES. On the same day Rovi Corporation announced that after the HEVC standard is released that they plan to start adding support for HEVC to their MainConcept SDK and to their DivX products.
On January 8, 2013, Broadcom announced the BCM7445 which is an Ultra HD decoding chip capable of decoding HEVC at up to 4096x2160p at 60 fps. The BCM7445 is a 28 nm ARM architecture chip capable of 21,000 Dhrystone MIPS with volume production estimated for the middle of 2014.
On January 8, 2013, Vanguard Video announced the availability of V.265, a professional pure-software HEVC encoder capable of real-time performance.
On January 25, 2013, NGCodec announced the availability of free H.265/HEVC compliance test clips.
On February 4, 2013, NTT DoCoMo announced that starting in March it would begin licensing its implementation of HEVC decoding software. The decoding software can allow playback of 4K UHDTV at 60 fps on personal computers and 1080p on smartphones and was planned to demonstrated at the 2013 Mobile World Congress. In a JCT-VC document NTT DoCoMo showed that their HEVC software decoder could decode 3840x2160 at 60 fps using 3 decoding threads on a 2.7 GHz quad core Ivy Bridge CPU.
On February 11, 2013, researchers from MIT demonstrated the world's first published HEVC ASIC decoder at the International Solid-State Circuits Conference (ISSCC) 2013. Their chip was capable of decoding a 3840x2160p at 30 fps video stream in real time, consuming under 0.1W of power.
On March 14, 2013, Ittiam Systems Announces Availability and Software Licensing of HEVC (H.265) Video Encoder and Decoder for Professional, Enterprise and Consumer Digital Media Markets. The HEVC Encoder is a software implementation on Intel x86 based platforms, capable of High Definition (HD) broadcast quality video encoding. The Decoder software available on ARM CortexTM-A9 and CortexTM-A15 based SoCs allows a wide range of existing Consumer Electronics (CE) devices such as Smartphones, Tablets, Smart TVs and Set-Top Boxes to play back High Definition (HD) HEVC content. Ittiam's HEVC solutions were showcased at CES 2013, MWC 2013 and NAB2013
On April 3, 2013, ATEME announced the availability of the first open source implementation of a HEVC software player based on the OpenHEVC decoder and GPAC video player which are both licensed under LGPL. The OpenHEVC decoder supports the Main profile of HEVC and can decode 1080p at 30 fps video using a single core CPU. A live transcoder that supports HEVC and used in combination with the GPAC video player was shown at the ATEME booth at the NAB Show in April 2013.
On April 19, 2013, SES announced the first Ultra HD transmission using the HEVC standard. The transmission had a resolution of 3840x2160 and a bit rate of 20 Mbit/s. SES used Harmonic's ProMedia Xpress HEVC encoder and Broadcom's BCM7445 HEVC decoder.
On May 9, 2013, NHK and Mitsubishi Electric announced that they had jointly developed the first HEVC encoder for 8K Ultra HD TV, which is also called Super Hi-Vision (SHV). The HEVC encoder supports the Main 10 profile at Level 6.1 allowing it to encode 10-bit video with a resolution of 7680x4320 at 60 fps. The HEVC encoder has 17 3G-SDI inputs and uses 17 boards for parallel processing with each board encoding a row of 7680x256 pixels to allow for real time video encoding. The HEVC encoder is compliant with draft 4 of the HEVC standard and has a maximum bit rate of 340 Mbit/s. The HEVC encoder was shown at the NHK Science & Technology Research Laboratories Open House 2013 that took place from May 30 to June 2. At the NHK Open House 2013 the HEVC encoder used a bit rate of 85 Mbit/s which gives a compression ratio of 350:1.
On May 15, 2013, DivX released a draft version of DivX HEVC video profiles that are based on the Main profile and Main tier of HEVC with additional restrictions specific to the DivX HEVC video profiles. The draft version of DivX HEVC 4K, 1080p, and 720p video profiles currently define only the video and DivX is planning to define other elements of the profiles in the future. The DivX HEVC 4K video profile allows for a maximum bit rate of HEVC Level 5.1 (40 Mbit/s) but the maximum number of samples per second is limited to HEVC Level 5 (4096x2160 at 30 fps).
On May 31, 2013, Orange announced the first public HEVC demonstration of a real time end-to-end delivery chain. The HEVC demonstration included a high definition broadcast of the 2013 French Open from June 1 to June 9 that uses both IPTV and DVB-T2.
On June 4, 2013, Rovi Corporation released the MainConcept HEVC SDK 1.0. SDK 1.0 supports Smart Adaptive Bitrate Encoding Technology (SABET) which allows for the simultaneous encoding of up to 10 video output streams with reduced computing cost. SDK 1.0 is available for Windows and SDK 1.0.1, which will be released in July 2013, will add support for Linux and macOS. SDK 1.0 supports the Main profile while SDK 2.0, which will be released in Q4 2013, will add support for the Main 10 profile.
On June 10, 2013, Vanguard Video announced that support for the Main 10 profile was added to their V.265 professional HEVC encoder. V.265 is the first real time HEVC software encoder to support the Main 10 profile.
On June 20, 2013, Imagination Technologies announced their PowerVR Series5 HEVC decoder. The PowerVR D5500 decoder core supports 10-bits per sample video decoding.
On July 19, 2013, Allegro DVT announced that they had improved their HEVC decoder IP by adding support for the Main 10 profile.
On July 23, 2013, VITEC announced the Stradis HDM850+ Professional Decoder Card. HDM850+ is the first PCIe based card supporting real time HEVC decoding (as well as H.264 and MPEG-2). HDM850+ decodes and display HEVC / H.265 clips or stream over 3G-SDI/HDMI video outputs. It supports HEVC/H.265 decode up to 1080p60 4:2:2 10bits (Main10 profile at Level 4.1 High Tier).
On July 23, 2013, MulticoreWare released alpha source code for x265, a video encoder application and library for encoding video into an HEVC bitstream. x265 is an open source software available under GNU GPL v2 license.
On August 8, 2013, Nippon Telegraph and Telephone announced the release of their HEVC-1000 SDK software encoder which supports the Main 10 profile, resolutions up to 7680x4320, and frame rates up to 120 fps.
On August 21, 2013, Microsoft released a DirectX Video Acceleration (DXVA) specification for HEVC which supports the Main profile, the Main 10 profile, and the Main Still Picture profile. DXVA 2.0 is required for HEVC decoding to be hardware accelerated and compatible decoders can use DXVA 2.0 for the following operations: bitstream parsing, deblocking, inverse quantization scaling, inverse transform processing, and motion compensation.
On September 4, 2013, Ittiam Systems demonstrated Real-Time 1080p HEVC Encoding and 4K HEVC Decoding at IBC’13. Ittiam's software HEVC encoder on Intel x86 platforms supports UHD resolutions and is capable of real-time broadcast grade encoding of HD 1080p content. The software decoders on Intel and ARM CortexTM are capable of performing 4K/UHD real time decoding.
On September 11, 2013, ViXS Systems announced the XCode 6400 SoC which supports 4K resolution at 60 fps, the Main 10 profile of HEVC, and the Rec. 2020 color space.
On September 6, 2013, Thomson Video Networks demonstrated a trial of the HEVC codec for ultra HD transmission, conducted by satellite transmission operator HISPASAT.
On September 11, 2013, NGCodec Inc. announced availability of free 4K HEVC/H.265 test clips.
At the September 12–17, 2013 IBC show in Amsterdam, HEVC was a significant theme – with HEVC technology products being demonstrated by several companies, including Advantech, Allegro DVT, Ateme, Broadcom, Elemental Technologies, Envivio, Ericsson, Fraunhofer HHI, Fujitsu, Haivision, Harmonic, Ittiam, Kontron, Media Excel, NGCodec Inc., NTT-AT, NXP Software, Pace, QuickFire Networks Rovi/Mainconcept, SES, Squid Systems, STMicroelectronics, Tata Elxsi, Technicolor, Telestream, Thomson Video Networks, Vanguard Video, VITEC and VisualOn.
On October 16, 2013, the OpenHEVC decoder was added to FFmpeg.
On October 23, 2013, Ittiam demonstrated its low power HEVC decoder optimized for ARM Mali™ GPU Compute and ARM® Cortex®-A processor at ARM TechCon 2013. Ittiam's HEVC decoder has been designed to take advantage of the full capabilities of mobile SoCs, it harnesses the great computational power and energy efficiency of GPUs to reduce the battery drain.
On October 29, 2013, Elemental Technologies announced support for real-time 4K HEVC video processing. Elemental provided live video streaming of the 2013 Osaka Marathon on October 27, 2013, in a workflow designed by K-Opticom, a telecommunications operator in Japan. Live coverage of the race in 4K HEVC was available to viewers at the International Exhibition Center in Osaka. This transmission of 4K HEVC video in real-time was an industry-first.
On November 14, 2013, DivX developers released information on HEVC decoding performance using an Intel i7 CPU at 3.5 GHz which had 4 cores and 8 threads. The DivX 10.1 Beta decoder was capable of 210.9 fps at 720p, 101.5 fps at 1080p, and 29.6 fps at 4K.
On December 18, 2013, ViXS Systems announced shipments of their XCode 6400 SoC which is the first SoC to support the Main 10 profile of HEVC.
2014
On January 15, 2014, oViCs announced the ViC-1 HEVC decoder which supports the Main 10 profile at up to 4K at 120 fps.
On February 13, 2014, PathPartner Technology Pvt.Ltd announced HEVC Decoder on ARM Cortex-A Family Processors which takes advantage of the full capabilities of mobile SoCs built on latest ARM processors.
On April 7, 2014, Vantrix released source code for the f265 HEVC encoder under the BSD license.
On August 5, 2014, Squid Systems announced the Squid Systems Video Codec Hardware IP Available for Licensing.
On August 13, 2014, Ittiam Systems announces availability of its third generation H.265/HEVC codec with 422 12-bit support.
On August 19, 2014, Nippon Telegraph and Telephone announced that support for the HEVC range extensions was added to their HEVC-1000 SDK software encoder which can encode video using the Main 4:2:2 12 profile.
On September 5, 2014, the Blu-ray Disc Association announced that the 4K Blu-ray Disc (as known as Ultra HD Blu-ray) specification will support 4K video at 60 fps, High Efficiency Video Coding, the Rec. 2020 color space, high dynamic range, and 10-bit color depth. 4K Blu-ray Disc will have a data rate of at least 50 Mbit/s and may include support for 66/100 GB discs. 4K Blu-ray Disc will be licensed in the spring or summer of 2015 and 4K Blu-ray Disc players have an expected release date of late 2015.
On September 9, 2014, Apple announced the iPhone 6 and iPhone 6 Plus which supports H.265 for FaceTime over cellular.
In October 2014, Nvidia included 4K HEVC hardware encode support on its Geforce GTX 980 video card.
On October 22, 2014, Ambarella announced their S3 SoC that encodes HEVC video at 4K resolution.
On October 31, 2014, Microsoft confirmed that Windows 10 will support HEVC out of the box, according to a statement from Gabriel Aul, the leader of Microsoft Operating Systems Group's Data and Fundamentals Team. Windows 10 Technical Preview Build 9860 added platform level support for HEVC and Matroska.
On November 7, 2014, DivX developers announced that DivX265 version 1.4.21 has added support for the Main 10 profile of HEVC and the Rec. 2020 color space.
On December 10, 2014 Better Portable Graphics was announced, an HEVC-based still image format offering significantly better quality than JPEG at the same file size.
In late December 2014, Google releases Android 5.0 "Lollipop" which includes APIs for HEVC.
2015
On January 5, 2015, ViXS Systems announced the XCode 6800 which is the first SoC to support the Main 12 profile of HEVC.
In January, 2015, Intel released a new driver version for its HD Graphics allowing hardware decoding support for HEVC.
On January 22, 2015, Nvidia released their GeForce GTX 960 mainstream video card which includes a fixed function HEVC Main/Main10 hardware decoder.
On March 9, 2015, Nvidia released VDPAU version 1.0 which supports HEVC decoding for the Main, Main 4:4:4, Main Still Picture, Main 10, and Main 12 profiles.
On March 31, 2015, VITEC announced the MGW Ace which was the first 100% hardware-based portable HEVC encoder that provides mobile HEVC encoding.
In June 2015, Nvidia released their Shield gaming/streaming media console, which supports HEVC decode and 4K video.
On 6 June 2015, Microsoft updated the Xbox One to support 10-bit HEVC decoding.
On August 5, 2015, Intel launched Skylake products with full fixed function Main/8bit decoding/encoding and hybrid/partial Main10/10bit decoding.
On August 20, 2015, Nvidia released the GeForce GTX 950(GM206), which includes full fixed function HEVC Main/Main10 hardware decoder like the GTX 960.
On November 10, 2015, Qualcomm Detailed Snapdragon 820 Smartphone SoC which is the first chip in the series to support 4K HEVC 10bit decoding.
2016
On April 11, 2016, full HEVC (H.265) support was announced in the newest MythTV version (0.28).
On May 27, 2016, Nvidia released the GeForce GTX 1080(GP104), which includes full fixed function HEVC Main10/Main12 hardware decoder.
On June 10, 2016, Nvidia released the GeForce GTX 1070(GP104), which includes full fixed function HEVC Main10/Main12 hardware decoder.
On July 19, 2016, Nvidia released the GeForce GTX 1060(GP106), which includes full fixed function HEVC Main10/Main12 hardware decoder.
On August 2, 2016, Nvidia released the NVIDIA TITAN X(GP102), which includes full fixed function HEVC Main10/Main12 hardware decoder.
On August 30, 2016, Intel officially announced 7th generation Core CPUs(Kaby Lake) products with full fixed function HEVC Main10 hardware decoding support.
On September 7, 2016, BBC R&D announced and made the source code available for the Turing Codec, an open source HEVC encoder.
On October 25, 2016, Nvidia released the GeForce GTX 1050Ti(GP107) & GeForce GTX 1050(GP107), which includes full fixed function HEVC Main10/Main12 hardware decoder.
On November 6, 2016, Google released the Chromecast Ultra, which features "expanded codec support" for hardware HEVC Main/Main10 decoding.
2017
On January 3, 2017, Intel officially announced 7th generation Core CPUs (Kaby Lake) desktop products with full fixed function HEVC Main10 hardware decoding support.
On March 10, 2017, Nvidia released the GeForce GTX 1080 Ti(GP102), which includes full fixed function HEVC Main10/Main12 hardware decoder.
On April 6, 2017, Nvidia released the NVIDIA TITAN Xp(GP102), which includes full fixed function HEVC Main10/Main12 hardware decoder.
On May 17, 2017, Nvidia released the GeForce GT 1030 (GP108), which includes full fixed function HEVC Main10/Main12 hardware decoder.
On June 5, 2017, Apple announced HEVC and HEIF support in macOS High Sierra and iOS 11.
On August 21, 2017, Intel officially unveiled their 8th generation Core CPUs (Kaby Lake Refresh) mobile products with full fixed function HEVC Main10 hardware decoding support.
On September 19, 2017, Apple released iOS 11 with full HEVC encoding and decoding support. Prior to this, HEVC support was limited to FaceTime.
On September 25, 2017, Apple released macOS High Sierra with HEVC encoding and decoding support.
On September 26, 2017, Microsoft released Windows 10 Fall Creators Update (version 1709), which removed the out-of-the-box HEVC support, to Windows Insiders. When questioned about the removal, a Microsoft employee claimed that it happened because HEVC (and HEIC) files were only supported by Apple devices. A replacement, called “HEVC Video Extension”, was added to Windows Store, at first for free. With a later version, now named “HEVC Video Extensions” (plural form), it became paid software, costing US$0.99. A separate version called “HEVC Video Extensions from Device Manufacturer”, presumably intended for computers with HEVC support in hardware, is still available for free.
On September 28, 2017, GoPro released the Hero6 Black action camera, with 4K60P HEVC video encoding.
On October 5, 2017, Intel officially launched their 8th generation Core CPUs (Coffee Lake) desktop products with full fixed function HEVC Main10 hardware decoding support.
On November 2, 2017, Nvidia released the GeForce GTX 1070 Ti (GP104), which includes full fixed function HEVC Main10/Main12 hardware decoder.
On December 11, 2017, Intel officially launched their Pentium Silver & Celeron CPUs (Gemini Lake) desktop & mobile products with full fixed function HEVC Main10 hardware decoding support.
2018
On April 9, 2018, Pathpartner Technology Pvt Ltd., announced free instance of HEVC decoder on Amazon Web Services.
On August 28, 2018, Intel officially unveiled their 8th generation Core CPUs (Whiskey Lake & Amber Lake) mobile products with full fixed function HEVC Main10 hardware decoding support.
On September 20, 2018, Nvidia released the GeForce RTX 2080 (TU104), which includes full fixed function HEVC Main 4:4:4 12 hardware decoder.
On September 27, 2018, Nvidia released the GeForce RTX 2080 Ti (TU102), which includes full fixed function HEVC Main 4:4:4 12 hardware decoder.
On October 17, 2018, Nvidia released the GeForce RTX 2070 (TU106), which includes full fixed function HEVC Main 4:4:4 12 hardware decoder.
On October 19, 2018, Intel officially launched their 9th generation Core CPUs (Coffee Lake Refresh) 9900K, 9700K & 9600K desktop products with full fixed function HEVC Main10 hardware decoding support.
2019
On January 15, 2019, Nvidia released the GeForce RTX 2060 (TU106), which includes full fixed function HEVC Main 4:4:4 12 hardware decoder.
On February 22, 2019, Nvidia released the GeForce GTX 1660 Ti (TU116), which includes full fixed function HEVC Main 4:4:4 12 hardware decoder.
On March 14, 2019, Nvidia released the GeForce GTX 1660 (TU116), which includes full fixed function HEVC Main 4:4:4 12 hardware decoder.
On April 23, 2019, Nvidia released the GeForce GTX 1650 (TU117), which includes full fixed function HEVC Main 4:4:4 12 hardware decoder.
On April 23, 2019, Intel officially launched their 9th generation Core CPUs (Coffee Lake Refresh) desktop & mobile products with full fixed function HEVC Main10 hardware decoding support.
On July 9, 2019, Nvidia released the GeForce RTX 2070 Super (TU104) & GeForce RTX 2060 Super (TU106), which includes full fixed function HEVC Main 4:4:4 12 hardware decoder.
On July 23, 2019, Nvidia released the GeForce RTX 2080 Super (TU104), which includes full fixed function HEVC Main 4:4:4 12 hardware decoder.
On August 1, 2019, Intel officially launched their 10th generation Core CPUs (Ice Lake) mobile products with full fixed function HEVC Main10 hardware decoding support.
On August 21, 2019, Intel officially launched their 10th generation Core CPUs (Comet Lake) mobile products with full fixed function HEVC Main10 hardware decoding support.
On October 29, 2019, Nvidia released the GeForce GTX 1660 Super (TU116), which includes full fixed function HEVC Main 4:4:4 12 hardware decoder.
On November 4, 2019, Intel officially launched their Pentium Silver & Celeron CPUs (Gemini Lake Refresh) desktop & mobile products with full fixed function HEVC Main10 hardware decoding support.
On November 22, 2019, Nvidia released the GeForce GTX 1650 Super (TU116), which includes full fixed function HEVC Main 4:4:4 12 hardware decoder.
2020
On April 2, 2020, Intel officially launched their 10th generation Core CPUs (Comet Lake-H series) mobile products with full fixed function HEVC Main10 hardware decoding support.
On April 30, 2020, Intel officially launched their 10th generation Core CPUs (Comet Lake-S series) desktop products with full fixed function HEVC Main10 hardware decoding support.
See also
High Efficiency Video Coding
UHDTV - Digital video formats with resolutions of 4K (3840×2160) and 8K (7680×4320)
Rec. 2020 - ITU-R Recommendation for UHDTV
H.264/MPEG-4 AVC - The predecessor video standard of HEVC
X264 - Free Open source implementation of H264 encoder
X265 - Free Open source implementation of H265 encoder
VP9
References
External links
Fraunhofer Heinrich Hertz Institute HEVC website
Joint Collaborative Team on Video Coding (JCT-VC)
JCT-VC Document Management System
Computer file formats
Graphics file formats
High-definition television
Lossy compression algorithms
MPEG-H
Open standards covered by patents
Ultra-high-definition television
Video compression
Video codecs
Videotelephony
ITU-T recommendations
ITU-T H Series Recommendations
H.26x
IEC standards
ISO standards |
41116142 | https://en.wikipedia.org/wiki/Don%20Dailey | Don Dailey | Don Dailey (March 10, 1956 – November 22, 2013) was an American longtime researcher in computer chess and a game programmer. Along with collaborator Larry Kaufman, he was the author of the chess engine Komodo. Dailey started chess programming in the 1980s, and was the author and co-author of multiple commercial as well as academic chess programs. He has been an active poster in computer chess forums and computer Go newsgroups. He was raised as a Jehovah's Witness and served in recent years as an elder in the church of Roanoke.
In October 2013, Dailey announced the release of Komodo 6, but also news concerning the future status of Komodo due to his fatal illness of an acute form of leukemia, and introduced Mark Lefler as new member of the Komodo team. Dailey died of leukemia at the age of 57 on November 22, 2013.
Rex
Rex was Dailey's first chess program in the 1980s, in collaboration with Sam Sloan and Larry Kaufman. It competed at various ACM North American Computer Chess Championships and World Computer Chess Championships. Rex was improved further and marketed as RexChess.
Heuristic software
In the early 1990s, Dailey started to work with chess master and computer chess programmer Julio Kaplan within his company Heuristic Software. The program they developed was called Heuristic Alpha, which later evolved into Socrates, Socrates II and the mass market entry Kasparov's Gambit.
MIT connection
At the ACM 1993 computer chess tournament, which was won by Dailey's program Socrates II on an IBM PC ahead of Cray Blitz, he met Bradley Kuszmaul and Charles Leiserson from MIT competing with StarTech, and they asked him to help develop a new parallel chess program. Some time later when Heuristic went out of business, he began working part-time for Leiserson at the lab at MIT on the new parallel program Star Socrates, beside his duty as official systems administrator. Star Socrates played a strong World Computer Chess Championship 1995 in Shatin, Hong Kong, finally losing the playoff versus Fritz. Dailey continued his cooperation with Charles Leiserson on the massively parallel chess program Cilkchess, written in Cilk.
Corel and Mini
Additionally, in the 1990s, Dailey further worked with Larry Kaufman on the commercial mass market entry Corel Chess. Beside competing with Cilkchess, their serial chess program Mini played the World Computer Chess Championship 1999 in Paderborn.
Doch and Komodo
After a break from computer chess and a few years focusing on other domains, Dailey's 2009/2010 chess program Doch as well as its successor Komodo are again a joint effort in collaboration with Larry Kaufman. In Fall 2013, the developmental version of Komodo won stage 3, and already after Don's death, the final of the Thoresen Chess Engines Competition, the latter in a 48-game match versus stage 4 winner Stockfish by a margin of 25–23. Finalist Stockfish DD, dedicated to Don Dailey, was officially released during the final, the commercial Komodo-TCEC a few days later.
References
External links
Don Dailey's ICGA Tournaments
Don Dailey: Chessprogramming wiki
Computerschach, Interview with Don Dailey by Frank Quisinsky, Schachwelt, December 18–20, 2009
Interview with Don Dailey (Komodo programmer), nTCEC interview by Martin Thoresen, April 7, 2013
1956 births
2013 deaths
Deaths from myelodysplastic syndrome
American computer programmers
Deaths from cancer in Virginia
Computer chess people
Deaths from leukemia
People from Kalamazoo, Michigan |
30444145 | https://en.wikipedia.org/wiki/Forest%20informatics | Forest informatics | Forest informatics is the combined science of Forestry and informatics, with a special emphasis on collection, management, and processing of data, information and knowledge, and the incorporation of informatic concepts and theories specific to enrich forest management and forest science; it has a similar relationship to library science and information science.
It is an interdisciplinary science primarily concerned with the collection, classification, manipulation, storage, retrieval and dissemination of information. Information, in this context, includes both human and machine readable documents. Examples of human readable documents include maps, field data sheets, operational schedules, and long term asset management plans with narrative text. Machine readable documents include files for geographic information systems (GIS), Global Positioning Systems (GPS), and other applications like spreadsheets, and relational database management systems.
As in management science, Forest Informatics uses decision support systems, mathematical modeling, statistics, and algorithms from engineering, operations research, computer science, and artificial intelligence to support decision-making activities. Common forestry problems include harvest scheduling, model fitting, optimal sampling, remote sensing, crew assignment, image classification, treatment timing, and log bucking problems, many of which can be formulated as optimization problems (e.g. generalized assignment problem, traveling salesman problem, knapsack problem, job shop scheduling, and vehicle routing problems). The practice includes information processing and the engineering of information systems, decision support systems, geographic information systems, and
global positioning systems. The research field includes studies the structure, algorithms, behavior, and interactions of natural and artificial systems that store, process, access and communicate information about forested ecosystems.
History
In 1970, J. G. Grevatt wrote an article titled, "Management Information and Computers in Forestry."
In the article, the author describes and discusses different dimensions of management information (i.e. operation, expenditure, location, and time) including the nature of management information and decisions, management information in forestry, the management information system itself, the application of computers, the structure of a computer based system, comparisons between clerical and computer systems, and the impact on the field manager. The author concludes that the use of computers to process management data may be justified on grounds of cost and improved information in organizations of a critical size.
At the time of that article, computers, databases, and geographic information systems were still in their infancy and tools like the Global Positioning Systems of today were yet invented. Management database systems for business were more prevalent. Over the next 30 years, computers became more powerful, smaller, and less expensive. Relational database management systems had become commonplace in business, interrogating the computer system had become standardized with languages like SQL, and faster networks for data and information integration have become highly integrated. In that time,
geographic information systems that could run on desktop computers and could be customized for various tasks were also developed, but as
separate systems.
Within the last 10 years, specialized fields of study at the university level are offered at the several forestry schools where students learn the principles of quantification, modeling, descriptive and predictive analyses of natural resources attributes needed for sound management of forested ecosystems.
At the small forester practitioner level, more software "back-end" programs have become available to model likely forest growth outcomes based on treatment prescriptions. These are provided by private businesses, such as Assisisoft, as well as government agencies, such as the U.S. Forestry Service's NED system. The basic functions of the NED system is commonly used among American consulting foresters, as it is free, although a small proportion of that group uses the full modeling capabilities of the software. With the prevalence of smartphone and tablet access, the computing power available in the woods is now much higher than it was just a few years ago when PocketPC-based systems were prevalent. Microsoft announced it will deprecate the PocketPC platform in March 2013. New "apps," such as Forest Metrix are now becoming available for foresters and timber cruisers to employ their devices for data collection for later export into more sophisticated software.
Software specifically devoted to analyzing management decisions for forested ecosystems have been developed, and used in several large
scale planning projects. For example, the Ecosystem Management Decision Support (EMDS) system is an application framework for
knowledge-based decision support of ecological analysis and planning. Open source software solutions have also become more widely
accepted as well, as is seen in the expansion of ecological extensions for statistical tools like R. A recent example would be the book written by Andrew Robinson and Jeff D. Hamann about using R for forest analytics.
In 2006, the United Nations declared 2011 to be International Year of Forests.
Forest Informatics, Inc. has developed a postgresql template, a set of software agents, and a collection of reports, maps, and
data feeds. The application uses an intelligent agent architecture to preemptively generate possible strategic, tactical, and operational solutions for forest managers.
Contributing disciplines
Math
Artificial intelligence
Computer science
Information science
Information theory
Information technology
Biodiversity Informatics
Ecoinformatics
Evolutionary informatics
Geoinformatics
Mathematical logic
Graph theory
Computational geometry
Geographic Information Systems
See also
Environmental informatics
Conservation biology
Farm Forestry Toolbox
Forestknowledge.net
Global Forest Information Service
i-Tree
Sustainability
References
Forest management
Forest modelling
Information science |
6559697 | https://en.wikipedia.org/wiki/Comparison%20of%20X%20Window%20System%20desktop%20environments | Comparison of X Window System desktop environments | A desktop environment is a collection of software designed to give functionality and a certain look and feel to an operating system.
This article applies to operating systems which are capable of running the X Window System, mostly Unix and Unix-like operating systems such as Linux, Minix, illumos, Solaris, AIX, FreeBSD and Mac OS X. Microsoft Windows is incapable of natively running X applications; however, third-party X servers like Cygwin/X, Exceed, or Xming are available.
Technical elements of a desktop environment
A desktop environment (DE) can be broken up into several components that function independently and interact with one another to provide the look and feel and functionality of the desktop environment. A fundamental part of a DE is the window manager or WM. A window manager creates a certain way for application windows to present themselves to the user. It manages the various application windows, keeping track of which ones are open and providing features to switch between them. Another important element of a DE is the file manager. This application manages files/ folders and presents them in a way that the user finds convenient. It provides file operations like viewing, copying or moving, changing permissions and deleting. DEs usually provide utilities to set wallpapers and screensavers, display icons on the desktop, and perform some administrative tasks. They may optionally include word processors, CD/DVD writing applications, web browsers and e-mail clients.
There are some exceptions: Window managers like Fluxbox, wmii and Ratpoison operate independently of a desktop environment and were written with this objective in mind. Additional hand-picked applications add functionality such as a panel and volume management which gives them some of the qualities of a full DE. This contrasts the behaviour of WMs like Metacity and KWin which were not written with the objective of operating independently of a DE.
KDE Software Compilation and GNOME are written almost completely on special software libraries Qt and GTK respectively. This usually means that virtually every component of the desktop environment including the file manager explicitly depends on that library for its functioning.
Notably, nothing prevents the user from installing any number of software libraries of their choice. In practice, software written on major libraries can be run under any desktop environment. Running a package designed for one desktop (which essentially means that it's written using the same libraries as the desktop itself is) within a different desktop can be visually displeasing, as well as incurring the RAM penalty of loading libraries that wouldn't otherwise be required.
Some of the differences which can influence the choice of desktop environment are:
Look and feel of the desktop environment. The user will be more comfortable with a certain look and feel that they may or may not already be familiar with.
Flexibility and configurability of the desktop environment. A sophisticated user might want a highly configurable desktop environment to make the desktop environment work the way they want. A beginning user might just want an easy-to-use environment to which they will adjust.
Personal preferences for choice of software, which has two aspects:
Each desktop environment comes packaged with various default software and various "ways things are done" under that desktop. A casual user might like a highly integrated graphical interface to change various settings while a more experienced user might prefer to use individual configuration utilities or even CLI tools.
Desktops are also often closely tied into various major functional components of the desktop manager (example: file manager, browser, word processor); whilst "mix and match" is possible, it is generally pleasing to make choices which result in a consistent look and feel of programs under the chosen desktop environment. Making choices based on what software integrates with a chosen desktop environment necessarily limits the weight that can be given to other application features.
Desktop comparison information
Overview
Default programs packaged
This table shows basic information on the programs distributed with some desktop environments for the X Window System.
Note that Razor-qt has become LXQt, a port of LXDE to the Qt framework.
Comparison of ease of use and stability
GNOME's graphical file manager Files (Nautilus) is intended to be very easy to use and has many features. KDE's file manager Dolphin is described as focused on usability. Prior to KDE version 4, the KDE project's standard file manager was Konqueror, which was also designed for ease of use.
Both GNOME and KDE come with many graphical configuration tools, reducing the need to manually edit configuration files for new users. They have extensive bundled software such as graphical menu editors, text editors, audio players, and software for doing administrative work. All applications installed in most distributions are automatically added to the GNOME and KDE menus. No major configuration changes are necessary to begin working. However, by using graphical tools, the extent to which the desktops can be configured is determined by the power provided by those tools.
Compatibility and interoperability issues
Some desktop environments and window managers claim that they support applications made for other desktop environments explicitly. For example, Fluxbox states KDE support in its feature list. Using software made specifically for the desktop environment in use or window manager agnostic software is a way to avoid issues. For software developers, the Portland Project has released a set of common interfaces that allows applications to integrate across many desktop environments.
System resources utilization
A 2011 test by Phoronix with the default installation of Ubuntu 10.04 showed that LXDE 0.5's memory utilization was lower than that of Xfce 4.6, which in turn was lower than that of GNOME 2.29, with KDE 4.4 using the most RAM compared to the aforementioned desktops.
In 2015, it was demonstrated in benchmarks that LXDE performed slightly faster than Xfce overall (in the average of all tests), using the Fedora Linux operating system.
See also
Comparison of X window managers
Comparison of file managers
Croquet Project
DistroWatch – a website containing information on several hundred distributions
freedesktop.org
Minimalism (computing)
Software bloat
References
External links
Best Linux desktop of 2018 TechRadar
Fedora 24: Comparing Gnome, KDE Plasma, Cinnamon, MATE, Xfce, LXDE ZDNet
Freedom of choice: 7 top Linux desktop environments compared PC World
11 Best Linux Desktop Environments And Their Comparison | 2018 Edition fossbytes.com
The 10 Best Linux Desktop Environments lifewire.com
7 Best Desktop Environments For Linux itsfoss.com
What is the difference between Gnome, KDE, Xfce & LXDE pclosmag.com
Should You Use a Window Manager as Your Desktop Environment? makeuseof.com
Six Popular Linux Desktop Environments techspot.com
10 Best and Most Popular Linux Desktop Environments of All Time tecmint.com
5 Best Linux Desktop Environments With Pros & Cons linuxandubuntu.com
The 8 Best Ubuntu Desktop Environments (18.04 Bionic Beaver Linux) linuxconfig.org
Best New Linux Desktop Environments Datamation
6 reasons why GNOME is still the best Linux desktop environment opensource.com
Best Linux Desktop Environments for 2016 linux.com
WTF Desktop Environments: GNOME, KDE, and More Explained Lifehacker
A visual history of OS desktop environments NetworkWorld
X Window System desktop environments |
22086683 | https://en.wikipedia.org/wiki/FPGA%20prototyping | FPGA prototyping | Field-programmable gate array prototyping (FPGA prototyping), also referred to as FPGA-based prototyping, ASIC prototyping or system-on-chip (SoC) prototyping, is the method to prototype system-on-chip and application-specific integrated circuit designs on FPGAs for hardware verification and early software development.
Verification methods for hardware design as well as early software and firmware co-design have become mainstream. Prototyping SoC and ASIC designs with one or more FPGAs and electronic design automation (EDA) software has become a good method to do this.
Why prototyping is important
Running a SoC design on FPGA prototype is a reliable way to ensure that it is functionally correct. This is compared to designers only relying on software simulations to verify that their hardware design is sound. About a third of all current SoC designs are fault-free during first silicon pass, with nearly half of all re-spins caused by functional logic errors. A single prototyping platform can provide verification for hardware, firmware, and application software design functionality before the first silicon pass.
Time-to-market (TTM) is reduced from FPGA prototyping: In today's technological driven society, new products are introduced rapidly, and failing to have a product ready at a given market window can cost a company a considerable amount of revenue. If a product is released too late of a market window, then the product could be rendered useless, costing the company its investment capital in the product. After the design process, FPGAs are ready for production, while standard cell ASICs take more than six months to reach production.
Development cost: Development cost of 90-nm ASIC/SoC design tape-out is around $20 million, with a mask set costing over $1 million alone. Development costs of 45-nm designs are expected to top $40 million. With increasing cost of mask sets, and the continuous decrease of IC size, minimizing the number of re-spins is vital to the development process.
Design for prototyping
Design for prototyping (DFP) refers to designing systems that are amenable to prototyping. Many of the obstacles facing development teams who adopt FPGA prototypes can be distilled down to three "laws":
SoCs are larger than FPGAs
SoCs are faster than FPGAs
SoC designs are FPGA-hostile
Putting a SoC design into an FPGA prototype requires careful planning in order to accomplish prototyping goals with minimal effort. To ease the development of the prototype, best practices called, Design-for-Prototyping, influences both the SoC design style and the project procedures applied by design teams. Procedural recommendations include adding DFP conventions to RTL coding standards, employing a prototype compatible simulation environment, and instituting a system debug strategy jointly with the software team.
Partitioning issues
Due to increased circuit complexity, and time-to-market shrinking, the need for verification of application-specific-integrated-circuit (ASIC) and system-on-chip (SoC) designs is growing. Hardware platforms are becoming more prominent amongst verification engineers due to the ability to test system designs at-speed with on-chip bus clocks, as compared to simulation clocks which may not provide an accurate reading of system behavior. These multi-million gate designs usually are placed in a multi-FPGA prototyping platform with six or more FPGAs, since they are unable to fit entirely onto a single FPGA. The fewer number of FPGAs the design has to be partitioned to reduce the effort from the design engineer. To the right is a picture of a FPGA-based prototyping platform utilizing a dual-FPGA configuration.
System RTL designs or netlists will have to be partitioned onto each FPGA to be able to fit the design onto the prototyping platform. This introduces new challenges for the engineer since manual partitioning requires tremendous effort and frequently results in poor speed (of the design under test). If the number or partitions can be reduced or the entire design can be placed onto a single FPGA, the implementation of the design onto the prototyping platform becomes easier.
Balance FPGA resources while creating design partitions
When creating circuit partitions, engineers should first observe the available resources the FPGA offers, since the design will be placed onto the FPGA fabric. The architecture of each FPGA is dependent on the manufacturer, but the main goal in design partitioning is to have an even balance of FPGA resource utilization. Various FPGA resources include lookup tables (LUTs), D flip-flops, block RAMs, digital signal processors (DSPs), clock buffers, etc. Prior to balancing the design partitions, it is also valuable to the user to perform global logic optimization to remove any redundant or unused logic. A typical problem that arises with creating balanced partitions is that it may lead to timing or resource conflict if the cut is on many signal lines. To have a fully optimized partitioning strategy, the engineer must consider issues such as timing/power constraints and placement and routing while still maintaining a balanced partition amongst the FPGAs. Strictly focusing on a single issue during a partition may create several issues in another.
Placing and routing partitions
In order to achieve optimal place and routing for partitioned designs, the engineer must focus on FPGA pin count and inter-FPGA signals. After partitioning the design into separate FPGAs, the number of inter-FPGA signals must not to exceed the pin count on the FPGA. This is very difficult to avoid when circuit designs are immense, thus signals must utilize strategies such as time-division multiplexing (TDM) which multiple signals can be transferred over a single line. These multiple signals, called sub-channels, take turns being transferred over the line over a time slot. When the TDM ratio is high, the bus clock frequency has to be reduced to accommodate time slots for each sub-channel. By reducing the clock frequency the throughput of the system is hindered.
Timing requirements
System designs usually encompass several clock domains with signals traversing separate domains. On-board clock oscillators and global clock lines usually mitigate these issues, but sometimes these resources may be limited or not fulfill all design requirements. Internal clocks should be implemented within FPGA devices since clock line and clock buffers connections are limited between FPGAs. Internal clocked designs which are partitioned across multiple FPGAs should replicate the clock generator within the FPGA, ensuring a low clock skew between inter-FPGA signals. In addition, any gated clock logic should be transformed to clock enables to reduce skew while operating at high clock frequencies.
Clock domains crossings should not be partitioned onto separate FPGAs. Signals passing through the crossing should be kept internal to a single FPGA, since the added delay time between FPGAs can cause problems in a different domain. It is also recommended that signals routed between FPGAs be clocked into registers.
Debugging
One of the most difficult and time-consuming tasks in FPGA prototyping is debugging system designs. The term coined for this is "FPGA hell". Debugging has become more difficult and time-consuming with the emergence of large, complex ASICs and SoC designs. To debug an FPGA prototype, probes are added directly to the RTL design to make specific signals available for observation, synthesized and downloaded to the FPGA prototype platform.
A number of standard debugging tools are offered by FPGA vendors including ChipScope and SignalTAP. These tools can probe a maximum of 1024 signals and require extensive LUT and memory resources. For SoC and other designs, efficient debugging often requires concurrent access to 10,000 or more signals. If a bug is not able to be captured by the original set of probes, gaining access to additional signals results in a “go home for the day” situation. This is due to long and complex CAD flows for synthesis and place and route that can require from 8 to 18 hours to complete.
Improved approaches include tools like Certus from Tektronix or EXOSTIV from Exostiv Labs.
Certus brings enhanced RTL-level visibility to FPGA-based debugging. It uses a highly efficient multi-stage concentrator as the basis for its observation network to reduce the number of LUTs required per signal to increase the number of signals that can be probed in a given space. The ability to view any combination of signals is unique to Certus and breaks through one of the most critical prototyping bottlenecks.
EXOSTIV uses large external storage and gigabit transceivers to extract deep traces from FPGA running at speed. The improvement lays in its ability to see large traces in time as a continuous stream or in bursts. This enables exploring extended debugging scenarios that can't be reached by traditional embedded instrumentation techniques. The solution claims saving both the FPGA I/O resources and the FPGA memory at the expense of gigabit transceivers, for an improvement of a factor of 100,000 and more on visibility.
See also
Hardware emulation
Prototype
SystemC
System on a chip
References
External links
FPGA Prototyping Solutions
S2C Rapid Prototyping Solutions
Synopsys HAPS Family
proFPGA Prototyping Boards
HyperSilicon Prototyping Boards
FPGA Prototyping Docs & Papers
FPGA-Based Prototyping Methodology Manual (FPMM) - co-authored with Xilinx
Gate arrays |
30250873 | https://en.wikipedia.org/wiki/2011%20in%20the%20United%20States | 2011 in the United States | Events in the year 2011 in the United States.
Incumbents
Federal government
President: Barack Obama (D–Illinois)
Vice President: Joe Biden (D–Delaware)
Chief Justice of the Supreme Court: John Roberts (New York)
Speaker of the House of Representatives: Nancy Pelosi (D–California) (until January 3), John Boehner (R–Ohio) (since January 5)
Senate Majority Leader: Harry Reid (D–Nevada)
Congress: 111th (until January 3), 112th (starting January 3)
Events
January
January 3
According to Dr. Daniel Haber, chief of Massachusetts General Hospital's cancer center, virtually unlimited metastatic cancer detection becomes possible using a screening method that can find cancer in the periphery. Further, the method appears to be a sound process for monitoring the progress of intervention and thereby modifying the treatment protocol.
Lawmakers in 14 states (Alabama, Arizona, Delaware, Idaho, Indiana, Michigan, Mississippi, Montana, Nebraska, New Hampshire, Oklahoma, Pennsylvania, Texas and Utah) announce plans to curtail application of parts of the 14th Amendment in their respective states.
Wisconsin becomes the 22nd state to sue the federal government over the Patient Protection and Affordable Care Act.
January 6 – The US Constitution is read aloud on the floor of the US House of Representatives for the first time in history.
January 7 – Oklahoma and Wyoming join the other 22 states suing the federal government over the Patient Protection and Affordable Care Act.
January 8 – 2011 Tucson shooting: In Tucson, Arizona, a gunman opens fire at a constituent meeting led by U.S. Representative Gabby Giffords injuring 14, including Giffords, and killing 6, including U.S. Federal Judge John Roll. The primary suspect, Jared Lee Loughner, is taken into custody.
January 10
Former Republican United States House of Representatives Majority Leader Tom DeLay is sentenced to three years in prison for money laundering.
In college football, the #1 Auburn Tigers defeat the #2 Oregon Ducks to win the 2011 BCS National Championship Game by a score of 22–19.
January 11 – Ohio becomes the 25th state to sue the federal government over the Patient Protection and Affordable Care Act.
January 12 – Kansas and Maine join the other 25 states suing the federal government over the Patient Protection and Affordable Care Act.
January 18 – U.S. President Barack Obama begins a four-day meeting with Chinese President Hu Jintao.
January 19
Kermit Gosnell, his wife and eight staff members at his Philadelphia abortion clinic are arrested in connection with murders of babies, manslaughter of a patient and prescription drug charges. Prosecutors alleged that Gosnell and others killed babies at the clinic by severing their spinal cords with scissors.
The US House votes to repeal the Patient Protection and Affordable Care Act with a vote of 245–189.
January 20 – In a landmark study that will ultimately see the cure for AIDS, a new technique renders T-Cells resistant to HIV.
January 25 – U.S. President Barack Obama delivers his 2011 State of the Union Address.
January 31 – Florida federal judge Roger Vinson rules that the Patient Protection and Affordable Care Act is unconstitutional because of the individual mandate it contains.
January 31–February 2 – A blizzard dumps as much as of snow across the Midwestern United States, causing at least 24 storm-related deaths.
February
February 2 – The US Senate blocks a repeal of the Patient Protection and Affordable Care Act with a vote of 51–47.
February 6
NASA's STEREO satellites obtain the first simultaneous images of the entire surface of the Sun.
Super Bowl XLV between the Green Bay Packers and the Pittsburgh Steelers becomes the most watched television program in US history at 111 million viewers. The Packers defeat the Steelers 31–25.
February 7 – AOL purchases online publisher The Huffington Post in a $315 million deal.
February 14
President Obama proposes a federal budget for fiscal year 2012. Overall the proposal reduces expenses but also increases funding for some programs and still results in an annual deficit of more than $1 trillion.
The House approves the extension of some parts of the controversial Patriot Act until December.
Disney Channel's daily morning program block for preschoolers, Playhouse Disney, rebrands as Disney Junior, part of the network's plan to establish Disney Junior as a stand-alone network in 2012 (replacing SOAPNet).
February 14–16 – The quiz show Jeopardy! airs the victory of IBM's artificial intelligence program Watson over two of the show's most successful contestants.
February 15 – The Senate approves the same extension of some parts of the controversial Patriot Act until December.
February 17 – Amidst large demonstrations in Wisconsin over a controversial bill (the bill intends to reduce spending on most government employees and remove their collective bargaining rights apart from restricted wage negotiation), 14 Wisconsin Democratic senators flee the state to delay the vote on the bill by preventing a quorum in the senate.
February 20 – 2011 Daytona 500 is won by the Wood Brothers Racing team entrant Trevor Bayne, who became the youngest winner of the race. Carl Edwards was second ahead of David Gilliland.
February 22 – Chicago mayoral election, 2011: Former White House Chief of Staff Rahm Emanuel wins the race for mayor with more than 55% of the vote. He will succeed Mayor Richard M. Daley in May.
February 24 – STS-133: Space Shuttle Discovery launches from Kennedy Space Center for the final time, carrying the Permanent Multipurpose Module to the International Space Station.
February 27
The 83rd Academy Awards, hosted by James Franco and Anne Hathaway, are held at Kodak Theatre in Hollywood. Tom Hooper's The King's Speech wins four awards out of 12 nominations, including Best Picture and Best Director. Christopher Nolan's Inception also wins four awards. The telecast garners 37.9 million viewers.
Frank Buckles, America's last surviving World War I veteran and one of only three verified surviving veterans of the war worldwide, dies at the age of 110. Buckles, who lived in West Virginia, served in Europe as an ambulance driver for 11 months until the war's end in November 1918.
March
March 1 – The U.S. House of Representatives passes a small spending bill that funds the federal government until March 18 and cuts $4 billion in spending, averting a potential government shutdown.
March 3
Serena Williams' spokeswoman confirmed that Williams had suffered from a life-threatening pulmonary embolism.
The U.S. Supreme Court makes a controversial 8–1 decision that the controversial protests of the Westboro Baptist Church at fallen US military members' funerals are a form of protected speech under the First Amendment.
The U.S. Senate passes the same small spending bill that funds the federal government through March 18 and cuts $4 billion in spending.
March 9
Space Shuttle Discovery lands at the Shuttle Landing Facility in Florida on its final mission, STS-133. The vehicle clocked 365 days in orbit during its 27-year career, beginning with STS-41-D in fall 1984.
Governor of Illinois Pat Quinn signs legislation abolishing the state's death penalty and commutes the death sentences of the fifteen inmates on Illinois' death row to life imprisonment without the possibility of parole.
The Wisconsin Senate approves a bill that ends most collective bargaining rights for nearly all unions; it was able to pass the legislation without a quorum by removing the budget oriented parts of it (a quorum would have necessitated the presence of at least one of the absent Democratic members).
The world's largest bond fund, Pimco, announces it is dumping all of its U.S. government-related securities, including U.S. Treasurys and agency debt.
March 10 – The Wisconsin State Assembly passes the law that restricts bargaining rights for unions in a 53–42 vote.
March 11 – Following the Tōhoku earthquake and tsunami, the Pacific Tsunami Warning Center issues a tsunami warning to parts of the U.S. West Coast along the affected coastal areas in Alaska, Hawaii and the U.S. Territories in the Pacific Ocean.
March 15 – The U.S. House of Representatives passes another small spending bill, avoiding the U.S. government shutdown until April 8.
March 16 – Wholesale food prices rose by the largest monthly increase in February since November 1974, with an increase of 3.9%. Some economists claim that it will only get worse.
March 17
The House cuts all federal funding for NPR.
The US Senate passes a small spending bill, avoiding a government shutdown until April 8.
March 18 – NASA's MESSENGER spacecraft becomes the first man-made technology to establish an orbit around Mercury.
March 19 – In light of the continuing attacks on Libyan rebels by Gaddafi forces, military intervention authorized under UNSCR 1973 began as French fighter jets flew reconnaissance flights over Libya. United States Navy ships were said to be preparing for bombardment of Libyan air defenses.
March 21
AT&T announces plans to buy T-Mobile for $39 billion. If allowed by the Federal Communications Commission, AT&T would become the largest US phone carrier, surpassing Verizon Wireless. If allowed, the number of major US phone carriers would decrease from 4 to 3, leaving AT&T, Verizon and Sprint.
March 24 – According to a landmark study in The New England Journal of Medicine, an orally administered Takeda Pharmaceutical called pioglitazone, marketed as Actos, shows 72 percent effectiveness at the prevention of the development of type 2 diabetes in pre-diabetic subject participants. Ralph DeFronzo, M.D., study leader and professor in the School of Medicine and chief of the diabetes division at The University of Texas Health Science Center in San Antonio, stated that "It's a blockbuster study. The 72 reduction is the largest decrease in the conversion rate of pre-diabetes to diabetes that has ever been demonstrated by any intervention, be it diet, exercise or medication.
March 25 – Archaeologists report that they have found new artifacts in an archaeological site in Texas which indicates of human existence in America 15,500 years ago – around 2,000 years earlier than the alleged Clovis culture took place, which until recently was considered the first human culture in North America.
March 29 – More than 1.5 million web sites around the world had been infected by the LizaMoon SQL injection attack spread by scareware since Tuesday. Novice computer users should be warned that when a pop up window opens the best way to insure you are not infected is to close the window from the task manager.
March 31
Because of U. S. federal budget woes and a general migration of information from printed to digital format, starting in April 2011, most U.S. workers will no longer receive their annual Social Security benefit estimates in the mail. "In light of the current budget situation, we are suspending the mailing of the annual statements beginning in April", said Social Security spokesman Mark Lassiter. Congress has failed to agree on a budget for the current fiscal year, which means that most federal agencies, including the Social Security Administration, are operating at last year's spending levels. The annual Social Security benefit statement, which contains a summary of an individual's earnings history and estimated retirement benefits at various ages, generally arrives about three months before the worker's birth month. "So folks born in July will likely be the first ones who won't get the annual statement", Lassiter says. However, workers can still get an estimate of their projected retirement benefits based on their actual work history at www.ssa.gov/estimator. Thus, U. S. citizens need to look at their social security account retirement benefit estimator at that link instead of passively waiting each year for a paper statement (which from now on they will not receive) – thus, they should make a bookmark of this government link for yearly use.
A data breach at one of the world's largest providers of marketing-email services, Dallas-based Epsilon, a subsidiary of Alliance Data Systems Corporation, may have enabled unauthorized people to access the names and email addresses for customers of major financial-services, retailing and other companies, (Citigroup Inc., J.P. Morgan Chase & Co., Barclays PLC, U.S. Bancorp, Capital One Financial Corp., Walgreen Co., New York & Co., Kroger Co., Brookstone, McKinsey & Co., Marriott International Inc., Ritz-Carlton and TiVo Inc.).
April
April 3 – Crystal Mangum, the false accuser in the Duke lacrosse case, is arrested after repeatedly stabbing her boyfriend, Reginald Daye.
April 4
The U.S. Supreme Court upholds Arizona School Vouchers in a 5–4 ruling.
In men's college basketball, the UConn Huskies defeat the Butler Bulldogs to win the 2011 NCAA Men's Division I Basketball Tournament.
April 5
Cuba and its partners announce plans to drill for oil in Cuban waters in The Gulf of Mexico.
In women's college basketball, the Texas A&M Aggies defeat the Notre Dame Fighting Irish to win the 2011 NCAA Women's Division I Basketball Tournament.
April 6 – A United States Navy F/A-18 crashes near Naval Air Station Lemoore in California, killing both crew members.
April 8 – President Obama, House Republicans and Senate Democrats agree on a week-long stopgap spending bill preventing a government shutdown resulting from a failure to pass the 2011 federal budget.
April 10 – 2011 Masters Tournament: South African Charl Schwartzel won the 2011 event by two strokes over Adam Scott and Jason Day.
April 13
An Air France Airbus A380, operating as Air France Flight 007, collides with a Comair Bombardier CRJ-700, operating as Comair flight 553/Delta Connection Flight 6293 in Delta Connection livery, on a taxiway at John F. Kennedy International Airport in New York City. The double-deck Airbus A380 is the world's largest commercial passenger jet. The A380 has 520 people on board, and the smaller plane 66. There are no injuries. The incident brings into question the spatial taxiway requirements for the new large A380's wingspan on existing airport taxiways.
Reginald Daye dies 10 days after being repeatedly stabbed by Crystal Mangum, the false rape accuser in the Duke lacrosse case.
April 14–16 – A tornado outbreak and severe thunderstorms kill at least 43 people across the Southern United States, with fatalities occurring in Oklahoma, Arkansas, Alabama, North Carolina and Virginia. It is the deadliest U.S. tornado outbreak to occur in three years.
April 15 – Rio is released in theaters.
April 18 – Standard & Poor's downgrades its outlook on long-term sovereign debt of the United States to negative from stable for the first time in history, citing "very large budget deficits and rising government indebtedness" as for why it did so. A statement from Standard & Poor's explained its reasoning; "We believe there is a material risk that U. S. policy-makers might not reach an agreement on how to address medium- and long-term budgetary challenges by 2013; if an agreement is not reached…this would…render the U.S. fiscal profile meaningfully weaker than [its peers]". This could possibly mean the US losing its AAA credit rating.
April 25–April 28 – The most active tornado outbreak in United States history kills 339 people across the Southeastern United States, becoming the third deadliest tornado outbreak in United States history, falling behind the Tupelo-Gainesville tornado outbreak of April 1936 and the outbreak that produced the Tri-State Tornado of March 1925.
April 27
Responding to continued coverage by the mainstream media of Barack Obama citizenship conspiracy theories, that President Barack Obama was born on August 4, 1961, in Honolulu, Hawaii, Obama releases his long-form birth certificate.
In an unprecedented meeting with reporters, the U.S. Federal Reserve chairman Ben Bernanke states that he expects less economic growth for 2011 as the economy has been weaker in recent months than he had thought it would be. Bernanke refused to speculate on when he would discontinue with The Federal Reserve's monetary stimulus policy, known as quantitative easing.
Eight American troops and one contractor are shot and killed by an Afghan National Army Air Force pilot. Five Afghan soldiers were also wounded in the attack, for which the Taliban claimed responsibility.
May
May 2
U.S. President Barack Obama announces in a media statement that Osama bin Laden, the founder and leader of the militant group Al-Qaeda and the most-wanted fugitive on the U.S. list, was killed by U.S. forces during an American military operation in Pakistan and that his body is in U.S. custody.
bin Laden's body, which was handled in accordance with Islamic practice and tradition, is buried by the U.S. forces at sea less than a day after his death, thus preventing a burial site from becoming a "terrorist shrine".
In order to save the city of Cairo, Illinois from severe flooding, the Army Corps of Engineers blows up the levee on the Missouri side of the Mississippi flooding acres of farmland and forcing some to go homeless. The issue went all the way to the Supreme Court.
May 6 – Thor, directed by Kenneth Branagh, is released by Marvel Studios as the fourth film of the Marvel Cinematic Universe (MCU).
May 7 – Jockey John R. Velazquez wins the 2011 Kentucky Derby riding Animal Kingdom.
May 8 – Mississippi flooding worsens, killing 15 more than the 337 in preceding storms, with the Army Corps of Engineers saying an area between Simmesport, Louisiana and Baton Rouge would be inundated 20–30 feet.
May 10 – 360,000 Citigroup credit card accounts are hacked.
May 12 – Plans are cancelled to install prismatic glass on One World Trade Center's bottom base.
May 13 – The federal government predicts that the Medicare hospital fund will run out in 2024, five years earlier than the 2029 previously projected. They also predicted that the Social Security trust fund would run out in 2036, instead of the 2037 previously projected.
May 14
The Morganza Spillway on the Mississippi River is opened for the second time in its history, deliberately flooding of rural Louisiana and placing three nuclear power plants at risk to save most of Baton Rouge and New Orleans.
The President of the International Monetary Fund and candidate for President of France, Dominique Strauss-Kahn, is charged with raping a maid in a New York City hotel room.
May 16
STS-134: Space Shuttle Endeavour is launched for the final time at 8:56 A. M. EDT.
The U.S. Supreme Court makes a controversial 8–1 decision that the exigent circumstance warrantless searches do not violate the Fourth Amendment when it is believed that there is an imminent destruction of evidence." Writing for the majority, Associate Justice Samuel Alito said that citizens are under no obligation to respond when law enforcement knocks at the door or, if they do open the door, allow the police to come in. In cases where no exigent circumstances exist, police officers who desire entry would have to persuade a judge to issue a search warrant. But Alito said, "Occupants who choose not to stand on their constitutional rights but instead elect to attempt to destroy evidence [had] only themselves to blame."
Congress is currently considering whether and by how much to extend the debt ceiling again. In a May 16, 2011 letter to Congress, U.S. Treasury Secretary Timothy Geithner declared a "debt issuance suspension period," which provides the Secretary with certain extraordinary authorities to prevent a breach of the debt limit. Geithner had previously sent letters to Congress requesting an increase in the debt ceiling on January 6, April 4 and May 2.
May 19
In Pennsylvania, teenager Angela Marinucci becomes the first of The Greenboro Six to be convicted of the murder of Jennifer Daugherty, a mentally disabled woman who was tortured and murdered in February 2010.
During a speech in support of the Arab Spring, Obama stated that a resolution to the Israeli–Palestinian conflict would involve creation of a Palestinian state based on the pre-1967 borders.
May 20
During a meeting between U.S. President Barack Obama and Israeli Prime Minister Benjamin Netanyahu at the White House, Netanyahu emphasizes that Israel would not make a full withdraw to the pre-1967 borders as Obama requested the previous day because these borders are not defensible.
Travel on the Mississippi River is closed for five miles (8 km) near the US city of Baton Rouge, Louisiana due to flooding.
WWE wrestler Randy Savage dies of a heart attack in Seminole, Florida when he loses control of his Jeep Wrangler and crashes into a tree; he was 58 years old.
May 21
U.S. businessman Herman Cain announces that he will be seeking the Republican Party nomination in the 2012 U.S. presidential election.
The Minnesota House of Representatives votes to put a constitutional referendum on marriage before voters in the US state of Minnesota.
May 22 – A tornado touched down in Joplin, Missouri, causing widespread damage. 158 were killed and 1,150 were injured, making it the deadliest U.S. tornado in 64 years.
May 23 – The U.S. Supreme Court makes a controversial 5–4 decision that court-mandated population limit was necessary to remedy a violation of prisoners’ Eighth Amendment constitutional rights (United States Constitution's prohibition against cruel and unusual punishment). The court requires that there be a controversial prisoner reduction plan forced on California prison administrators whereby the state reduces its inmate population by tens-of-thousands to ease overcrowding. Writing for the majority, Associate Justice Anthony Kennedy said that "after years of litigation, it became apparent that a remedy for the constitutional violations would not be effective absent a reduction in the prison system population."
May 25
Jared Loughner, the man charged with the 2011 Tucson shooting, is found by a federal judge to be incompetent to stand trial.
Oprah Winfrey hosts the finale of her syndicated talk show, which was on the air for 25 years.
May 26 – The U.S. Supreme Court makes a controversial 5–3 decision which upheld the Arizona state law that monetarily (up to and including seizure, but not criminally) punished businesses that hire illegal aliens.
May 27 – The Space Shuttle spacewalk portions of the International Space Station are completed.
May 28 – U.S.-based missile producer Lockheed Martin, the largest military contractor in the world, is targeted by a "significant and tenacious" cyber attack.
May 29
Indy-style British racer Dan Wheldon wins the 2011 Indianapolis 500.
The Wallow Fire begins, named for the Bear Wallow Wilderness area where the fire originated, in eastern Arizona, in the White Mountains near Alpine. By June 7, 2011, it had burned about .
May 31
The U.S. Supreme Court makes a limited-usage (narrow in scope and application) 8–0 decision which sided with former United States Attorney General John Ashcroft in a claim for damages against a public official.
The U.S. Supreme Court makes a limited-usage (narrow in scope and application) 8–1 decision which sided with SEB S. A. in a patent infringement case.
June
June 1
The Obama administration states that it will boycott a United Nations anti-racism conference because of concerns over Anti-Semitism.
The new United States military strategy explicitly states that a cyberattack is casus belli for a traditional act of war.
STS-134: Space Shuttle Endeavour lands for the final time, after 19 years of orbital spaceflight.
June 2
The Federal Bureau of Investigation investigate claims that hackers in China attacked the Google email accounts of officials in the United States and Asian countries, as well as Chinese pro-democracy activists.
Mitt Romney announces plans to seek the Republican Party nomination as President of the United States.
June 3 – John Edwards, former United States presidential candidate and Senator representing North Carolina, is indicted on charges of conspiracy and violating campaign finance laws in connection to his affair with Rielle Hunter; Edwards denies he broke any laws.
June 6
The U.S. Supreme Court makes a 7–2 decision that inventors do not give up their patent rights to their employers if that employer received federal funding. The ruling went against Stanford University in a dispute of patent infringement over a Roche HIV PCR detection test. Writing for the majority, Chief Justice John Roberts said that "Since 1790, the patent law has operated on the premise that rights in an invention belong to the inventor. The question here is whether the University and Small Business Patent Procedures Act of 1980—commonly referred to as the Bayh-Dole Act—displaces that norm and automatically vests title to federally funded inventions in federal contractors. We hold that it does not."
Anthony Weiner photo scandal: Representative Anthony Weiner (D-NY) admits he sent a lewd photo of himself over Twitter to a Washington woman. He also admits sending explicit photos and messages to at least 6 other woman over the past 3 years. He states that he will not resign.
June 8 – Fazul Abdullah Mohammed, mastermind of the 1998 United States embassy bombings in Kenya and Tanzania, is killed in Somalia.
June 9 – The U.S. Supreme Court makes an 8–0 decision that in patent dispute challenges against inventors the standard of proof required is more than a preponderance of evidence. The ruling upholds a 2009 jury verdict in favor of i4i in a dispute of patent infringement over a Microsoft Word software editing subprogram. Writing for the majority, Associate Justice Sonia Sotomayor said that "Under §282 of the Patent Act of 1952, "[a] patent shall be presumed valid" and "[t]he burden of establishing in-validity of a patent or any claim thereof shall rest on the party asserting such invalidity." 35 U. S. C. §282. We consider whether §282 requires an invalidity defense to be proved by clear and convincing evidence. We hold that it does."
June 12
The Dallas Mavericks win their first NBA championship, four games to two, against the star-studded Miami Heat in the 2011 NBA Finals.
The coat of Mad Men star Christina Hendricks catches fire and bursts into flames at the Tony Awards after party. She was said to be unharmed.
June 13 – Hackers break Into US Senate computers.
June 15 – The Boston Bruins win their first NHL title in 39 years over the Vancouver Canucks in the 2011 Stanley Cup Finals.
June 16
Anthony Weiner photo scandal: Representative Anthony Weiner (D-NY) resigns.
The U.S. Supreme Court makes a controversial 5–4 decision that, in the interrogations of minors, a Miranda statement must be made. The ruling involves a 13-year-old child under schoolroom police interview. The court ruled in favor of the child, J. D. B., in a dispute of his confession made during a North Carolina theft investigation. Writing for the majority, Associate Justice Sonia Sotomayor said that "This case presents the question whether the age of a child subjected to police questioning is relevant to the custody analysis of Miranda v. Arizona, 384 U. S. 436 (1966). It is beyond dispute that children will often feel bound to submit to police questioning when an adult in the same circumstances would feel free to leave. Seeing no reason for police officers or courts to blind themselves to that commonsense reality, we hold that a child's age properly informs the Miranda custody analysis."
On March 19, because of the continuing attacks on Libyan rebels by Gaddafi forces, there was a military intervention authorized under UNSCR 1973. Various forces including ones from the United States attacked with fighter jets in bombardment over Libya. Ten Congressman announce plans to sue President Barack Obama in Federal court over violation of the War Powers Resolution. The 10 Congressman include 3 Democrats, Dennis Kucinich of Ohio, John Conyers of Michigan and Michael Capuano of Massachusetts, as well as 7 Republicans, Ron Paul of Texas, Walter Jones and Howard Coble of North Carolina, Tim Johnson of Illinois, Dan Burton of Indiana, Jimmy Duncan of Tennessee and Roscoe Bartlett of Maryland.
June 19 – Northern Ireland golfer Rory McIlroy wins the 2011 U.S. Open, setting scoring records in the process.
June 20
The internet domain names can now be any "dot"-suffix. The Internet Corporation for Assigned Names and Numbers (ICANN) approved the change.
The U.S. Supreme Court makes a controversial 9–0 decision that, in large class-action lawsuits, a cohesive element must exist. The ruling involves the class-action status of a sex discrimination case against Wal-mart containing 1.6 million litigants. The court ruled in favor of Wal-mart, only on the class action status of the dispute of the women's claims. The ruling rejects the lower courts lowering of standards in class-action status certification. Writing for the majority, Associate Justice Antonin Scalia said that "We are presented with one of the most expansive class actions ever. The District Court and the Court of Appeals approved the certification of a class comprising about one and a half million plaintiffs, current and former female employees of petitioner Wal-Mart who allege that the discretion exercised by their local supervisors over pay and promotion matters violates Title VII by discriminating against women. In addition to injunctive and declaratory relief, the plaintiffs seek an award of backpay. We consider whether the certification of the plaintiff class was consistent with Federal Rules of Civil Procedure 23(a) and (b)(2)."
June 21 – Starting in 2012, the Food and Drug Administration requires new warning labels featuring graphic images that convey the dangers of smoking be on U.S. cigarette packs.
June 22
The Congressional Budget Office predicts the US debt-to-GDP ratio will top 101% by 2021, 10% higher than the 91% previously projected. Further predictions show an increase to 150% by 2030 and 200% by 2037. This assumes current spending levels continue.
82-year-old Boston mob boss James "Whitey" Bulger, wanted for his alleged role in 19 murders, was captured by the FBI in Santa Monica, California after 16 years as a fugitive.
June 23 – The U.S. Supreme Court makes a 5–4 decision that, in will lawsuits, bankruptcy state courts are superseded by will courts in matters of core proceedings. The ruling involves the US$1.6 billion estate of J. Howard Marshall, Jr. between Anna Nicole Smith and Pierce Marshall. The court ruled in favor of the estate of the deceased Pierce Marshall and the Texas Probate Court versus the estate of the deceased Vickie Lynn Marshall (a.k.a. Anna Nicole Smith) and the California Bankruptcy Court. Writing for the majority, Chief Justice John Roberts said that "... the Texas state decision controlled, after concluding that the Bankruptcy Court lacked the authority to enter final judgment on a counter claim that Vickie brought against Pierce in her bankruptcy proceeding. 1 To determine whether the Court of Appeals was correct in that regard, we must resolve two issues: (1) whether the Bankruptcy Court had the statutory authority under 28 U. S. C. §157(b) to issue a final judgment on Vickie's counterclaim; and (2) if so, whether conferring that authority on the Bankruptcy Court is constitutional ... We conclude that, although the Bankruptcy Court had the statutory authority to enter judgment on Vickie's counterclaim, it lacked the constitutional authority to do so."
June 24
New York becomes the sixth state to legalize same-sex marriage.
Pixar Animation Studios' twelfth feature film, Cars 2, the sequel to 2006's Cars, is released in theaters.
June 27
The U.S. Supreme Court makes a 7–2 decision that strikes down a California law enacted in 2005 that bans the sale of certain violent video games to children without parental supervision. The Court upholds the lower court decisions and revokes the law, ruling that video games were protected speech under the First Amendment as other forms of media. The ruling involves a freedom of speech case by The Entertainment Merchants Association against a California law. The court ruled in favor of The Entertainment Merchants Association, only on the overly broad status of the statute's wording of the minors' rights. Writing for the majority, Associate Justice Antonin Scalia said that "We consider whether a California law imposing restrictions on violent video games comports with the First Amendment...Because the Act imposes a restriction on the content of protected speech, it is invalid unless California can demonstrate that it passes strict scrutiny—that is, unless it is justified by a compelling government interest and is narrowly drawn to serve that interest. R. A. V., 505 U. S., at 395. The State must specifically identify an "actual problem" in need of solving, Playboy, 529 U. S., at 822–823, and the curtailment of free speech must be actually necessary to the solution, see R. A. V., supra, at 395. That is a demanding standard. "It is rare that a regulation restricting speech because of its content will ever be permissible." Playboy, supra, at 818. California cannot meet that standard...And finally, the Act's purported aid to parental authority is vastly overinclusive. Not all of the children who are forbidden to purchase violent video games on their own have parents who care whether they purchase violent video games."
Former Illinois Governor Rod Blagojevich is found guilty of 17 of the 20 counts against him, including trying to sell President Barack Obama's Senate seat.
June 28 – In baseball, a judge in the U.S. state of Delaware authorizes the Los Angeles Dodgers to enter into a $150 million bankruptcy financing deal after the club addresses concerns of Major League Baseball.
Richard Poplawski is sentenced to death in the murder of three Pittsburgh police officers in April 2009.
June 29 - Transformers: Dark of the Moon is released in theaters as the third film in the Transformers film series.
July
July 1
The New York Times sexual assault case against former International Monetary Fund head Dominique Strauss-Kahn is on the verge of collapse due to concerns over the credibility of the alleged victim's testimony. A judge releases him from house arrest as prosecutors said that the maid had made false statements.
Owners in the North American National Basketball Association start a lock out after failing to reach a new collective bargaining agreement.
The Minnesota government shuts down after budget talks fail between Democratic Governor Mark Dayton and the Republican-controlled Minnesota Legislature.
The majority of children under one year old are minorities, which is a first according to a 2012 U.S. Census Bureau estimate.
Leon Panetta is sworn in as the new Secretary of Defense, succeeding Robert Gates.
July 2 – ExxonMobil workers attempt to contain an oil spill on the Yellowstone River in the US state of Montana.
July 3 – A tourist boat sinks in the Gulf of California off the coast of Baja California in Mexico with 23 people missing.
July 5
The US city of Phoenix, Arizona is hit by a large dust storm leaving thousands of people without power and grounding flights at Phoenix Airport.
Casey Anthony is found not guilty of first degree murder and manslaughter in the death of her daughter Caylee, but found guilty of four misdemeanor counts of giving false information to police.
July 7
The U.S. Supreme Court makes a controversial 5–4 decision that Humberto Leal García, a Mexican national, should be executed in the US state of Texas despite concerns over whether the circumstances of his execution would breach international law.
Casey Anthony is sentenced to four years for lying to law enforcement regarding the death of her child Caylee in the U.S. state of Florida but after credit for time served, will be released on July 17.
Seven people are shot dead by Rodrick Dantzler in Grand Rapids, Michigan.
July 8 – STS-135: In an added flight, Space Shuttle Atlantis of the US Space Shuttle program is launched for its final time. This is also the final launch for the entire NASA Space Shuttle program.
July 12
A three judge panel of the U.S. 9th Circuit Court of Appeals rules that Jared Lee Loughner, the suspect in the 2011 Tucson shooting, has the right to refuse antipsychotic medication while he appeals the treatment prescribed by prison mental health authorities.
CNN reports that the U.S. Bureau of Alcohol, Tobacco, Firearms, and Explosives has lost track of 1,400 guns involved in Operation Fast and Furious aimed at tracing the flow of weapons to Mexican drug cartels.
The United States Coast Guard ends aerial searches for seven Americans still missing after a charter fishing boat sank in the Gulf of California off Mexico on July 3.
July 14
U.S. district court judge Reggie Walton declares a mistrial in the perjury trial of former baseball star Roger Clemens after prosecutors present evidence that Walton had previously ruled inadmissible. Walton will hold a hearing on September 2 to determine whether to hold a new trial.
News International phone hacking scandal: The FBI is investigating reports that News Corporation sought to hack the phones of victims of the September 11 terrorist attacks.
Borders Group, the once-major bookstore chain now in chapter 11 bankruptcy in the United States, says that its arrangement with stalking horse bidder Najafi Companies has collapsed and it will seek a modification of bid procedures.
July 15
The Dawn spacecraft arrives and settles into its one-year orbit around the minor planet 4 Vesta.
Walt Disney Animation Studios' 51st feature film, Winnie the Pooh, is released in theaters. A revival of the studio's Winnie the Pooh franchise, it is met with strongly positive reception but middling box office performance. To date, it is Disney's most recent traditionally-animated film.
July 17 – Japan wins the FIFA Women's World Cup 2011 by beating the USA 3–2 in the Penalty Shootout.
July 18 – The U.S. city of Phoenix, Arizona, is hit by a haboob or dust storm.
July 19
Northern Mariana Islands Governor Benigno Fitial and Guam Governor Eddie Calvo state that they are in serious talks to potentially merge the U.S. territories of Guam and the Northern Mariana Islands.
Sixteen alleged members of the computer hacking group Anonymous are arrested in FBI raids across several states in the US.
The Federal Bureau of Investigation (FBI) arrests an alleged agent of Pakistan's Inter-Services Intelligence in the US state of Virginia for making illegal campaign contributions.
July 21
Two dozen people die throughout the week in a heat wave in the United States.
STS-135: Space Shuttle Atlantis touches down at the Shuttle Landing Facility at Kennedy Space Center, ending the 30-year shuttle program, which began with the launch of shuttle Columbia on April 12, 1981.
Minnesota Governor Mark Dayton signs a budget agreement with Republicans in the Minnesota Legislature, ending a 20-day government shutdown.
July 22 – Captain America: The First Avenger, directed by Joe Johnston, is released by Marvel Studios as the fifth film of the Marvel Cinematic Universe (MCU). It is the final MCU film to be distributed by Paramount Pictures.
July 23 – Nearly 4,000 employees of the US Federal Aviation Administration are furloughed due to Congressional authorisation for its programs lapsing.
July 24 – Democratic Party leaders call for the United States House Committee on Ethics to investigate claims that Rep David Wu of Oregon had sexually assaulted a teenager.
July 25
In American football, the NFL Players Association executive unanimously accepts a 10-year pay deal with team owners in the US National Football League.
Nickelodeon launches a block of reruns of its 1990s programming to cater to its twentysomething viewers. Dubbed The '90s Are All That, the block airs on its older-skewing sister channel, TeenNick.
July 26
United States Post Office closure list sent. Some 3,653 post offices are being reviewed for possible closure.
David Wu resigns as a member of the United States House of Representatives following allegations of an unwanted sexual encounter with an 18-year-old.
July 27
Maria Ridulph: a 7-year-old 1957 murder victim is exhumed.
The United States Senate, in an exception to the 10-year limit, extends the term of the current FBI director, Robert Mueller.
August
August 1 – The United States Congress votes on a deal to resolve the United States debt-ceiling crisis with the House of Representatives passing it. U.S. Rep. Gabby Giffords (D--Arizona) casts her first vote since her traumatic brain injury at the hands of a deranged assassin.
August 2
The United States Senate passes legislation to raise the debt ceiling in order to avert the 2011 US debt ceiling crisis and President Barack Obama signs it into law; it thus became the Budget Control Act of 2011.
Baruj Benacerraf dies at the age of 90. He was a Venezuelan-born American immunologist, who shared the 1980 Nobel Prize in Physiology or Medicine for the "discovery of the major histocompatibility complex genes which encode cell surface protein molecules important for the immune system's distinction between self and non-self". The MHC genes are critical to organ transplantation medicine.
August 3 – It is announced that Jerry Lewis would no longer host any further MDA telethons. Earlier this year, it was announced that Lewis was no longer the national chairman of the MDA.
August 4
Kraft Foods announces that it will split into two operations consisting of its North American grocery business and its global snack foods business.
United States debt-ceiling crisis: The Dow Jones Industrial Average plunges 512 points (−4.3%) on economic worries, becoming the worst day for stocks since December 2008 and, at this time, the 9th largest drop in United States history (See August 8).
August 5
NASA's Juno Spacecraft launches to Jupiter. The orbital insertion will occur in August 2016.
United States debt-ceiling crisis: After the U.S. trading markets close for the weekend, the Standard & Poor's credit rating agency downgrades the credit rating of the United States from AAA to AA+ with a negative outlook. This was the first downgrade of the US credit rating since it was first issued in 1917. Barack Obama's administration had told Standard and Poor's they made a nearly 2 trillion dollar error in their calculations. S&P acknowledged the error, but proceeded with the downgrade anyway.
August 6
A NATO Chinook helicopter crashes in the Sayd Abad district of Afghanistan's Wardak Province after being shot down using rocket-propelled grenade by the Taliban with 38 deaths. At least 20 of the U.S. Navy SEALs killed in the attack were members of SEAL Team Six, the unit that carried out the operation that killed Osama Bin Laden. The Associated Press and CNN later reported that none of the unit members that participated in the raid were involved. This was the single deadliest day for US troops since the Afghanistan War began in 2001.
The computer hacking group Anonymous attacks 70 mostly rural law enforcement websites in the United States. Many of the sheriff's offices outsourced their websites to the media hosting company, Brooks-Jeffrey Marketing. If Brooks-Jeffrey's have been breached, then that would give hackers access to every website that the company hosted.
August 7 – Ohio man Michael Hance kills seven people before being shot dead by police.
August 8 – United States debt-ceiling crisis: The Dow Jones Industrial Average plunges another 635 points (−5.6%) in reaction to Standard and Poor's downgrade on August 5. It is the 6th largest drop in United States history and the largest drop since December 2008.
August 9
United States debt-ceiling crisis: The U.S. Federal Reserve announces it will keep interest rates at "exceptionally low levels" at least through mid-2013; but, it also makes no commitment for further quantitative easing. The Dow Jones Industrial Average and the New York Stock Exchange as well as other world stock markets, recover after recent falls.
The largest group of simultaneous recall elections in United States history ends with Republicans keeping control of the Wisconsin State Senate, despite Democrats picking up 2 seats.
August 10
New England Journal of Medicine: A therapy destroys leukemia (advanced cases of chronic lymphocytic leukemia, or CLL) in three patients.
United States debt-ceiling crisis: Stocks dive again on Europe and economy fears. The Dow Jones Industrial Average fell 519.83 points, or 4.62% to 10719.94, more than wiping out the gains posted in Tuesday's sizable late-day rally. It was the Dow's fourth triple-digit move in five days and brings its declines since its April peak to more than 16%.
August 12
Judge sentences Ohio serial killer Anthony Sowell to death by lethal injection – he is believed to be responsible for 11 murders.
United States Post Office considering budget cuts of cutting as many as 120,000 jobs.
August 13
Ames Straw Poll: Republican candidates for the party's nomination in the 2012 presidential election face off in the informal Iowa contest. Congresswoman Michele Bachmann of Minnesota finishes first place, ahead of runner-up Rep. Ron Paul of Texas and former Governor of Minnesota Tim Pawlenty, who comes in third.
Seven people are killed and 45 are injured when the main stage collapses at the Indiana State Fair in Indianapolis. The tragedy occurred in part from a hurricane-force wind gust ahead of an approaching severe thunderstorm. The scheduled event was to be a performance by the band Sugarland.
August 14—The 2011 PGA Championship, played at Bellerive Country Club, is won by American professional golfer, Keegan Bradley, defeating Jason Dufner in a playoff.
August 15 – Google announces a proposed acquisition of Motorola Mobility
August 17 – University of Miami football scandal: NCAA investigating claims by a former booster, Nevin Shapiro, who claims that he provided players with prostitutes, cars and other gifts over the past decade.
August 19
U.S. President Obama provides temporary relief for illegal immigrants who are students, veterans, the elderly, crime victims and those with family, including same-sex partners, as part of immigration reform in the United States.
Hewlett-Packard shares drop 20% on news that the company plans to spin-out its personal computer division into a separate company.
Doctor Tyron Reece, who last year wrote nearly a million prescriptions for the painkiller hydrocodone, has been charged with assisting a Mexican prescription drug smuggling ring.
August 20 – Striking Verizon union workers will return to work starting August 22, 2011, though their contract dispute isn't resolved.
August 23 – A rare Eastern-seaboard earthquake of magnitude 5.8 strikes Virginia. The Virginia Seismic Zone's faults ruptured, resulting in activity being felt in Washington, D.C., New York City and other cities.
August 24
A Russian Progress resupply vehicle that was destined for the International Space Station experienced a catastrophic engine failure. The unmanned craft failed to reach orbit and impacted in the Altai Republic.
The ailing head of Apple Inc., Steve Jobs resigns.
August 26 – The filming of government officials while on duty is protected by the First Amendment, said the United States First Circuit Court.
August 28 – Hurricane Irene: A rare hurricane drives North up the mid-atlantic and Northeast coast. 9 million homes lose power. Total Caribbean and U. S. fatalities and flooding damage are 55 dead and US$10 billion respectively. The New England state of Vermont suffers its worst flooding in 100 years.
August 30 – While reportedly on his way to surrender to police in the US city of Atlanta to face murder charges, former National Basketball Association player Javaris Crittenton is arrested by the FBI at John Wayne Airport in Orange County, California.
August 31
Solyndra, a California solar panel company declares bankruptcy. Only 2 years earlier, Solyndra had received over $500 million from the federal government as requested by the Obama administration.
The United States Department of Justice files a lawsuit in an attempt to stop the $39 billion merger between cell phone giants AT&T and T-Mobile.
September
September 1 – Tropical Storm Lee: With memories of Hurricane Katrina, a Gulf of Texas storm lands on New Orleans. After a storm track footprint into the Southeastern states, there are a total of 21 fatalities.
September 2 – An audit report from the United States Treasury Inspector General for Tax Administration found that last year illegal aliens fraudulently collected $4.2 billion from the Additional Child Tax Credit, a refundable credit meant for working families. The audit found that the means for the crime was as a result of vague U.S. law.
September 3 – A 47-year-old North Carolina man is convicted of eight counts of second-degree murder in the shooting deaths at a nursing home on March 29, 2009 – the type of conviction means that he will not be eligible for the death penalty.
September 5
Wildfires rage across Texas. A fire near Bastrop, Texas burns 1,500 homes and , breaking the record for most homes destroyed in a single fire in Texas history.
The new format, prime-time Muscular Dystrophy Association Telethons begin without Jerry Lewis the first telethon not to feature him. In six hours, the organization, which leads the fight against progressive muscle diseases, broadcast its 46th annual MDA Labor Day Telethon. The 2011 telethon raised $61,491,393 — up from the $58,919,838 achieved during the prior year's 21½-hour telethon.
September 6 – Gunman Eduardo Sencion opens fire in an IHOP in Carson City, Nevada, killing three members of National Guard and one civilian before committing suicide.
September 8 – U.S. President Barack Obama unveils the American Jobs Act to a joint-session of Congress. Critics label it as a "Third stimulus package".<ref name="NYDailyNews">Alison Gendar and Thomas M. DeFrank. President Obama punts to earlier time against NFL game after caving to GOP of televised jobs speech. New York Daily News, 9/1/2011.</ref>
September 11
The 9/11 National Memorial & Museum in New York City opens; the ceremony commemorates the tenth anniversary of the 9/11 attacks.
In tennis, Samantha Stosur of Australia wins the Women's Singles in the 2011 US Open defeating Serena Williams of the United States 6–2, 6–3.
September 12
Bank of America announces 30,000 layoffs.
In tennis, Novak Djokovic of Serbia wins the Men's Singles at the 2011 US Open defeating Rafael Nadal of Spain 6–2, 6–4, 6–7 (3–7), 6–1.
September 13
In what was called a referendum on U.S. President Barack Obama, Republican Bob Turner defeats Democrat David Weprin in a special election for New York's 9th congressional district, the seat held previously by Anthony Weiner until he resigned amid a sexting scandal. Turner is the first Republican to represent this district in 88 years.
The Fall television season officially kicks off with the first new show, Ringer.
September 14
In a court case concerning the theft of Kevlar-related trade secrets, DuPont is awarded US$920 million in damages.
NASA announces plans for a Space Launch System to replace the Space Shuttle program with the first flight tentatively scheduled for 2017.
September 15
The House passes a bill that would severely limit the power of the National Labor Relations Board with a vote of 238–186. The NLRB had recently come under fire from Republicans for trying to prevent Boeing from opening a new 787 Dreamliner production facility in South Carolina with non-union workers instead of in Washington state.
Criminal questions arise over a United States Air Force general being pressured by the Obama administration to approve a plan by telecom company LightSquared to develop a nationwide satellite phone network. The company has backing by Democratic donors. LightSquared technology may be a threat to Global Positioning System guidance of U. S. missiles and airline Air traffic control systems.
Walter Reed Army Medical Center closes. (it was merged into Walter Reed National Military Medical Center in Bethesda, Maryland)
September 16 – 2011 Reno Air Races crash: There are 11 dead and at least 75 injured, 25 critically, when a P-51D Mustang airplane crashes into the crowd at the annual Reno Air Races in Reno, Nevada.
September 17 – Occupy Wall Street: Thousands march on Wall Street in response to high unemployment, record executive bonuses and extensive bailouts of the financial system.
September 18 – The 63rd Primetime Emmy Awards for television programs broadcast in the United States are held in Los Angeles with Mad Men winning the outstanding drama series and Modern Family winning the Outstanding Comedy.
September 20 – The United States military officially ends its policy of Don't ask, don't tell allowing gay and lesbian personal to publicly declare their sexual orientation.
September 22 – The Federal Bureau of Investigation arrests suspected members of the computer hacking groups LulzSec and Anonymous in the US cities of Phoenix, Arizona and San Francisco, California.
September 23
2011 NBA lockout: The ongoing labor dispute forces the NBA to cancel the first 43 preseason games of the 2011–12 NBA season.
The Dow Jones Industrial Average has its worst week in nearly 3 years, falling 6.41% as new recession fears grow.
September 27 – Andy Rooney announces his retirement from 60 Minutes after 33 years of providing commentary.
September 28 – The United States Centers for Disease Control and Prevention links an outbreak of listeriosis that has caused 23 deaths and 116 illnesses in 25 states to infected cantaloupes from Colorado.
September 30
After a manhunt that lasted more than two years, during a U.S. military operation in northern Yemen's al-Jawf province, American drones carried out a targeted killing of al-Qaida's leader in the Arabian Peninsula Anwar al-Awlaki while he traveled in a convoy together with his senior aides.
Jessie debuts on Disney Channel.
October
October 1 – 700 people are arrested while attempting to cross the Brooklyn Bridge during the Occupy Wall Street movement.
October 3
Amanda Knox is released from Italian prison following a successful appeal of her murder conviction.CBC News. Cbsnews.com (2011-10-04). Retrieved on 2011-10-10.
The U.S. Supreme Court announces that it won't hear a much-noted dispute on the width of the "first sale" doctrine in copyright law. The Supreme Court denied Vernor's petition for certiorari – the action affirms the lower court of the United States Court of Appeals for the Ninth Circuit which held that when the transfer of software to the purchaser materially resembled a sale it was, in fact, not a "sale with restrictions on use", giving rise to no right to resell the copy under the first-sale doctrine. As such, Autodesk could pursue an action for copyright infringement against Vernor, who sought to resell used versions of its software on eBay. The Ninth Circuit's decision means that the policy considerations involved in the case might affect motion pictures and libraries as well as sales of used software. The net effect of the Ninth Circuit's ruling (and now the Supreme Court's) is to limit the "You bought it, you own it" principle asserted by such organizations whom would like to resell items.
American cell phone service provider Sprint Nextel reportedly pays $20 billion for rights to Apple's next mobile phone.
October 4
In basketball, the North American National Basketball Association cancels the remainder of the preseason due to the 2011 NBA lockout, with cancellation of games in the regular season occurring if the lockout continues for another week.
Voters in the U.S. state of West Virginia go to the polls for a gubernatorial special election with acting Governor, Democrat Earl Ray Tomblin, being elected as Governor of West Virginia.
October 5 – Steve Jobs dies at the age of 56. He was an American computer engineer, who co-founded in 1976 Apple Inc., an electronics producer, which at many times has been the largest company in the world.
October 7 – The NYPD busts a Queens-based identity theft and retail crime ring, arresting over 110 people. It was the largest identity theft ring in the history of the United States, making an annual profit of over $13 million.Ali, Aman. (2011-09-27) Reuters. Reuters. Retrieved on 2011-10-10.
October 11
The United States Senate passes economic sanctions on China due to so called low manipulation of the yuan.
The United States Senate rejects the American Jobs Act in a procedural vote.
October 12 – Scott Dekraai opens fire in a hair salon in Seal Beach, California, killing eight, including his ex-wife. He is later arrested.
October 14 – The United States under President Barack Obama deployed 100 troops in Uganda to assist in the capture of Lord's Resistance Army leader Joseph Kony in the current insurgency.
October 16 – British auto racing driver Dan Wheldon dies in a 15-car pileup while participating in the final race of the 2011 IndyCar season at Las Vegas Motor Speedway.
October 22 – Republican Governor Bobby Jindal wins a second term as Governor of Louisiana.
October 28 – The St. Louis Cardinals defeat the Texas Rangers in seven games to win their 11th world series.
November
November 4 – After announcing his retirement on September 27, Andy Rooney dies at the age of 92.
November 7 – Jerry Sandusky, a former assistant coach for the Penn State University football team, is arrested on nearly 40 counts of molesting eight boys over a 15-year period. The charges come following a grand jury investigation, which also alleges attempts to cover up the incidents and failure to report the incidents to law enforcement. In the days following the report, longtime coach Joe Paterno and university president Graham Spanier (already heavily criticized for alleged inaction) are fired.
November 8 – Election Day
Republican Phil Bryant wins the 2011 Mississippi Gubernatorial election.
Voters in Mississippi reject a life-at-conception proposal.
Incumbent Democrat Steve Beshear wins the 2011 Kentucky Gubernatorial election.
Republicans gain control of the Virginia Senate, with Republican Lieutenant Governor of Virginia Bill Bolling breaking all tie votes.
Arizona Senate majority leader Russell Pearce is recalled from office.
Voters in Ohio reject a law that would ban collective bargaining for government employees.
November 11 – U.S. President Barack Obama arrives in Honolulu, Hawaii with Asia-Pacific leaders to attends the APEC Summit.
November 14 – The United States Supreme Court announces it will make its ruling on the constitutionality of the Patient Protection and Affordable Care Act sometime in 2012.
November 21 – The US national debt tops the United States's GDP for the first time since the late 1940s.
November 26 – NASA's Mars Science Laboratory launches with the Curiosity rover. The scheduled landing date is August 6, 2012.Mars Science Laboratory. Nasa.gov (2011-10-05). Retrieved on 2011-10-10.
November 29 – AMR Corporation, the parent company of American Airlines announces a sudden bankruptcy. The company's stock plunges 84% on the news.
December
December 2 – The U.S. unemployment rate falls to 8.6% – the lowest since early 2009.
December 7 – Former Illinois Governor Rod Blagojevich is sentenced to 14 years in prison for corruption and trying to sell Barack Obama's former Senate seat.
December 9
Republican party nomination hopeful Newt Gingrich fuels controversy by referring to Palestinians as an "invented people".
Joshua Komisarjevsky, one of the suspects in the Cheshire, Connecticut home invasion murders, is sentenced to death by lethal injection.
December 10 – Robert Griffin III, the quarterback with the Baylor Bears college football team, wins the Heisman Trophy.
December 13 – Iran rejects a U.S. request to return an RQ-170 unmanned reconnaissance aircraft that was recently captured by Iranian forces. Iranian officials claimed they used a cyber attack to capture the aircraft and that they are extracting data from it.
December 15 – Impractical Jokers'' debuts its first episode on TruTV.
December 18 – The last American troops are withdrawn from Iraq, ending the Iraq War.
December 31–January 2 – A string of 52 arson fires are set in the Los Angeles area, causing up to $2 million in damage. A foreign national named Harry Burkhart, reportedly angry at Americans, is arrested for the crimes.
Ongoing
War in Afghanistan (2001–2021)
Iraq War (2003–2011)
2010–2012 Southern United States drought
Births
August 10 – Jeremy Maguire, actor
October 6 – Ryan Kaji, YouTube star
Deaths
January
January 1
Charles Fambrough, musician and composer (b. 1950)
Billy Joe Patton, amateur golfer (b. 1922)
January 2
William Richard Ratchford, U.S. Representative from Connecticut from 1979 to 1985 (b. 1934)
Anne Francis, actress (b. 1930)
Peter Hobbs, French-born American actor (b. 1918)
Richard Winters, World War II soldier (b. 1918)
January 5 – John Ertle Oliver, geophysicist (b. 1923)
January 6 – Donald J. Tyson, businessman (b. 1930)
January 7 – Thomas J. White, construction company executive and philanthropist (b. 1920)
January 8
John Roll, US federal judge and 2011 Tucson shooting victim (b. 1947)
Christina-Taylor Green, documentary subject and 2011 Tucson shooting victim (b. 2001)
January 10 – John Dye, actor (b. 1963)
January 18 – Sargent Shriver, Peace Corps founder (b. 1915)
January 21 – Dennis Oppenheim, artist (b. 1938)
January 23 – François Henri "Jack" LaLanne, fitness and dietary health trainer (b. 1914)
January 24 – Bernd Eichinger, German film producer and director, died in Los Angeles (b. 1949)
January 25 – Daniel Bell, sociologist (b. 1919)
January 27 – Charlie Callas, comedian and actor (b. 1927)
January 29 – Milton Babbitt, composer (b. 1916)
January 30 – John Barry, British film score composer, died in Oyster Bay, New York (b. 1933)
January 31 – Charles Kaman, aeronautical engineer (b. 1919)
February
February 3 – LeRoy Grannis, surfing photographer (b. 1917)
February 4 – Tura Satana, Japanese-born American actress (b. 1938)
February 5 – Peggy Rea, actress (b. 1921)
February 6 – Kenneth Harry Olsen, electrical engineer (b. 1926)
February 7 – Maria Altmann, litigant versus Nazi Austria (b. 1916 in Austria)
February 8
Marvin Sease, singer (b. 1946)
Tony Malinosky, baseball player (b. 1909)
February 10 – Bill Justice, cartoonist (b. 1914)
February 12
Betty Garrett, actress (b. 1919)
Kenneth Mars, actor (b. 1935)
February 14 – George Shearing, pianist (b. 1919)
February 15 – Charles Epstein, geneticist and Unabomber victim (b. 1933)
February 16
Neal R. Amundson, mathematical modeling in chemical engineering (b. 1916)
Leonard King "Len" Lesser, actor (b. 1922)
February 18 – Walter Seltzer, film producer (b. 1914)
February 19 – Ollie Matson, American football player (b. 1930)
February 21
Edwin D. Kilbourne, physician and vaccine scientist (b. 1920)
Dwayne McDuffie, comic book writer, editor and animator (b. 1962)
Russell W. Peterson, 66th governor of Delaware from 1969 till 1973. (b. 1916)
Haila Stoddard, actress (b. 1913)
Judith Sulzberger, physician and newspaper director (b. 1923)
February 26 – Greg Goossen, baseball player and actor (b. 1945)
February 27
Frank Buckles, soldier, last living U.S. World War I veteran (b. 1901)
Duke Snider, baseball player (b. 1926)
Gary Winick, film director (b. 1961)
February 28
Peter J. Gomes, professor and theologian (b. 1942)
Nick LaTour, actor (b. 1926)
Jane Russell, actress (b. 1921)
March
March 1
Leonard Lomell, soldier (b. 1920)
John M. Lounge, astronaut (b. 1946)
March 4
Charles Jarrott, British film director, died in Woodland Hills, California (b. 1927)
Johnny Preston, singer (b. 1939)
March 6 – Jean Bartel, Miss America Pageant winner and actress (b. 1923)
March 8 – Mike Starr, musician (b. 1966)
March 9 – David S. Broder, journalist (b. 1929)
March 11 – Hugh Martin, film music composer (b. 1914)
March 15
Nate Dogg, rapper (b. 1969)
Marty Marion, baseball player (b. 1917)
March 17
Ferlin Husky, singer and musician (b. 1925)
Mirabelle Thao-Lo, murder victim (b. 2010)
March 18
Warren Christopher, 63rd United States Secretary of State from 1993 till 1997. (b. 1925)
Drew Hill, American football player (b. 1956)
March 19 – Robert Ross, medical school founder (b. 1919)
March 23
Jean Bartik, computer engineer (b. 1924)
Elizabeth Taylor, British-born American actress (b. 1932)
March 24 – Lanford Wilson, writer (b. 1937)
March 25 – Thomas Eisner, biologist, died in Ithaca, New York (b. 1929)
March 26
Paul Baran, internet pioneer, died in Palo Alto, California (b. 1926)
Harry Coover, inventor (b. 1917)
Geraldine Ferraro, United States Representative from New York from 1979 till 1985. (b. 1935)
March 27
Farley Granger, actor (b. 1925)
Dorothea Puente, murderer (b. 1929)
March 28
Lee Hoiby, composer (b. 1926)
Guy M. Townsend, Air Force brigadier general and test pilot (b. 1920)
March 31 – Mel McDaniel, singer-songwriter and musician (b. 1942)
April
April 1 – Lou Gorman, baseball manager (b. 1929)
April 2
Larry Finch, college basketball player and coach (b. 1951)
John C. Haas, chemical engineer (b. 1918)
Bill Varney, film sound editor (b. 1934)
April 3 – William Prusoff, pharmacologist (b. 1920)
April 4 – Ned McWherter, 46th governor of Tennessee from 1987 till 1995. (b. 1930)
April 5
Baruch Samuel Blumberg, Nobel physician (b. 1925)
Larry Shepard, baseball manager (b. 1919)
April 6
Skip O'Brien, actor (b. 1950)
F. Gordon A. Stone, chemist (b. 1925).
April 7 – Edward Edwards, murderer and one-time member of the FBI's most wanted list (b. 1933)
April 9
Jerry Lawson, video game pioneer (b. 1940)
Sidney Lumet, film director (b. 1924)
April 10 – Homer Smith, American football player and coach (b. 1931)
April 12
Sidney Harman, businessman and publisher (b. 1918)
Eddie Joost, baseball player and manager (b. 1916)
April 14
Walter Breuning, current oldest living man and third oldest man ever (b. 1896)
Cyrus Harvey Jr., film distributor (b. 1925)
William Lipscomb, Nobel chemist (b. 1919)
Arthur Marx, writer (b. 1921)
April 16
William A. Rusher, magazine publisher (b. 1923)
Sol Saks, television writer (b. 1910)
April 17 – Joel Colton, historian (b. 1918)
April 18 – William Donald Schaefer, 58th governor of Maryland from 1987 till 1995. (b. 1921)
April 19
Lynn Chandnois, American football player (b. 1925)
Norm Masters, American football player (b. 1933)
April 20
Chris Hondros, photojournalist (b. 1970)
Madelyn Pugh, television writer (b. 1921)
Gerard Smith, guitarist (b. 1974)
April 21
Harold Garfinkel, sociologist (b. 1917)
Max Mathews, electrical engineer (b. 1926)
April 22 – Merle Greene Robertson, archaeologist (b. 1913)
April 23 – Phillip Shriver, historian and college president (b. 1922)
April 25 – Joe Perry, American football player (b. 1927)
April 26 – Jim Mandich, American football player (b. 1948)
April 27 – Marian Mercer, actress (b. 1935)
April 28 – William Campbell, actor (b. 1926)
May
May 1
Steven A. Orszag, mathematician (b. 1943)
William O. Taylor II, journalist (b. 1932)
J. Ernest Wilkins Jr., nuclear physicist (b. 1923)
May 2 – David Sencer, physician (b. 1924)
May 3
Robert Brout, American-born Belgian physicist (b. 1928)
Jackie Cooper, actor (b. 1922)
May 4
Mary Murphy, actress (b. 1931)
Sada Thompson, actress (b. 1927)
May 5
Arthur Laurents, screenwriter (b. 1917)
Dana Wynter, actress, died in Ojai, California (b. 1931)
May 6
Lawrence Johnson, inventor (b. 1913)
Horace Freeland Judson, science historian (b. 1931)
Dick Walsh, baseball manager (b. 1925)
May 7
Ross Hagen, actor (b. 1938)
Robert Stempel, automobile executive (b. 1933)
May 8 – Corwin Hansch, chemist (b. 1918)
May 9
Henry Feffer, surgeon (b. 1918)
Jeff Gralnick, journalist (b. 1939)
May 10
Bill Bergesch, baseball manager (b. 1921)
Burt Reinhardt, journalist (b. 1920)
May 11
Maurice Goldhaber, physicist, died in East Setauket, New York (b. 1911 in Austria)
Leo Kahn, businessman (b. 1916)
May 12
Charles F. Haas, film and television director (b. 1913)
Jack Jones, journalist (b. 1924)
Jack Keil Wolf, electrical engineer (b. 1935)
May 13 – Mel Queen, baseball manager (b. 1942)
May 14
Murray Handwerker, businessman (b. 1921)
Joseph Wershba, journalist (b. 1920)
May 15 – Barbara Stuart, actress (b. 1935)
May 16 – Douglas Blubaugh, athlete (b. 1934)
May 17 – Harmon Killebrew, baseball player (b. 1936)
May 19
Phyllis Avery, actress (b. 1924)
David H. Kelley, Canadian archaeologist (b. 1924 in the United States)
Tom West, computer engineer (b. 1939)
May 20
Steve Rutt, early pioneer of video animation (b. 1945)
Randy Savage, wrestler (b. 1952)
May 22 – Joseph Brooks, songwriter (b. 1938)
May 24 – Mark Haines, lawyer and television news anchor (b. 1946)
May 25 – Paul J. Wiedorfer, World War II soldier (b. 1921)
May 26 – Irwin D. Mandel, dentist (b. 1922)
May 27
Jeff Conaway, actor (b. 1950)
Gil Scott-Heron, poet and musician (b. 1949)
May 28
Leo Rangell, physician (b. 1913)
John H. Sinfelt, chemical engineer (b. 1931)
May 29 – Bill Clements, 42nd and 44th governor of Texas from 1979 to 1983 and 1987 to 1991 (b. 1917)
May 30
Clarice Taylor, actress (b. 1917)
Rosalyn Sussman Yalow, Nobel physicist in medicine (b. 1921)
May 31
Pauline Betz, tennis player (b. 1919)
Andy Robustelli, American football player (b. 1925)
Philip Rose, stage and film producer (b. 1921)
June
June 2
Walter R. Peterson Jr., 81st governor of New Hampshire from 1969 to 1973 (b. 1922)
Geronimo Pratt, Black Panther, died in Tanzania (b. 1947)
June 3
James Arness, actor, brother of Peter Graves (b. 1923)
Andrew Gold, singer (b. 1951)
John Henry Johnson, American football player (b. 1929)
Jack Kevorkian, physician (b. 1928)
June 4 – Lawrence Eagleburger, 62nd Secretary of State from 1992 to 1993 (b. 1930)
June 6 – John R. Alison, World War II Air Force pilot (b. 1912)
June 7
Genaro Hernández, boxer (b. 1966)
Leonard B. Stern, television writer, director and producer (b. 1923)
June 8 – Jim Northrup, baseball player (b. 1939)
June 9 – Godfrey Myles, American football player (b. 1968)
June 12
Carl Gardner, singer (b. 1928)
Alan Haberman, businessman (b. 1929)
Kathryn Tucker Windham, writer (b. 1918)
Laura Ziskin, film producer (b. 1950)
Bessie Higginbottom, school (b. 1968)
Ben Higginbottom, school (b. 1968)
Happy Higginbottom, school (b. 1968)
June 13 – Betty Neumar, murder suspect (b. 1931)
June 15 – Bob Banner, television producer (b. 1921)
June 16 – Claudia Bryar, actress (b. 1918)
June 17 – George M. White, architect (b. 1920)
June 18
Clarence Clemons, musician (b. 1942)
Bob Pease, electrical engineer (b. 1940)
June 19 – Don Diamond, actor (b. 1921)
June 20
Ryan Dunn, reality television star (b. 1977)
Robert H. Widmer, aeronautical engineer (b. 1916)
June 22
David Rayfiel, film screenwriter (b. 1923)
Robert Miller, art dealer (b. 1939)
June 23
Gene Colan, comic book artist (b. 1926)
Peter Falk, actor (b. 1927)
Fred Steiner, television and film composer (b. 1923)
June 24 – F. Gilman Spencer, newspaper editor (b. 1925)
June 25
Shelby Grant, actress and wife of Chad Everett (b. 1936)
Alice Playten, actress (b. 1947)
June 26
Edith Fellows, actress (b. 1923)
Robert Morris, cryptographer (b. 1932)
June 27
Lorenzo Charles, basketball player (b. 1963)
Elaine Stewart, actress (b. 1930)
June 29 – Billy Costello, boxer (b. 1956)
July
July 1 – Bud Grant, television producer (b. 1934)
July 4
Wes Covington, baseball player (b. 1932)
William G. Thrash, general in the United States Marine Corps (b. 1916)
July 5 – Armen Gilliam, basketball player (b. 1964)
July 6 – John Mackey, American football player (b. 1941)
July 7
Humberto Leal Garcia Jr., murderer, died in Huntsville, Texas (b. 1973 in Mexico)
Dick Williams, baseball player and manager (b. 1929)
July 8
Roberts Blossom, actor (b. 1924)
William R. Corliss, physicist (b. 1926)
Sam Denoff, television writer and producer (b. 1928)
Pete Duranko, American football player (b. 1943)
Betty Ford, wife of Gerald Ford (b. 1918)
July 10 – Deacon Turner, American football player (b. 1955)
July 11 – Tom Gehrels, Dutch-born American astronomer (b. 1925 in the Netherlands)
July 12
Ame Deal, murder victim (b. 2000)
Leiby Kletzky, murder victim (b. 2002)
Sherwood Schwartz, television writer and producer (b. 1916)
July 13 – Jerry Ragovoy, songwriter and producer (b. 1930)
July 14 – Noel Gayler, World War II naval aviator, admiral and bureaucrat (b. 1914)
July 15
Cornell MacNeil, operatic baritone (b. 1922)
John S. Toll, physicist and college administrator (b. 1923)
July 17
Jim Kincaid, television news correspondent (b. 1934)
Alex Steinweiss, album cover artist (b. 1917)
July 18
Nat Allbright, sports commentator (b. 1923)
Rudiger D. Haugwitz, German-born American chemist (b. 1932)
Edson Stroll, actor (b. 1929)
July 21
Franz Alt, mathematician (b. 1910 in Austria)
Elliot Handler, businessman (b. 1916)
Harold J. Kosasky, Canadian-born American physician (b. c. 1928)
Bruce Sundlun, 71st Governor of Rhode Island from 1991 to 1995 (b. 1920)
July 22
Tom Aldredge, actor (b. 1928)
Linda Christian, Mexican-born American actress, first Bond girl and wife of Tyrone Power (b. 1923 in Mexico)
Charles Taylor Manatt, lawyer and political party leader (b. 1936)
July 23
Robert Ettinger, academic, writer and father of cryonics (b. 1918)
John Shalikashvili, Polish-born American 13th Chairman of the Joint Chiefs of Staff (b. 1936)
Elmer B. Staats, 5th Comptroller General of the United States from 1966 to 1981 (b. 1914)
July 24
Dan Peek, singer (b. 1950)
G. D. Spradlin, actor (b. 1920)
Skip Thomas, American football player (b. 1950)
July 26 – Elmer Lower, television journalist and executive (b. 1913)
July 27
Hideki Irabu, Japanese and American baseball player (b. 1969 in Japan)
Jerome Liebling, photographer and film producer (b. 1924)
Polly Platt, film producer, wife of Peter Bogdanovich (b. 1939)
July 28 – John Marburger, physicist (b. 1941)
July 29
John Edward Anderson, businessman (b. 1917)
Max Harry Weil, physician (b. 1927 in Switzerland)
July 30 – Daniel D. McCracken, computer scientist (b. 1930)
August
August 2
Baruj Benacerraf, Nobel immunologist (b. 1920 in Venezuela)
James Ford Seale, murderer (b. 1935)
August 3
Ray Patterson, basketball executive (b. 1922)
Bubba Smith, American football player (b. 1945)
August 4 – Sherman White, American college basketball player and convicted game fixer (b. 1928)
August 5 – Francesco Quinn, actor, son of Anthony Quinn (b. 1963 in Italy)
August 6
Bernadine Healy, physician (b. 1944)
Fe del Mundo, Filipino pediatrician and first woman to attend Harvard Medical School, died in Quezon City, Philippines (b. 1911)
John W. Ryan, college administrator (b. 1929)
August 7
Hugh Carey, 51st Governor of New York from 1975 to 1982 (b. 1919)
Charles C. Edwards, physician (b. 1923)
Mark Hatfield, 29th governor of Oregon from 1959 to 1967 (b. 1922)
Paul Meier, mathematician (b. 1924)
Charles Wyly, businessman (b. 1933)
August 8 – Harry Hillel Wellington, lawyer and college administrator (b. 1926)
August 11
Don Chandler, American football player (b. 1934)
George Devol, first industrial robot inventor (b. 1912)
August 12
Ernie Johnson, baseball player (b. 1924)
Charles P. Murray Jr., World War II soldier (b. 1921)
August 14 – Fritz H. Bach, physician (b. 1934 in Austria)
August 16 – Pete Pihos, American football player (b. 1923)
August 18
Maurice M. Rapport, neuroscience biochemist (b. 1919)
Scotty Robertson, basketball coach (b. 1930)
Jerome J. Shestack, lawyer (b. 1923)
August 20
Reza Badiyi, Iranian-born American television director (b. 1930)
William B. Kannel, physician (b. 1923)
William I. Wolff, physician and Colonoscopy co-developer (b. 1916)
August 22
Nickolas Ashford, singer (b. 1942)
Jerry Leiber, songwriter (b. 1933)
Sanford H. Winston, World War II soldier (b. 1920)
August 24 – Mike Flanagan, baseball player and manager (b. 1951)
August 26
Patrick C. Fischer, computer scientist and Unabomber target (b. 1935)
Donn A. Starry, soldier (b. 1924)
August 27 – Keith Tantlinger, mechanical engineer (b. 1919)
August 29
Pauline Morrow Austin, meteorologist (b. 1916)
David P. Reynolds, businessman (b. 1915)
September
September 3 – Don Fambrough, American college football coach (b. 1922)
September 4 – Lee Roy Selmon, American football player (b. 1954)
September 5 – Charles S. Dubin, television director (b. 1919)
September 6
Bruce B. Dan, physician (b. 1946)
Michael S. Hart, founder of Project Gutenberg (b. 1947)
Malcolm Prine, baseball executive (b. 1928)
September 10 – Cliff Robertson, film actor (b. 1923)
September 13
John Calley, film studio executive (b. 1930)
Sam DeLuca, American football player (b. 1936)
September 14 – Malcolm Wallop, Senator for Wyoming (b. 1933)
September 15
Frances Bay, Canadian film and television actress, died in Tarzana, California (b. 1919)
Bill Taylor, baseball player (b. 1929)
September 16 – Dave Gavitt, basketball coach and administrator (b. 1937)
September 17
Julius Blank, mechanical engineer (b. 1925)
Charles H. Percy, U.S. Senator from Illinois from 1967 to 1985 (b. 1919)
September 18
Bayless Manning, lawyer and college administrator (b. 1923)
Jamey Rodemeyer, suicide victim (b. 1997)
September 19
Thomas Capano, murderer (b. 1949)
Dolores Hope, singer, wife of Bob Hope (b. 1909)
September 20 – Oscar Handlin, historian (b. 1915)
September 21
Troy Davis, murderer (b. 1968)
Michael Julian Drake, astronomer (b. 1946)
September 22 – John H. Dick, basketball player and U.S. Navy admiral (b. 1918)
September 23
Orlando Brown Sr., American football player and successful litigant against the National Football League (b. 1970)
Danny Litwhiler, baseball player and college coach (b. 1916)
September 24
Richard Koch, physician, advocate for phenylketonuria neonate screening (b. 1921)
Tony Knap, American football coach (b. 1914)
September 26
David Zelag Goodman, film screenwriter (b. 1930)
Jerry Haynes, television actor (b. 1927)
September 27
Wilson Greatbatch, electrical engineer and the inventor of the implantable cardiac pacemaker (b. 1919)
Fritz Manes, film producer (b. 1932)
September 28 – Claude R. Kirk Jr., 36th governor of Florida from 1967 to 1971 (b. 1926)
September 30 –
Anwar al-Awlaki, terrorist, died in al-Jawf Governorate, Yemen (b. 1971)
Lee Davenport, physicist (b. 1915)
Peter Gent, American football player and writer (b. 1942)
Mike Heimerdinger, American football coach, died in Mexico (b. 1952)
Ralph M. Steinman, Canadian Nobel immunologist, died in New York City (b. 1943 in Canada)
Marv Tarplin, guitarist and songwriter (b. 1941)
October
October 1 – J. Willis Hurst, physician (b. 1920)
October 2 – Don Lapre, conartist (b. 1964)
October 3
George Harrison, swimmer (b. 1939)
Aden Meinel, astronomer (b. 1922)
October 4
Joseph R. Aceti, television sports director (b. 1935)
Doris Belack, actress (b. 1926)
Kenneth H. Dahlberg, World War II pilot (b. 1917)
October 5
Derrick Bell, lawyer and college administrator (b. 1930)
Steve Jobs, computer engineer (b. 1955)
Charles Napier, actor (b. 1936)
Fred Shuttlesworth, minister (b. 1922)
October 6 – William S. Dietrich II, executive (b. 1938)
October 7
Paul Kent, actor (b. 1930)
Andrew Laszlo, film cinematographer (b. 1926 in Hungary)
Julio Mario Santo Domingo, Colombian businessman, died in New York City (b. 1919 in Columbia)
Mildred Savage, author (b. 1919)
October 8
Al Davis, American football executive (b. 1929)
David Hess, actor and songwriter (b. 1936)
Milan Puskar, pharmaceutical executive (b. 1934)
Mikey Welsh, bassist (b. 1971)
Roger Williams, pianist (b. 1924)
October 10
Ray Aghayan, costume designer (b. 1928 in Iran)
Albert Rosellini, 15th governor of Washington from 1957 to 1965 (b. 1910)
October 11 – Bob Galvin, electronics executive (b. 1922)
October 12
Patricia Breslin, actress, wife of Art Modell (b. 1931)
Paul Leka, songwriter (b. 1943)
Dennis Ritchie, computer scientist (b. 1941)
October 13 – Barbara Kent, actress (b. 1907 in Canada)
October 14 – Morris Chafetz, psychiatrist (b. 1924)
October 16
Elouise P. Cobell, Native American litigant (b. 1945)
Pete Rugolo, Italian-born American television composer (b. 1915)
Dan Wheldon, British race car driver, died in Las Vegas (b. 1978)
October 17 – Edgar Villchur, audio equipment inventor (b. 1917)
October 18 – Norman Corwin, radio, film and television screenwriter (b. 1910)
October 20 – Barry Feinstein, photographer (b. 1931)
October 22 – Robert Pierpoint, television journalist (b. 1925)
October 23 – Herbert A. Hauptman, mathematician and Nobel laureate in chemistry (b. 1917)
October 25 – Tom McNeeley, boxer (b. 1937)
October 30 – David Utz, surgeon (b. 1923)
November
November 2
John F. Burke, physician (b. 1922)
Sid Melton, actor (b. 1917)
November 3
Matty Alou, baseball player (b. 1938 in the Dominican Republic)
Bob Forsch, baseball player (b. 1950)
Morris Philipson, publisher (b. 1926)
November 4
Andy Rooney, columnist (b. 1919)
Theadora Van Runkle, film costume designer (b. 1928)
November 6 – Hal Kanter, film and television writer (b. 1918)
November 7
Joe Frazier, boxer (b. 1944)
Andrea True, actress and singer (b. 1943)
November 8
Hal Bruno, magazine and television journalist (b. 1928)
Heavy D, Jamaican-born American rapper and actor (b. 1967 in Jamaica)
Bil Keane, cartoonist (b. 1922)
Ed Macauley, basketball player (b. 1928)
November 9 – Roger Christian, ice hockey player (b. 1935)
November 11 – William Aramony, charity organization fraudster (b. 1927)
November 15
Moogy Klingman, keyboardist and songwriter (b. 1950)
Oba Chandler, murderer (b. 1946; executed)
November 19 – Ira Michael Heyman, lawyer and college administrator (b. 1930)
November 21
George Gallup Jr., pollster (b. 1930)
Anne McCaffrey, American-born Irish writer (b. 1926)
November 22
Svetlana Alliluyeva, writer (b. 1926 in the Soviet Union)
Lynn Margulis, biologist (b. 1938)
November 23 – Jim Rathmann, race car driver (b. 1928)
November 24 – Jeno Paulucci, businessman (b. 1918)
November 25
Judy Lewis, actress, psychologist and daughter of Clark Gable and Loretta Young (b. 1935)
Frederik Meijer, businessman (b. 1919)
T. Franklin Williams, physician (b. 1921)
November 26 – Ron Lyle, boxer (b. 1941)
November 27 – Judd Woldin, composer (b. 1925)
November 28
Charles T. Kowal, astronomer (b. 1940)
Lloyd J. Old, physician (b. 1933)
November 30
Carl Robie, swimmer (b. 1945)
Bill Waller, 55th governor of Mississippi from 1972 to 1976 (b. 1926)
December
December 1
Bill McKinney, actor (b. 1931)
Alan Sues, screen actor (b. 1926)
December 4 – Patricia C. Dunn, businesswoman (b. 1953)
December 5
Paul M. Doty, biochemist (b. 1920)
Joe Lonnett, baseball player and coach (b. 1927)
December 6 – Dobie Gray, singer (b. 1940)
December 7
Harry Morgan, film and television actor (b. 1915)
Jerry Robinson, comic book artist (b. 1922)
December 12
Bert Schneider, television and film producer (b. 1933)
Gene Summers, (b. 1928)
December 13 – Russell Hoban, writer (b. 1925)
December 14 – Joe Simon, comic book writer, artist, editor and publisher (b. 1913)
December 15
Andy Carey, baseball player (b. 1931)
Christopher Hitchens, English writer, died in Houston, Texas (b. 1949 in the United Kingdom)
December 22 – Bennie Ellender, American football player and coach (b. 1925)
December 24
Cheetah-Mike, notable chimpanzee (b. c. 1931)
Jody Rainwater, musician and radio personality (b. 1920)
December 25
Ben Breedlove, internet personality (b. 1993)
Adrienne Cooper, klezmer and Yiddish vocalist (b. 1946)
Andrew Geller, architect (b. 1924)
Jim Sherwood, musician (b. 1942)
Simms Taback, author, graphic artist and illustrator (b. 1932)
December 26
Houston Antwine, American football player (b. 1939)
Pedro Armendáriz Jr., Mexican actor, died in New York City (b. 1940)
Joe Bodolai, television comedy writer and producer (b. 1948)
Sean Collins, surfer and surf forecaster (b. 1952)
Barbara Lea, singer and actress (b. 1929)
Sam Rivers, musician and composer (b. 1923)
James Rizzi, artist (b. 1950)
December 31 – Glenn Lord, editor (b. 1931)
See also
2011 in American music
2011 in American soccer
2011 in American television
List of American films of 2011
May 2011 tornado outbreak
Timeline of United States history (2010–present)
References
External links
Years of the 21st century in the United States
2010s in the United States
United States
United States |
13389095 | https://en.wikipedia.org/wiki/Floppyfw | Floppyfw | floppyfw was a Linux distribution running BusyBox to provide a firewall/gateway/router on a single bootable floppy disk, but was later available in CD format.
Reviews concluded that it was a very simple and reliable gateway/firewall that could be established on small to medium-sized networks at low cost and with ease. One independent study concluded that it was " best possible security provided by a floppy-based firewall" for the repurposing of old, redundant hardware into Linux-based firewalls or routers.
Requirements
Intel 80386SX or better
two network interface cards
1.44MB floppy drive
12MByte of RAM
Features
Floppyfw's features include:
Access lists
IP-masquerading (network address translation)
Connection tracked packet filtering
Quite advanced routing
Traffic shaping
PPPoE
Very simple packaging system. Is used for editors, PPP, VPN, traffic shaping and whatever comes up
Logging through klogd/syslogd, both local and remote
Serial support for console over serial port
DHCP server and DNS cache for internal networks
References
External links
floppyfw main web page
Free routing software
Gateway/routing/firewall distribution
Discontinued Linux distributions
Linux distributions |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.