id
stringlengths 3
8
| url
stringlengths 32
207
| title
stringlengths 1
114
| text
stringlengths 93
492k
|
---|---|---|---|
36449834 | https://en.wikipedia.org/wiki/Voxeet | Voxeet | Voxeet is VoIP web conferencing software that uses 3D high definition voice technology to produce immersive sound. It is available for Windows, iPhone and Android.
In 2012, it won the DemoGod Award at DEMO Spring '12.
Technology
In a real-life conversation, sounds follow a complex journey before reaching the listener's ears. Listeners's brains then analyse the sounds and their alterations to determine the source's position in the room. This enables them to identify the speaker even without seeing them or recognizing their voice. In a crowded, noisy room, the brain can isolate specific sounds and focus on decoding relevant information while disregarding other sounds, a phenomenon also called the cocktail party effect.
Traditional conferencing software uses one microphone that mixes all sounds, losing this location information and making it impossible for the brain to use spatial information to filter what it is hearing. This increases fatigue by requiring listeners to distinguish who is speaking, as well as what they are saying.
Voxeet uses multiple microphones to reproduce natural location mechanisms, creating a virtual 3D audio space that makes listening easier and less taxing.
Features
Voxeet allows up to 8 participants to have online conferences with high definition voice and 3D sound on Windows computers, and iPhone and Android phones. Participants use headsets to experience immersion. The system provides:
VoiP HD 3D Sound
Visual Cues to know who is speaking
Mobile phone integration with one-click transfer
References
VoIP software
Teleconferencing
Web conferencing
Software companies based in California
VoIP services
VoIP companies of the United States
Companies based in Marin County, California
Windows multimedia software
Software companies of the United States |
887806 | https://en.wikipedia.org/wiki/Far%20Manager | Far Manager | Far Manager (short for File and ARchive Manager) is an orthodox file manager for Microsoft Windows and is a clone of Norton Commander. Far Manager uses the Win32 console and has a keyboard-oriented user interface (although limited mouse operation, including drag-and-drop, is possible).
Far Manager was created by Eugene Roshal, and has been under development by the Far Group since 2000. The project's Unicode branches (2.0 and 3.0) are open-source (under the BSD-3-Clause license). All branches are available as 32- and 64-bit builds. Far Manager is often viewed as a very customizable file manager and text editor, and a free alternative to Total Commander.
Features
Far Manager features an internal viewer and editor, customizable user menus, tree views, file search, compare, integrated help, and a task switcher for its tools. Its standard functionality can be expanded with macros (which allow scripting) and plugins.
Far Manager's default interface combines two file panels with a command prompt. Panels may be fully customized as to which columns are shown and in which order, and operations may be done to and from either panel. The file panels support wildcard selection, advanced filtering, sorting and highlighting. The file panels and the command prompt are both active at the same time (they are interacted with using different keys), and most features can be accessed using keyboard shortcuts (the key bar at the bottom displays the function key actions for the currently held down modifier keys).
Extensibility
Far's standard functionality can be greatly extended with macros (written in Lua scripting language, primarily used to record keypress sequences) and plugins. Standard plugins installed by default include FTP, Windows network, extensible archive file support and temporary panel (sandbox) virtual file systems, a process list, print manager, filename case converter, and several editor plugins to format, wrap, and otherwise alter text.
Third-party plugins are available from the PlugRing repository and plugin announcement forum (in Russian). Some popular plugins include regular expression search and replace (both in the text editor and across multiple files), syntax highlighting and auto-completion for the text editor, SFTP/SCP and Windows Registry virtual file systems, 7-zip integration, a hex editor and a picture viewer (which overlays a DirectX surface over Far's console window). Wrappers are available which allow using some Total Commander plugins with Far Manager, and vice versa. Plugins can be developed using the native C/Pascal API, or using wrappers which permit plugin development in other platforms and languages, such as .NET (including PowerShell), and Lua.
Linux and MacOS version
far2l project develops Linux and MacOS ports of Far Manager. As of February 2021, the port successfully builds and the most common functions work. Among ported and working plugins are Colorer, MultiArc and TmpPanel. There is also new NetRocks plugin implementing network connections via FTP, SFTP, SCP, SMB, NFS and WebDAV.
far2l also supports "terminal extensions". Although FAR2L itself is a TUI application, it can run in GUI or TTY backends modes. While TTY backend can run in any terminal (like, for example, xterm), it can also run inside a built-in terminal of GUI mode far2l, gaining capabilities not available on "regular" terminals (such as recognizing all possible keyboard key combinations, even with keyup events). Also, the "host" far2l can provide shared clipboard access and desktop notifications. Those extensions can be used by running TTY far2l inside an SSH client session opened in "host" GUI far2l (or by using SFTP/SCP protocols in NetRocks to run remote far2l via "execute remote command" feature).
Licensing
Far Manager is available under the BSD-3-Clause license.
Originally, Far Manager was available as 40 days shareware for everyone except for citizens of the former USSR countries, who could use it as freeware for non-commercial use only. On 26 October 2007, the source code for the Unicode development version (1.80, later renamed to 2.0) was released under the BSD-3-Clause license. On 17 May 2010 the 1.x branch has also been released under the BSD-3-Clause license, though without source code.
See also
Comparison of file managers
WinSCP plugin
ConEmu
References
External links
1996 software
Free software programmed in C++
Orthodox file managers
Free file managers
Free FTP clients
Windows-only free software
Formerly proprietary software
Lua (programming language)-scriptable software
Software using the BSD license |
9855609 | https://en.wikipedia.org/wiki/PostScript%20fonts | PostScript fonts | PostScript fonts are font files encoded in outline font specifications developed by Adobe Systems for professional digital typesetting. This system uses PostScript file format to encode font information.
"PostScript fonts" may also separately be used to refer to a basic set of fonts included as standards in the PostScript system, such as Times New Roman, Helvetica, and Avant Garde.
History
Type 1 and Type 3 fonts, though introduced by Adobe in 1984 as part of the PostScript page description language, did not see widespread use until March 1985 when the first laser printer to use the PostScript language, the Apple LaserWriter, was introduced.
Even then, in 1985, the outline fonts were resident only in the printer, and the screen used bitmap fonts as substitutes for outline fonts.
Although originally part of PostScript, Type 1 fonts used a simplified set of drawing operations compared to ordinary PostScript (programmatic elements such as loops and variables were removed, much like PDF), but Type 1 fonts added "hints" to help low-resolution rendering. Originally, Adobe kept the details of their hinting scheme undisclosed and used a (simple) encryption scheme to protect Type 1 outlines and hints, which still persists today (although the encryption scheme and key has since been published by Adobe). Despite these measures, Adobe's scheme was quickly reverse-engineered by other players in the industry. Adobe nevertheless required anyone working with Type 1 fonts to license their technology.
Type 3 fonts allowed for all the sophistication of the PostScript language, but without the standardized approach to hinting (though some companies such as ATF implemented their own proprietary schemes) or an encryption scheme. Other differences further added to the confusion.
The cost of the licensing was considered very high at this time, and Adobe continued to stonewall on more attractive rates. It was this issue that led Apple to design their own system, TrueType, around 1991. Immediately following the announcement of TrueType, Adobe published ”Adobe type 1 font format”, a detailed specification for the format. Font development tools such as Fontographer added the ability to create Type 1 fonts. The Type 2 format has since been used as one basis for the modern OpenType Format.
Technology
By using PostScript (PS) language, the glyphs are described with cubic Bézier curves (as opposed to the quadratic curves of TrueType), and thus a single set of glyphs can be resized through simple mathematical transformations, which can then be sent to a PostScript-ready printer. Because the data of Type 1 is a description of the outline of a glyph and not a raster image (i.e. a bitmap), Type 1 fonts are commonly referred to as "outline fonts," as opposed to bitmap fonts. For users wanting to preview these typefaces on an electronic display, small versions of a font need extra hints and anti-aliasing to look legible and attractive on screen. This often came in the form of an additional bitmap font of the same typeface, optimized for screen display. Otherwise, in order to preview the Type 1 fonts in typesetting applications, the Adobe Type Manager utility was required.
Font type
Type 0
Type 0 is a "composite" font format - as described in the PostScript Language Reference Manual, 2nd Edition. A composite font is composed of a high-level font that references multiple descendant fonts.
Type 1
Type 1 (also known as PostScript, PostScript Type 1, PS1, T1 or Adobe Type 1) is the font format for single-byte digital fonts for use with Adobe Type Manager software and with PostScript printers. It can support font hinting.
It was originally a proprietary specification, but Adobe released the specification to third-party font manufacturers provided that all Type 1 fonts adhere to it.
Type 1 fonts are natively supported in Mac OS X, and in Windows 2000 and later via the GDI API. (They are not supported in the Windows GDI+, WPF or DirectWrite APIs.)
Adobe announced on 27 January 2021 that they would end support for Type 1 fonts in Adobe products after January 2023. Support for Type 1 fonts in Photoshop is to end in 2021.
Type 2
Type 2 is a character string format that offers a compact representation of the character description procedures in an outline font file. The format is designed to be used with the Compact Font Format (CFF). The CFF/Type2 format is the basis for Type 1 OpenType fonts, and is used for embedding fonts in Acrobat 3.0 PDF files (PDF format version 1.2).
Type 3
Type 3 font (also known as PostScript Type 3 or PS3, T3 or Adobe Type 3) consists of glyphs defined using the full PostScript language, rather than just a subset. Because of this, a Type 3 font can do some things that Type 1 fonts cannot do, such as specify shading, color, and fill patterns. However, it does not support hinting. Adobe Type Manager did not support Type 3 fonts, and they are not supported as native WYSIWYG fonts on any version of Mac OS or Windows.
Type 4
Type 4 is a format that was used to make fonts for printer font cartridges and for permanent storage on a printer's hard disk. The character descriptions are expressed in the Type 1 format. Adobe does not document this proprietary format.
Type 5
Type 5 is similar to the Type 4 format but is used for fonts stored in the ROMs of a PostScript printer. It is also known as CROM font (Compressed ROM font).
Types 9, 10, 11
Ghostscript referred them as CID font types 0, 1, and 2 respectively, documented in Adobe supplements. Types 9, 10, and 11 are CID-keyed fonts for storing Types 1, 3, and 42, respectively.
Type 14
Type 14, or the Chameleon font format, is used to represent a large number of fonts in a small amount of storage space such as printer ROM. The core set of Chameleon fonts consists of one Master Font, and a set of font descriptors that specify how the Master Font is to be adjusted to give the desired set of character shapes for a specific typeface.
Adobe does not document the Type 14 format. It was introduced with PostScript 3 in 1997, and de-emphasized in later years as storage became cheaper.
Type 32
Type 32 is used for downloading bitmap fonts to PostScript interpreters with version number 2016 or greater. The bitmap characters are transferred directly into the interpreter's font cache, thus saving space in the printer's memory.
Type 42
The Type 42 font format is a PostScript wrapper around a TrueType font, allowing PostScript-capable printers containing a TrueType rasterizer (which was first implemented in PostScript interpreter version 2010 as an optional feature, later standard) to print TrueType fonts. Support for multibyte CJK TrueType fonts was added in PostScript version 2015. The out-of-sequence choice of the number 42 is said to be a jesting reference to The Hitchhiker's Guide to the Galaxy, where 42 is the Answer to Life, the Universe, and Everything.
Core Font Set
In addition to font types, PostScript specifications also defined the Core Font Set, which dictates the minimum number of fonts, and character sets to be supported by each font.
In original PostScript, there are 13 base fonts:
Courier (Regular, Oblique, Bold, Bold Oblique)
Helvetica (Regular, Oblique, Bold, Bold Oblique)
Times (Roman, Italic, Bold, Bold Italic)
Symbol
In PostScript Level 2, there are 35 fonts, which is a superset of the 13 base fonts:
ITC Avant Garde Gothic (Book, Book Oblique, Demi, Demi Oblique)
ITC Bookman (Light, Light Italic, Demi, Demi Italic)
Courier (Regular, Oblique, Bold, Bold Oblique)
Helvetica (Regular, Oblique, Bold, Bold Oblique, Condensed, Condensed Oblique, Condensed Bold, Condensed Bold Oblique)
New Century Schoolbook (Roman, Italic, Bold, Bold Italic)
Palatino (Roman, Italic, Bold, Bold Italic)
Symbol
Times (Roman, Italic, Bold, Bold Italic)
ITC Zapf Chancery (Medium Italic)
ITC Zapf Dingbats
As a result of this, many computer operating systems contain these fonts or clones of them (as on the GhostScript package).
In PostScript 3, 136 fonts are specified, which includes the standard 35 fonts; core fonts in Windows 95, Windows NT and Macintosh; selected fonts from Microsoft Office and the HP 110 font set. New fonts include:
Albertus (Light, Roman, Italic)
Antique Olive (Roman, Italic, Bold, Compact)
Apple Chancery
Arial (Regular, Italic, Bold, Bold Italic)
Bodoni (Roman, Italic, Bold, Bold Italic, Poster, Poster Compressed)
Carta (a dingbat)
Chicago
Clarendon (Light, Roman, Bold)
Cooper Black, Cooper Black Italic
Copperplate Gothic (32BC, 33BC)
Coronet
Eurostile (Medium, Bold, Extended No.2, Bold Extended No.2)
Geneva
Gill Sans (Light, Light Italic, Book, Book Italic, Bold, Bold Italic, Extra Bold, Condensed, Condensed Bold)
Goudy (Oldstyle, Oldstyle Italic, Bold, Bold Italic, Extra Bold)
Helvetica (Narrow, Narrow Oblique, Narrow Bold, Narrow Bold Oblique)
Hoefler Text (Roman, Italic, Black, Black Italic), Hoefler Ornaments
Joanna (Roman/Regular, Italic, Bold, Bold Italic)
Letter Gothic (Regular, Slanted, Bold, Bold Slanted)
ITC Lubalin Graph (Book, Oblique, Demi, Demi Oblique)
ITC Mona Lisa Recut
Marigold
Monaco
New York
Optima (Roman, Italic, Bold, Bold Italic)
Oxford
Stempel Garamond (Roman, Italic, Bold, Bold Italic)
Tekton (Regular)
Times New Roman (Regular, Italic, Bold, Bold Italic)
Univers (45 Light, 45 Light Oblique, 55, 55 Oblique, 65 Bold, 65 Bold Oblique, 57 Condensed, 57 Condensed Oblique, 67 Condensed Bold, 67 Condensed Bold Oblique, 53 Extended, 53 Extended Oblique, 63 Extended Bold, 63 Extended Bold Oblique)
Wingdings
In PDF, the following 14 Type 1 fonts are defined as the standard fonts:
Courier (Regular, Oblique, Bold, Bold Oblique)
Helvetica (Regular, Oblique, Bold, Bold Oblique)
Symbol
Times (Roman, Italic, Bold, Bold Italic)
ITC Zapf Dingbats
However, in recent versions of Adobe Acrobat Reader, Helvetica and Times were internally replaced by Arial and Times New Roman respectively.
Character sets
Although PostScript fonts can contain any character set, there are character sets specifically developed by Adobe, which are used by fonts developed by Adobe.
Adobe Western 2
It includes a basic character set containing upper and lowercase letters, figures, accented characters, and punctuation. These fonts also contain currency symbols (cent, dollar, euro, florin, pound sterling, yen), standard ligatures (fi, fl), common fractions (1/4, 1/2, 3/4), common mathematics operators, superscript numerals (1,2,3), common delimiters and conjoiners, and other symbols (including daggers, trademark, registered trademark, copyright, paragraph, litre and estimated symbol). Compared to the ISO-Adobe character set, Western 2 also adds 17 additional symbol characters: euro, litre, estimated, omega, pi, partialdiff, delta, product, summation, radical, infinity, integral, approxequal, notequal, lessequal, greaterequal, and lozenge.
Fonts with an Adobe Western 2 character set support most western languages including Afrikaans, Basque, Breton, Catalan, Danish, Dutch, English, Finnish, French, Gaelic, German, Icelandic, Indonesian, Irish, Italian, Norwegian, Portuguese, Sami, Spanish, Swahili and Swedish.
This standard superseded ISO-Adobe as the new minimum character set standard as implemented in OpenType fonts from Adobe.
Adobe CE
Fonts with an Adobe CE character set also include the characters necessary to support the following central European languages: Croatian, Czech, Estonian, Hungarian, Latvian, Lithuanian, Polish, Romanian, Serbian (Latin), Slovak, Slovenian and Turkish.
Adobe-GB1
This Simplified Chinese character collection provides support for the GB 1988–89, GB 2312–80, GB/T 12345–90, GB 13000.1-93, and GB 18030-2005 character set standards. Supported encodings include ISO-2022, EUC-CN, GBK, UCS-2, UTF-8, UTF-16, UTF-32, and the mixed one, two- and four-byte encoding as published in GB 18030-2005.
Adobe-CNS1
This Traditional Chinese character collection provides support for the Big-5 and CNS 11643-1992 character set standards. It also includes support for a number of extensions to Big-5, which contain characters used mainly in the Hong Kong locale. Primary supported Big-5 extensions include HKSCS.
Supported encodings include ISO-2022, EUC-TW, Big Five, UCS-2, UTF-8, UTF-16, and UTF-32.
In Adobe-CNS1-7, 23 additional glyphs were added, with 25 additional mappings for its Unicode CMap resources.
Adobe-Japan1
It is a series of character sets developed for Japanese fonts. Adobe's latest, the Adobe-Japan1-6 set covers character sets from JIS X 0208, ISO-2022-JP, Microsoft Windows 3.1 J, JIS X 0213:2004, JIS X 0212-1990, Kyodo News U-PRESS character set.
Adobe-Japan2
It was originally as an implementation of JIS X 0212-1990 character set standard and the Macintosh extensions, but with the introduction of Adobe-Japan1 supplement 6 (Adobe-Japan1-6) standard, Adobe-Japan2-0 became obsolete.
Adobe-Korea1
This Korean character collection provides support for the KS X 1001:1992 and KS X 1003:1992 character set standards, and their selected corporate variations. Supported encodings include ISO-2022-KR, EUC-KR, Johab, UHC, UCS-2, UTF-8, UTF-16, and UTF-32.
ISO-Adobe
Fonts with an ISO-Adobe character set support most western languages including: Afrikaans, Basque, Breton, Catalan, Danish, Dutch, English, Finnish, French, Gaelic, German, Icelandic, Indonesian, Irish, Italian, Norwegian, Portuguese, Sami, Spanish, Swahili and Swedish. This is the standard character set in most PostScript Type 1 fonts from Adobe.
File formats
CID
The CID-keyed font (also known as CID font, CID-based font, short for Character Identifier font) is a font structure, originally developed for PostScript font formats, designed to address a large number of glyphs. It was developed to support pictographic East Asian character sets, as these comprise many more characters than the Latin, Greek and Cyrillic writing systems.
Adobe developed CID-keyed font formats to solve problems with the OCF/Type 0 format, for addressing complex Asian-language (CJK) encoding and very large character sets. CID-keyed internals can be used with the Type 1 font format for standard CID-keyed fonts, or Type 2 for CID-keyed OpenType fonts.
CID-keyed fonts often reference "character collections," static glyph sets defined for different language coverage purposes. Although in principle any font maker may define character collections, Adobe's are the only ones in wide usage. Each character collection has an encoding which maps Character IDs to glyphs. Each member glyph in a character collection is identified by a unique character identifier (CID). Such CIDs are generally supplemental to other encodings or mappings such as Unicode.
Character collections are uniquely named by registry, ordering and supplement, such as "Adobe-Japan1-6." The registry is the developer (such as Adobe). The so-called "ordering" gives the purpose of the collection (for example, "Japan1"). The supplement number (such as 6) indicates incremental additions: for a given language, there may be multiple character collections of increasing size, each a superset of the last, using a higher supplement number. The Adobe-Japan1-0 collection is 8284 glyphs, while Adobe-Japan1-6 is 23,058 glyphs.
CID-keyed fonts may be made without reference to a character collection by using an "identity" encoding, such as Identity-H (for horizontal writing) or Identity-V (for vertical). Such fonts may each have a unique character set, and in such cases the CID number of a glyph is not informative; generally the Unicode encoding is used instead, potentially with supplemental information.
CID-keyed fonts internally have their character sets divided into "rows," with the advantage that each row may have different global hinting parameters applied.
In theory it would be possible to make CID-keyed OpenType versions of western fonts. This would seem desirable for some such fonts because of the hinting advantages. However, according to Adobe, much of the software infrastructure (applications, drivers, operating systems) makes incorrect assumptions about CID-keyed fonts in ways that makes such fonts behave badly in real-world usage.
Adobe ClearScan technology (as from Acrobat 9 Pro) creates custom Type1-CID fonts to match the visual appearance of a scanned document after optical character recognition (OCR). ClearScan does not replace the fonts with system fonts or substitute them by Type1-MM (as in Acrobat 8 and earlier versions), but uses these newly created custom fonts. The custom fonts are embedded in the PDF file (this is mandatory). In Acrobat DC, it is no more called "ClearScan" but "Recognize Text - Editable Text & Images", and it is now possible to edit the text.
Compact Font Format
Compact Font Format (also known as CFF font format, Type 2 font format, or CFF/Type 2 font format) is a lossless compaction of the Type 1 format using Type 2 charstrings. It is designed to use less storage space than Type 1 fonts, by using operators with multiple arguments, various predefined default values, more efficient allotment of encoding values and shared subroutines within a FontSet (family of fonts).
The so-called PostScript or Type 1 flavor of OpenType fonts, also called OpenType CFF, contains glyph outlines and hints in a CFF table.
CFF fonts can be embedded in PDF files, starting with PDF version 1.2. It is the usual approach to representing a Type 1 font within PDF.
CID-keyed fonts can be represented within CFF with Type 2 charstrings for CID-keyed OpenType fonts.
A Type 1 font can be losslessly converted into CFF/Type2 format and back.
Multiple Master
Multiple master fonts (or MM fonts) are an extension to Adobe Systems' Type 1 PostScript fonts. Multiple master fonts contain one or more "masters" — that is, original font styles, e.g. a light, a regular and a bold version — and enable a user to interpolate these font styles along a continuous range of "axes." While Multiple Master fonts are not common in end user fonts anymore, they still play an important role when developing complex font families.
OpenType
PostScript glyph data can be embedded in OpenType font files, but OpenType fonts are not limited to using PostScript outlines. PostScript outlines in OpenType fonts are encoded in the Type2 Compact Font Format (CFF).
OpenType conversion
When Adobe converted PostScript Type 1 and Type 1 multiple master fonts to OpenType CFF format, they were made based on the last Type 1/MM versions from the Adobe Type Library fonts. In addition to file format change, there are other changes:
All alphabetic fonts had 17 additional characters included: the euro (some had already gotten this in Type 1), litre, estimated, and the 14 Mac "symbol substitution" characters. Symbol substitution was a scheme used on Mac OS to deal with the fact that the standard "ISO-Adobe" character set omitted certain characters which were part of the MacRoman character set. When one of these 14 characters was typed in a Type 1 font with standard encoding, both ATM and the printer driver would get a generic glyph in the Times style from the Symbol font. In the OpenType conversion, these characters were built into every font, getting some degree of font-specific treatment (weight and width).
Fonts that had unkerned accented characters had additional kerning to deal with accented characters.
Font families that included separate Type 1 expert fonts or Cyrillic fonts have these glyphs built into the "base font" in their OpenType counterparts.
Multiple master fonts were converted to individual OpenType fonts; each font consisting of a former Multiple Master instance.
For many Adobe Originals fonts, particularly those designed by Robert Slimbach, Adobe did some degree of redesign along with the conversion to OpenType.
The typeface Helvetica Narrow was not converted to OpenType, because the Type 1 original was a mathematically squished version of Helvetica, rather than an actually designed condensed typeface. This was originally done to conserve ROM space in PostScript printers.
As a result of the above changes, Adobe no longer guarantees metric compatibility between Type 1 and OpenType fonts. However, Adobe claims the change is minimal for Adobe (not Adobe Originals) fonts, if:
Text is written in English
The formatted text contains only non-accented characters
Only characters that were present in the old fonts are used, without the former Symbol substitution characters
Applications are used which base line spacing solely on point size or leading, and not on the bounding box of the font
Original Composite Font
Original Composite Font format (which uses a Type 0 file structure) was Adobe's first effort to implement a format for fonts with large character sets, debuted with PostScript level 2.
Adobe then developed the CID-keyed font file format which was designed to offer better performance and a more flexible architecture for addressing the complex Asian-language encoding and character set issues. Adobe does not document or support OCF font format.
OCF font metrics are described in Adobe Composite Font Metrics file.
Adobe Font Metrics, Adobe Composite Font Metrics, Adobe Multiple Font Metrics
Adobe Font Metrics (AFM), Adobe Composite Font Metrics (ACFM), Adobe Multiple Font Metrics (AMFM) files contain general font information and font metrics information for the font program. These files are generally used directly only in Unix environments.
An AFM file provides both global metrics for a font program and the metrics of each individual character.
The metrics of a multiple master font are described by one AMFM file, which specifies the control data and global font information, plus one AFM file for each of the master designs in the font.
An ACFM file provides information about the structure of a composite font. Specifically, the global metrics of the composite font program and the global metrics of each of its immediately descendent font programs. ACFM file does not associate with a base font, but act as the top-level structure of a composite font. The character metrics of individual characters in the composite font are described completely by one of more associated AFM files.
The formats are sufficiently similar that a compliant parser can parse AFM, ACFM, and AMFM files.
Printer Font ASCII
Printer Font ASCII (PFA) is a pure ASCII version of a Type 1 font program, containing in particular a font's glyph data. It is pure PostScript code without any sort of wrapper, and can be copied in full into a PS file to define the font to the PS interpreter. PFA is the preferred format for Type 1 fonts used in UNIX environments, and usually carries a ".PFA" file name extension.
Though these files syntactically can contain arbitrary PostScript code, they usually follow a rather rigid formula in order to allow readers that are less than full PostScript interpreters to process them (for example to subset the font). The first section of the file is called the clear text portion, and begins constructing those data structures that define the font in the PostScript interpreter; the information here are things Adobe in the 1980s were comfortable having public, and much of it would be present also in the companion AFM file. The last two operators in the clear text portion are currentfile eexec (encrypted exec), which instructs the interpreter to switch to reading the current file as an encrypted stream of instructions. The following encrypted portion is again PostScript code for finishing constructing the font data structures—a lot of it consists of charstrings, which is rather a kind of bytecode, but at the font definition stage those are merely data stored in the font—even if that code is encrypted (which produces arbitrary byte values) and then hex-encoded to ensure the overall ASCII nature of the file. The data structures created here are marked noaccess to make them inaccessible for subsequent PostScript code. The final action in the encrypted portion is to switch back to reading the file normally, but since eexec would read ahead a bit it was impossible to know at exactly which character normal processing would resume. Therefore, PFA files end with a trailer of 512 zeroes followed by a cleartomark operator that throws away any operands that might have ended up on the stack as a result of interpreting those zeroes starting from a random position.
Printer Font Binary
Printer Font Binary (PFB) is a binary PostScript font format created by Adobe Systems, usually carrying ".PFB" file name extension. It contains a font's glyph data.
The PFB format is a lightweight wrapper to allow more compact storage of the data in a PFA file. The file consists of a number of blocks, each of which is marked as ASCII or binary. To recreate the corresponding PFA file, one takes the ASCII blocks verbatim and hex-encodes the binary blocks. The binary blocks are those which make up the encrypted portion of the font program.
LaserWriter Font
LaserWriter Font (LWFN) is a binary PostScript font format used on Classic Mac OS, conceptually similar to the Printer Font Binary format but using the Mac OS resource fork data structure rather than a custom wrapper for the font data. It contains the glyph data for one font.
LWFN is the file type code for this kind of file. It would not carry any extension, and the file name would be an abbreviation of the PostScript name of the font, according to a 5+3+3+... formula: the name is read as being in CamelCase and split into subwords, up to 5 letters are kept from the first subword, and up to 3 letters of any subsequent subword. Palatino-BoldItalic would thus be found in the file PalatBolIta.
Printer Font Metric
Printer Font Metric (PFM) is a binary version of AFM, usually carrying ".PFM" file name extension. It contains font metric information.
The PFM format is documented in the Windows 3.1 "Printers and Fonts Kit" help file (PFK31WH.HLP). Some details are also covered in the Windows 3.1 "Device Drivers Adaptation Guide" help file (DDAG31WH.HLP). Both of those documents are part of the Windows 3.1 Device Development Kit (DDK), which is still available (October 2008) to MSDN subscribers.
.INF
.inf (INFormation) files contain application-specific information in plain ASCII text, such as font menu names for Windows and DOS-based applications. When a font is installed in Windows, the ATM Installer software takes the AFM and the INF file as input and generates the required PFM file at installation time. The AFM and INF files are not installed in the user's system.
.MMM
.MMM files are used for the metric data needed by multiple master fonts for the Windows environment.
.OFM
.OFM is the extension used by OS/2 for its version of binary font metrics file, starting from version 2.1.
Support for Microsoft Windows
Windows 95, Windows 98, Windows NT 4 and Windows Me do not support Type 1 fonts natively. Adobe Type Manager is needed in order to use these fonts on these operating systems. Windows 2000, Windows XP and Windows Vista support Type 1 fonts natively through GDI calls. The Windows Presentation Foundation introduced in Windows Vista, which is also available for Windows XP however drops support for Type 1 fonts, in favor of Type 2 fonts.
For Microsoft Windows platforms that natively support PostScript, only binary PostScript and OpenType file formats are supported.
Windows Presentation Foundation (formerly codenamed Avalon) in Windows Vista supports rasterizing OpenType CFF/Type 2 fonts, whereas Type 1 fonts will still be supported in GDI, but not in GDI+.
PostScript font utilities
The t1utils font utility package by I. Lee Hetherington and Eddie Kohler provides tools for decoding Type 1 fonts into a human-readable, and editable format (t1disasm), reassembling them back into fonts (t1asm), for converting between the ASCII and binary formats (t1ascii and t1binary), and for converting from Macintosh PostScript format to Adobe PostScript font format (unpost).
See also
PostScript Standard Encoding
Computer font
OpenType
TrueType
Page description language
References
External links
Font format specifications
Adobe Type 1 Font Format (PDF: 445 KB)
Adobe Technical Note #5015: Type 1 Font Format Supplement (PDF: 225 KB)
Adobe Technical Note #5176: The CFF (Compact Font Format) Specification, (PDF: 251 KB)
Adobe Technical Note #5177: Type 2 Charstring Format (PDF: 212 KB)
Adobe Technical Note #5012: The Type 42 Font Format Specification
Adobe Technical Note #5014: Adobe CMap and CIDFont Files Specification
Adobe Technical Note #5004: Adobe Font Metrics (AFM) File Format Specification
General font information
Font Formats Q&A
Adobe font technical notes
Adobe CID fonts
Adobe Technical Note #5092: CID-Keyed Font Technology Overview
Adobe Technical Note #5178: Building PFM Files for PostScript-Language CJK Fonts
Adobe Technical Note #5641: Enabling PDF Font Embedding for CID-Keyed Fonts
Character set information
Common Character Sets
Adobe Latin Character Sets
Adobe Greek Character Sets
Adobe Cyrillic Character Sets
Adobe Technical Note #5078: Adobe-Japan1-6 Character Collection for CID-Keyed Fonts
Adobe Technical Note #5079: The Adobe-GB1-5 Character Collection
Adobe Technical Note #5080: The Adobe-CNS1-6 Character Collection
Adobe Technical Note #5093: The Adobe-Korea1-2 Character Collection
Adobe Technical Note #5094: Adobe CJKV Character Collections and CMaps for CID-Keyed Fonts
Adobe Technical Note #5097: Adobe-Japan2-0 Character Collection for CID-Keyed Fonts
Core font information
PostScript Type 1 fonts
Adobe Technical Note #5609: PostScript 3 Core Font Set Overview
The Adobe PostScript 3 Font Set
Apache FOP: fonts
Miscellaneous
comp.fonts FAQ: OS/2 2.1 and beyond
comp.lang.postscript FAQ
About Fonts
Fonts, Fonts, and more Fonts!
Font formats
Digital typography |
13430845 | https://en.wikipedia.org/wiki/1984%20Network%20Liberty%20Alliance | 1984 Network Liberty Alliance | 1984 Network Liberty Alliance is a loose group of software programmers, artists, social activists and militants, interested in computers and networks and considering them tools to empower and link the various actors of the social movement around the world. They are part of the hacktivism movement.
History
The group was formed in November 1984, during a "debriefing" workshop of the European Peace Marches on the Hartmannswillerkopf in Alsace, France, following the struggle against the installation of Pershing II and SS-20 nuclear missiles in Germany (Mutlangen). From 1978 to 1985, this European-wide peace movement had mobilized millions of citizens, protesting the arms race, the growth of military spending and joining in the Campaign for Nuclear Disarmament.
In reference to George Orwell's novel 1984 and to the Rebel Alliance of the movie Star Wars, the group chose the (ironic) name 1984 Network Liberty Alliance. Founders are André Gorz, French philosopher, Dov Lerner, MIT computer graduate and disciple of Saul Alinsky, as well as Gregoire Seither, free radio activist, Frauke Hahn who had led the woman's resistance ('Commons Women') at Greenham Common Women's Peace Camp, David Szwarc from the Israeli Peace movement and Adama Drasiweni, computer graduate from the University of London, future founder of N'DA, Africa's first independent telecom company.
Other members, like Australian co-founder of Indymedia Matthew Arnison, south-African anti-apartheid militant Peter Makema and Israeli peace activists Uri Avnery and Michel Warchawsky, joined later on. All were active in various social movements and peace initiatives in Europe and the USA.
When Richard Stallman published the GNU Manifesto in March 1985 and called for participation and support, Dov Lerner and Gregor Seither started organizing regular meetings and workshops in order to train activists in the use of information technology and gather support for the Free Software movement. Adama Drasiweni, owner of a computer business in London, set up similar workshops in Kibera, a giant slum outside of Nairobi, Kenya.
In France, the Alliance used the network of the Maisons de l'Informatique that had been set up under the presidency of François Mitterrand as well as the computer labs of Paris University, who access to academic networks and Billboard Systems. The group ran a number of BBS, among them 'Pom-Pom', devoted to the Apple Macintosh and 'PeaceNet', an "electronic pow-wow" to help social activists and community organizers exchange information around the world, offering free mail accounts and file hosting services.
Very soon the issues of free speech, software patents, civil rights and surveillance became some of the major topics addressed by the Alliance, the group being accused of hacking and forking software. One of the BBS run by the group 'Gaia rising', was accused by the German government of being a meeting point for radical environmental activists as well as anarchists.
The Liberty Alliance was particularly active in the popular worldwide resistance to Multilateral Agreement on Investments (MAI) in the mid-1990s, networking multiple groups and providing "open cyberspaces" for activists to share information and experience.
In the summer of 1998 the first alternative media centre was set up in a bus in Birmingham, United Kingdom during the Global Street Party, an international day of protest and festive actions coinciding with the 24th G8 Summit. The alternative media centres also provided interpretation and language services to international militant meetings, like during the July 1999 Global Carnival against Capitalism, or J18 London, a giant rally and party in the heart of the London City, meant as a counter-summit to the 25th G8 Summit in Köln, Germany.
Members of the team travelled to the WTO Ministerial Conference of 1999 in Seattle to set up an alternative media centre during the WTO Ministerial Conference of 1999 protest activity. The project joined with that of other media activists and, out of the necessity to bypass the corporate media and report on a WTO conference but also to show how one could bypass corporate software (Windows), the independent media agency Indymedia was born.
Language diversity and the lack of interpreters led a number of activists to start thinking about a way to help militants from around the world to bridge the language barrier. Three years later, during the 27th G8 summit in Genoa, Italy, this would lead to the creation of the Babels network of volunteer interpreters and translators for linguistic diversity and social change.
Members of the Network Liberty Alliance have worked on social IT projects in North America (San Francisco Free Software movement, Chicago community cybercenter), Central America (Nicaragua, Guatemala, Panama), the Middle-East (Egypt, Israeli Civil Administration area) as well as in the Asia Pacific region (Indonesia, Nouvelle-Calédonie, Australia, Papua Niugini) and Africa (Malawi, Mali, Cameroon).
When the Berlin Wall fell in 1989, another member of the Alliance, Stefan Ostrowsky, transferred "NET(te) Bude" (a play on the word NET like network, and 'Nette Bude', nice crashpad in German), a community IT training centre to East-Berlin, thus becoming the first 'Cybercafe' behind the iron curtain.
See also
milw0rm
Anti-nuclear movement
References
Anonymous, J18 1999 Our resistance is as transnational as capital, Days of Dissent, 2004.
Anonymous, Friday June 18th 1999, Confronting Capital And Smashing The State! , article in Do or Die 8.
Wat Tyler(2003), Dancing at the Edge of Chaos: a Spanner in the Works of Global Capitalism, in, Notes From Nowhere (Eds.) We Are Everywhere: the Irresistible Rise of Global Anticapitalism188-195. Verso, London/New York 2003
Complete list of actions worldwide
J18 Timeline London
Anti–Iraq War groups
Campaign for Nuclear Disarmament
Political campaigns
Anti–nuclear weapons movement
Advocacy groups
Hacking (computer security)
Politics and technology
Organizations established in 1984 |
65940597 | https://en.wikipedia.org/wiki/Comparison%20of%20user%20features%20of%20messaging%20platforms | Comparison of user features of messaging platforms | Comparison of user features of messaging platforms refers to a comparison of all the various user features of various electronic instant messaging platforms. This includes a wide variety of resources; it includes standalone apps, platforms within websites, computer software, and various internal functions available on specific devices, such as iMessage for iPhones.
This entry includes only the features and functions that shape the user experience for such apps. A comparison of the underlying system components, programming aspects, and other internal technical information, is outside the scope of this entry.
Overview and background
Instant messaging technology is a type of online chat that offers real-time text transmission over the Internet. A LAN messenger operates in a similar way over a local area network. Short messages are typically transmitted between two parties when each user chooses to complete a thought and select "send". Some IM applications can use push technology to provide real-time text, which transmits messages character by character, as they are composed. More advanced instant messaging can add file transfer, clickable hyperlinks, Voice over IP, or video chat.
Non-IM types of chat include multicast transmission, usually referred to as "chat rooms", where participants might be anonymous or might be previously known to each other (for example collaborators on a project that is using chat to facilitate communication). Instant messaging systems tend to facilitate connections between specified known users (often using a contact list also known as a "buddy list" or "friend list"). Depending on the IM protocol, the technical architecture can be peer-to-peer (direct point-to-point transmission) or client-server (an Instant message service center retransmits messages from the sender to the communication device).
By 2010, instant messaging over the Web was in sharp decline, in favor of messaging features on social networks. The most popular IM platforms were terminated, such as AIM which closed down and Windows Live Messenger which merged into Skype. Instant messaging has since seen a revival in popularity in the form of "messaging apps" (usually on mobile devices) which by 2014 had more users than social networks.
As of 2010, social networking providers often offer IM abilities. Facebook Chat is a form of instant messaging, and Twitter can be thought of as a Web 2.0 instant messaging system. Similar server-side chat features are part of most dating websites, such as OKCupid or PlentyofFish. The spread of smartphones and similar devices in the late 2000s also caused increased competition with conventional instant messaging, by making text messaging services still more ubiquitous.
Many instant messaging services offer video calling features, voice over IP and web conferencing services. Web conferencing services can integrate both video calling and instant messaging abilities. Some instant messaging companies are also offering desktop sharing, IP radio, and IPTV to the voice and video features.
The term "Instant Messenger" is a service mark of Time Warner and may not be used in software not affiliated with AOL in the United States. For this reason, in April 2007, the instant messaging client formerly named Gaim (or gaim) announced that they would be renamed "Pidgin".
In the 2010s, more people started to use messaging apps on modern computers and devices like WhatsApp, WeChat, Viber, Facebook Messenger, Telegram, Signal and Line rather than instant messaging on computers like AIM and Windows Live Messenger. For example, WhatsApp was founded in 2009, and Facebook acquired in 2014, by which time it already had half a billion users.
Concepts
Backchannel
Backchannel is the practice of using networked computers to maintain a real-time online conversation alongside the primary group activity or live spoken remarks. The term was coined in the field of linguistics to describe listeners' behaviours during verbal communication. (See Backchannel (linguistics).)
The term "backchannel" generally refers to online conversation about the conference topic or speaker. Occasionally backchannel provides audience members a chance to fact-check the presentation.
First growing in popularity at technology conferences, backchannel is increasingly a factor in education where WiFi connections and laptop computers allow participants to use ordinary chat like IRC or AIM to actively communicate during presentation. More recent research include works where the backchannel is brought publicly visible, such as the ClassCommons, backchan.nl and Fragmented Social Mirror.
Twitter is also widely used today by audiences to create backchannels during broadcasting of content or at conferences. For example, television drama, other forms of entertainment and magazine programs. This practice is often also called live tweeting. Many conferences nowadays also have a hashtag that can be used by the participants to share notes and experiences; furthermore such hashtags can be user generated.
Features
Various platforms and apps are distinguished by their strengths and features in regards to specific functions.
Group messaging
Official channels
Some apps include a feature known as "official channels" which allows companies, especially news media outlets, publications, and other mass media companies, to offer an official channel, which users can join, and thereby receive regular updates, published articles, or news updates from companies or news outlets. Two apps which have a large amount of such channels available are Line and Telegram.
Video group calls
Basic default platforms
Basic platforms which are common across entire categories of mobile devices, computers, or operating systems.
SMS
SMS (short message service) is a text messaging service component of most telephone, Internet, and mobile device systems. It uses standardized communication protocols to enable mobile devices to exchange short text messages. An intermediary service can facilitate a text-to-voice conversion to be sent to landlines.
SMS, as used on modern devices, originated from radio telegraphy in radio memo pagers that used standardized phone protocols. These were defined in 1985 as part of the Global System for Mobile Communications (GSM) series of standards. The first test SMS message was sent on December 3, 1992, when Neil Papwort, a test engineer for Sema Group, used a personal computer to send "Merry Christmas" to the phone of colleague Richard Jarvis. It commercially rolled out to many cellular networks that decade. SMS became hugely popular worldwide as a way of text communication. By the end of 2010, SMS was the most widely used data application, with an estimated 3.5 billion active users, or about 80% of all mobile phone subscribers.
The protocols allowed users to send and receive messages of up to 160 characters (when entirely alpha-numeric) to and from GSM mobiles. Although most SMS messages are sent from one mobile phone to another, support for the service has expanded to include other mobile technologies, such as ANSI CDMA networks and Digital AMPS.
Mobile marketing, a type of direct marketing, uses SMS. According to a 2018 market research report the global SMS messaging business was estimated to be worth over US$100 billion, accounting for almost 50 percent of all the revenue generated by mobile messaging.
A Flash SMS is a type of SMS that appears directly on the main screen without user interaction and is not automatically stored in the inbox. It can be useful in emergencies, such as a fire alarm or cases of confidentiality, as in delivering one-time passwords.
Threaded SMS format
Threaded SMS is a visual styling orientation of SMS message history that arranges messages to and from a contact in chronological order on a single screen.
It was first invented by a developer working to implement the SMS client for the BlackBerry, who was looking to make use of the blank screen left below the message on a device with a larger screen capable of displaying far more than the usual 160 characters, and was inspired by threaded Reply conversations in email.
Visually, this style of representation provides a back-and-forth chat-like history for each individual contact. Hierarchical-threading at the conversation-level (as typical in blogs and on-line messaging boards) is not widely supported by SMS messaging clients. This limitation is due to the fact that there is no session identifier or subject-line passed back and forth between sent and received messages in the header data (as specified by SMS protocol) from which the client device can properly thread an incoming message to a specific dialogue, or even to a specific message within a dialogue.
Most smart phone text-messaging-clients are able to create some contextual threading of "group messages" which narrows the context of the thread around the common interests shared by group members. On the other hand, advanced enterprise messaging applications that push messages from a remote server often display a dynamically changing reply number (multiple numbers used by the same sender), which is used along with the sender's phone number to create session-tracking capabilities analogous to the functionality that cookies provide for web-browsing. As one pervasive example, this technique is used to extend the functionality of many Instant Messenger (IM) applications such that they are able to communicate over two-way dialogues with the much larger SMS user-base. In cases where multiple reply numbers are used by the enterprise server to maintain the dialogue, the visual conversation threading on the client may be separated into multiple threads.
Multimedia Messaging Service
Multimedia Messaging Service (MMS) is a standard way to send messages that include multimedia content to and from a mobile phone over a cellular network. Users and providers may refer to such a message as a PXT, a picture message, or a multimedia message. The MMS standard extends the core SMS (Short Message Service) capability, allowing the exchange of text messages greater than 160 characters in length. Unlike text-only SMS, MMS can deliver a variety of media, including up to forty seconds of video, one image, a slideshow of multiple images, or audio.
The first MMS-capable phones were introduced around 2002 in conjunction with the first GSM network. The Sony Ericsson T68i is widely believed to be the first MMS-capable cell phone, while many more hit North American markets beginning in 2004 and 2005.
The most common use involves sending photographs from camera-equipped handsets. Media companies have utilized MMS on a commercial basis as a method of delivering news and entertainment content, and retailers have deployed it as a tool for delivering scannable coupon codes, product images, videos, and other information.
The 3GPP and WAP groups fostered the development of the MMS standard, which is now continued by the Open Mobile Alliance (OMA).
Content adaptation: Multimedia content created by one brand of MMS phone may not be entirely compatible with the capabilities of the recipient's MMS phone. In the MMS architecture, the recipient MMSC is responsible for providing for content adaptation (e.g., image resizing, audio codec transcoding, etc.), if this feature is enabled by the mobile network operator. When content adaptation is supported by a network operator, its MMS subscribers enjoy compatibility with a larger network of MMS users than would otherwise be available.
Rich Communication Services
Rich Communication Services (RCS) is a communication protocol between mobile telephone carriers and between phone and carrier, aiming at replacing SMS messages with a text-message system that is richer, provides phonebook polling (for service discovery), and can transmit in-call multimedia. It is part of broader IP Multimedia Subsystem.
It is also marketed as Advanced Messaging, Chat, joyn, SMSoIP, Message+, and SMS+.
In early 2020, it was estimated that RCS is available from 88 operators throughout 59 countries in the world. There are approximately 390 million users per month and the business is expected to be worth $71 billion by 2021.
Amnesty International researcher Joe Westby criticized RCS for not allowing end-to-end encryption, because it is treated as a service of carriers and thus subject to lawful interception.
The Verge criticized the inconsistent support of RCS in the United States, with carriers not supporting RCS in all markets, not certifying service on all phones, or not yet supporting the Universal Profile. Concerns were shown over Google's decision to run its own RCS service due to the possibility of antitrust scrutiny, but it was acknowledged that Google had to do so in order to bypass the carriers' inconsistent support of RCS, as it wanted to have a service more comparable to Apple's iMessage service available on Android.
Ars Technica also criticized Google's move to launch a direct-to-consumer RCS service, considering it a contradiction of RCS being native to the carrier to provide features reminiscent of messaging apps, counting it as being among various past and unsuccessful attempts by Google to develop an in-house messaging service (including Google Talk, Google+ Messenger, Hangouts, and Allo), and noting limitations such as its dependencies on phone numbers as the identity, not being capable of being readily synchronized between multiple devices, and the aforementioned lack of end-to-end encryption. In November 2020, Google announced that it would begin to introduce end-to-end encryption in beta.
Internet Relay Chat
Internet Relay Chat (IRC) is an application layer protocol that facilitates communication in the form of text. The chat process works on a client/server networking model. IRC clients are computer programs that users can install on their system or web based applications running either locally in the browser or on a third party server. These clients communicate with chat servers to transfer messages to other clients. IRC is mainly designed for group communication in discussion forums, called channels, but also allows one-on-one communication via private messages as well as chat and data transfer, including file sharing.
Client software is available for every major operating system that supports Internet access. As of April 2011, the top 100 IRC networks served more than half a million users at a time, with hundreds of thousands of channels operating on a total of roughly 1,500 servers out of roughly 3,200 servers worldwide. IRC usage has been declining steadily since 2003, losing 60% of its users (from 1 million to about 400,000 in 2012) and half of its channels (from half a million in 2003).
Modern IRC
IRC has changed much over its life on the Internet. New server software has added a multitude of new features.
Services: Network-operated bots to facilitate registration of nicknames and channels, sending messages for offline users and network operator functions.
Extra modes: While the original IRC system used a set of standard user and channel modes, new servers add many new modes for features such as removing color codes from text, or obscuring a user's hostmask ("cloaking") to protect from denial-of-service attacks.
Proxy detection: Most modern servers support detection of users attempting to connect through an insecure (misconfigured or exploited) proxy server, which can then be denied a connection. This proxy detection software is used by several networks, although that real-time list of proxies is defunct since early 2006.
Additional commands: New commands can be such things as shorthand commands to issue commands to Services, to network-operator-only commands to manipulate a user's hostmask.
Encryption: For the client-to-server leg of the connection TLS might be used (messages cease to be secure once they are relayed to other users on standard connections, but it makes eavesdropping on or wiretapping an individual's IRC sessions difficult). For client-to-client communication, SDCC (Secure DCC) can be used.
Connection protocol: IRC can be connected to via IPv4, the old version of the Internet Protocol, or by IPv6, the current standard of the protocol.
, a new standardization effort is under way under a working group called IRCv3, which focuses on more advanced client features like instant notifications, better history support and improved security. , no major IRC networks have fully adopted the proposed standard.
After its golden era during the 1990s and early 2000s (240,000 users on QuakeNet in 2004), IRC has seen a significant decline, losing around 60% of users between 2003 and 2012, with users moving to newer social media platforms like Facebook or Twitter, but also to open platforms like XMPP which was developed in 1999. Certain networks like Freenode have not followed the overall trend and have more than quadrupled in size during the same period. As of 2016, Freenode is the largest IRC network with around 90,000 users.
The largest IRC networks have traditionally been grouped as the "Big Four"—a designation for networks that top the statistics. The Big Four networks change periodically, but due to the community nature of IRC there are a large number of other networks for users to choose from.
Historically the "Big Four" were:
EFnet
IRCnet
Undernet
DALnet
IRC reached 6 million simultaneous users in 2001 and 10 million users in 2003, dropping to 371k in 2018.
, the largest IRC networks are:
freenode – around 90k users at peak hours
IRCnet – around 30k users at peak hours
EFnet – around 18k users at peak hours
Undernet – around 17k users at peak hours
QuakeNet – around 15k users at peak hours
Rizon – around 14k users at peak hours
OFTC – around 13k users at peak hours
DALnet – around 8k users at peak hours
Today, the top 100 IRC networks have around 370k users connected at peak hours.
XMPP
Extensible Messaging and Presence Protocol (XMPP) is a communication protocol for message-oriented middleware based on XML (Extensible Markup Language). It enables the near-real-time exchange of structured yet extensible data between any two or more network entities. Originally named Jabber, the protocol was developed by the eponymous open-source community in 1999 for near real-time instant messaging (IM), presence information, and contact list maintenance. Designed to be extensible, the protocol has been used also for publish-subscribe systems, signalling for VoIP, video, file transfer, gaming, the Internet of Things (IoT) applications such as the smart grid, and social networking services.
Unlike most instant messaging protocols, XMPP is defined in an open standard and uses an open systems approach of development and application, by which anyone may implement an XMPP service and interoperate with other organizations' implementations. Because XMPP is an open protocol, implementations can be developed using any software license and many server, client, and library implementations are distributed as free and open-source software. Numerous freeware and commercial software implementations also exist.
The Internet Engineering Task Force (IETF) formed an XMPP working group in 2002 to formalize the core protocols as an IETF instant messaging and presence technology. The XMPP Working group produced four specifications (RFC 3920, RFC 3921, RFC 3922, RFC 3923), which were approved as Proposed Standards in 2004. In 2011, RFC 3920 and RFC 3921 were superseded by RFC 6120 and RFC 6121 respectively, with RFC 6122 specifying the XMPP address format. In 2015, RFC 6122 was superseded by RFC 7622. In addition to these core protocols standardized at the IETF, the XMPP Standards Foundation (formerly the Jabber Software Foundation) is active in developing open XMPP extensions.
XMPP-based software is deployed widely across the Internet, and by 2003, was used by over ten million people worldwide, according to the XMPP Standards Foundation.
SMS texting apps
Below are apps that are used for texting via SMS. Generally, these apps offer various features for expanded messaging, or group texts; however, all messages are received by others as regular SMS text messages.
Textfree
Textfree (formerly Pinger) is an application made by Pinger that allows users to text and call over the internet for free or for a price. The application runs on iOS, Android, Microsoft Windows and Macintosh devices. Competitors include GOGII, Optini and WhatsApp.
Stand-alone messaging platforms
Below are stand-alone apps that are generally focused upon instant messaging as their core feature; however, almost all of these also include numerous distinct additional features such as group chats, video calls, emojis, etc.
These apps do not use SMS messaging; rather, users of this app receive messages through the app interface, not through SMS texting.
Tango
Tango is a third-party, cross platform messaging application software for smartphones developed by TangoME, Inc. in 2009. The app is free and began as one of the first provider of video calls, voice calls, texting, photo sharing, and games on a 3G network.
As of 2018, Tango has more than 400 million registered users. It was rated by PCMag as "the simplest mobile chat application out there, with a good range of support."
In 2017, Tango entered the live-streaming space, and has become a B2C platform for Live Video Broadcasts. Combining high-quality video streaming, a live messaging chat and a digital economy, Tango is a social community that allows content creators to share their talents and monetize their fans and followers.
Tango is available in many languages including Russian, Spanish, Turkish, Hindi and Vietnamese.
WhatsApp
WhatsApp provides the following features, as detailed below.
Group threads: up to 250 members
Groups and channels: no built-in search function to find official groups and channels. anyone can join groups, if they have the link.
Video calls: up to 3 members.
WhatsApp is an American freeware, cross-platform messaging and Voice over IP (VoIP) service owned by Facebook, Inc. It allows users to send text messages and voice messages, make voice and video calls, and share images, documents, user locations, and other media. WhatsApp's client application runs on mobile devices but is also accessible from desktop computers, as long as the user's mobile device remains connected to the Internet while they use the desktop app. The service requires users to provide a standard cellular mobile number for registering with the service. In January 2018, WhatsApp released a standalone business app targeted at small business owners, called WhatsApp Business, to allow companies to communicate with customers who use the standard WhatsApp client.
The client application was created by WhatsApp Inc. of Mountain View, California, which was acquired by Facebook in February 2014 for approximately US$19.3 billion. It became the world's most popular messaging application by 2015, and has over 2billion users worldwide . It has become the primary means of electronic communication in multiple countries and locations, including Latin America, the Indian subcontinent, and large parts of Europe and Africa.
Telegram
Telegram provides the following features, as detailed below.
Group threads: up to 200,000 members
Groups and channels: provides numerous official channels for various organizations. Has an internal search feature to enable searches to find various official outlets.
Telegram is a cross-platform cloud-based instant messaging, video calling, and VoIP service. It was initially launched for iOS on 14 August 2013 in Russia, and is currently based in Dubai. Telegram client apps are available for Android, iOS, Windows Phone, Windows, macOS and Linux, web interface is also available. As of April 2020, Telegram reached 400 million monthly active users.
Telegram provides end-to-end encrypted calls and optional end-to-end encrypted "secret" chats between two online users on smartphone clients, whereas cloud chats use client-server/ server-client encryption.
Users can send text and voice messages, animated stickers, make voice and video calls, and share an unlimited number of images, documents(2GB per file), user locations, contacts, music, links etc.
SInce March 2017, Telegram introduced its own voice calls. According to Telegram, there is a neural network working to learn various technical parameters about a call to provide better quality of the service for future uses. After a brief initial trial in Western Europe, voice calls are now available for use in most countries.
Telegram announced in April 2020 that they would include group video calls by the end of the year. On 15 August 2020, Telegram added video calling with end-to-end encryption like Signal and WhatsApp, which Zoom does not have yet. Currently offering one-to-one video calls, Telegram has plans to introduce secure group video calls later in 2020. Picture-in-picture mode is also available so that users have the option to simultaneously use the other functions of the app while still remaining on the call and are even able to turn their video off.
Telegram's video and voice calls are secure and end-to-end encrypted.
Google Voice
Google Voice is a telephone service that provides call forwarding and voicemail services, voice and text messaging.
Google Voice provides a U.S. telephone number, chosen by the user from available numbers in selected area codes, free of charge to each user account. Calls to this number are forwarded to telephone numbers that each user must configure in the account web portal. Multiple destinations may be specified that ring simultaneously for incoming calls. Service establishment requires a United States telephone number. A user may answer and receive calls on any of the ringing phones as configured in the web portal. During a received call the user may switch between the configured telephones.
Users may place outbound calls to domestic and international destinations. Calls may be initiated from any of the configured telephones, as well as from a mobile device app, or from the account portal. As of August 2011, users in many other countries also may place outbound calls from the web-based application to domestic and international phone numbers.
Many other Google Voice services—such as voicemail, free text messaging, call history, call screening, blocking of unwanted calls, and voice transcription to text of voicemail messages—are also available to . In terms of product integration, transcribed and audio voicemails, missed call notifications, and/or text messages can optionally be forwarded to an email account of the user's choice. Additionally, text messages can be sent and received via the familiar email or IM interface by reading and writing text messages in numbers in Google Talk respectively (PC-to-Phone texting). Google Voice multi-way videoconferencing (with support for document sharing) is now integrated with Google Hangouts.
The service is configured and maintained by the user in a web-based application, styled after Google's e-mail service, Gmail, or with Android and iOS apps on smart phones or tablets. Google Voice provides free PC-to-phone calling within the United States and Canada, and PC-to-PC voice and video calling worldwide between users of the Google+ Hangouts browser plugin (available for Windows, Intel-based Mac OS X, and Linux).
GroupMe
GroupMe works by downloading the app or accessing the service online, and then forming an account by providing your name, cell phone number and a password, or you can connect through your Facebook or Twitter account. The service then syncs with your contacts and from that point forward the user can make groups, limited to 500 members. An individual who is part of an active group has the ability to turn off notifications for the app; users will still receive the message, but will not be notified about it. Each group is given a label and assigned a unique number. Some of the features of the app include the ability to share photos, videos, locations, create events, and emojis from various packs.
GroupMe has a web client as well as apps for iOS, Android, Windows Phone, and Windows 10.
Those who do not wish to use the app can still send and receive GroupMe messages through SMS (only available in the United States).
Users begin by creating a “group” and adding contacts. When someone sends a message, everyone in the group can see and respond to it. The app allows users to easily attach and send pictures, documents, videos and web-links as well. Users can also send private messages, but only to users who are also active on the GroupMe app.
GroupMe has been used as a means for studying the usage of messaging clients in educational settings. Use cases include facilitating online course discussions, small group work, and other course communications for both in-person and online sections. Though unconventional, using GroupMe to facilitate discussion in an environment where students already interact has been found to encourage rhetorical thinking and overall engagement. Researchers have found alternatives for literacy learning as a "legitimate academic genre", given a student population that communicates in a variety of modes. Research around GroupMe furthers the argument that computer-mediated communication is a valuable space for learning in an increasingly globalized society.
Hike Messenger
Hike Messenger, also called Hike Sticker Chat, is an Indian freeware, cross-platform instant messaging (IM), Voice over IP (VoIP) application which was launched on 12 December 2012 by Kavin Bharti Mittal and is now owned by Hike Private Limited. Hike can work offline through SMS and has multi-platform support. The app registration uses standard one time password (OTP) based authentication process. With abundance of low-cost data, Hike decided to go from a single super app strategy to multiple app approach, so that it can focus more on the core messaging capabilities. It has numerous Hikemoji Stickers which can be customized accordingly.From version 6, the user-interface was revised and the app no longer supports features like news, mobile payment, games or jokes. As per CB Insights, $1.4 billion is the valuation of Hike with more than 100 million registered users till August 2016 and 350 employees working from Bengaluru and Delhi.
KakaoTalk
KakaoTalk, commonly referred to as "KaTalk" in South Korea, is a free mobile instant messaging application for smartphones with free text and free call features, operated by Kakao Corporation. It was launched on March 18, 2010 and is currently available on iOS, Android, Bada OS, BlackBerry, Windows Phone, Nokia Asha, Windows and macOS.
As of May 2017, KakaoTalk had 220 million registered and 49 million monthly active users. It is available in 15 languages. The app is also used by 93% of smartphone owners in South Korea, where it is the number one messaging app.
In addition to free calls and messages, users can share photos, videos, voice messages, location, URL links as well as contact information. Both one-on-one and group chats are available over WiFi, 3G or LTE, and there are no limits to the number of people on a group chat.
Airlines such as Southwest which allow free WhatsApp in flight also have functionality for KaTalk, even though their literature omits to mention same.
The app automatically synchronizes the user’s contact list on their smartphones with the contact list on the app to find friends who are on the service. Users can also search for friends by KakaoTalk ID without having to know their phone numbers. The KakaoTalk service also allows its users to export their messages and save them.
KakaoTalk began as a messenger service but has become a platform for the distribution of various third-party content and apps, including hundreds of games, which users can download and play with their friends through the messaging platform. Through the "Plus Friend" feature, users can follow brands, media and celebrities to receive exclusive messages, coupons and other real-time information through KakaoTalk chatrooms. Users can also purchase real-life goods through the messenger's "Gifting" platform.
Besides those listed above, the app has these additional features:
VoiceTalk, free calls and conference calls (with support for up to five people)
Photo, video, location, and contact information sharing
Polling and scheduling feature for members in the chatroom
K-pop & Local Star Friends (Plus Friends)
Walkie-talkie
Customizable themes (for iOS and Android)
Game platform
Stickers and animated emoticons
Plus Mate: You can add your favorite brand, star, or media as your friend to receive a variety of content and benefits.
Kik Messenger
Kik Messenger, commonly called Kik, is a freeware instant messaging mobile app from the Canadian company Kik Interactive, available free of charge on iOS and Android operating systems. It uses a smartphone's data plan or Wi-Fi to transmit and receive messages, photos, videos, sketches, mobile web pages, and other content after users register a username. Kik is known for its features preserving users' anonymity, such as allowing users to register without the need to provide a telephone number or valid email address. However, the application does not employ end-to-end encryption, and the company also logs user IP addresses, which could be used to determine the user's ISP and approximate location. This information, as well as "reported" conversations are regularly surrendered upon request by law enforcement organizations, sometimes without the need for a court order.
Kik was originally intended to be a music-sharing app before transitioning to messaging, briefly offering users the ability to send a limited number of SMS text messages directly from the application. During the first 15 days after Kik's re-release as a messaging app, over 1 million accounts were created. In May 2016, Kik Messenger announced that they had approximately 300 million registered users, and was used by approximately 40% of United States' teenagers.
Kik Messenger announced in October 2019 they had signed a letter of intent with MediaLab AI, followed by the announcement Kik Interactive would be reducing their staff from 100 to just 19. MediaLab owns several mobile apps, most notably Whisper.
A main attraction of Kik that differentiates it from other messaging apps is its anonymity. To register for the Kik service, a user must enter a first and last name, e-mail address, and birth date (which must show that the user is at least 13 years old), and select a username. The Kik registration process does not request or require the entry of a phone number (although the user has the option to enter one), unlike some other messaging services that require a user to provide a functioning mobile phone number.
The New York Times has reported that, according to law enforcement, Kik's anonymity features go beyond those of most widely used apps. As of February 2016, Kik's guide for law enforcement said that the company cannot locate user accounts based on first and last name, e-mail address and/or birth date; the exact username is required to locate a particular account. The guide further said that the company does not have access to content or "historical user data" such as photographs, videos, and the text of conversations, and that photographs and videos are automatically deleted shortly after they are sent. A limited amount of data from a particular account (identified by exact username), including first and last name, birthdate, e-mail address, link to a current profile picture, device-related information, and user location information such as the most recently used IP address, can be preserved for a period of 90 days pending receipt of a valid order from law enforcement. Kik's anonymity has also been cited as a protective safety measure for good faith users, in that "users have screennames; the app doesn't share phone numbers or email addresses."
Kik introduced several new user features in 2015, including a full-screen in-chat browser that allows users to find and share content from the web; a feature allowing users to send previously recorded videos in Kik Messenger for Android and iOS; and "Kik Codes", which assigns each user a unique code similar to a QR code, making it easier to connect and chat with other users. Kik joined the Virtual Global Taskforce, a global anti-child-abuse organization, in March 2015. Kik began using Microsoft's PhotoDNA in March 2015 to premoderate images added by users. That same month, Kik released native video capture allowing users to record up to 15 seconds in the chat window. In October 2015, Kik partnered with the Ad Council as part of an anti-bullying campaign. The campaign was featured on the app and Kik released stickers in collaboration with the campaign. Kik released a feature to send GIFs as emojis in November 2015. Kik added SafePhoto to its safety features in October 2016 which "detects, reports, and deletes known child exploitation images" sent through the platform. Kik partnered with ConnectSafely in 2016 to produce a "parents handbook" and joined The Technology Coalition, an anti-sexual exploitation group including Facebook, Google, Twitter and LinkedIn.
Line
Line (styled in all caps as LINE) is a freeware app for instant communications on electronic devices such as smartphones, tablet computers, and personal computers. Line users exchange texts, images, video and audio, and conduct free VoIP conversations and video conferences. In addition, Line is a platform providing various services including digital wallet as Line Pay, news stream as Line Today, video on demand as Line TV, and digital comic distribution as Line Manga and Line Webtoon. The service is operated by Line Corporation, a Tokyo-based subsidiary of South Korean internet search engine company Naver Corporation.
Line is an application that works on multiple platforms and has access via multiple personal computers (Windows or macOS). The application will also give an option of address book syncing. This application also has a feature to add friends through the use of QR codes, by Line ID, and by shaking phones simultaneously. The application has a direct pop-out message box for reading and replying to make it easy for users to communicate. It also can share photos, videos and music with other users, send the current or any specific location, voice audio, emojis, stickers and emoticons to friends. Users can see a real-time confirmation when messages are sent and received or use a hidden chat feature, which can hide and delete a chat history (from both involved devices and Line servers) after a time set by the user. The application also makes free voice and video calls.
Users can also chat and share media in a group by creating and joining groups that have up to 500 people. Chats also provide bulletin boards on which users can post, like, and comment. This application also has timeline and homepage features that allow users to post pictures, text and stickers on their homepages. Users can also change their Line theme to the theme Line provides in the theme shop for free or users can buy other famous cartoon characters they like. Line also has a feature, called a Snap movie, that users can use to record a stop-motion video and add in provided background music.
In January 2015, Line Taxi was released in Tokyo as a competitor to Uber. Line launched a new android app called "Popcorn buzz" in June 2015. The app facilitates group calls with up to 200 members. In June a new Emoji keyboard was also released for iOS devices, which provides a Line-like experience with the possibility to add stickers. In September 2015 a new Android launcher was released on the Play Store, helping the company to promote its own services through the new user interface.
Signal
Signal is a cross-platform encrypted messaging service developed by the Signal Foundation and Signal Messenger LLC. It uses the Internet to send one-to-one and group messages, which can include files, voice notes, images and videos. It can also be used to make one-to-one voice and video calls, and the Android version can optionally function as an SMS app.
Signal uses standard cellular telephone numbers as identifiers and secures all communications to other Signal users with end-to-end encryption. The apps include mechanisms by which users can independently verify the identity of their contacts and the integrity of the data channel.
Snapchat
Snapchat sends messages referred to as "snaps"; snaps can consist of a photo or a short video, and can be edited to include filters and effects, text captions, and drawings. Snaps can be directed privately to selected contacts, or to a semi-public "Story" or a public "Story" called "Our Story." The ability to send video snaps was added as a feature option in December 2012. By holding down on the photo button while inside the app, a video of up to ten seconds in length can be captured. Spiegel explained that this process allowed the video data to be compressed into the size of a photo. A later update allowed the ability to record up to 60 seconds, but are still segmented into 10 second intervals. After a single viewing, the video disappears by default. On May 1, 2014, the ability to communicate via video chat was added. Direct messaging features were also included in the update, allowing users to send ephemeral text messages to friends and family while saving any needed information by clicking on it. According to CIO, Snapchat uses real-time marketing concepts and temporality to make the app appealing to users. According to Marketing Pro, Snapchat attracts interest and potential customers by combining the AIDA (marketing) model with modern digital technology.
Private message photo snaps can be viewed for a user-specified length of time (1 to 10 seconds as determined by the sender) before they become inaccessible. Users were previously required to hold down on the screen in order to view a snap; this behavior was removed in July 2015 The requirement to hold on the screen was intended to frustrate the ability to take screenshots of snaps; the Snapchat app does not prevent screenshots from being taken but can notify the sender if it detects that it has been saved. However, these notifications can be bypassed through either unauthorized modifications to the app or by obtaining the image through external means. One snap per day can be replayed for free. In September 2015, Snapchat introduced the option to purchase additional replays through in-app purchases. The ability to purchase extra replays was removed in April 2016.
Friends can be added via usernames and phone contacts, using customizable "Snapcodes," or through the "Add Nearby" function, which scans for users near their location who are also in the Add Nearby menu. Spiegel explained that Snapchat is intended to counteract the trend of users being compelled to manage an idealized online identity of themselves, which he says has "taken all of the fun out of communicating."
Viber
Rakuten Viber, or simply Viber, is a cross-platform voice over IP (VoIP) and instant messaging (IM) software application operated by Japanese multinational company Rakuten, provided as freeware for the Android, iOS, Microsoft Windows, macOS and Linux platforms. Users are registered and identified through a cellular telephone number, although the service is accessible on desktop platforms without needing mobile connectivity. In addition to instant messaging it allows users to exchange media such as images and video records, and also provides a paid international landline and mobile calling service called Viber Out. As of 2018, there are over a billion registered users on the network.
Wonder video chat
Wonder is a new style of shared video chat, using a virtual space where users can move between virtual rooms and initiate conversations either with a large group, or within a spontaneous "circle." The chat platforms is entirely browser-based, and does not entail or require the use of any specific app.
WeChat
WeChat is a Chinese multi-purpose messaging, social media and mobile payment app developed by Tencent. First released in 2011, it became the world's largest standalone mobile app in 2018, with over 1 billion monthly active users. WeChat has been described as China's "app for everything" and a "super app" because of its wide range of functions. WeChat provides text messaging, hold-to-talk voice messaging, broadcast (one-to-many) messaging, video conferencing, video games, sharing of photographs and videos, and location sharing.
WeChat provides text messaging, hold-to-talk voice messaging, broadcast (one-to-many) messaging, video calls and conferencing, video games, photograph and video sharing, as well as location sharing. WeChat also allows users to exchange contacts with people nearby via Bluetooth, as well as providing various features for contacting people at random if desired (if people are open to it). It can also integrate with other social networking services such as Facebook and Tencent QQ. Photographs may also be embellished with filters and captions, and automatic translation service is available.
WeChat supports different instant messaging methods, including text message, voice message, walkie talkie, and stickers. Users can send previously saved or live pictures and videos, profiles of other users, coupons, lucky money packages, or current GPS locations with friends either individually or in a group chat. WeChat's character stickers, such as Tuzki, resemble and compete with those of LINE, a Japanese-South Korean messaging application.
WeChat users can register as a public account (), which enables them to push feeds to subscribers, interact with subscribers and provide them with services. Users can also create an official account, which fall under service, subscription, or enterprise accounts. Once users as individuals or organizations set up a type of account, they cannot change it to another type. By the end of 2014, the number of WeChat official accounts had reached 8 million. Official accounts of organizations can apply to be verified (cost 300 RMB or about US$45). Official accounts can be used as a platform for services such as hospital pre-registrations, visa renewal or credit card service. To create an official account, the applicant must register with Chinese authorities, which discourages "foreign companies".
"Moments" () is WeChat's brand name for its social feed of friends' updates. "Moments" is an interactive platform that allows users to post images, text, and short videos taken by users. It also allows users to share articles and music (associated with QQ Music or other web-based music services). Friends in the contact list can give thumbs up to the content and leave comments. Moments can be linked to Facebook and Twitter accounts, and can automatically post Moments content directly on these two platforms.
In 2017 WeChat had a policy of a maximum of two advertisements per day per Moments user.
Platforms for combining multiple apps
Platforms specifically designed to combined messages from multiple other mobile apps.
Trillian
Trillian is a proprietary multiprotocol instant messaging application created by Cerulean Studios. It is currently available for Microsoft Windows, Mac OS X, Linux, Android, iOS, BlackBerry OS, and the Web. It can connect to multiple IM services, such as AIM, Bonjour, Facebook Messenger, Google Talk (Hangouts), IRC, XMPP (Jabber), VZ, and Yahoo! Messenger networks; as well as social networking sites, such as Facebook, Foursquare, LinkedIn, and Twitter; and email services, such as POP3 and IMAP.
Trillian no longer supports Windows Live Messenger or Skype as these services have combined and Microsoft chose to discontinue Skypekit. They also no longer support connecting to MySpace, and no longer support a distinct connection for Gmail, Hotmail or Yahoo! Mail although these can still be connected to via POP3 or IMAP. Currently, Trillian supports Facebook, Google, Jabber (XMPP), and Olark.
Initially released July 1, 2000, as a freeware IRC client, the first commercial version (Trillian Pro 1.0) was published on September 10, 2002. The program was named after Trillian, a fictional character in The Hitchhiker's Guide to the Galaxy by Douglas Adams. A previous version of the official web site even had a tribute to Douglas Adams on its front page. On August 14, 2009, Trillian "Astra" (4.0) for Windows was released, along with its own Astra network. Trillian 5 for Windows was released in May 2011, and Trillian 6.0 was initially released in February 2017.
Trillian connects to multiple instant messaging services without the need of running multiple clients. Users can create multiple connections to the same service, and can also group connections under separate identities to prevent confusion. All contacts are gathered under the same contact list. Contacts are not bound to their own IM service groups, and can be dragged and dropped freely.
Trillian represents each service with a different-colored sphere. Prior versions used the corporate logos for each service, but these were removed to avoid copyright issues, although some skins still use the original icons. The Trillian designers chose a color-coding scheme based on the underground maps used by the London Underground that uses different colors to differentiate between different lines.
Platforms for specific operating systems
Empathy
Empathy is an instant messaging (IM) and voice over IP (VoIP) client which supports text, voice, video, file transfers, and inter-application communication over various IM communication protocols. It is specifically designed for use with the operating systems BSD, Linux, and other Unix-like systems. It was initially completely XMPP based (similar to Google Talk and Facebook's chat implementations), but others wanted it to use the Telepathy stack. This led to the forking and new name Empathy.
Empathy also provides a collection of reusable graphical user interface widgets for developing instant messaging clients for the GNOME desktop. It is written as extension to the Telepathy framework, for connecting to different instant messaging networks with a unified user interface.
Empathy has been included in the GNOME desktop since its version 2.24, in Ubuntu since version 9.10 (Karmic Koala), and in Fedora since version 12 (Constantine); Empathy has replaced Pidgin as their default messenger application.
Messages for MacOS
Messages (Apple) is an instant messaging software application developed by Apple Inc. for its macOS, iOS, iPadOS, and watchOS operating systems.
The mobile version of Messages on iOS used on iPhone and iPad also supports SMS and MMS due to replacing the older text messaging Text app since iPhone OS 3. Users can tell the difference between a message sent via SMS and one sent over iMessage as the bubbles will appear either green (SMS) or blue (iMessage).
The desktop Messages application replaced iChat as the native OS X instant messaging client with the release of OS X Mountain Lion in July 2012. While it inherits the majority of iChat's features, Messages also brings support for iMessage, Apple's messaging service for iOS, as well as FaceTime integration.
Messages was announced for OS X as a beta application on February 16, 2012 for Macs running Mac OS X 10.7 "Lion". The stable release of Messages was released on July 25, 2012 with OS X Mountain Lion, replacing iChat. In addition to supporting Apple's new iMessage protocol, Messages retained its support for AIM, Yahoo Messenger, Google Talk and Jabber.
Messages unitizes the newly added Notification Center to notify of incoming messages. The introduction of a new Share button in applications like Safari, Finder and Preview gave users the ability to share links to webpages, photos, and files. Messages also supported dragging and dropping files and photos for sharing. It also supports video calling through Apple's FaceTime and the third-party IM services it supports. With the release of OS X Mountain Lion 10.8.2, Messages gained the ability to send and receive iMessages using an iPhone phone number.
Messages received a major redesign in OS X Yosemite, following the flat design aesthetic introduced in iOS 7. As a part of the new Continuity feature, users can send and receive SMS and MMS messages through paired iPhones running iOS 8 or later.
Social networking mobile apps
A social networking service (also social networking site or social media) is an online platform which people use to build social networks or social relationships with other people who share similar personal or career interests, activities, backgrounds or real-life connections.
Social networking services vary in format and the number of features. They can incorporate a range of new information and communication tools, operating on desktops and on laptops, on mobile devices such as tablet computers and smartphones. They may feature digital photo/video/sharing and diary entries online (blogging). Online community services are sometimes considered social-network services by developers and users, though in a broader sense, a social-network service usually provides an individual-centered service whereas online community services are group-centered. Defined as "websites that facilitate the building of a network of contacts in order to exchange various types of content online," social networking sites provide a space for interaction to continue beyond in person interactions. These computer mediated interactions link members of various networks and may help to both maintain and develop new social and professional relationships
Social networking sites allow users to share ideas, digital photos and videos, posts, and to inform others about online or real-world activities and events with people in their network. While in-person social networking – such as gathering in a village market to talk about events – has existed since the earliest development of towns, the web enables people to connect with others who live in different locations, ranging from across a city to across the world. Depending on the social media platform, members may be able to contact any other member. In other cases, members can contact anyone they have a connection to, and subsequently anyone that contact has a connection to, and so on. The success of social networking services can be seen in their dominance in society today, with Facebook having a massive 2.13 billion active monthly users and an average of 1.4 billion daily active users in 2017. LinkedIn, a career-oriented social-networking service, generally requires that a member personally know another member in real life before they contact them online. Some services require members to have a preexisting connection to contact other members.
MeWe
MeWe is an American alt-tech social media and social networking service owned by Sgrouples, a company based in Culver City, California. MeWe's light approach to content moderation has made it popular among conspiracy theorists, particularly the anti-vaccine movement, as well as American conservatives. The site's interface has been described as similar to that of Facebook, though the company describes MeWe as the "anti-Facebook" due to its focus on data privacy
By 2015, as MeWe neared the end of its beta testing cycle, the press called MeWe's software "not dissimilar to Facebook". Mashable described MeWe as replicating Facebook's features in 2020.
The MeWe site and application has features common to most social media and social networking sites: users can post text and images to a feed, react to others' posts using emoji, post animated GIFs, create specialized groups, post disappearing content, and chat.
Online chat may occur between two or more people or among members of a group. Person-to-person online chat is similar to that in most other social media and social networking sites, and supports text, video calling, and voice calling. "Secret Chat" is limited to the paid subscription tier of MeWe, and uses double ratchet encryption to ensure that chats are private and not visible even to MeWe employees.
MeWe reported in June 2018 that the site had 90,000 active groups, 60,000 of which were "public" and open to all users. Following the influx of Hong Kong users in 2020, MeWe CEO Weinstein announced that the website would provide a Traditional Chinese language version by the end of the year.
User base and content
Although MeWe has not intentionally positioned itself as a social network for conservatives, Mashable noted in November 2020 that its active userbase trends conservative.The platform's choice not to moderate misinformation on the platform has attracted conservatives who felt mainstream social networks were censoring their posts, and those who have been banned from those platforms.
MeetMe
The Meet Group (formerly MeetMe) owns several mobile social networking services including MeetMe, hi5, LOVOO, Growlr, Skout, and Tagged.
The company has millions of mobile daily active users. Its mobile apps are available on iOS, and Android in multiple languages. Through these apps, users can stream live video, send gifts, chat, and share photos. The Meet Group derives revenue from in-app purchases, subscriptions, and advertising. The company has offices in New Hope, Pennsylvania, Philadelphia, San Francisco, Dresden, and Berlin.
The Meet Group has transformed its business from being a predominantly advertising model to now generating the majority of revenue from user pay sources, which include subscriptions and in-app purchases for virtual gifts as part of its video live-streaming product. The company also derives revenue from advertising. In the second quarter of 2018, 60% of revenue was derived from user-pay, versus 26% in the second quarter of 2017. Livestreaming video revenue has become an increasingly important component of revenue and growth, and the product has been rolled out to all of the Company's main apps.
myYearbook derives its revenue from three sources: advertising, virtual-currency sales, and monthly subscriptions. Advertising makes up two-thirds of its revenue, with the other sources making up the rest. It has an established sales office based in New York City and Los Angeles.
Nextdoor
Nextdoor is a hyperlocal social networking service for neighborhoods. The company was founded in 2008 and is based in San Francisco, California. Nextdoor launched in the United States in October 2011, and is currently available in 11 countries. Users of Nextdoor are required to submit their real names and addresses (or street without the exact number) to the website; posts made to the website are available only to other Nextdoor members living in the same neighborhood.
Typical platform uses include neighbors reporting on news and events in their "neighborhood" and members asking each other for local service-provider recommendations. "Neighborhood" borders were initially established with Maponics, a provider of geographical information. According to the platform's rules, members whose addresses fall outside the boundaries of existing neighborhoods can establish their own neighborhoods. "Founding" members of neighborhoods determine the name of the neighborhood and its boundaries, although Nextdoor retains the authority to change either of these. A member must attract a minimum of 10 households to establish a new "neighborhood", as of November 2016.
While allowing for "civil debate", the platform prohibits canvassing for votes on forums. The service does however allow separate forums just for political discussions. According to The New York Times, these discussions are "separated from [a user's regular] neighborhood feeds". The company had established these separate forums in 12 markets by 2018. The company has stated it "has no plans" to accept political advertising.
Special-use platforms
U-Report
U-Report is a social messaging tool and data collection system developed by UNICEF to improve citizen engagement, inform leaders, and foster positive change. The program sends SMS polls and alerts to its participants, collecting real-time responses, and subsequently publishes gathered data. Issues polled include health, education, water, sanitation and hygiene, youth unemployment, HIV/AIDS, and disease outbreaks. The program currently has three million participants in forty-one countries.
Platforms that are internal features within major websites
Facebook
Facebook Messenger is an instant messaging service and software application. It began as Facebook Chat in 2008, was revamped in 2010 and eventually became a standalone mobile app in August 2011, while remaining part of the user page on browsers.
Complementing regular conversations, Messenger lets users make one-to-one and group voice and video calls. Its Android app has integrated support for SMS and "Chat Heads", which are round profile photo icons appearing on-screen regardless of what app is open, while both apps support multiple accounts, conversations with optional end-to-end encryption and "Instant Games". Some features, including sending money and requesting transportation, are limited to the United States. In 2017, Facebook added "Messenger Day", a feature that lets users share photos and videos in a story-format with all their friends with the content disappearing after 24 hours; Reactions, which lets users tap and hold a message to add a reaction through an emoji; and Mentions, which lets users in group conversations type @ to give a particular user a notification.
Businesses and users can interact through Messenger with features such as tracking purchases and receiving notifications, and interacting with customer service representatives. Third-party developers can integrate apps into Messenger, letting users enter an app while inside Messenger and optionally share details from the app into a chat. Developers can build chatbots into Messenger, for uses such as news publishers building bots to distribute news. The M virtual assistant (U.S.) scans chats for keywords and suggests relevant actions, such as its payments system for users mentioning money. Group chatbots appear in Messenger as "Chat Extensions". A "Discovery" tab allows finding bots, and enabling special, branded QR codes that, when scanned, take the user to a specific bot.
Instagram
In December 2013, Instagram announced Instagram Direct, a feature that lets users interact through private messaging. Users who follow each other can send private messages with photos and videos, in contrast to the public-only requirement that was previously in place. When users receive a private message from someone they don't follow, the message is marked as pending and the user must accept to see it. Users can send a photo to a maximum of 15 people. The feature received a major update in September 2015, adding conversation threading and making it possible for users to share locations, hashtag pages, and profiles through private messages directly from the news feed. Additionally, users can now reply to private messages with text, emoji or by clicking on a heart icon. A camera inside Direct lets users take a photo and send it to the recipient without leaving the conversation. A new update in November 2016 let users make their private messages "disappear" after being viewed by the recipient, with the sender receiving a notification if the recipient takes a screenshot.
In April 2017, Instagram redesigned Direct to combine all private messages, both permanent and ephemeral, into the same message threads. In May, Instagram made it possible to send website links in messages, and also added support for sending photos in their original portrait or landscape orientation without cropping.
In April 2020, Direct became accessible from the Instagram website.
In August 2020, Facebook started merging Instagram Direct into Facebook Messenger. After the update (which is rolled out to a segment of the user base) the Instagram Direct icon transforms into Facebook Messenger icon.
LinkedIn
The LinkedIn website includes a feature that allows direct messaging by a user to any other user who is on their list of Connections. Additionally, users with Premium membership can send messages to anyone on LinkedIn.
Reddit
In 2017, Reddit developed its own real-time chat software for the site. While some established subreddits have used third-party software to chat about their communities, the company built chat functions that it hopes will become an integral part of Reddit. Individual chat rooms were rolled out in 2017 and community chat rooms for members of a given subreddit were rolled out in 2018.
Twitter
Tweets are publicly visible by default, but senders can restrict message delivery to only their followers. Users can mute users they do not wish to interact with and block accounts from viewing their tweets. Users can tweet via the Twitter website, compatible external applications (such as for smartphones), or by Short Message Service (SMS) available in certain countries. Users may subscribe to other users' tweets—this is known as "following" and subscribers are known as "followers" or "tweeps", a portmanteau of Twitter and peeps. Individual tweets can be forwarded by other users to their own feed, a process known as a "retweet". Users can also "like" (formerly "favorite") individual tweets. Twitter allows users to update their profile via their mobile phone either by text messaging or by apps released for certain smartphones and tablets. Twitter has been compared to a web-based Internet Relay Chat (IRC) client. In a 2009 Time magazine essay, technology author Steven Johnson described the basic mechanics of Twitter as "remarkably simple":
Video conference platforms
Jitsi
Jitsi is a collection of free and open-source multiplatform voice (VoIP), video conferencing and instant messaging applications for the web platform, Windows, Linux, macOS, iOS and Android. The Jitsi project began with the Jitsi Desktop (previously known as SIP Communicator). It is totally free to use, and to host on a business's own server.
With the growth of WebRTC, the project team focus shifted to the Jitsi Videobridge for allowing web-based multi-party video calling. Later the team added Jitsi Meet, a full video conferencing application that includes web, Android, and iOS clients. Jitsi also operates meet.jit.si, a version of Jitsi Meet hosted by Jitsi for free community use. Other projects include: Jigasi, lib-jitsi-meet, Jidesha, and Jitsi.
Jitsi has received support from various institutions such as the NLnet Foundation, the University of Strasbourg and the Region of Alsace and it has also had multiple participations in the Google Summer of Code program.
Jitsi Meet is an open source JavaScript WebRTC application used primarily for video conferencing. In addition to audio and video, screen sharing is available, and new members can be invited via a generated link. The interface is accessible via web browser or with a mobile app. The Jitsi Meet server software can be downloaded and installed on Linux-based computers. Jitsi owner 8x8 maintains a free public-use server for up to 50 participants at meet.jit.si.
Key features of Jitsi Meet
Encrypted communication (secure communication): As of April 2020, one-to-one calls use the P2P mode, which is end-to-end encrypted via DTLS-SRTP between the two participants. Group calls also use DTLS-SRTP encryption, but rely on the Jitsi Videobridge (JVB) as video router, where packets are decrypted temporarily. The Jitsi team emphasizes that "they are never stored to any persistent storage and only live in memory while being routed to other participants in the meeting", and that this measure is necessary due to current limitations of the underlying WebRTC technology.
No need of new client software installation.
Skype
Skype is a proprietary telecommunications application that specializes in providing video chat and voice calls between computers, tablets, mobile devices, the Xbox One console, and smartwatches over the Internet. Skype also provides instant messaging services. Users may transmit text, video, audio and images. Skype allows video conference calls.
In March 2020, Skype was used by 100 million people on a monthly basis and by 40 million people on a daily basis, which was a 70% increase in the number of daily users from the previous month, due to the COVID-19 pandemic.
Registered users of Skype are identified by a unique Skype ID and may be listed in the Skype directory under a Skype username. Skype allows these registered users to communicate through both instant messaging and voice chat. Voice chat allows telephone calls between pairs of users and conference calling and uses proprietary audio codec. Skype's text chat client allows group chats, emoticons, storing chat history, and editing of previous messages. Offline messages were implemented in a beta build of version 5 but removed after a few weeks without notification. The usual features familiar to instant messaging users—user profiles, online status indicators, and so on—are also included.
The Online Number, a.k.a. SkypeIn, service allows Skype users to receive calls on their computers dialed by conventional phone subscribers to a local Skype phone number; local numbers are available for Australia, Belgium, Brazil, Chile, Colombia, Denmark, the Dominican Republic, Estonia, Finland, France, Germany, Hong Kong, Hungary, India, Ireland, Japan, Mexico, Nepal, New Zealand, Poland, Romania, South Africa, South Korea, Sweden, Switzerland, Turkey, the Netherlands, the United Kingdom, and the United States. A Skype user can have local numbers in any of these countries, with calls to the number charged at the same rate as calls to fixed lines in the country.
Skype supports conference calls, video chats, and screen sharing between 25 people at a time for free, which then increased to 50 on 5 April 2019.
Skype does not provide the ability to call emergency numbers, such as 112 in Europe, 911 in North America, 999 in the UK or 100 in India and Nepal. However, as of December 2012, there is limited support for emergency calls in the United Kingdom, Australia, Denmark, and Finland. The U.S. Federal Communications Commission (FCC) has ruled that, for the purposes of section 255 of the Telecommunications Act, Skype is not an "interconnected VoIP provider". As a result, the U.S. National Emergency Number Association recommends that all VoIP users have an analog line available as a backup.
Skype allows users to send instant messages to other users in their contact list. Messages sent to offline users are stored on Skype servers and will be delivered to their recipients as soon as they come online on Skype. Chat history along with the message status will be synchronized across all user devices supported by Skype whenever the user signs in with the same Skype account.
Although Skype allows sending SMS messages, it is not possible to receive SMS messages on Skype so users need a different way to receive responses to the messages they send using Skype. This has been a cause of angst among user who purchase Skype as an alternative to a mobile phone because Microsoft will not refund any purchases even for users who discover this missing feature only after purchasing multi-year contracts. Other than in user complaints on the Microsoft Skype forums, there is no mention on Microsoft or Skype websites that when they say "Send SMS messages," that is just what they mean: users can send but they cannot receive SMS messages.
Skype keeps user instant messaging history on user's local computer, and on Skype's cloud for 30 days. Users cannot control how long their chat histories are stored on Skype's servers but can configure that option individually for every their device. Once user signs into Skype on a new device the conversation history is synced with Skype's cloud and stored locally. Skype allows users to remove or edit individual messages during one hour after sending; this affects messages already received by chat interlocutors as well as not delivered to them yet. Skype allows users to delete all saved conversation histories for the device.
FaceTime
FaceTime is a proprietary videotelephony product developed by Apple Inc. It is available on supported iOS mobile devices running iOS 4 and later and Mac computers that run and later. FaceTime supports any iOS device with a forward-facing camera and any Mac computer equipped with a FaceTime Camera. FaceTime Audio, an audio-only version, is available on any iOS device that supports iOS 7 or newer, and any Mac with a forward-facing camera running and later. FaceTime is included for free in iOS and in macOS from (10.7) onwards.
Apple bought the "FaceTime" name from FaceTime Communications, which changed its name to Actiance in January 2011. On June 7, 2010, Apple CEO Steve Jobs announced FaceTime in conjunction with the iPhone 4 in a keynote speech at the 2010 Apple Worldwide Developers Conference. Support for the fourth generation iPod Touch (the first model of iPod Touch equipped with cameras) was announced in conjunction with the device's release on September 8, 2010. FaceTime for was announced on October 20, 2010.
In May 2011, it was found that FaceTime would work seamlessly over 3G on all iPhone, iPad, and iPod Touch models that supported it. Even though FaceTime worked only over 3G at that time, it now supports 4G LTE calls on networks all over the world, availability being limited to operators' GSM plans.
In 2018, Apple added group video and audio support to FaceTime which can support up to 32 people in iOS 12 and macOS Mojave.
Zoom
Zoom is a videotelephony software program developed by Zoom Video Communications. The free version provides a video chatting service that allows up to 100 devices at once, with a 40-minute time restriction for free accounts having meetings of three or more participants. Users have the option to upgrade by subscribing to one of its plans, with the highest allowing up to 1,000 people concurrently, with no time restriction.
Zoom is compatible with Windows, macOS, iOS, Android, Chrome OS, and Linux. It is noted for its simple interface and usability, specifically for non-tech people. Features include one-on-one meetings, group video conferences, screen sharing, plugins, browser extensions, and the ability to record meetings and have them automatically transcribed. On some computers and operating systems, users are able to select a virtual background, which can be downloaded from different sites, to use as a backdrop behind themselves.
Use of the platform is free for video conferences of up to 100 participants at once, with a 40-minute time limit if there are more than two participants. For longer or larger conferences with more features, paid subscriptions are available, costing $15–20 per month. Features geared towards business conferences, such as Zoom Rooms, are available for $50–100 per month. Up to 49 people can be seen on a screen at once. Zoom has several tiers: Basic, Pro, Business, and Enterprise. Participants do not have to download the app if they are using Google Chrome or Firefox; they can click on a link and join from the browser. Zoom is not compatible with Safari for Macs.
Zoom security features include password-protected meetings, user authentication, waiting rooms, locked meetings, disabling participant screen sharing, randomly generated IDs, and the ability for the host to remove disruptive attendees. As of June 2020, Zoom began offering end-to-end encryption to business and enterprise users, with AES 256 GCM encryption enabled for all users. In October 2020, Zoom added end-to-end encryption for free and paid users. It's available on all platforms, except for the official Zoom web client.
Zoom also offers a transcription service using Otter.ai software that allows businesses to store transcriptions of the Zoom meetings online and search them, including separating and labeling different speakers.
As of July 2020, Zoom Rooms and Zoom Phone also became available as hardware as a service products. Zoom Phone is available for domestic telephone service in 40 countries as of August 2020. Zoom for Home, a category of products designed for home use, became available in August 2020.
Google Duo
Google Duo is a video chat mobile app developed by Google, available on the Android and iOS operating systems. It was announced at Google's developer conference on May 18, 2016, and began its worldwide release on August 16, 2016. It is also available to use via Google's Chrome web browser on desktop and laptop computers.
Google Duo lets users make video calls in high definition. It is optimized for low-bandwidth networks. End-to-end encryption is enabled by default. Duo is based on phone numbers, allowing users to call someone from their contact list. The app automatically switches between Wi-Fi and cellular networks. A "Knock Knock" feature lets users see a live preview of the caller before answering. An update in April 2017 lets users worldwide make audio-only calls.
As of December 1, 2016, Google Duo replaced Hangouts within the suite of Google apps device manufacturers must install in order to gain access to the Google Play Store, with Hangouts instead becoming optional.
In August 2020, it was reported that Google was planning to eventually replace Google Duo with Google Meet, but would continue to support Duo and "invest in building new features" in the long term.
Google Hangouts
Google Hangouts is a cross-platform messaging app developed by Google. Originally a feature of Google+, Hangouts became a stand-alone product in 2013, when Google also began integrating features from Google+ Messenger and Google Talk into Hangouts. In 2017, Google began developing Hangouts into a product aimed at enterprise communication, splitting into two products: Google Meet and Google Chat.
Google has also begun integrating features of Google Voice, its IP telephony product, into Hangouts, stating that Hangouts is designed to be "the future" of Voice. Google began transitioning users from the "classic" version of Hangouts to Meet and Chat in June 2020, and announced in October 2020 that Google Chat would eventually be made free to consumers and fully replace Hangouts, shortly after Google Meet became free as well. Google Hangouts will remain a consumer-level product for people using standard Google accounts.
Google Hangouts has a unique feature in that it allows video calls to be streamed live via YouTube.
Google Meet
Google Meet (formerly known as Hangouts Meet) is a video-communication service developed by Google. It is one of two apps that constitute the replacement for Google Hangouts, the other being Google Chat.
User features of Google Meet include:
Two-way and multi-way audio and video calls with a resolution up to 720p
An accompanying chat
Call encryption between all users
Noise cancelling audio filter
Low-light mode for video
Ability to join meetings through a web browser or through Android or iOS apps
Integration with Google Calendar and Google Contacts for one-click meeting calls
Screen-sharing to present documents, spreadsheets, presentations, or (if using a browser) other browser tabs
Ability to call into meetings using a dial-in number in the US
Hosts being able to deny entry and remove users during a call.
Google Meet uses proprietary protocols for video, audio and data transcoding. However, Google has partnered with the company Pexip to provide interoperability between Google Meet and SIP/H.323-based conferencing equipment and software.
Features for users who use Google Workspace accounts include:
Up to 100 members per call for Google Workspace Starter users, up to 150 for Google Workspace Business users, and up to 250 for Google Workspace Enterprise users
Ability to call into meetings with a dial-in number from selected countries
Password-protected dial-in numbers for Google Workspace Enterprise edition users
Real-time closed captioning based on speech recognition
Background blurring
In March 2020, Google temporarily extended advanced features present in the enterprise edition to anyone using Google Workspace or G Suite for Education editions.
In March 2020, Google rolled out Meet to personal (free) Google accounts.
Free Meet calls can only have a single host and up to 100 participants, compared to the 250-caller limit for Google Workspace users and the 25-participant limit for Hangouts. Unlike business calls with Meet, consumer calls are not recorded and stored, and Google states that consumer data from Meet will not be used for advertisement targeting. While call data is reportedly not being used for advertising purposes, based on an analysis of Meet's privacy policy, Google reserves the right to collect data on call duration, who is participating, and participants' IP addresses.
Users need a Google account to initiate calls and like Google Workspace users, anyone with a Google account is able to start a Meet call from within Gmail.
Marco Polo
Marco Polo (app) is a video messaging and video hosting service mobile app. The app was created in 2014 by Joya Communications, founded by Vlada Bortnik and Michael Bortnik. The app markets itself as a video walkie talkie.
Device-specific platforms
iMessage for iPhones
iMessage is an instant messaging service developed by Apple Inc. and launched in 2011. iMessage functions exclusively on Apple platforms: macOS, iOS, iPadOS, and watchOS.
Core features of iMessage, available on all supported platforms, include sending texts, images, videos, and documents; getting delivery and read statuses (read receipts); and end-to-end encryption (which means no one, including Apple itself, is able to intercept or tamper with sent messages). On all platforms except macOS, the service also allows sending location data and stickers. On iOS and iPadOS, third-party developers can extend iMessage capabilities with custom extensions (an example being quick sharing of recently played songs).
Launched on iOS in 2011, iMessage arrived on macOS (then called OS X) in 2012. In 2020, Apple announced an entirely redesigned version of the macOS Messages app which adds some of the features previously unavailable on the Mac, including location sharing and message effects.
Messages by Google
Messages is an SMS and instant messaging application developed by Google for its Android mobile operating system. A web interface is also available. Launched in 2014, it has supported Rich Communication Services (RCS) messaging since 2018, marketed as "Chat" features. By April 2020, the app had more than a billion installs which was most likely due to Google's wider roll out of RCS to many different countries without carrier support.
Palringo
Palringo, or The World’s Online Festival (WOLF), is a community-oriented messaging and gaming app for iOS and Android. The platform allows users to chat, entertain, and perform on a Stage—live microphone slots for up to 5 people, form and join large groups based on common interests, send instant messages and drop images and voice recordings into conversations. Launched under its original name of Palringo in 2006, the app has 80 million accounts worldwide and offers a range of games along with more than 380,000 groups, some of which have up to 2,500 members. Headquartered in London, WOLF has offices in Newcastle and London, UK, and Amman, Jordan.
Groupware
"Groupware" refers to a number of varied applications that are designed to enable communication amongst members of a team, either within a company, a project, or some other group effort. these applications may incorporate a vast range of features and functions, rather than a single specialized function. Such platforms may include instant messaging, document sharing, visual diagrams, voice conference, and many other team-oriented features.
Microsoft Yammer
Yammer is a freemium enterprise social networking service used for private communication within organizations. Access to a Yammer network is determined by a user's Internet domain so that only individuals with approved email addresses may join their respective networks.
The service began as an internal communication system for the genealogy website Geni.com, and was launched as an independent product in 2008. Microsoft later acquired Yammer in 2012 for US$1.2 billion. Currently Yammer is included in all enterprise plans of Office 365 and Microsoft 365.
Adobe Connect
Adobe Connect (formerly Presedia Publishing System, Macromedia Breeze, and Adobe Acrobat Connect Pro) is a suite of software for remote training, web conferencing, presentation, and desktop sharing. All meeting rooms are organized into 'pods'; with each pod performing a specific role (e.g. chat, whiteboard, note etc.) Adobe Connect was formerly part of the Adobe Acrobat family and has changed names several times.
Google Workspace
Google Workspace, formerly known as G Suite, is a collection of cloud computing, productivity and collaboration tools, software and products developed and marketed by Google. It was first launched in 2006 as Google Apps for Your Domain and rebranded as G Suite in 2016. Google Workspace consists of Gmail, Contacts, Calendar, Meet and Chat for communication; Currents for employee engagement; Drive for storage; and the Google Docs suite for content creation. An Admin Panel is provided for managing users and services. Depending on edition Google Workspace may also include the digital interactive whiteboard Jamboard and an option to purchase such add-ons as the telephony service Voice. The education edition adds a learning platform Google Classroom and as of October 2020 retains the name G Suite for Education.
While most of these services are individually available at no cost to consumers who use their free Google (Gmail) accounts, Google Workspace adds enterprise features such as custom email addresses at a domain (e.g. @yourcompany.com), an option for unlimited Drive storage, additional administrative tools and advanced settings, as well as 24/7 phone and email support.
Being based in Google's data centers, data and information are saved directly and then synchronized to other data centers for backup purposes. Unlike the free, consumer-facing services, Google Workspace users do not see advertisements while using the services, and information and data in Google Workspace accounts do not get used for advertisement purposes. Furthermore, Google Workspace administrators can fine-tune security and privacy settings.
Google Chat
Google Chat is a communication software developed by Google built for teams that provides direct messages and team chat rooms, similar to competitors Slack and Microsoft Teams, along with a group messaging function that allows Google Drive content sharing. It is one of two apps that constitute the replacement for Google Hangouts, the other being Google Meet. Google planned to begin retiring Google Hangouts in October 2019.
The current version is for Google Workspace, (formerly G Suite until October 2020) customers only, with identical features in all packages except a lack of Vault data retention in the Basic package. However, in October 2020, Google announced plans to open Google Chat up to consumers as early as 2021, once Hangouts has been officially retired.
Slack
Slack offers many IRC-style features, including persistent chat rooms (channels) organized by topic, private groups, and direct messaging. Content, including files, conversations, and people, is all searchable within Slack. Users can add emoji buttons to their messages, on which other users can then click to express their reactions to messages.
Slack's free plan allows only the 10,000 most recent messages to be viewed and searched. On March 18, 2020, Slack redesigned its platform to simplify and customize the user experience.
Slack teams allow communities, groups, or teams to join a "workspace" via a specific URL or invitation sent by a team admin or owner. Although Slack was developed for professional and organizational communication, it has been adopted as a community platform, replacing message boards or social media groups.
Public channels allow team members to communicate without the use of email or group SMS (texting). Public channels are open to everyone in the workspace. Private channels allow for private conversation between smaller sub-groups. These private channels can be used to organize large teams. Direct messages allow users to send private messages to specific users rather than a group of people. Direct messages can include up to nine people. Once started, a direct message group can be converted into a private channel.
Slack integrates with many third-party services and supports community-built integrations, including Google Drive, Trello, Dropbox, Box, Heroku, IBM Bluemix, Crashlytics, GitHub, Runscope, Zendesk and Zapier. In December 2015, Slack launched their software application ("app") directory, consisting of over 150 integrations that users can install.
In March 2018, Slack announced a partnership with financial and human capital management firm Workday. This integration allows Workday customers to access Workday features directly from the Slack interface.
Discord
Discord is built to create and manage private and public communities. It gives users access to tools focused around communication like voice and video calls, persistent chat rooms and integrations with other gamer-focused services.
Discord communities are organized into discrete collections of channels called servers. A user can create servers for free, manage their public visibility and create one or more channels within that server.
Starting October 2017, Discord allows game developers and publishers to verify their servers. Verified servers, like verified accounts on social media sites, have badge to mark them as official communities. Verified servers are moderated by the developer's or publisher's own moderation team. Verification was later extended in February 2018 to include esports teams and musical artists.
By the end of 2017, about 450 servers were verified. Approximately 1790 servers are verified as of December 2020.
Discord users can improve the quality of the servers they reside in via the "Server Boost" feature, which improves quality of audio channels, streaming channels, number of emoji slots and other perks in 3 levels. Users can buy boosts to support the servers they choose, for a monthly amount. Possession of "Discord Nitro", the platform's paid subscription, gives a user two extra boosts to use on any server they like.
Channels may be either used for voice chat and streaming or for instant messaging and file sharing. The visibility and access to channels can be customized to limit access from certain users, for example marking a channel "NSFW" (Not Safe For Work) requires that first-time viewers confirm they are over 18 years old and willing to see such content.
Kune
Kune is a free/open source distributed social network focused on collaboration rather than just on communication. That is, it focuses on online real-time collaborative editing, decentralized social networking and web publishing, while focusing on workgroups rather than just on individuals. It aims to allow for the creation of online spaces for collaborative work where organizations and individuals can build projects online, coordinate common agendas, set up virtual meetings, publish on the web, and join organizations with similar interests. It has a special focus on Free Culture and social movements needs. Kune is a project of the Comunes Collective.
All the functionalities of Apache Wave, that is collaborative federated real-time editing, plus
Communication
Chat and chatrooms compatible with Gmail and Jabber through XMPP (with several XEP extensions), as it integrates Emite
Social networking (federated)
Real-time collaboration for groups in:
Documents: as in Google Docs
Wikis
Lists: as in Google Groups but minimizing emails, through waves
Group Tasks
Group Calendar: as in Google Calendar, with ical export
Group Blogs
Web-creation: aiming to publish contents directly on the web (as in WordPress, with a dashboard and public view) (in development)
Bartering: aiming to decentralize bartering as in eBay
Advanced email
Waves: aims to replace most uses of email
Inbox: as in email, all your conversations and documents in all kunes are controlled from your inbox
Email notifications (Projected: replies from email)
Multimedia & Gadgets
Image or Video galleries integrated in any doc
Maps, mindmaps, Twitter streams, etc.
Polls, voting, events, etc.
and more via Apache Wave extensions, easy to program (as in Facebook apps, they run on top of Kune)
See also
Comparison of cross-platform instant messaging clients
Comparison of instant messaging protocols
Comparison of Internet Relay Chat clients
Comparison of LAN messengers
Comparison of VoIP software
List of SIP software
List of video telecommunication services and product brands
References
External links
Comparison articles and overviews
15 Group Messaging Mobile Apps, November 5, 2019, by Sig Ueland, practicalecommerce.com.
The Ultimate Group Text App Guide, Last updated February 17, 2020, snapdesk.app website.
Wikimedia pages
Wikimedia list of conference platforms
Social media
Android Auto software
VoIP software
Mobile applications
Android (operating system) software
IOS software
Instant messaging clients
Cross-platform software
Communication software
Computer-mediated communication
Groupware
Collaborative software
Network software comparisons
user features of messaging platforms
Messaging platforms |
466494 | https://en.wikipedia.org/wiki/POKEY | POKEY | The Pot Keyboard Integrated Circuit (POKEY) is a digital I/O chip designed for the Atari 8-bit family of home computers and found in Atari arcade games of the 1980s. POKEY combines functions for sampling potentiometers (such as game paddles) and scan matrices of switches (such as a computer keyboard) as well as sound generation. It produces four voices of distinctive square wave sound, either as clear tones or modified with a number of distortion settings.
POKEY chips are used for audio in many arcade games including Centipede, Missile Command, Asteroids Deluxe, and Gauntlet. Some of Atari's arcade systems use multi-core versions with 2 or 4 POKEYs in a single package for more sound voices. The Atari 7800 console allows a game cartridge to contain a POKEY, providing better sound than the system's audio chip. Only two licensed games make use of this: the ports of Ballblazer and Commando.
The LSI chip has 40 pins and is identified as C012294. POKEY was designed by Atari employee Doug Neubauer, who also programmed the original Star Raiders. The USPTO granted U.S. Patent 4,314,236 to Atari on February 2, 1982 for an "Apparatus for producing a plurality of audio sound effects". The inventors listed are Steven T. Mayer and Ronald E. Milner.
No longer manufactured, POKEY is emulated in software by arcade and Atari 8-bit emulators and also via the Atari SAP music format and associated player.
Features
Audio
4 semi-independent audio channels
Channels may be configured as one of:
Four 8-bit channels
Two 16-bit channels
One 16-bit channel and two 8-bit channels
Per-channel volume, frequency, and waveform (square wave with variable duty cycle or pseudorandom noise)
15 kHz or 64 kHz frequency divider.
Two channels may be driven at the CPU clock frequency.
High-pass filter
Keyboard scan (up to 64 keys) + 2 modifier bits (Shift, Control) + Break
Potentiometer ports (8 independent ports, each with 8-bit resolution)
High Resolution Timers (audio channels 1, 2, and 4 can be configured to cause timer interrupts when they cross zero)
Random number generator (8 bits of a 17-bit polynomial counter can be read)
Serial I/O port
Eight IRQ interrupts
Versions
By part number:
C012294 — Used in all Atari 8-bit family computers, including the Atari XEGS, as well as the Atari 5200 console. The suffix on the chip refers to the manufacturer:
C012294B-01 — AMI
C012294-03 — Signetics
C012294-19 — National Semiconductor
C012294-22 — OKI
C012294-31 — IMP
137430-001 — Part number sometimes used in Atari arcade machines for POKEY.
137324-1221 — Quad-Core POKEY used in Atari arcade machines Major Havoc, I, Robot, Firefox, and Return of the Jedi.
Pinout
Registers
The Atari 8-bit computers map POKEY to the $D2xxhex page and the Atari 5200 console maps it to the $E8xxhex page.
POKEY provides 29 Read/Write registers controlling Sound, Paddle input, keyboard input, serial input/output, and interrupts. Many POKEY register addresses have dual purposes performing different functions as a Read vs a Write register. Therefore, no code should read Hardware registers expecting to retrieve the previously written value.
This problem is solved for some registers by Operating System "Shadow" registers implemented in regular RAM that mirror the values of hardware registers. During the Vertical Blank the Operating System copies the Shadow registers in RAM for Write registers to the corresponding hardware register, and updates Shadow values for Read registers from the hardware accordingly. Therefore, writes to hardware registers which have corresponding shadow registers will be overwritten by the value of the Shadow registers during the next vertical blank.
Reading values directly from hardware at an unknown stage in the display cycle may return inconsistent results (an example: reading potentiometers). Operating System Shadow registers for Read registers would usually be the preferred source of information.
Some Write hardware registers do not have corresponding Shadow registers. They can be safely written by an application without the value being overwritten during the vertical blank. If the application needs to know the last value written to the register then it is the responsibility of the application to implement its own shadow value to remember what it wrote.
In the individual register listings below the following legend applies:
Audio
Pokey contains a programmable sound generator; four audio channels with separate frequency, noise and voice level controls.
Each channel has an 8-bit frequency divider and an 8-bit register to select noise and volume.
AUDF1 to AUDF4 – frequency register (AUDio Frequency)
AUDC1 to AUDC4 – volume and noise register (AUDio Control)
AUDCTL – general register, which controls generators (AUDio ConTroL)
POKEY's sound is distinctive: when the four channels are used independently, there is noticeable detuning of parts of the 12-tone equal temperament scale, due to lack of pitch accuracy. Channels may be paired for higher accuracy; in addition, multiple forms of distortion are available, allowing a thicker sound. The distortion is primarily used in music for bass parts.
One of the sound-engines developed for the Atari 8-bit family was called the AMP engine (Advanced Music Processor). This was used by the musician Gary Gilbertson.
Audio Channel Frequency
The AUDF* registers control the frequency or pitch of the corresponding sound channels. The AUDF* values also control the POKEY hardware timers useful for code that must run in precise intervals more frequent than the vertical blank.
Each AUDF* register is an 8-bit value providing a countdown timer or divisor for the pulses from the POKEY clock. So, smaller values permit more frequent output of pulses from POKEY, and larger values, less frequent. The values $0hex/0dec to $FFhex/255dec are incremented by POKEY to range from $1hex/1dec to $100hex/256dec. The actual audible sound pitch is dependent on the POKEY clock frequency and distortion values chosen. See Audio Channel Control and Audio Control.
AUDF1 $D200 Write
Audio Channel 1 Frequency
AUDF2 $D202 Write
Audio Channel 2 Frequency
AUDF3 $D204 Write
Audio Channel 3 Frequency
AUDF4 $D206 Write
Audio Channel 4 Frequency
Audio Channel Control
The Audio Channel control registers provide volume and distortion control over individual sound channels. Audio may also be generated independently of the POKEY clock by direct volume manipulation of a sound channel which is useful for playing back digital samples.
AUDC1 $D201 Write
Audio Channel 1 Control
AUDC2 $D203 Write
Audio Channel 2 Control
AUDC3 $D205 Write
Audio Channel 3 Control
AUDC4 $D207 Write
Audio Channel 4 Control
Bit 0-3: Control over volume level, from 0 to F.
Bit 4: Forced volume-only output. When this bit is set the channel ignores the AUDF timer, noise/distortion controls, and high-pass filter. Sound is produced only by setting volume bits 0:3 . This feature was used to create digital audio via pulse-code modulation.
Bit 5-7: Shift register settings for noises/distortion. Bit values described below:
Generating random noises is served by reading 8 bits from top of 17-bit shift register. That registers are driven by frequency 1.79 MHz for NTSC or 1.77 MHz for PAL. Its outputs can by used independently by each audio channels' divider rate.
AUDCTL $D208 Write
Audio Control allows the choice of clock input used for the audio channels, control over the high-pass filter feature, merging two channels together allowing 16-bit frequency accuracy, selecting a high frequency clock for specific channels, and control over the "randomness" of the polynomial input.
"1" means "on", if not described:
Bit 0 $01: (15 kHz), choice of frequency divider rate "0" - 64 kHz, "1" - 15 kHz
Bit 1 $02: (FI2 + 4), high-pass filter for channel 2 rated by frequency of channel 4
Bit 2 $04: (FI1 + 3), high-pass filter for channel 1 rated by frequency of channel 3
Bit 3 $08: (CH4 + 3), connection of dividers 4+3 to obtain 16-bit accuracy
Bit 4 $10: (CH2 + 1), connection of dividers 2+1 to obtain 16-bit accuracy
Bit 5 $20: (CH3 1.79), set channel 3 frequency "0" is 64 kHz. "1" is 1.79 MHz NTSC or 1.77 MHz PAL
Bit 6 $40: (CH1 1.79), set channel 1 frequency "0" is 64 kHz. "1" is 1.79 MHz NTSC or 1.77 MHz PAL
Bit 7 $80: (POLY 9), switch shift register "0" - 17-bit, "1" – 9-bit
All frequency dividers (AUDF) can be driven at the same time by 64 kHz or 15 kHz rate.
Frequency dividers 1 and 3 can be alternately driven by CPU clock (1.79 MHz NTSC, 1.77 MHz PAL).
Frequency dividers 2 and 4 can be alternately driven by output of dividers 1 and 3.
In this way, POKEY makes possible connecting of 8-bit channels to create sound with 16-bit accuracy.
Possible channel configurations:
four 8-bit channels
two 8-bit channels and one 16-bit channel
two 16-bit channels
Potentiometers
POKEY has eight analog to digital converter ports most commonly used for potentiometers, also known as Paddle Controllers. The analog inputs are also used for the Touch Tablet controller, and the 12-button, video game Keyboard Controllers. Each input has a drop transistor, which can be set on or off from software. The timers can also be used to support a light pen, by connecting a photodiode to the drop transistor, which captures the timer when the electron beam in the television passes by the pen. The vertical position of the pen had to be read separately.
POT0 $D200 Read
SHADOW: PADDL0 $0270
Paddle Controller 0 Input
POT1 $D201 Read
SHADOW: PADDL1 $0271
Paddle Controller 1 Input
POT2 $D202 Read
SHADOW: PADDL2 $0272
Paddle Controller 2 Input
POT3 $D203 Read
SHADOW: PADDL3 $0273
Paddle Controller 3 Input
POT4 $D204 Read
SHADOW: PADDL4 $02704
Paddle Controller 4 Input
POT5 $D205 Read
SHADOW: PADDL5 $0275
Paddle Controller 5 Input
POT6 $D206 Read
SHADOW: PADDL6 $0276
Paddle Controller 6 Input
POT7 $D207 Read
SHADOW: PADDL7 $0277
Paddle Controller 7 Input
Each input has 8-bit timer, counting time when each TV line is being displayed. This had the added advantage of allowing the value read out to be fed directly into screen coordinates of objects being driven by the paddles. The Atari Paddle values range from 0 to 228, though the maximum possible is 244. The Paddle controller reads 0 when turned to its maximum clockwise position, and returns increasing values as it is turned counter-clockwise ending at its maximum value.
The Paddle reading process begins by writing to POTGO which resets the POT* values to 0, the ALLPOT value to $FF, and discharges the potentiometer read capacitors. The POT* values increment as they are being scanned until reaching the resistance value of the potentiometer. When the Paddle reading is complete the corresponding bit in ALLPOT is reset to 0.
The Paddle scanning process can take the majority of a video frame to complete. The Atari Operating System takes care of Paddle reading automatically. The Paddles are read and paddle scanning initiated during the stage 2 vertical blank. Paddle values are copied to shadow registers. (Note that Paddle triggers are actually joystick direction input read from PIA.)
A faster mode of scanning the Paddles is possible by setting a bit in SKCTL. The reading sequence completes in only a couple scan lines, but the value is less accurate.
ALLPOT $D208 Read
Potentiometer Scanning Status
Each bit corresponds to one potentiometer input (the POT* registers). When paddle scanning is started by writing to POTGO each paddle's bit in ALLPOT is set to 1. When a paddle's scan is complete the corresponding bit in ALLPOT is reset to 0 indicating the value in the associated POT* register is now valid to read.
POTGO $D20B Write
Start Potentiometer Scan
Writing to POTGO initiates the potentiometer (Paddle) scanning process. This resets the POT* values to 0, the ALLPOT value to $FF, and discharges the potentiometer read capacitors. As each potentiometer scan completes the bit corresponding to the potentiometer in ALLPOT is cleared indicating the value of the associated POT* register is valid for reading.
Serial input output port
Contains:
serial input line
serial output line
serial clock output line
two-way serial clock data line
registers SKREST, SEROUT, SERIN, SKCTL, SKSTAT
POKEY is a sort of UART. Usually one of the doubled audio channels is used as baud rate generator. The standard baud rate is 19.2 kbit/s, the maximum possible baud rate is 127 kbit/s. A byte put into the SEROUT register is automatically sent over the serial bus. The data frame contains 10 bits: 1 start bit, 8 data bits, 1 stop bit. The voltage levels are 0 V (logical 0) and +4 V (logical 1). It is possible to connect the Atari serial port with an RS-232 port by means of a simple voltage converter.
Each input/output operation causes POKEY's internal shift registers to change value, so when programming for POKEY, it is necessary to re-initialise some values after each operation is carried out.
SKREST $D20A Write
Reset Serial Port Status (SKSTAT).
A write to this register will reset bits 5 through 7 of SKSTAT which are latches to 1. The latches flag keyboard overrun, Serial data input overrun, and Serial data input frame error.
SEROUT $D20D Write
Serial port data output byte.
This is a parallel "holding" register for the eight bit (one byte) value that will be transferred to the serial shift register for output one bit at a time. When the port is ready to accept data for output a Serial Data Out interrupt informs the Operating System that it can write a byte to this output register.
SERIN $D20D Read
Serial port data input byte.
Like SEROUT, also a parallel "holding" register. This holds the eight bit (one byte) value assembled by the serial shift register reading the data input one bit at a time. When a full byte is read a Serial Data In interrupt occurs informing the Operating System that it can read the byte from this register.
SKCTL $D20F Write
Serial Port Control
Bit 0: Enable "debounce" scanning which is intended to eliminate noise or jitter from mechanical switches. A value of 1 enables POKEY to use an internal comparison register while scanning keys. A key must be detected in two simultaneous scans before it is identified as pressed, and it must be seen released for two consecutive scans to be considered released. This should be enabled to maintain normal keyboard handling with the Operating System.
Bit 1: Set to 1 to enable keyboard scanning. This should be enabled to maintain normal keyboard handling with the Operating System.
Bit 2: Set to 1 to enable fast, though less accurate Potentiometer scanning. Fast Pot scanning increments the counter on every cycle and returns a usable result within two scan lines. The Operating System uses the slow Pot Scanning which increments the counter once every 114 cycles (scan line) taking a frame (1/60th second) to produce a result. The OS reads the Pot values during its Vertical Blank Interrupt (VBI) and copies the result to the potentiometer Shadow registers in RAM. It then resets POTGO for the next read during the next VBI.
Bit 3: Enable Serial port two-tone mode. When enabled, 1 and 0 bits output to the SIO bus are replaced by tones set by timers 1 and 2. This is ordinarily used for writing analog tones representing digital data to cassette tape.
Bit 4-6: Clock Timing Control for serial port operation. Bit values described below:
Bit 7: Forces a known 0 output, so that timer 2 can reset timer 1 in two-tone serial output mode.
SKSTAT $D20F Read
Serial Port Status
KBCODE $D209 Read
SHADOW: CH $02FC
Keyboard Code
Eight IRQ interrupts
BREAK Break (BREAK key interrupt)
K Keyboard (keyboard interrupt)
SIR if Serial Input Ready (read interrupt from serial rail)
ODN if Output Data Needed (write interrupt from serial rail)
XD if eXmitend Data (serial transmission end interrupt)
T1 Timer 1, timer 1 interrupt
T2 Timer 2, timer 2 interrupt
T4 Timer 4, timer 4 interrupt
Interrupts can be set on or off from software by register IRQEN.
IRQSTAT register contains interrupts status.
Keyboard
Six key register of actually pushed keys (K0 K5), which contains values from 00 to 3F. Contains 2 control values. One of them acts as decoder of all 6 values. Second control values is used to decode special key values — CTRL, SHIFT and BREAK.
References
External links
ASMA — Atari SAP Music Archive A collection of POKEY chip-music (SAP) players and SAP music from various Atari 8-bit games.
POKEY chip data sheet scanned to PDF.
POKEY made from small-scale logic chips
Video of Atari 8-bit (using POKEY) emulating Commodore SID chip.
Atari 8-bit family
Computer-related introductions in 1979
Sound chips
Input/output integrated circuits
Integrated circuits |
51524555 | https://en.wikipedia.org/wiki/Winmark | Winmark | Winmark Corporation is an American franchisor of five retail businesses that specialize in buying and selling used goods. The company is based in Minneapolis, Minnesota. Winmark was founded in 1988 as Play It Again Sports Franchise Corporation by Ron Olson and Jeffrey Dahlberg after they purchased the Play It Again Sports franchise rights from Martha Morris. They renamed the company to Grow Biz International Inc. in June 1993. Grow Biz went public in August 1993. In 2000, John Morgan replaced Dahlberg as CEO and renamed the company to Winmark in 2001. Morgan rescued Winmark from the verge of bankruptcy by selling financially failing franchise concepts and stores and replacing the management team. The company's strategy was to move from owning stores itself to having franchisees own all the stores.
Winmark Corporation owns five franchise-based retail companies that focus on used goods: Music Go Round (musical instruments), Once Upon a Child (children's clothes and toys), Plato's Closet (adolescent and young adult clothes), Play It Again Sports (sports equipment), and Style Encore (women's clothing). Winmark also owned but subsequently sold four franchise-based retailed companies: Computer Renaissance (computer equipment), Disc Go Round (CDs), It's About Game (computer games and video games), and ReTool (tools). Its subsidiary Wirth Business Credit is a small-business supplies leasing company.
Around 2013, research company IBISWorld reported that in the used goods outlet market, Goodwill Industries was first with a 21.5% share, Winmark was second with nearly 6%, and The Salvation Army was third with nearly 4%. In 2016, Winmark had a $1 billion market share in the $17 billion resell industry through its 1,170 franchisees.
History
Ron Olson and Jeffrey Dahlberg started a consulting firm, Franchise Business Systems, in 1986. Olson had been the president of R.J. Brandon Galleries and Dahlberg had been the chief executive officer of his father Kenneth H. Dahlberg's company, Dahlberg Inc. (now Miracle-Ear). Martha Morris was an initial customer of Olson and Dahlberg's consulting company. Morris, who started Play It Again Sports in 1983 in Uptown, Minneapolis, had purchased camping and backpacking supplies, found out she was not interested in camping, and decided to sell her used goods. She had attempted to sell a costly, lightly used backpack through making ads and visiting a sports shop, where an employee told her, "We don't sell used equipment." Morris decided to start her own store since she believed other people might have used sports equipment they would like to sell.
Morris expressed a desire to make her idea a franchise. Although Olson and Dahlberg were first concerned about the idea's outlook for success, their worries disappeared after they dropped by her outlet a Saturday morning and found a line of 10 customers before Morris' store had even opened. Their strategy to captivate franchisees was to add urbanity to something they called a "garage sale-looking environment" but not harm the initial idea. Olson and Dahlberg quickly realized they preferred to be the owners of a company instead of be advisers. Morris sold her Play It Again Sports franchise rights to Olson and Dahlberg in 1988. She sold her stores to them in 1990. Play It Again Sports became Winmark's first division.
The company was incorporated as Play It Again Sports Franchise Corporation in 1988 and was renamed to Grow Biz International Inc. in June 1993. It went public in August 1993. The company was listed on NASDAQ as GBIZ; it is now listed on NASDAQ as WINA. In 1995, a significant number of the company's franchises were on Entrepreneurs annual "Franchise 500" list. In 2001, Grow Biz was renamed to Winmark Corporation. Winmark Corporation is based in Minneapolis, Minnesota.
In March 2000, John Morgan took over as CEO from Jeff Dahlberg. By a year after joining the company as CEO, Morgan rescued Winmark from the precipice of bankruptcy by introducing stringent review of franchisee finances, shuttering failing Play It Again Sports stores, and appointing his own people to executive and board positions. Morgan chose Steve Briggs, who had been at Valspar, as the company's president. He selected as board members Kirk MacKenzie, whom he had worked with at Winthrop Resources, and Paul Reyelts, the chief financial officer at Valspar. In June 2000, Winmark sold its corporate headquarters building to Koch Trucking. The company had lost $350,700 in 2000; in 2001, it had a net income of $3.2 million. Morgan said in a 2009 interview with the Star Tribune about the state of Winmark before he joined, "The company was very good at selling franchises, but it was still losing money." Around 2002, Winmark sold the franchises Retool, Computer Renaissance and Disc-Go-Round.
In 2011, Winmark was ranked the 11th company on Forbess "The Top 20 Small Public Companies In America". Around 2013, research company IBISWorld found that in the used goods outlet market, Goodwill Industries was first with a 21.5% share, Winmark was second with nearly 6%, and The Salvation Army was third with nearly 4%. In 2016, the company had a $1 billion market share in the $17 billion resell industry through its 1,170 franchisees.
In February 2016, President Brett Heffes was chosen as Winmark's next CEO, succeeding John Morgan, who became the executive chairman. According to a 2014 article in The Toronto Star, Morgan holds the most shares in the company.
Franchises
Winmark Corporation owns five franchise-based retail companies that focus on used goods: Music Go Round (musical instruments), Once Upon a Child (children's clothes and toys), Plato's Closet (adolescent and young adult clothes), Play It Again Sports (sports equipment), and Style Encore (women's clothing). Winmark sold four franchise-based retailed companies: Computer Renaissance (computer equipment), Disc Go Round (CDs), It's About Game (computer games and video games), and ReTool (tools).
The cost to become a franchisee in 2009 was $25,000 and five percent of the franchisee's gross revenue. Franchisees further are required to pay Winmark for advertising and miscellaneous assistance. Although Winmark's contract with franchisees does not allot them territories, the contract ensures that any rival outlets must be located five or more miles away.
Despite Winmark's specializing in used goods, its stores sell new goods too. Through its many stores, Winmark uses its buying power to negotiate competitive prices for the stores' new goods. It also teaches franchisees about the used goods industry and offers pricing software to establish standards for the used goods they purchase.
Between 2006 and 2010, Winmark started around 50 stores annually. In 2013, Winmark had over 1,000 franchised stores—none of which it owned—that in total had sales of over $900 million.
Current franchises
Music Go Round
Music Go Round purchases, sells, and exchanges used musical instruments and paraphernalia. Founded as Hi-Tech Consignments in Minneapolis by Bill Shell in 1986, Winmark purchased it in 1993 and renamed it to Music Go Round. In 2009, roughly 30% of Music Go Round's musical instruments purchased were new. In 2011, it had 35 locations in the United States. In 2010, the complete cost to start a Music Go Round was $300,000 and the average yearly sales were between $650,000 and $725,000.
Once Upon a Child
Once Upon a Child purchases and sells used children's attire and toys. The first Once Upon a Child store was opened in 1985 in Perrysburg, Ohio, by Dennis and Lynn Blum in 1985 after they observed Goodwill Industries accepting and selling used baby attire. Prior to opening a store, Lynn Blum had been selling her three sons' and friends' and neighbors' clothing in a garage sale from her house every week. Her husband resigned from his employment in 1989 to work with Lynn . Winmark purchased the company in 1992. In 2009, roughly 10% of Once Upon a Child's children's attire and playthings purchased were new. In 2011, it had 240 locations in the United States and 24 in Canada. In 2010, the complete cost to start a Once Upon a Child was between $200,000 and $250,000 and the average yearly sales were between $650,000 and $725,000.
Plato's Closet
Plato's Closet purchases and sells used brand name children's and teenagers' brand name clothes, shoes, and paraphernalia. It focuses on clothes for people ages 12 to 24. Winmark purchased Plato's Closet from Dennis and Lynn Blum, the founders of Once Upon a Child, in 1998. The store's name was inspired by a Blum son's schoolwork about Plato who had been an early advocate of recycling, which paralleled the aim to recycle used clothes. In 2011, there were over 280 franchisees in the United States and Canada. In 2010, the complete cost to start a Plato's Closet was between $200,000 and $250,000 and the average yearly sales were $825,000.
A 2001 article in the Star Tribune noted that Plato's Closet in 2001 stocked up on brand names such as Abercrombie & Fitch, Gap Inc., Silver Jeans Co., Sean John, Express, Inc., and Dr. Martens that they sold at a markdown of between 50% and 75%. Unlike consignment shops, Plato's Closet pays sellers on the spot. Used clothes are purchased at between 30% and 40% of what Plato's Closet intends to sell them at. In a 2009 interview with Star Tribune, CEO John Morgan said Plato's Closet did the best during the Great Recession among Winmark's franchises because people were more likely to sell used clothing to make money and to buy used clothing to save money.
Play It Again Sports
Play It Again Sports purchases and sells used sports goods and is Winmark's largest chain. Roughly 70% of Play It Again Sports' sports equipment is new. In 2011, it had over 330 locations in the United States and Canada. In 2010, the complete cost to start a Play It Again Sports was $300,000 and the average yearly sales were between $650,000 and $725,000.
Style Encore
Style Encore buys and sells used women's clothing. In January 2013, Winmark announced that it would start a new franchise, Style Encore, that would focus on used women's clothing. The first store opened in Texas in August 2013. Style Encore immediately pays cash to people looking to sell used women's attire, footwear, handbags, and jewelry.
Former franchises
Computer Renaissance
Computer Renaissance bought and sold new and used computer supplies such as computers, computer memory, computer monitors, and printers. The store also sold computer games and books and helped customers build custom computers. Computer Renaissance was started in 1993. On July 7, 2000, Winmark sold Computer Renaissance, which had 209 stores, to Jack Hollis' Hollis Technologies LLC in Lakeland, Florida, for $3 million. Hollis had been a Computer Renaissance franchisee. CEO John Morgan said that whereas Music Go Round's used musical equipment did not depreciate, Computer Renaissance's aging computers did.
Disc Go Round
Disc Go Round bought and sold new and used compact discs (CDs). In July 1994, Winmark spent $2.3 million to purchase CDX Audio, which used the name CD Exchange in Green Bay, Wisconsin. When Winmark purchased the company, it had 42 stores. Winmark renamed the used CD store to Disc Go Round because they could not nationally trademark the name "CD Exchange". The outlet had an electronic system that recorded all the CDs a store had so customers did not have to browse the shelves trying to determine whether a particular CD was present. Disc Go Around also had "listening stations" for customers to listen to CDs. Winmark sold Disc Go Round, which had increased to 137 stores, to CD Warehouse on June 26, 1998, for $7.4 million.
It's About Games
It's About Games bought and sold used PC games, video games, and board games. Winmark spent roughly $6.8 million to acquire Video Game Exchange Inc. in 1997 and renamed it to It's About Games. It's About Games locations were largely in Ohio, Pennsylvania, Kentucky, Georgia and Maryland. It made money for Winmark until 1998. During its last year under Winmark management, It's About Games lost $3.4 million because of excessive inventory, a buggy computer system, poorly chosen products, and poorly trained employees. Winmark owned 60 of the 64 It's About Games stores in the franchise and would have been profitable without It's About Games. Winmark sold or closed all its It's About Games stores in 1999 to reduce losses. Winmark also aimed to cut down on the number of stores it owned because it wanted to focus on franchisee-owned stores.
ReTool
ReTool bought and sold used tools. Winmark opened the first ReTool on November 10, 1999, in Chicago. Roughly 65% of ReTool's goods were purchased from people, while the remainder was purchased from factories that produced too many tools or from closing businesses. CEO John Morgan said in an October 2000 interview that many people did not sell tools they did not use so it was hard to accumulate used tools at ReTool. Winmark sold ReTool around 2002.
Subsidiaries
Wirth Business Credit
Wirth Business Credit is a small-business supplies leasing company owned by Winmark. In a 2008 interview with the Star Tribune, CEO John Morgan said Winmark plowed the $9 million to $10 million in profit from its franchisees into Wirth Business Credit because they believed leasing supplies will become a profitable business even though the current growth had been slow. In 2008, there were 27 Wirth Business Credit franchises.
In 2004, Winmark created Winmark Business Solutions, a website for small businesses owners. Winmark Business Solutions was intended to help franchisees and clients of Wirth Business Credit. The website hosted 6,000 pages related to business such as how to found a company and how to sell a company. It had a forum for people to discuss small business issues and its articles also discussed finance, insurance, and technology topics with a focus on business.
References
External links
American companies established in 1988
Companies based in Minneapolis
Companies listed on the Nasdaq
Franchises
Retail companies established in 1988 |
25034980 | https://en.wikipedia.org/wiki/Apache%20OpenMeetings | Apache OpenMeetings | OpenMeetings is software used for presenting, online training, web conferencing, collaborative whiteboard drawing and document editing, and user desktop sharing. The product is based on Red5 media server, HTML5 and Flash which in turn are based on a number of open source components. Communication takes place in virtual "meeting rooms" which may be set to different communication, security and video quality modes. The recommended database engine for backend support is MySQL. The product can be set up as an installed server product, or used as a hosted service.
Work on OpenMeetings started in 2007 by Sebastian Wagner. Since 2009 the project became open which helped to involve other developers from different countries. Starting from 2011 main project development and technical support moved to Russia. In the meantime, web conferencing services based on OpenMeetings formally are offered by about a dozen companies around the world. Since 2012, the project is being developed under the auspices of open-source devoted Apache Software Foundation (ASF) and possesses Apache License, which allows it to be used in commercial projects. Since 2012 OpenMeetings progress is presented regularly at the ApacheCon.
Public facilities include the educational intranet "Koblenzer Schulnetz" in Koblenz, Germany and two public demo-servers.
Articles have been published at ZDNet Blogs and a publication in LinuxMag France Page 40-44 and Ajax Magazine.
OpenMeetings is used for web conferencing in FOSS e-learning solutions Moodle and Atutor. Now OpenMeetings is integrated with several CMS, CRM and other systems. The project has been downloaded over 250 000 times. OpenMeetings is available in 31 languages.
Features
Open Meetings implements the following features:
Audio communication
Video conferencing
Meeting recording
Screen sharing
Collaborative document editing
Chat and white boarding
User and room management
Mobile client for Android
no encryption protocol, end to end etc
See also
Comparison of web conferencing software
References
External links
Release archives on weblog.openlaszlo.org
Remote desktop
Web conferencing
Free software
Apache Software Foundation
Apache_Software_Foundation_projects
Software using the Apache license |
22259076 | https://en.wikipedia.org/wiki/Koei%20Tecmo | Koei Tecmo | is a Japanese video game and anime holding company created in 2009 by the merger of Koei and Tecmo. Koei Tecmo Holdings owns several companies, the biggest one of those being its flagship game developer and publisher Koei Tecmo Games that was founded in 1978 as Koei.
Koei Europe was the first subsidiary to change its name to Tecmo Koei Europe, Ltd and to release video games under the new moniker. In January 2010, Tecmo, Inc. and Koei Corporation followed suit by merging to form Tecmo Koei America Corporation.
On April 1, 2010, Tecmo was declared disbanded in Japan. Its sister company Koei survived but was renamed Tecmo Koei Games (today Koei Tecmo Games) and is now the main publishing arm of the group in Japan. The former development divisions of Tecmo and Koei were briefly spun-off as separate companies in March 2010, but folded into Tecmo Koei Games in April 2011. In addition to its primary trademark, Koei Tecmo Games occasionally used until 2016 the "Tecmo" and "Koei" brand names on new video games for marketing purposes.
History
Independent era
Koei
Koei Co., Ltd. (株式会社コーエー Kabushiki gaisha Kōē, formerly 光栄 (Kōei)) was founded in July 1978 by husband-and-wife duo Yoichi and Keiko Erikawa. Yoichi was a student at Keio University, and when his family's rural dyestuffs business failed he decided to pursue his interest in programming. The company was (and, as Koei Tecmo, still is) located in the Hiyoshi area of Yokohama along with Yoichi's alma mater, and the company's name is simply a spoonerism of the school's.
Kō Shibusawa and Eiji Fukuzawa, whose names are supposed to have made up the name of the company, do not really exist and are names used by the company to avoid giving credit to individual contributors, effectively acting as pen names for the Erikawas.
The company initially focused on personal computer sales and made-to-order business software. In 1983 it released Nobunaga's Ambition (信長の野望 Nobunaga no Yabō), a historical strategy game set during the Sengoku period of Japanese history. The game went on to receive numerous awards, and Koei produced several more such games set against the backdrop of world history, including Romance of the Three Kingdoms, set during the Three Kingdoms period of Chinese history, and Uncharted Waters (大航海時代 Dai Kōkai Jidai; lit. Great Navigation Era), set in Portugal during the Age of Exploration.
In 1988, Koei established a North American subsidiary, Koei Corporation, in California. This subsidiary localized Koei games for export to all territories outside Japan, as well as producing original games and concepts with the leadership of designer Stieg Hedlund, like Liberty or Death, Celtic Tales: Balor of the Evil Eye, and Gemfire . After Hedlund's departure, this subsidiary ceased game development in 1995, focusing instead on localization, sales and marketing.
A Canadian subsidiary, Koei Canada, Inc. was established in early 2001, and a European subsidiary, Koei Limited was established in early 2003 in Hertfordshire, United Kingdom. In 2004, a Lithuanian subsidiary was formed.
Tecmo
, formerly known as , was founded by Yoshihito Kakihara on July 31, 1967, as a supplier of cleaning equipment. Two years later, in 1969, it started to sell amusement equipment. Tecmo had its headquarters in Kudankita, Chiyoda, Tokyo. Tecmo's United States offices were located in Torrance, California.
In March 1981, a U.S. division was inaugurated as U.S. Tehkan, Inc.. A month later, in April 1981, Tehkan released in Japan its first arcade video game titled Pleiades (which was distributed in America by Centuri). When it was still called Tehkan, the company also released such classic games as Bomb Jack and Tehkan World Cup. On January 8, 1986, Tehkan officially changed its name to Tecmo. In 1989 Tecmo was named as co-defendant in a lawsuit, when Indianapolis Colts running back Eric Dickerson sued the NFLPA over use of his likeness in the game Tecmo Bowl.
In 2006, Founder, President and Chairman Yoshihito Kakihara died of interstitial pneumonia.
On June 3, 2008, Team Ninja head Tomonobu Itagaki resigned from the company and filed a 145 million yen ($1.3 million) lawsuit for "unpaid completion bonuses" and "emotional distress". This was followed by another lawsuit filed on 16 June by two plaintiffs on behalf of Tecmo's 300 employees for unpaid wages amounting to ¥8.3 million.
Merger and reorganization
On August 20, 2008, Tecmo announced the resignation of president Yoshimi Yasuda, to be replaced by current Chairman of the Board Yasuharu Kakihara as of September 1. On August 28, Square Enix announced plans for a friendly takeover of Tecmo by purchasing shares at a 30 percent premium with a total bid of ¥22.3 billion. They gave Tecmo until September 4 to either accept or reject the proposal. Upon hearing this news on August 31, Kenji Matsubara, President and COO of Koei, called a board meeting for the next day, September 1. The board discussed the possibility of a merger with Tecmo, and began discussions with Tecmo that same day. On September 4, 2008 Tecmo officially declined Square Enix's proposal, and later that same day announced plans to merge with Koei.
In November, the companies announced their specific plan of action, to complete the merger on April 1, 2009, forming Tecmo Koei Holdings. Koei stock was to be exchanged for Tecmo Koei stock at a rate of 1:1, and Tecmo stock exchanged at .9:1, giving Koei shareholders, in total, a three-quarter stake in the new company. Though the combined profits in 2007 were 8.5 billion yen, they anticipated that the merged company would net over 16 billion yen in the fiscal year ending March 2012. Effissimo Capital Management Pte, Tecmo's second-largest shareholder at 17.6%, openly opposed the merger. On January 26, 2009, the shareholders for both Koei and Tecmo reached separate agreements in favor of the merger. Effissimo raised some dissent during the meeting, and implied they may seek to sell their shares. Effissimo's director Takashi Kosaka stated “We have not had sufficient information from the company to make a judgment on the merger, such as the feasibility of their plan to raise shareholder value.” On February 12, Kenji Matsubara liquidated KOEI France SAS. On February 13, Tecmo announced it had received a repurchase claim (a request for the company to buy stock back) from a major shareholder, 15.64% of the stock (3,890,700 shares) from a shareholder that stood in opposition to the firm's upcoming merger with Koei. While the requesting shareholder was not mentioned, Reuters stated that it was likely Effissimo.
Despite these misgivings, the holding company formed on April 1, 2009 as planned. The development divisions of both companies were spun-out into separate subsidiaries, created specifically for the planning and development of software, operating directly under the holding company. Kenji Matsubara became CEO of the new company, and former Tecmo CEO Yasuharu Kakihara became board chairman. As of May 26, Tecmo Koei had still not reached an agreement with Effissimo, prompting the investment fund to seek mediation with the Tokyo District Court. While Tecmo Koei favored a stock value in the mid-600 yen range, Effissimo was expected to ask for at least 900, in part due to the rejected Square Enix offer of 920 per share.
On June 23, 2009, Tecmo Koei announced a planned restructure of its international subsidiaries. Koei Europe was renamed Tecmo Koei Europe in 2009 and became the first subsidiary to publish games under the new moniker, starting with Ninja Gaiden Sigma 2. In August 2009 Tecmo Koei announced that it was setting up a subsidiary in Hanoi, Vietnam. In January 2010, Tecmo's sole subsidiary, the American Tecmo Inc., and Koei's American branch, Koei Corporation, were moved under a newly formed Tecmo Koei America Corporation, itself a direct subsidiary to Tecmo Koei Holdings. Koei's Canadian, Korean, and Taiwanese subsidiaries were re-branded Tecmo Koei, and also moved to direct subsidiaries of the holding company. Later that month the Entertainment Software Association (ESA) announced that Tecmo Koei was now a member.
On April 1, 2010, Koei absorbed Tecmo in Japan to become Tecmo Koei Games, which renamed itself to Koei Tecmo Games in 2014. Koei Singapore was also re-branded as Tecmo Koei.
Post-merger
On February 8, 2011, Tecmo Koei Holdings announced that the new individual developers Tecmo and Koei that were formed in March 2010 would be merged into Tecmo Koei Games in April 2011, though the company will continue to develop under the Tecmo and Koei brands.
The continued operating loss prompted Kenji Matsubara, the former president and CEO of both Tecmo Koei Holdings and Tecmo Koei Games label, to render his resignation in November 2010. Yoichi Erikawa, co-founder of Koei, took over the four positions vacated by Matsubara.
On July 1, 2014, the company and its related subsidiaries were renamed from Tecmo Koei to Koei Tecmo.
On February 18, 2016, Koei Tecmo announced a second reorganization of the company, to support the expansion of the company. Brand names Team Tachyon, Koei and Tecmo, amongst others, were dropped.
Current Subsidiaries/Divisions
Gust
Gust Co. Ltd. was founded in 1993 and is best known for its long-running Atelier series. Koei Tecmo bought Gust Co. Ltd. in 2011 and absorbed it in 2014.
Kou Shibusawa
On February 18, 2016, as part of the companies reconstruction, Koei Tecmo announced the establishment of Kou Shibusawa, named after the stage name of Koei's founder. It has handled the historically-based titles such as Nobunaga's Ambition series, Romance of the Three Kingdoms series, Uncharted Waters series and Nioh series, as well as horse racing simulation Winning Post series. The division also worked on Fire Emblem: Three Houses.
Koei Tecmo Singapore
A development support studio at Singapore.
Koei Tecmo Tianjin Software
A development support studio at Tianjin, China.
Koei Tecmo Beijing Software
A development support studio at Beijing, China
Koei Tecmo Software Vietnam
A development support studio at Vietnam.
Midas
"midas" is a new division aiming to produce titles for smartphones and to create new IPs.
Omega Force
Omega Force (ω-Force) is a division of Koei. Omega Force are most well known for its Dynasty Warriors series, including spin-offs such as Samurai Warriors, Warriors Orochi, amongst others. As well as non-Warriors titles such as Dragon Quest Heroes, WinBack, Attack on Titan and Toukiden.
Ruby Party
Ruby Party specializes in games labeled as Neoromance: GxB visual novels and dating sims, usually with extra side-quests. Out of the three Neoromance series, the best known is Angelique series, which has been in production since 1994. The first game of Angelique series was the first otome game (visual novel and dating sims for women) in the world. Harukanaru Toki no Naka de is a newer Neoromance hit, with many sequels and an anime television series based on it. The newest game in the series, Kin'iro no Corda, is gaining popularity partially because the manga series it was based on has been recently licensed by Viz for English language publishing, and an anime television series based on it began airing in October 2006. A sequel was also released on the PlayStation 2 in March 2007.
Team Ninja
Team Ninja (stylised as Team NINJA) is a video game development studio of Tecmo founded in 1995. It was formerly led by Tomonobu Itagaki and is best known for the Dead or Alive and Ninja Gaiden series.
International offices
Koei Tecmo Europe, Ltd.
Koei Tecmo America Corporation
Koei Tecmo Taiwan Co., Ltd
Former Subsidiaries
Team Tachyon
Team Tachyon is a Japanese video game development department of Koei Tecmo founded in 2007. Similar to Team Ninja, the group was formed to develop high-profile games, some of which relate to Tecmo Koei's classic franchises. The company says that they chose the name, "Team Tachyon", because a tachyon is a particle that exceeds the speed of light. Key members include Tecmo producers Keisuke Kikuchi (Rygar, Fatal Frame) and Kohei Shibata.
So far, Team Tachyon has aided in the development of the 2008 Fatal Frame IV: Mask of the Lunar Eclipse game for the Wii, 2008 Wii game Rygar: The Battle of Argus, has released Undead Knights for the PlayStation Portable, and Quantum Theory for the PlayStation 3 and the Xbox 360, released in 2010. Spirit Camera: The Cursed Memoir was developed for the Nintendo 3DS and released in 2012, Project Zero 2: Wii Edition was released on the same year for the Wii and Fatal Frame: Maiden of Black Water was released in 2015 for the Wii U.
As of February 18, 2016, Team Tachyon was absorbed into Team Ninja, with some staff now moved to Gust.
Koei Tecmo Canada
Founded in 2001 as Koei Canada, Koei Tecmo Canada was the North American development arm of the company based in Toronto. It started out as a CG studio for Koei games but expanded into video game development in 2005, developing Fatal Inertia, Prey the Stars, and Warriors: Legends of Troy. The studio was closed at the end of March 2013.
Notable games published
Atelier series
Dead or Alive series
Dynasty Warriors series
Fatal Frame series
Monster Rancher series
Ninja Gaiden series
Nioh series
Nobunaga's Ambition series
Romance of the Three Kingdoms series
Samurai Warriors series
Tamashii no Mon Dante no Shinkyoku yori (魂の門 ダンテ「神曲」より, Literally: Gate of Souls ~ From Dante's Divine Comedy)
Uncharted Waters series
Winning Post series
IP collaboration:
Fire Emblem: Three Houses
Fire Emblem Warriors
Hyrule Warriors series
Marvel Ultimate Alliance 3: The Black Order
Metroid: Other M
One Piece: Pirate Warriors series
Persona 5 Strikers
Pokemon Conquest
Stranger of Paradise: Final Fantasy Origin
Notes
References
External links
Koei Tecmo America Corp.
Koei Tecmo Europe Ltd.
Koei Tecmo Singapore Pte. Ltd.
Koei Tecmo Holdings Co., Ltd.
Koei Tecmo Taiwan Co., Ltd
Companies based in Yokohama
Japanese companies established in 2009
Video game companies established in 2009
Anime companies
Holding companies established in 2009
Video game companies of Japan
Video game development companies
Video game publishers
Companies listed on the Tokyo Stock Exchange
Holding companies of Japan |
44382509 | https://en.wikipedia.org/wiki/Example-centric%20programming | Example-centric programming | Example-centric programming is an approach to software development that helps the user to create software by locating and modifying small examples into a larger whole. That approach can be helped by tools that allow an integrated development environment (IDE) to show code examples or API documentation related to coding behaviors occurring in the IDE. “Borrow” tactics are often employed from online sources, by programmers leaving the IDE to troubleshoot.
The purpose of example-centric programming is to reduce the time spent by developers searching online. Ideally, in example-centric programming, the user interface integrates with help module examples for assistance without programmers leaving the IDE. The idea for this type of “instant documentation” is to reduce programming interruptions. The usage of this feature is not limited to experts, as some novices reap the benefits of an integrated knowledge base, without resorting to frequent web searches or browsing.
Background
The growth of the web has fundamentally changed the way software is built. Vast increase in information resources and the democratization of access and distribution are main factors in the development of example-centric programming for end-user development. Tutorials are available on the web in seconds thus broadening the space of who writes it: designers, scientists, or hobbyists. By 2012 13 million program as a part of their job, yet only three million of those are actual professional programmers.
Prevalence of online code repositories, documentation, blogs and forums—enables programmers to build applications iteratively searching for, modifying, and combining examples.
Using the web is integral to an opportunistic approach to programming when focusing on speed and ease of development over code robustness and maintainability. There is a widespread use of the web by programmers, novices and experts alike, to prototype, ideate, and discover.
To develop software quickly programmers often mash up various existing systems. As part of this process, programmers must often search for suitable components and learn new skills, thus they began using the web for this purpose.
When developing software programmers spend 19% of their programming time on the web. Individuals use the web to accomplish several different kinds of activities. The intentions behind web use vary in form and time spent. Programmers spend most of the time learning a new concept, the least time is spent reminding themselves of details of a concept they already know, and in between they use the web to clarify their existing knowledge.
Example-centric programming tries to solve the issue of having to get out of the development environment to look for references and examples while programming. For instance, traditionally, to find API documentation and sample code, programmers will either visit the language reference website or go to search engines and make API specific queries. When trying to learn something new, programmers use web tutorials for just-in-time learning. Additionally, programmers deliberately choose not to remember complicated syntax and instead use the web as an external memory that can be accessed when needed.
Benefits
Some of the benefits of example-centric programming include:
Prevention of usage errors
Reduction of time searching for code examples
Reduction of time searching for API documentation
Clarification of existing knowledge and reminding of forgotten details
Emergent programming
Emergence can be defined as a process whereby larger entities, patterns, and regularities arise through interactions among smaller or simpler entities that themselves do not exhibit such properties. The extensive amount of code publicly available on the web can be used to find this type of patterns and regularities. By modeling how developers use programming languages in practices, algorithms for finding common idioms and detecting unlikely code can be created.
This process is limited to the amount of code that programmers are willing and able to share. Because people write more code than they share online there is a lot of duplicated effort. To fully use the power of the crowd, the effort required to publish code online should be reduced.
Examples
Blueprint
Blueprint is a plugin for Adobe Flash Builder that automatically augments queries with code context, presents a code-centric view of search results, embeds the search experience into the editor, and retains a link between copied code and its source. It is designed to help programmers with web searches and allow them to easily remember forgotten details and clarify existing knowledge.
It displays results from a varied set of web pages enabling users to browse and evaluate search results rapidly.
Blueprint is task-specific, meaning that it will specifically search for examples in the programming language.
Redprint
Redprint is a browser-based development environment for PHP that integrates API specific "instant example" and "instant documentation" display interfaces. The prototype IDE was developed by Anant Bhardwaj, then at Stanford University on the premise that task-specific example interfaces leave programmers having to understand the example code that has been found, and thus Redprint also includes an API specific search interface. The API specific search interface searches for relevant API specific examples and documentation.
Codex
Codex is a knowledge base that records common practices for Ruby. Uses crowdsourced data from developers and searches all code, looking for patterns, that way if someone is coding in a strange way, Codex lets them know that they are doing something wrong.
Codex uses statistical linting to find poorly written code, or code which is syntactically different from well written code, and warn the user, pattern annotation to automatically discover common programming idioms and annotate them with metadata using crowdsourcing, and library generation to construct a utility package that encapsulates emergent software practice.
Codelets
A codelet is a block of example code an interactive helper widget that assists the user in understanding and integrating the example.
Bing Code Search
Bing Code Search is an extension to Microsoft Visual Studio developed by a team made of people from Visual Studio, Bing and Microsoft Research that allows developers to search code examples and documentation from Bing directly from IntelliSense.
Bing Code Search gathers its code samples from MSDN, StackOverflow, Dotnetperls and CSharp411.
Codota
Codota helps developers find typical Java code examples by analyzing millions of code snippets available on sites such as GitHub and StackOverflow. Codota ranks these examples by criteria such as commonality of the coding patterns, credibility of the origin and clarity of the code.
The Codota plugin for the IntelliJ IDEA and Android Studio IDEs allows developers to get code examples for using Java and android APIs without having to leave their editor.
UpCodeIn
UpCodeIn is a source code search search engine that allows developers to find and reuse software components from the Internet. A unique feature of UpCodeIn compared to other source code search engines is its ability to find code for syntax element, for example you can find methods with specific parameter type, annotation, variables.
UpCodeIn understand syntax of many programming languages like Java, JavaScript, Python and C#.
See also
Emergence
List of human–computer interaction topics
User experience
User experience design
Web usability
Crowdsourcing
References
External links
Joel Brandt Talk
Human–computer interaction
Computer programming
Software features
Software design |
26123 | https://en.wikipedia.org/wiki/Real-time%20operating%20system | Real-time operating system | A real-time operating system (RTOS) is an operating system (OS) for real-time applications that processes data and events that have critically defined time constraints. An RTOS is distinct from a time sharing operating system, such as Unix, which manages the sharing of system resources with a scheduler, data buffers, or fixed task prioritization in a multitasking or multiprogramming environment. Processing time requirements need to be fully understood and bound rather than just kept as a minimum. All processing must occur within the defined constraints. Real-time operating systems are event-driven and preemptive, meaning the OS is capable of monitoring the relevant priority of competing tasks, and make changes to the task priority. Event-driven systems switch between tasks based on their priorities, while time-sharing systems switch the task based on clock interrupts.
Characteristics
A key characteristic of an RTOS is the level of its consistency concerning the amount of time it takes to accept and complete an application's task; the variability is 'jitter'. A 'hard' real-time operating system (hard RTOS) has less jitter than a 'soft' real-time operating system (soft RTOS). A late answer is a wrong answer in a hard RTOS while a late answer is acceptable in a soft RTOS. The chief design goal is not high throughput, but rather a guarantee of a soft or hard performance category. An RTOS that can usually or generally meet a deadline is a soft real-time OS, but if it can meet a deadline deterministically it is a hard real-time OS.
An RTOS has an advanced algorithm for scheduling. Scheduler flexibility enables a wider, computer-system orchestration of process priorities, but a real-time OS is more frequently dedicated to a narrow set of applications. Key factors in a real-time OS are minimal interrupt latency and minimal thread switching latency; a real-time OS is valued more for how quickly or how predictably it can respond than for the amount of work it can perform in a given period of time.
See the comparison of real-time operating systems for a comprehensive list. Also, see the list of operating systems for all types of operating systems.
Design philosophies
An RTOS is an operating system in which the time taken to process an input stimulus is less than the time lapsed until the next input stimulus of the same type.
The most common designs are:
Event-driven – switches tasks only when an event of higher priority needs servicing; called preemptive priority, or priority scheduling.
Time-sharing – switches tasks on a regular clocked interrupt, and on events; called round robin.
Time sharing designs switch tasks more often than strictly needed, but give smoother multitasking, giving the illusion that a process or user has sole use of a machine.
Early CPU designs needed many cycles to switch tasks during which the CPU could do nothing else useful. Because switching took so long, early OSes tried to minimize wasting CPU time by avoiding unnecessary task switching.
Scheduling
In typical designs, a task has three states:
Running (executing on the CPU);
Ready (ready to be executed);
Blocked (waiting for an event, I/O for example).
Most tasks are blocked or ready most of the time because generally only one task can run at a time per CPU. The number of items in the ready queue can vary greatly, depending on the number of tasks the system needs to perform and the type of scheduler that the system uses. On simpler non-preemptive but still multitasking systems, a task has to give up its time on the CPU to other tasks, which can cause the ready queue to have a greater number of overall tasks in the ready to be executed state (resource starvation).
Usually, the data structure of the ready list in the scheduler is designed to minimize the worst-case length of time spent in the scheduler's critical section, during which preemption is inhibited, and, in some cases, all interrupts are disabled, but the choice of data structure depends also on the maximum number of tasks that can be on the ready list.
If there are never more than a few tasks on the ready list, then a doubly linked list of ready tasks is likely optimal. If the ready list usually contains only a few tasks but occasionally contains more, then the list should be sorted by priority. That way, finding the highest priority task to run does not require iterating through the entire list. Inserting a task then requires walking the ready list until reaching either the end of the list, or a task of lower priority than that of the task being inserted.
Care must be taken not to inhibit preemption during this search. Longer critical sections should be divided into small pieces. If an interrupt occurs that makes a high priority task ready during the insertion of a low priority task, that high priority task can be inserted and run immediately before the low priority task is inserted.
The critical response time, sometimes called the flyback time, is the time it takes to queue a new ready task and restore the state of the highest priority task to running. In a well-designed RTOS, readying a new task will take 3 to 20 instructions per ready-queue entry, and restoration of the highest-priority ready task will take 5 to 30 instructions.
In more advanced systems, real-time tasks share computing resources with many non-real-time tasks, and the ready list can be arbitrarily long. In such systems, a scheduler ready list implemented as a linked list would be inadequate.
Algorithms
Some commonly used RTOS scheduling algorithms are:
Cooperative scheduling
Preemptive scheduling
Rate-monotonic scheduling
Round-robin scheduling
Fixed priority pre-emptive scheduling, an implementation of preemptive time slicing
Fixed-Priority Scheduling with Deferred Preemption
Fixed-Priority Non-preemptive Scheduling
Critical section preemptive scheduling
Static time scheduling
Earliest Deadline First approach
Stochastic digraphs with multi-threaded graph traversal
Intertask communication and resource sharing
A multitasking operating system like Unix is poor at real-time tasks. The scheduler gives the highest priority to jobs with the lowest demand on the computer, so there is no way to ensure that a time-critical job will have access to enough resources. Multitasking systems must manage sharing data and hardware resources among multiple tasks. It is usually unsafe for two tasks to access the same specific data or hardware resource simultaneously. There are three common approaches to resolve this problem:
Temporarily masking/disabling interrupts
General-purpose operating systems usually do not allow user programs to mask (disable) interrupts, because the user program could control the CPU for as long as it wishes. Some modern CPUs do not allow user mode code to disable interrupts as such control is considered a key operating system resource. Many embedded systems and RTOSs, however, allow the application itself to run in kernel mode for greater system call efficiency and also to permit the application to have greater control of the operating environment without requiring OS intervention.
On single-processor systems, an application running in kernel mode and masking interrupts is the lowest overhead method to prevent simultaneous access to a shared resource. While interrupts are masked and the current task does not make a blocking OS call, the current task has exclusive use of the CPU since no other task or interrupt can take control, so the critical section is protected. When the task exits its critical section, it must unmask interrupts; pending interrupts, if any, will then execute. Temporarily masking interrupts should only be done when the longest path through the critical section is shorter than the desired maximum interrupt latency. Typically this method of protection is used only when the critical section is just a few instructions and contains no loops. This method is ideal for protecting hardware bit-mapped registers when the bits are controlled by different tasks.
Mutexes
When the shared resource must be reserved without blocking all other tasks (such as waiting for Flash memory to be written), it is better to use mechanisms also available on general-purpose operating systems, such as a mutex and OS-supervised interprocess messaging. Such mechanisms involve system calls, and usually invoke the OS's dispatcher code on exit, so they typically take hundreds of CPU instructions to execute, while masking interrupts may take as few as one instruction on some processors.
A (non-recursive) mutex is either locked or unlocked. When a task has locked the mutex, all other tasks must wait for the mutex to be unlocked by its owner - the original thread. A task may set a timeout on its wait for a mutex. There are several well-known problems with mutex based designs such as priority inversion and deadlocks.
In priority inversion a high priority task waits because a low priority task has a mutex, but the lower priority task is not given CPU time to finish its work. A typical solution is to have the task that owns a mutex 'inherit' the priority of the highest waiting task. But this simple approach gets more complex when there are multiple levels of waiting: task A waits for a mutex locked by task B, which waits for a mutex locked by task C. Handling multiple levels of inheritance causes other code to run in high priority context and thus can cause starvation of medium-priority threads.
In a deadlock, two or more tasks lock mutex without timeouts and then wait forever for the other task's mutex, creating a cyclic dependency. The simplest deadlock scenario occurs when two tasks alternately lock two mutex, but in the opposite order. Deadlock is prevented by careful design.
Message passing
The other approach to resource sharing is for tasks to send messages in an organized message passing scheme. In this paradigm, the resource is managed directly by only one task. When another task wants to interrogate or manipulate the resource, it sends a message to the managing task. Although their real-time behavior is less crisp than semaphore systems, simple message-based systems avoid most protocol deadlock hazards, and are generally better-behaved than semaphore systems. However, problems like those of semaphores are possible. Priority inversion can occur when a task is working on a low-priority message and ignores a higher-priority message (or a message originating indirectly from a high priority task) in its incoming message queue. Protocol deadlocks can occur when two or more tasks wait for each other to send response messages.
Interrupt handlers and the scheduler
Since an interrupt handler blocks the highest priority task from running, and since real-time operating systems are designed to keep thread latency to a minimum, interrupt handlers are typically kept as short as possible. The interrupt handler defers all interaction with the hardware if possible; typically all that is necessary is to acknowledge or disable the interrupt (so that it won't occur again when the interrupt handler returns) and notify a task that work needs to be done. This can be done by unblocking a driver task through releasing a semaphore, setting a flag or sending a message. A scheduler often provides the ability to unblock a task from interrupt handler context.
An OS maintains catalogues of objects it manages such as threads, mutexes, memory, and so on. Updates to this catalogue must be strictly controlled. For this reason, it can be problematic when an interrupt handler calls an OS function while the application is in the act of also doing so. The OS function called from an interrupt handler could find the object database to be in an inconsistent state because of the application's update. There are two major approaches to deal with this problem: the unified architecture and the segmented architecture. RTOSs implementing the unified architecture solve the problem by simply disabling interrupts while the internal catalogue is updated. The downside of this is that interrupt latency increases, potentially losing interrupts. The segmented architecture does not make direct OS calls but delegates the OS related work to a separate handler. This handler runs at a higher priority than any thread but lower than the interrupt handlers. The advantage of this architecture is that it adds very few cycles to interrupt latency. As a result, OSes which implement the segmented architecture are more predictable and can deal with higher interrupt rates compared to the unified architecture.
Similarly, the System Management Mode on x86 compatible Hardware can take a lot of time before it returns control to the operating system.
Memory allocation
Memory allocation is more critical in a real-time operating system than in other operating systems.
First, for stability there cannot be memory leaks (memory that is allocated but not freed after use). The device should work indefinitely, without ever needing a reboot. For this reason, dynamic memory allocation is frowned upon. Whenever possible, all required memory allocation is specified statically at compile time.
Another reason to avoid dynamic memory allocation is memory fragmentation. With frequent allocation and releasing of small chunks of memory, a situation may occur where available memory is divided into several sections and the RTOS is incapable of allocating a large enough continuous block of memory, although there is enough free memory. Secondly, speed of allocation is important. A standard memory allocation scheme scans a linked list of indeterminate length to find a suitable free memory block, which is unacceptable in an RTOS since memory allocation has to occur within a certain amount of time.
Because mechanical disks have much longer and more unpredictable response times, swapping to disk files is not used for the same reasons as RAM allocation discussed above.
The simple fixed-size-blocks algorithm works quite well for simple embedded systems because of its low overhead.
See also
Adaptive Partition Scheduler
Comparison of real-time operating systems
Data General RDOS
DO-178B
Earliest deadline first scheduling
Firmware
FreeRTOS
Interruptible operating system
Least slack time scheduling
OSEK
POSIX
Rate-monotonic scheduling
Robot Operating System
SCADA
Synchronous programming language
Time-triggered system
Time-utility function
References
Operating systems
Real-time computing |
201460 | https://en.wikipedia.org/wiki/Stereoscopy | Stereoscopy | Stereoscopy (also called stereoscopics, or stereo imaging) is a technique for creating or enhancing the illusion of depth in an image by means of stereopsis for binocular vision. The word stereoscopy derives . Any stereoscopic image is called a stereogram. Originally, stereogram referred to a pair of stereo images which could be viewed using a stereoscope.
Most stereoscopic methods present two offset images separately to the left and right eye of the viewer. These two-dimensional images are then combined in the brain to give the perception of 3D depth. This technique is distinguished from 3D displays that display an image in three full dimensions, allowing the observer to increase information about the 3-dimensional objects being displayed by head and eye movements.
Background
Stereoscopy creates the illusion of three-dimensional depth from given two-dimensional images. Human vision, including the perception of depth, is a complex process, which only begins with the acquisition of visual information taken in through the eyes; much processing ensues within the brain, as it strives to make sense of the raw information. One of the functions that occur within the brain as it interprets what the eyes see is assessing the relative distances of objects from the viewer, and the depth dimension of those objects. The cues that the brain uses to gauge relative distances and depth in a perceived scene include
Stereopsis
Accommodation of the eye
Overlapping of one object by another
Subtended visual angle of an object of known size
Linear perspective (convergence of parallel edges)
Vertical position (objects closer to the horizon in the scene tend to be perceived as farther away)
Haze or contrast, saturation, and color, greater distance generally being associated with greater haze, desaturation, and a shift toward blue
Change in size of textured pattern detail
(All but the first two of the above cues exist in traditional two-dimensional images, such as paintings, photographs, and television.)
Stereoscopy is the production of the illusion of depth in a photograph, movie, or other two-dimensional image by the presentation of a slightly different image to each eye, which adds the first of these cues (stereopsis). The two images are then combined in the brain to give the perception of depth. Because all points in the image produced by stereoscopy focus at the same plane regardless of their depth in the original scene, the second cue, focus, is not duplicated and therefore the illusion of depth is incomplete. There are also mainly two effects of stereoscopy that are unnatural for human vision: (1) the mismatch between convergence and accommodation, caused by the difference between an object's perceived position in front of or behind the display or screen and the real origin of that light; and (2) possible crosstalk between the eyes, caused by imperfect image separation in some methods of stereoscopy.
Although the term "3D" is ubiquitously used, the presentation of dual 2D images is distinctly different from displaying an image in three full dimensions. The most notable difference is that, in the case of "3D" displays, the observer's head and eye movement do not change the information received about the 3-dimensional objects being viewed. Holographic displays and volumetric display do not have this limitation. Just as it is not possible to recreate a full 3-dimensional sound field with just two stereophonic speakers, it is an overstatement to call dual 2D images "3D". The accurate term "stereoscopic" is more cumbersome than the common misnomer "3D", which has been entrenched by many decades of unquestioned misuse. Although most stereoscopic displays do not qualify as real 3D display, all real 3D displays are also stereoscopic displays because they meet the lower criteria also.
Most 3D displays use this stereoscopic method to convey images. It was first invented by Sir Charles Wheatstone in 1838,
and improved by Sir David Brewster who made the first portable 3D viewing device.
Wheatstone originally used his stereoscope (a rather bulky device) with drawings because photography was not yet available, yet his original paper seems to foresee the development of a realistic imaging method:
For the purposes of illustration I have employed only outline figures, for had either shading or colouring been introduced it might be supposed that the effect was wholly or in part due to these circumstances, whereas by leaving them out of consideration no room is left to doubt that the entire effect of relief is owing to the simultaneous perception of the two monocular projections, one on each retina. But if it be required to obtain the most faithful resemblances of real objects, shadowing and colouring may properly be employed to heighten the effects. Careful attention would enable an artist to draw and paint the two component pictures, so as to present to the mind of the observer, in the resultant perception, perfect identity with the object represented. Flowers, crystals, busts, vases, instruments of various kinds, &c., might thus be represented so as not to be distinguished by sight from the real objects themselves.
Stereoscopy is used in photogrammetry and also for entertainment through the production of stereograms. Stereoscopy is useful in viewing images rendered from large multi-dimensional data sets such as are produced by experimental data. Modern industrial three-dimensional photography may use 3D scanners to detect and record three-dimensional information. The three-dimensional depth information can be reconstructed from two images using a computer by correlating the pixels in the left and right images. Solving the Correspondence problem in the field of Computer Vision aims to create meaningful depth information from two images.
Visual requirements
Anatomically, there are 3 levels of binocular vision required to view stereo images:
Simultaneous perception
Fusion (binocular 'single' vision)
Stereopsis
These functions develop in early childhood. Some people who have strabismus disrupt the development of stereopsis, however orthoptics treatment can be used to improve binocular vision. A person's stereoacuity determines the minimum image disparity they can perceive as depth. It is believed that approximately 12% of people are unable to properly see 3D images, due to a variety of medical conditions. According to another experiment up to 30% of people have very weak stereoscopic vision preventing them from depth perception based on stereo disparity. This nullifies or greatly decreases immersion effects of stereo to them.
Stereoscopic viewing may be artificially created by the viewer's brain, as demonstrated with the Van Hare Effect, where the brain perceives stereo images even when the paired photographs are identical. This "false dimensionality" results from the developed stereoacuity in the brain, allowing the viewer to fill in depth information even when few if any 3D cues are actually available in the paired images.
Side-by-side
Traditional stereoscopic photography consists of creating a 3D illusion starting from a pair of 2D images, a stereogram. The easiest way to enhance depth perception in the brain is to provide the eyes of the viewer with two different images, representing two perspectives of the same object, with a minor deviation equal or nearly equal to the perspectives that both eyes naturally receive in binocular vision.
To avoid eyestrain and distortion, each of the two 2D images should be presented to the viewer so that any object at infinite distance is perceived by the eye as being straight ahead, the viewer's eyes being neither crossed nor diverging. When the picture contains no object at infinite distance, such as a horizon or a cloud, the pictures should be spaced correspondingly closer together.
The advantages of side-by-side viewers is the lack of diminution of brightness, allowing the presentation of images at very high resolution and in full spectrum color, simplicity in creation, and little or no additional image processing is required. Under some circumstances, such as when a pair of images is presented for freeviewing, no device or additional optical equipment is needed.
The principal disadvantage of side-by-side viewers is that large image displays are not practical and resolution is limited by the lesser of the display medium or human eye. This is because as the dimensions of an image are increased, either the viewing apparatus or viewer themselves must move proportionately further away from it in order to view it comfortably. Moving closer to an image in order to see more detail would only be possible with viewing equipment that adjusted to the difference.
Freeviewing
Freeviewing is viewing a side-by-side image pair without using a viewing device.
Two methods are available to freeview:
The parallel viewing method uses an image pair with the left-eye image on the left and the right-eye image on the right. The fused three-dimensional image appears larger and more distant than the two actual images, making it possible to convincingly simulate a life-size scene. The viewer attempts to look through the images with the eyes substantially parallel, as if looking at the actual scene. This can be difficult with normal vision because eye focus and binocular convergence are habitually coordinated. One approach to decoupling the two functions is to view the image pair extremely close up with completely relaxed eyes, making no attempt to focus clearly but simply achieving comfortable stereoscopic fusion of the two blurry images by the "look-through" approach, and only then exerting the effort to focus them more clearly, increasing the viewing distance as necessary. Regardless of the approach used or the image medium, for comfortable viewing and stereoscopic accuracy the size and spacing of the images should be such that the corresponding points of very distant objects in the scene are separated by the same distance as the viewer's eyes, but not more; the average interocular distance is about 63 mm. Viewing much more widely separated images is possible, but because the eyes never diverge in normal use it usually requires some previous training and tends to cause eye strain.
The cross-eyed viewing method swaps the left and right eye images so that they will be correctly seen cross-eyed, the left eye viewing the image on the right and vice versa. The fused three-dimensional image appears to be smaller and closer than the actual images, so that large objects and scenes appear miniaturized. This method is usually easier for freeviewing novices. As an aid to fusion, a fingertip can be placed just below the division between the two images, then slowly brought straight toward the viewer's eyes, keeping the eyes directed at the fingertip; at a certain distance, a fused three-dimensional image should seem to be hovering just above the finger. Alternatively, a piece of paper with a small opening cut into it can be used in a similar manner; when correctly positioned between the image pair and the viewer's eyes, it will seem to frame a small three-dimensional image.
Prismatic, self-masking glasses are now being used by some cross-eyed-view advocates. These reduce the degree of convergence required and allow large images to be displayed. However, any viewing aid that uses prisms, mirrors or lenses to assist fusion or focus is simply a type of stereoscope, excluded by the customary definition of freeviewing.
Stereoscopically fusing two separate images without the aid of mirrors or prisms while simultaneously keeping them in sharp focus without the aid of suitable viewing lenses inevitably requires an unnatural combination of eye vergence and accommodation. Simple freeviewing therefore cannot accurately reproduce the physiological depth cues of the real-world viewing experience. Different individuals may experience differing degrees of ease and comfort in achieving fusion and good focus, as well as differing tendencies to eye fatigue or strain.
Autostereogram
An autostereogram is a single-image stereogram (SIS), designed to create the visual illusion of a three-dimensional (3D) scene within the human brain from an external two-dimensional image. In order to perceive 3D shapes in these autostereograms, one must overcome the normally automatic coordination between focusing and vergence.
Stereoscope and stereographic cards
The stereoscope is essentially an instrument in which two photographs of the same object, taken from slightly different angles, are simultaneously presented, one to each eye. A simple stereoscope is limited in the size of the image that may be used. A more complex stereoscope uses a pair of horizontal periscope-like devices, allowing the use of larger images that can present more detailed information in a wider field of view. One can buy historical stereoscopes such as Holmes stereoscopes as antiques. Many stereo photography artists like Jim Naughten and Rebecca Hackemann also make their own stereoscopes.
Transparency viewers
Some stereoscopes are designed for viewing transparent photographs on film or glass, known as transparencies or diapositives and commonly called slides. Some of the earliest stereoscope views, issued in the 1850s, were on glass. In the early 20th century, 45x107 mm and 6x13 cm glass slides were common formats for amateur stereo photography, especially in Europe. In later years, several film-based formats were in use. The best-known formats for commercially issued stereo views on film are Tru-Vue, introduced in 1931, and View-Master, introduced in 1939 and still in production. For amateur stereo slides, the Stereo Realist format, introduced in 1947, is by far the most common.
Head-mounted displays
The user typically wears a helmet or glasses with two small LCD or OLED displays with magnifying lenses, one for each eye. The technology can be used to show stereo films, images or games, but it can also be used to create a virtual display. Head-mounted displays may also be coupled with head-tracking devices, allowing the user to "look around" the virtual world by moving their head, eliminating the need for a separate controller. Performing this update quickly enough to avoid inducing nausea in the user requires a great amount of computer image processing. If six axis position sensing (direction and position) is used then wearer may move about within the limitations of the equipment used. Owing to rapid advancements in computer graphics and the continuing miniaturization of video and other equipment these devices are beginning to become available at more reasonable cost.
Head-mounted or wearable glasses may be used to view a see-through image imposed upon the real world view, creating what is called augmented reality. This is done by reflecting the video images through partially reflective mirrors. The real world view is seen through the mirrors' reflective surface. Experimental systems have been used for gaming, where virtual opponents may peek from real windows as a player moves about. This type of system is expected to have wide application in the maintenance of complex systems, as it can give a technician what is effectively "x-ray vision" by combining computer graphics rendering of hidden elements with the technician's natural vision. Additionally, technical data and schematic diagrams may be delivered to this same equipment, eliminating the need to obtain and carry bulky paper documents.
Augmented stereoscopic vision is also expected to have applications in surgery, as it allows the combination of radiographic data (CAT scans and MRI imaging) with the surgeon's vision.
Virtual retinal displays
A virtual retinal display (VRD), also known as a retinal scan display (RSD) or retinal projector (RP), not to be confused with a "Retina Display", is a display technology that draws a raster image (like a television picture) directly onto the retina of the eye. The user sees what appears to be a conventional display floating in space in front of them. For true stereoscopy, each eye must be provided with its own discrete display. To produce a virtual display that occupies a usefully large visual angle but does not involve the use of relatively large lenses or mirrors, the light source must be very close to the eye. A contact lens incorporating one or more semiconductor light sources is the form most commonly proposed. As of 2013, the inclusion of suitable light-beam-scanning means in a contact lens is still very problematic, as is the alternative of embedding a reasonably transparent array of hundreds of thousands (or millions, for HD resolution) of accurately aligned sources of collimated light.
3D viewers
There are two categories of 3D viewer technology, active and passive. Active viewers have electronics which interact with a display. Passive viewers filter constant streams of binocular input to the appropriate eye.
Active
Shutter systems
A shutter system works by openly presenting the image intended for the left eye while blocking the right eye's view, then presenting the right-eye image while blocking the left eye, and repeating this so rapidly that the interruptions do not interfere with the perceived fusion of the two images into a single 3D image. It generally uses liquid crystal shutter glasses. Each eye's glass contains a liquid crystal layer which has the property of becoming dark when voltage is applied, being otherwise transparent. The glasses are controlled by a timing signal that allows the glasses to alternately darken over one eye, and then the other, in synchronization with the refresh rate of the screen. The main drawback of active shutters is that most 3D videos and movies were shot with simultaneous left and right views, so that it introduces a "time parallax" for anything side-moving: for instance, someone walking at 3.4 mph will be seen 20% too close or 25% too remote in the most current case of a 2x60 Hz projection.
Passive
Polarization systems
To present stereoscopic pictures, two images are projected superimposed onto the same screen through polarizing filters or presented on a display with polarized filters. For projection, a silver screen is used so that polarization is preserved. On most passive displays every other row of pixels is polarized for one eye or the other. This method is also known as being interlaced. The viewer wears low-cost eyeglasses which also contain a pair of opposite polarizing filters. As each filter only passes light which is similarly polarized and blocks the opposite polarized light, each eye only sees one of the images, and the effect is achieved.
Interference filter systems
This technique uses specific wavelengths of red, green, and blue for the right eye, and different wavelengths of red, green, and blue for the left eye. Eyeglasses which filter out the very specific wavelengths allow the wearer to see a full color 3D image. It is also known as spectral comb filtering or wavelength multiplex visualization or super-anaglyph. Dolby 3D uses this principle. The Omega 3D/Panavision 3D system has also used an improved version of this technology
In June 2012 the Omega 3D/Panavision 3D system was discontinued by DPVO Theatrical, who marketed it on behalf of Panavision, citing ″challenging global economic and 3D market conditions″.
Color anaglyph systems
Anaglyph 3D is the name given to the stereoscopic 3D effect achieved by means of encoding each eye's image using filters of different (usually chromatically opposite) colors, typically red and cyan. Red-cyan filters can be used because our vision processing systems use red and cyan comparisons, as well as blue and yellow, to determine the color and contours of objects. Anaglyph 3D images contain two differently filtered colored images, one for each eye. When viewed through the "color-coded" "anaglyph glasses", each of the two images reaches one eye, revealing an integrated stereoscopic image. The visual cortex of the brain fuses this into perception of a three dimensional scene or composition.
Chromadepth system
The ChromaDepth procedure of American Paper Optics is based on the fact that with a prism, colors are separated by varying degrees. The ChromaDepth eyeglasses contain special view foils, which consist of microscopically small prisms. This causes the image to be translated a certain amount that depends on its color. If one uses a prism foil now with one eye but not on the other eye, then the two seen pictures – depending upon color – are more or less widely separated. The brain produces the spatial impression from this difference. The advantage of this technology consists above all of the fact that one can regard ChromaDepth pictures also without eyeglasses (thus two-dimensional) problem-free (unlike with two-color anaglyph). However the colors are only limitedly selectable, since they contain the depth information of the picture. If one changes the color of an object, then its observed distance will also be changed.
Pulfrich method
The Pulfrich effect is based on the phenomenon of the human eye processing images more slowly when there is less light, as when looking through a dark lens. Because the Pulfrich effect depends on motion in a particular direction to instigate the illusion of depth, it is not useful as a general stereoscopic technique. For example, it cannot be used to show a stationary object apparently extending into or out of the screen; similarly, objects moving vertically will not be seen as moving in depth. Incidental movement of objects will create spurious artifacts, and these incidental effects will be seen as artificial depth not related to actual depth in the scene.
Over/under format
Stereoscopic viewing is achieved by placing an image pair one above one another. Special viewers are made for over/under format that tilt the right eyesight slightly up and the left eyesight slightly down. The most common one with mirrors is the View Magic. Another with prismatic glasses is the KMQ viewer. A recent usage of this technique is the openKMQ project.
Other display methods without viewers
Autostereoscopy
Autostereoscopic display technologies use optical components in the display, rather than worn by the user, to enable each eye to see a different image. Because headgear is not required, it is also called "glasses-free 3D". The optics split the images directionally into the viewer's eyes, so the display viewing geometry requires limited head positions that will achieve the stereoscopic effect. Automultiscopic displays provide multiple views of the same scene, rather than just two. Each view is visible from a different range of positions in front of the display. This allows the viewer to move left-right in front of the display and see the correct view from any position. The technology includes two broad classes of displays: those that use head-tracking to ensure that each of the viewer's two eyes sees a different image on the screen, and those that display multiple views so that the display does not need to know where the viewers' eyes are directed. Examples of autostereoscopic displays technology include lenticular lens, parallax barrier, volumetric display, holography and light field displays.
Holography
Laser holography, in its original "pure" form of the photographic transmission hologram, is the only technology yet created which can reproduce an object or scene with such complete realism that the reproduction is visually indistinguishable from the original, given the original lighting conditions. It creates a light field identical to that which emanated from the original scene, with parallax about all axes and a very wide viewing angle. The eye differentially focuses objects at different distances and subject detail is preserved down to the microscopic level. The effect is exactly like looking through a window. Unfortunately, this "pure" form requires the subject to be laser-lit and completely motionless—to within a minor fraction of the wavelength of light—during the photographic exposure, and laser light must be used to properly view the results. Most people have never seen a laser-lit transmission hologram. The types of holograms commonly encountered have seriously compromised image quality so that ordinary white light can be used for viewing, and non-holographic intermediate imaging processes are almost always resorted to, as an alternative to using powerful and hazardous pulsed lasers, when living subjects are photographed.
Although the original photographic processes have proven impractical for general use, the combination of computer-generated holograms (CGH) and optoelectronic holographic displays, both under development for many years, has the potential to transform the half-century-old pipe dream of holographic 3D television into a reality; so far, however, the large amount of calculation required to generate just one detailed hologram, and the huge bandwidth required to transmit a stream of them, have confined this technology to the research laboratory.
In 2013, a Silicon Valley company, LEIA Inc, started manufacturing holographic displays well suited for mobile devices (watches, smartphones or tablets) using a multi-directional backlight and allowing a wide full-parallax angle view to see 3D content without the need of glasses.
Volumetric displays
Volumetric displays use some physical mechanism to display points of light within a volume. Such displays use voxels instead of pixels. Volumetric displays include multiplanar displays, which have multiple display planes stacked up, and rotating panel displays, where a rotating panel sweeps out a volume.
Other technologies have been developed to project light dots in the air above a device. An infrared laser is focused on the destination in space, generating a small bubble of plasma which emits visible light.
Integral imaging
Integral imaging is a technique for producing 3D displays which are both autostereoscopic and multiscopic, meaning that the 3D image is viewed without the use of special glasses and different aspects are seen when it is viewed from positions that differ either horizontally or vertically. This is achieved by using an array of microlenses (akin to a lenticular lens, but an X–Y or "fly's eye" array in which each lenslet typically forms its own image of the scene without assistance from a larger objective lens) or pinholes to capture and display the scene as a 4D light field, producing stereoscopic images that exhibit realistic alterations of parallax and perspective when the viewer moves left, right, up, down, closer, or farther away.
Wiggle stereoscopy
Wiggle stereoscopy is an image display technique achieved by quickly alternating display of left and right sides of a stereogram. Found in animated GIF format on the web, online examples are visible in the New-York Public Library stereogram collection. The technique is also known as "Piku-Piku".
Stereo photography techniques
For general purpose stereo photography, where the goal is to duplicate natural human vision and give a visual impression as close as possible to actually being there, the correct baseline (distance between where the right and left images are taken) would be the same as the distance between the eyes. When images taken with such a baseline are viewed using a viewing method that duplicates the conditions under which the picture is taken, then the result would be an image much the same as that which would be seen at the site the photo was taken. This could be described as "ortho stereo."
However, there are situations in which it might be desirable to use a longer or shorter baseline. The factors to consider include the viewing method to be used and the goal in taking the picture. The concept of baseline also applies to other branches of stereography, such as stereo drawings and computer generated stereo images, but it involves the point of view chosen rather than actual physical separation of cameras or lenses.
Stereo window
The concept of the stereo window is always important, since the window is the stereoscopic image of the external boundaries of left and right views constituting the stereoscopic image. If any object, which is cut off by lateral sides of the window, is placed in front of it, an effect results that is unnatural and is undesirable, this is called a "window violation". This can best be understood by returning to the analogy of an actual physical window. Therefore, there is a contradiction between two different depth cues: some elements of the image are hidden by the window, so that the window appears as closer than these elements, and the same elements of the image appear as closer than the window. So that the stereo window must always be adjusted to avoid window violations.
Some objects can be seen in front of the window, as far as they don't reach the lateral sides of the window. But these objects can not be seen as too close, since there is always a limit of the parallax range for comfortable viewing.
If a scene is viewed through a window the entire scene would normally be behind the window, if the scene is distant, it would be some distance behind the window, if it is nearby, it would appear to be just beyond the window. An object smaller than the window itself could even go through the window and appear partially or completely in front of it. The same applies to a part of a larger object that is smaller than the window. The goal of setting the stereo window is to duplicate this effect.
Therefore, the location of the window versus the whole of the image must be adjusted so that most of the image is seen beyond the window. In the case of viewing on a 3D TV set, it is easier to place the window in front of the image, and to let the window in the plane of the screen.
On the contrary, in the case of projection on a much larger screen, it is much better to set the window in front of the screen (it is called "floating window"), for instance so that it is viewed about two meters away by the viewers sit in the first row. Therefore, these people will normally see the background of the image at the infinite. Of course the viewers seated beyond will see the window more remote, but if the image is made in normal conditions, so that the first row viewers see this background at the infinite, the other viewers, seated behind, will also see this background at the infinite, since the parallax of this background is equal to the average human interocular.
The entire scene, including the window, can be moved backwards or forwards in depth, by horizontally sliding the left and right eye views relative to each other. Moving either or both images away from the center will bring the whole scene away from the viewer, whereas moving either or both images toward the center will move the whole scene toward the viewer. This is possible, for instance, if two projectors are used for this projection.
In stereo photography window adjustments is accomplished by shifting/cropping the images, in other forms of stereoscopy such as drawings and computer generated images the window is built into the design of the images as they are generated.
The images can be cropped creatively to create a stereo window that is not necessarily rectangular or lying on a flat plane perpendicular to the viewer's line of sight. The edges of the stereo frame can be straight or curved and, when viewed in 3D, can flow toward or away from the viewer and through the scene. These designed stereo frames can help emphasize certain elements in the stereo image or can be an artistic component of the stereo image.
Uses
While stereoscopic images have typically been used for amusement, including stereographic cards, 3D films, 3D television, stereoscopic video games, printings using anaglyph and pictures, posters and books of autostereograms, there are also other uses of this technology.
Art
Salvador Dalí created some impressive stereograms in his exploration in a variety of optical illusions. Other stereo artists include Zoe Beloff, Christopher Schneberger, Rebecca Hackemann, William Kentridge, and Jim Naughten. Red-and-cyan anaglyph stereoscopic images have also been painted by hand.
Education
In the 19th century, it was realized that stereoscopic images provided an opportunity for people to experience places and things far away, and many tour sets were produced, and books were published allowing people to learn about geography, science, history, and other subjects. Such uses continued till the mid-20th century, with the Keystone View Company producing cards into the 1960s.
Space exploration
The Mars Exploration Rovers, launched by NASA in 2003 to explore the surface of Mars, are equipped with unique cameras that allow researchers to view stereoscopic images of the surface of Mars.
The two cameras that make up each rover's Pancam are situated 1.5m above the ground surface, and are separated by 30 cm, with 1 degree of toe-in. This allows the image pairs to be made into scientifically useful stereoscopic images, which can be viewed as stereograms, anaglyphs, or processed into 3D computer images.
The ability to create realistic 3D images from a pair of cameras at roughly human-height gives researchers increased insight as to the nature of the landscapes being viewed. In environments without hazy atmospheres or familiar landmarks, humans rely on stereoscopic clues to judge distance. Single camera viewpoints are therefore more difficult to interpret. Multiple camera stereoscopic systems like the Pancam address this problem with unmanned space exploration.
Clinical uses
Stereogram cards and vectographs are used by optometrists, ophthalmologists, orthoptists and vision therapists in the diagnosis and treatment of binocular vision and accommodative disorders.
Mathematical, scientific and engineering uses
Stereopair photographs provided a way for 3-dimensional (3D) visualisations of aerial photographs; since about 2000, 3D aerial views are mainly based on digital stereo imaging technologies. One issue related to stereo images is the amount of disk space needed to save such files. Indeed, a stereo image usually requires twice as much space as a normal image. Recently, computer vision scientists tried to find techniques to attack the visual redundancy of stereopairs with the aim to define compressed version of stereopair files. Cartographers generate today stereopairs using computer programs in order to visualise topography in three dimensions. Computerised stereo visualisation applies stereo matching programs.
In biology and chemistry, complex molecular structures are often rendered in stereopairs. The same technique can also be applied to any mathematical (or scientific, or engineering) parameter that is a function of two variables, although in these cases it is more common for a three-dimensional effect to be created using a 'distorted' mesh or shading (as if from a distant light source).
See also
Cloud stereoscopy
References
Bibliography
Further reading
Scott B. Steinman, Barbara A. Steinman and Ralph Philip Garzia. (2000). Foundations of Binocular Vision: A Clinical perspective. McGraw-Hill Medical.
External links
Archival collections
Guide to the Edward R. Frank Stereograph Collection. Special Collections and Archives, The UC Irvine Libraries, Irvine, California.
Niagara Falls Stereo Cards RG 541 Brock University Library Digital Repository
Other
Durham Visualization Laboratory stereoscopic imaging methods and software tools
University of Washington Libraries Digital Collections Stereocard Collection
Stereographic Views of Louisville and Beyond, 1850s–1930 from the University of Louisville Libraries
American University in Cairo Rare Books and Special Collections Digital Library Underwood & Underwood Egypt Stereoviews Collection
Views of California and the West, ca. 1867–1903, The Bancroft Library
Museum exhibition on the history of stereographs and stereoscopes (1850–1930)
Two stereoscopic selfies from 1890
3D imaging
Binocular rivalry |
15154670 | https://en.wikipedia.org/wiki/Japanese%20language%20education%20in%20India | Japanese language education in India | Japanese language education in India has experienced a boom in the early 21st century, helping it to begin to catch up with foreign languages more traditionally popular among Indians, such as French and German. A 2006 survey by the Japan Foundation showed 369 teachers teaching 11,011 students at 106 different institutions; the number of students nearly doubled since the 2005 survey.
History
Origins
The earliest Japanese language courses in India were established in the 1950s; the Ministry of Defence began offering a course through their affiliated School of foreign languages, Lodhi Road, New Delhi in 1954, Visva-Bharati (Santiniketan) established a Japanese department in 1954 which made it the first university in India to introduce Japanese language courses. Bhartiya Vidya Bhawan, J N Academy of Languages, New Delhi started Japanese courses in the year 1958. While the Japan-India Cooperation Association in Mumbai set up a Japanese class in 1958. The University of Delhi established their Japan Studies Centre in 1969, the University of Pune established a course in the language in 1977, and New Delhi's Jawaharlal Nehru University began to offer a doctorate in the language beginning in 1982. However, the language did not enjoy much popularity until the late 1990s. The growth of interest in the Japanese language took place in a short time, in spite of government inaction from both the Japanese and Indian side. The Japanese government-funded Japan Foundation, an organisation for the promotion of Japanese culture, opened an office in New Delhi in 1993, its first on the Indian subcontinent; however, its budget constituted only 2% of the Foundation's global expenditures, as compared to 15.1% for East Asia and 20.4% for Southeast Asia. Then-Finance Minister of India Manmohan Singh suggested as early as 1997 that India needed 10,000 of its citizens to be fluent in Japanese; however, little concrete action was taken to achieve this goal.
Education and industry
As a result of the lack of government action, the private sector were forced to take the lead in Japanese language education. The majority of Japanese language teaching in the country is conducted by non-school institutions, while government schools have lagged behind the demand for the language; only 20% of Japanese language students study it in the course of their primary or secondary education, or in university. Business process outsourcing and information technology companies are responsible for much of this; as companies in India take aim at the Japan market, they have increased their recruitment of Japanese-speaking individuals and offering internal training courses in the language to their employees.
Pune has grown to become a major centre of Japanese language education in India, surpassing larger cities such as Mumbai and Kolkata despite their late start relative to the rest of the country. The first Japanese language teachers came to the city in the 1970s; the University of Pune established a Japanese language course in 1977 and upgraded it to a full department in 1978. As such, the city was well-positioned to begin capturing Japanese business when India's information technology boom began. As early as 2004, software exports to Japan made up 12% of Pune's then-US$1 billion software industry. As of 2007, 70 Japanese teachers are estimated to work in the city; it is also the home of the country's branch of the Japanese Language Teachers' Association. The similarity between Japanese grammar and that of Marathi is mentioned as a factor by some Pune residents in easing their study of the language.
South India, though traditionally a leader in the information technology sector which is driving so much of the demand for Japanese speakers, has actually lagged behind the country when it comes to teaching the language. Bangalore University established a course in the language, but unlike the University of Pune or Jawaharlal Nehru University, it has done little to promote the language. The Japanese Language Proficiency Test was not even offered in south India until 2000, when a test centre was established in Chennai; the test was first offered in Bangalore in 2007. The entire Japanese teaching and translation industry in Southern India was estimated to produce revenues of only Rs. 1 million (US$21,000 at then-current exchange rates) as of 2003, with only 12 schools teaching the language. Bangalore has few schools like Stonehill International School and Trio World Academy which offers after school Japanese language classes for expats children.
Shortfall
Despite the increase in the number of Indians studying Japanese, there remains a major shortfall relative to the needs of industry; for example, in the translation business, there are 100 jobs available for every 20 candidates. The Japanese government is working with organisations in India to address the shortfall, and aims to expand the number of Indian students learning Japanese to 30,000 by 2012. In 2006, the Central Board of Secondary Education announced plans to introduce a syllabus for Japanese language teaching, making Japanese the first East Asian language to be offered as part of the curriculum in Indian secondary schools. The emphasis will be on the spoken language, rather than the written; according to the syllabus, kanji would not be taught until class VIII.
Standardised testing
The Japanese Language Proficiency Test is offered in eight Indian cities as of October 2020; the most recently added test site was that in Salem in Tamil Nadu. The Level 4 examination, aimed at beginning students with fewer than 150 contact hours of instruction, is the most widely attempted; numbers decrease at higher levels. The number of examinees quadrupled between 1998 and 2006. Chennai had the fastest growth in the number of examinees during that period, while Kolkata was slowest. JETRO also offer their Business Japanese Language Test in Bangalore, Mumbai, and New Delhi; in 2006, 147 people attempted the examination, forming about 7.7% of all overseas examinees. 94% of all Indian examinees scored 410 points or less out of 800, as compared to 70% of all overseas examinees.
See also
Japanese people in India
References
Further reading
India
Languages of India
Language education in India |
51982853 | https://en.wikipedia.org/wiki/Spanish%20Network%20of%20Excellence%20on%20Cybersecurity%20Research | Spanish Network of Excellence on Cybersecurity Research | The Spanish Network of Excellence on Cybersecurity Research (RENIC), is a research initiative to promote cybersecurity interests in Spain.
Members
Board of Directors (2018)
President: Universidad de Málaga
Vicepresident: CSIC
Treasurer: Universidad Politécnica de Madrid
Secretary: Universidad de Granada
Vocals: Tecnalia, Universidad de La Laguna and Universidad de Modragón
Board of Directors (2016)
President: Universidad Carlos III de Madrid
Vicepresident: Universidad Politécnica de Madrid
Treasurer: Universidad de Granada
Secretary: Universidad de León
Vocals: Gradiant, Tecnalia, Universidad de Málaga
Founding Members
Centro Andaluz de Innovación y Tecnologías de la Información y las Comunicaciones (CITIC).
Consejo Superior de Investigaciones Científicas (CSIC).
Centro Tecnolóxico de Telecomunicaciones de Galicia (Gradiant).
Instituto Imdea Software.
Instituto Nacional de Ciberseguridad (INCIBE).
Mondragón Unibertsitatea.
Tecnalia.
Universidad Carlos III de Madrid.
Universidad Castilla la Mancha.
Universidad de Granada.
Universidad de la Laguna.
Universidad de León.
Universidad de Málaga.
Universidad de Murcia.
Universidad de Vigo.
Universidad Internacional de la Rioja.
Universidad Politécnica de Madrid.
Universidad Rey Juan Carlos.
Members
Consejo Superior de Investigaciones Científicas (CSIC).
Centro Tecnolóxico de Telecomunicaciones de Galicia (Gradiant).
Instituto Imdea Software.
Instituto Nacional de Ciberseguridad (INCIBE).
Mondragón Unibertsitatea.
Tecnalia.
Universidad Carlos III de Madrid.
Universidad de Castilla-La Mancha.
Universidad de Granada.
Universidad de la Laguna.
Universidad de León.
Universidad de Málaga.
Universidad de Murcia.
Universidad de Vigo.
Universidad Politécnica de Madrid.
Universidad Rey Juan Carlos.
Universitat Oberta de Catalunya.
IKERLAN.
Honorary Members
Centre for the Development of Industrial Technology (CDTI). (2017)
Instituto Nacional de Ciberseguridad (INCIBE). (2016)
Initiatives and Participations
RENIC is ECSO member, and is also a member of its board of directors.
A collaboration agreement between RENIC and the Innovative Business Cluster on Cybersecurity (AEI Cybersecurity) has been signed.
RENIC is pleased to sponsor the Cybersecurity Research National Conferences (JNIC) JNIC2017 edition, organized by Universidad Rey Juan Carlos.
RENIC is pleased to announce the publication of the online version of the Catalog and knowledge map of cybersecurity research
References
Computer security
Development |
50539820 | https://en.wikipedia.org/wiki/The%20Hurricane%20Heist | The Hurricane Heist | The Hurricane Heist is a 2018 American disaster heist action film directed by Rob Cohen, written by Jeff Dixon and Scott Windhauser, and starring Toby Kebbell, Maggie Grace, Ryan Kwanten, Ralph Ineson, Melissa Bolona, James Cutler, and Ben Cross. It was released in the UK as a Sky Cinema Original Film. The film is about a maintenance worker, his meteorologist brother, and a treasury agent contending with band of rogue treasury agents who plan to use a Category 5 hurricane to cover their tracks of a bank robbery. The film was released on March 9, 2018, received negative reviews and was a box-office bomb, making just $32 million against its estimated $35 million budget.
Plot
In 1992, a category 5 hurricane named "Andrew" hits the town of Gulfport, Alabama. Will and Breeze Rutledge are evacuating from the destructive hurricane with their dad. However, their truck gets stuck after avoiding the toppling tree in front of them, and they are forced to take refuge in a nearby house. While trying to save the truck from blowing away, strong winds blow a water tank and it crushes their father.
In 2018, another destructive category 5 hurricane named "Tammy" approaches Gulfport. Federal Reserve Treasury agent Casey Corbyn is ordered by fellow employee Randy Moreno to summon Breeze, who now works in maintenance and whose brother Will is a National Weather Service meteorologist, to fix the generator at a cash storage facility.
While she is out of the facility, rogue Treasury agents led by Connor Perkins infiltrate the facility and hold Moreno hostage. Their plan is to steal $600 million, and Perkins enlists computer hackers Sasha and Frears to crack the code of the vault. Failing to decrypt it, Perkins realizes that Corbyn may have changed it, so he has his men find her. Sasha and Frears have to use a brute-force attack using the town's transmission tower.
As Corbyn and Breeze drive back to the facility, they encounter the mercenaries, and Corbyn engages in a shootout with them. Will helps her escape with his Storm Research Vehicle called the Dominator, but Breeze is left behind and is captured and taken hostage, forced to repair the generator.
Will is upset when he learns that his brother is in danger. Determined to save him, he and Corbyn meet Sheriff Jimmy Dixon at his station. Unfortunately, Dixon reveals himself to be one of Perkins' cohorts and tries to take Will and Corbyn hostage. Corbyn shoots the sheriff, and they escape. When Dixon and one of his deputies chase them, Will manages to knock their car with his Dominator. Realizing that the tower is being used to crack the vault's code, Will and Corbyn manage to topple it moments before the decryption is completed. Perkins' men spot them and engage in a gunfight with them, but they escape. Dixon turns on Perkins, confronting him over a botched heist in the previous hurricane. When Dixon wants to claim all the money, Perkins shoots him dead and persuades Dixon's men to find Corbyn.
While looting a mall, Corbyn calls Perkins and makes a deal for the release of Moreno and Breeze as long as she opens the vault and gets the money. When Perkins asks where the trade will be conducted, Corbyn tells him to meet them at the Gulfport mall. Meanwhile, Will and Corbyn make a plan to shoot the roof glass, causing the mercenaries to be sucked out through the roof. After Will talks to Breeze, who has arrived with the mercenaries, Corbyn shoots the glass roof, sucking the mercenaries out into the storm as planned. Corbyn, Will, and Breeze manage to hold on. After the storm surge, Corbyn gives herself up while Breeze rescues the stranded Will. Back at the Treasury facility as Corbyn and the remaining mercenaries arrive, Perkins breaks his deal to release Moreno and kills him as revenge for the deaths of Jaqi and Xander.
As the eye of the storm passes, Perkins and his men take the money, using three of the facility's truck trailers, along with Corbyn. Will and Breeze follow them. With the eye wall approaching on their tail, Will and Breeze take over a truck. After a struggle with Perkins, the eye wall sucks the money out of one of the trucks and then the truck itself. Perkins is then killed after his own detached trailer crushes him. When Breeze's truck engine backfires and burns, Will and Corbyn transfer him to their truck. However, when they attempt to rescue Sasha and Frears, they are sucked into the storm.
William, Breeze and Corbyn manage to outrun the storm safely, and drive away into the sunshine, having saved $200 million.
Cast
Toby Kebbell as William "Will" Rutledge, a meteorologist
Leonardo Dickens as Young Will
Maggie Grace as Casey Corbyn, a Treasury agent
Ryan Kwanten as Breeze Rutledge, Will's older ex-Marine brother who works as a repairman
Patrick McAuley as Young Breeze
Ralph Ineson as Connor Perkins, a corrupt Treasury agent
Melissa Bolona as Sasha Van Dietrich, a computer hacker and Frears' girlfriend
James Cutler as Clement Rice, Xander's brother
Ben Cross as Sheriff Jimmy Dixon, a corrupt local sheriff who is associated with Perkins
Christian Contreras as Randy Moreno, a Treasury agent
Jimmy Walker as Xander, a minion of Perkins and Clement Rice's brother
Ed Birch as Frears, another computer hacker and Sasha's boyfriend
Moyo Akande as Jaqi, Perkins' lover
James Barriscale as Deputy Michaels, a deputy who works for Sheriff Dixon
Mark Basnight as Deputy Gabriel, a deputy who works for Sheriff Dixon
Keith D. Evans as Deputy Rothilsberg, a deputy who works for Sheriff Dixon
Mark Rhino Smith as Deputy Baldwin, a deputy deputy who works for Sheriff Dixon
Brooke Johnston as Deputy Diamond, a deputy who works for Sheriff Dixon
Production
In January 2016, it was announced that Rob Cohen had signed on to write and direct the film, then titled Category 5, with casting underway and a Summer 2016 principal production start set. In February 2016, it was announced that the film had been acquired for distribution in a large number of international locations via the European Film Market. In May 2016, it was revealed that Toby Kebbell had been set to star in the film. In June 2016, the rest of the cast was announced.
Principal photography on the film began in Bulgaria on August 29, 2016. In July 2017, the completed film, now titled The Hurricane Heist, was acquired for domestic distribution by Entertainment Studios with an early 2018 release date slated.
In the UK, the film will be released by Altitude Film Distribution and will be the second Sky Cinema Original Film.
Reception
Box office
The Hurricane Heist grossed $6.1 million in the United States and Canada, and $26.4 million in other territories, for a worldwide total $32.5 million against a production budget of $35 million.
In the United States and Canada, The Hurricane Heist was released on March 9, 2018 alongside The Strangers: Prey at Night, Gringo, and A Wrinkle in Time, and was initially projected to gross around $7 million from 2,402 theaters in its opening weekend. However, after making just $950,000 on its first day, weekend estimates were lowered to $3 million. It ended up grossing $3 million, finishing ninth.
Critical response
On review aggregator website Rotten Tomatoes, the film holds an approval rating of based on reviews, and an average rating of . The website's critical consensus reads, "The Hurricane Heist is a throwback to the overblown action thrillers of yesteryear—and a thoroughly middling example of why they don't make 'em like this anymore." On Metacritic, the film has a weighted average score of 35 out of 100, based on 12 critics, indicating "generally unfavorable reviews". Audiences polled by CinemaScore gave the film an average grade of "B–" on an A+ to F scale.
Alonso Duralde of TheWrap criticized the film's direction, acting and overbearing musical score, saying, "Critics often lament that worthy films released early in the year are too often forgotten during awards season, so let's be very clear up front: For your Best of the Worst of 2018 consideration, in all categories, The Hurricane Heist." Andrew Barker, writing for Variety, gave the film an ironic recommendation, calling it the best worst movie of 2018 and saying: "All three of our heroes take time out in the middle of survival situations to discuss their undying love of football and the Second Amendment, but they also believe in climate change. If our divided country can't come together over a movie this wonderfully terrible, what hope do we really have?"
There are some factual inconsistencies relating to the alleged location of the events in the film. Several scenes show mountains in the distance, but the entire Gulf Coast is completely flat. Also, the film claims to be in Gulfport, which exists in Mississippi, not Alabama.
References
External links
2018 films
English-language films
2018 crime action films
2010s disaster films
2010s heist films
American films
American crime action films
American disaster films
American heist films
Entertainment Studios films
Films about hurricanes
Films about tropical cyclones
Films directed by Rob Cohen
Films scored by Lorne Balfe
Films set in 1992
Films set in 2018
Films set in Alabama
Films shot in Bulgaria |
52901775 | https://en.wikipedia.org/wiki/N%C2%BA3 | Nº3 | Nº3 is the third studio album by alternative rock band Dot Hacker. The album was released on January 20, 2017 on ORG Music label.
The album cover art was created by Josh Klinghoffer although it was not originally designed for this purpose.
Track listing
Personnel
Dot Hacker
Josh Klinghoffer – lead vocals, guitar, keyboards, synthesizers
Clint Walsh – guitar, backing vocals, synthesizers
Jonathan Hischke – bass guitar
Eric Gardner – drums
References
2017 albums
Dot Hacker albums |
23548810 | https://en.wikipedia.org/wiki/Manycore%20processor | Manycore processor | Manycore processors are special kinds of multi-core processors designed for a high degree of parallel processing, containing numerous simpler, independent processor cores (from a few tens of cores to thousands or more). Manycore processors are used extensively in embedded computers and high-performance computing.
Contrast with multicore architecture
Manycore processors are distinct from multi-core processors in being optimized from the outset for a higher degree of explicit parallelism, and for higher throughput (or lower power consumption) at the expense of latency and lower single-thread performance.
The broader category of multi-core processors, by contrast, are usually designed to efficiently run both parallel and serial code, and therefore place more emphasis on high single-thread performance (e.g. devoting more silicon to out of order execution, deeper pipelines, more superscalar execution units, and larger, more general caches), and shared memory. These techniques devote runtime resources toward figuring out implicit parallelism in a single thread. They are used in systems where they have evolved continuously (with backward compatibility) from single core processors. They usually have a 'few' cores (e.g. 2,4,8), and may be complemented by a manycore accelerator (such as a GPU) in a heterogeneous system.
Motivation
Cache coherency is an issue limiting the scaling of multicore processors. Manycore processors may bypass this with methods such as message passing, scratchpad memory, DMA, partitioned global address space, or read-only/non-coherent caches. A manycore processor using a network on a chip and local memories gives software the opportunity to explicitly optimise the spatial layout of tasks (e.g. as seen in tooling developed for TrueNorth).
Manycore processors may have more in common (conceptually) with technologies originating in high performance computing such as clusters and vector processors.
GPUs may be considered a form of manycore processor having multiple shader processing units, and only being suitable for highly parallel code (high throughput, but extremely poor single thread performance).
Suitable programming models
Message passing interface
OpenCL or other APIs supporting compute kernels
Partitioned global address space
Actor model
OpenMP
Dataflow
Classes of manycore systems
GPUs, which can be described as manycore vector processors
Massively parallel processor array
Asynchronous array of simple processors
Specific manycore architectures
ZettaScaler , Japanese PEZY Computing 2048-core modules
Xeon Phi coprocessor, which has MIC (Many Integrated Cores) architecture
Tilera
Adapteva Epiphany Architecture, a manycore chip using PGAS scratchpad memory
Coherent Logix hx3100 Processor, a 100-core DSP/GPP processor based on HyperX Architecture
Movidius Myriad 2, a manycore vision processing unit (VPU)
Kalray, a manycore PCI-e accelerator for data-intensive tasks
Teraflops Research Chip, a manycore processor using message passing
TrueNorth, an AI accelerator with a manycore network on a chip architecture
Green arrays, a manycore processor using message passing aimed at low power applications
Sunway SW26010, a 260-core manycore processor used in the, then top 1 supercomputer Sunway TaihuLight
SW52020, an improved 520-core variant of SW26010, with 512-bit SIMD (also adding support for half-precision), used in a prototype, meant for an exascale system (and in the future 10 exascale system), and according to datacenterdynamics China is rumored to already have two separate exascale systems secretly
Eyeriss, a manycore processor designed for running convolutional neural nets for embedded vision applications
Graphcore, a manycore AI accelerator
Specific manycore computers with 1M+ CPU cores
A number of computers built from multicore processors have one million or more individual CPU cores. Examples include:
Sunway TaihuLight, a massively parallel (10M CPU cores) Chinese supercomputer, once one of the fastest supercomputers in the world, using a custom manycore architecture. As of November 2018, the world's third fastest supercomputer (as ranked by the TOP500 list), the Chinese Sunway TaihuLight, obtains its performance from 40,960 SW26010 manycore processors, each containing 256 cores.
Gyoukou (Japanese: 暁光 Hepburn: gyōkō, dawn light), a supercomputer developed by ExaScaler and PEZY Computing, with 20,480,000 processing elements total plus the 1250 Intel Xeon D host processors.
SpiNNaker, a massively parallel (1M CPU cores) manycore processor (ARM-based) built as part of the Human Brain Project.
Specific computers with 5M+ CPU cores
Quite a few supercomputers have over a million of even over 5 million CPU cores. When there are also coprocessors, e.g. GPUs used with, then those cores are not listed in the core-count, then quite a few more computers would hit those targets.
Fugaku, a Japanese supercomputer using Fujitsu A64FX ARM-based cores, 7,630,848 in total.
Sunway TaihuLight, a massively parallel (10M CPU cores) Chinese supercomputer.
See also
Multicore
Vector processor
SIMD
High performance computing
Computer cluster
Multiprocessor system on a chip
Vision processing unit
Memory access pattern
Cache coherency
Embarrassingly parallel
Massively parallel
CUDA
References
External links
Architecting solutions for the Manycore future, published on Feb 19, 2010 (more than one dead link in the slide)
Eyeriss architecture
Computer architecture
Manycore processors
Parallel computing |
2089390 | https://en.wikipedia.org/wiki/Osborne%20%28computer%20retailer%29 | Osborne (computer retailer) | Osborne was the name of one of the largest and most successful computer wholesalers and resellers in Australia. Osborne Corporation in Australia was originally registered by President Computers founded by Tom Cooper, the name was transferred back to the US HQ founder Adam Osborne and his Osborne Corporation Inc as a good will gesture by Cooper in the early stages of a highly successful launch of the Osborne 1 across the Australian marketplace. Adam Osborne at this time visited Sydney Australia at the invitation of Cooper to witness first hand the phenomenal success the Osborne 1 was experiencing in the Australian market.
The Agency would in time would change hands to land with Stanley Falinsky as the final Australian distributor. The famous and original Osborne 1 "luggable" computer featuring a Z-80 processor and running CP/M as the operating system. When Osborne Corporation USA collapsed changes in the Australian business entity occurred whereas Falinsky's company retained the Osborne Company name transitioning into IBM PC compatibles in the mid-1980s and had great success with both business and government clients.
A number of entities were involved in the complex trading relationship of the brand in Australia. A search of the ASIC names database return 28 entries for "Osborne Computers". Telnet pty Ltd, Peak Pacific, Computer Manufacturing Services Pty Ltd, System Support Services Pty Ltd, Osborne Computers (UK) Ltd, Osborne Computers (NZ)are a few of the other related entities at the time.
As cited, the fact remains that in 1982, Osborne Corporation was originally represented in Australia exclusively by President Computers Pty Ltd headed by Tom Cooper, a Captain of Industry in the emerging Australian PC era. With outstanding success of Osborne 1 sales in Australia, President Computers was lauded at the time by Osborne Corp USA as the largest global distributor of Osborne I luggable computers outside of Computerland USA. However with success, Osborne's visiting CFO had his own sights on the Australian marketplace and convinced Adam Osborne to split the Agency much to Cooper's objection. This move saw President Computers equally divide its current Dealership arrangement when Osborne Corporation setup to hold half the dealership Agency locally. Upon this decision President Computers then exclusively signed on Osborne's luggable rival the US Del Mar CA manufactured Kaypro Computer produced by Non Liner Systems which boasted a larger format inbuilt screen.
Cooper who held a strong relationship with Adam Osborne also was privy to the early view of the new Osborne II model, sighting warehouses full of the first model, Cooper cautioned Osborne that the Osborne II should only be announced once clearance of the original Osborne 1 stock holdings had been depleted, ultimately dismissing such advice contributed to Osborne Corporations demise and Chapter 11 filing in September 1983, whereas Osborne Corporation in Australia was then restructured and independently moved towards the PC sector but retained use of the Osborne produce brand. By this time President Computers also had successfully enjoyed strong success with Kaypro luggable computer sales had itself moved into the PC Sector under their own private label brand with Cooper's President Computer PC Assembly plant officially opened on the Gold Coast Technology Park, 1 Computer Street, Labrador Queensland in 1986 by Minister for Industry, Small Business and Technology of Queensland for Industry and Innovation Hon. Mike Ahern. https://en.wikipedia.org/wiki/Mike_Ahern_(Australian_politician).
Around the time of the US collapse Osborne Australia appointed John Linton (now deceased) as the new CEO of a combined Osborne Australia entity was determined to double their already substantial market share, largely by massive discounting without reducing the traditional good quality of an Osborne machine. The marketing push was financed by demanding that customers place a 100% deposit and then wait six weeks before picking up their new system, and by buying components on ever more generous credit terms from major suppliers like Micronics and Seagate. For about six months the new policy was remarkably effective: Osborne sales boomed and competitors were unable to match their prices. Osborne were selling well below cost, but their retail losses were made up for by currency fluctuations, in particular the steadily rising value of the Australian dollar against the United States dollar.
Inevitably, the currency movement swung back the other way eventually, and Osborne were placed on credit hold by several of their major suppliers: unable to secure more components until at least some of the previous shipments had been paid for, and unable to ship the promised new computers to the many customers who had long since paid in full for them, Osborne went into Voluntary Administration on 25 June 1995. The notification was passed to the company's employees on 26 June 1995.
Star Dean-Willcocks were appointed Administrators to the company in June 1995, resulting in the sale of the business assets to Gateway 2000 computer company Gateway. As a result of the sale employees received all entitlements and customers who had pre-paid for computers received a new computer from the new Osborne-Gateway company. Some employees were transitioned to Osborne Gateway 2000 organisation
In the course of the Voluntary Administration new legal references were created in regards to the ownership of prepaid customer goods in Osborne Computer Corp Pty Ltd v Airroad Distribution Pty Ltd 13 ACLC 1129, 17 ASCR 614
Relaunched at the PC96 show as Osborne Gateway 2000, the company later traded as Gateway 2000 Australia for several years, but were unable to recover Osborne's former dominant position and were unsuccessful in the Australian market. Gateway withdrew from Australia in August 2001.
The trading entity continues to be registered as an Australian company ACN 003 677 272
References
Defunct computer companies of Australia |
181348 | https://en.wikipedia.org/wiki/University%20of%20St%20Andrews | University of St Andrews | The University of St Andrews (, ; abbreviated as St And, from the Latin Sancti Andreae, in post-nominals) is a public university in St Andrews, Fife, Scotland. It is the oldest of the four ancient universities of Scotland and, following Oxford and Cambridge universities, the third-oldest university in the English-speaking world. St Andrews was founded in 1413 when the Avignon Antipope Benedict XIII issued a papal bull to a small founding group of Augustinian clergy. Along with the universities of Glasgow, Edinburgh, and Aberdeen, St Andrews was part of the Scottish Enlightenment during the 18th century.
St Andrews is made up of a variety of institutions, comprising three colleges — United College (a union of St Salvator's and St Leonard's Colleges), St Mary's College, and St Leonard's College, the last named being a non-statutory revival of St Leonard's as a post-graduate society. There are 18 academic schools organised into four faculties. The university occupies historic and modern buildings located throughout the town. The academic year is divided into two semesters, Martinmas and Candlemas. In term time, over one-third of the town's population are either staff members or students of the university. The student body is notably diverse: over 145 nationalities are represented with 45% of its intake from countries outside the UK; about one-eighth of the students are from the EU and the remaining third are from overseas—15% from North America alone. The university's sport teams compete in BUCS competitions, and the student body is known for preserving ancient traditions such as Raisin Weekend, May Dip, and the wearing of distinctive academic dress.
It has been twice named "University of the Year" by The Times and Sunday Times Good University Guide, one of only two UK universities to achieve this. In the 2022 Good University Guide, St Andrews was ranked as the best university in the UK, the first university to ever top Oxford and Cambridge in British rankings. In 2021, St Andrews had the highest entry standards for undergraduate admission in the UK, attaining an average UCAS Entry Tariff of 208 points. St Andrews has many notable alumni and affiliated faculty, including eminent mathematicians, scientists, theologians, philosophers, and politicians. Recent alumni include the former First Minister of Scotland Alex Salmond; Cabinet Secretary and Head of the Civil Service Mark Sedwill; Chief of the Secret Intelligence Service (MI6) Alex Younger; former Secretary of State for Defence Sir Michael Fallon; Olympic cycling gold medalist Chris Hoy; Permanent Representative of the United Kingdom to the United Nations and former British Ambassador to China (2015-2020) Dame Barbara Woodward; and royals Prince William, Duke of Cambridge, and Catherine, Duchess of Cambridge. Five Nobel Laureates are among St Andrews' alumni and former staff: three in Chemistry and two in Physiology or Medicine.
History
Foundation
The university was founded in 1410 when a group of Augustinian clergy, driven from the University of Paris by the Avignon schism and from the universities of Oxford and Cambridge by the Anglo-Scottish Wars, formed a society of higher learning in St Andrews, which offered courses of lectures in divinity, logic, philosophy, and law. A charter of privilege was bestowed upon the society of masters and scholars by the Bishop of St Andrews, Henry Wardlaw, on 28 February 1411. Wardlaw then successfully petitioned the Avignon Pope Benedict XIII to grant the school university status by issuing a series of papal bulls, which followed on 28 August 1413. King James I of Scotland confirmed the charter of the university in 1432. Subsequent kings supported the university, with King James V of Scotland "confirming privileges of the university" in 1532.
A college of theology and arts, called St John's College, was founded in 1418 by Robert of Montrose and Lawrence of Lindores. St Salvator's College was established in 1450 by Bishop James Kennedy. St Leonard's College was founded in 1511 by Archbishop Alexander Stewart, who intended it to have a far more monastic character than either of the other colleges. St John's College was refounded by Cardinal James Beaton under the name St Mary's College in 1538 for the study of divinity and law. It was intended to encourage traditional Catholic teachings in opposition to the emerging Scottish Reformation, but once Scotland had formally split with the Papacy in 1560, it became a teaching institution for Protestant clergy. At its foundation in 1538 St Mary's was intended to be a college for instruction in divinity, law, and medicine, as well as in Arts, but its career on this extensive scale was short-lived. Under a new foundation and erection, confirmed by Parliament in 1579, it was set apart for the study of Theology only, and it has remained a Divinity College ever since.
Some university buildings that date from this period are still in use today, such as St Salvator's Chapel, St Leonard's College Chapel and St Mary's College quadrangle. At this time, the majority of the teaching was of a religious nature and was conducted by clerics associated with the cathedral.
Development
During the 17th and 18th centuries, the university had mixed fortunes and was often beset by civil and religious disturbances. In a particularly acute depression in 1747, severe financial problems triggered the dissolution of St Leonard's College, whose properties and staff were merged into St Salvator's College to form the United College of St Salvator and St Leonard. Throughout this period student numbers were very low; for instance, when Samuel Johnson visited the university in 1773, the university had fewer than 100 students, and was in his opinion in a steady decline. He described it as "pining in decay and struggling for life". The poverty of Scotland during this period also damaged St Andrews, as few were able to patronise the university and its colleges, and with state support being improbable, the income they received was scarce.
Modern period
Women
In the second half of the 19th century, pressure was building upon universities to open up higher education to women. In 1876, the university senate decided to allow women to receive an education at St Andrews at a level roughly equal to the Master of Arts degree that men were able to take at the time. The scheme came to be known as the 'LLA examination' (Lady Literate in Arts). It required women to pass five subjects at an ordinary level and one at honours level and entitled them to hold a degree from the university. Not being required to attend the university in person, the women were essentially learning by correspondence. They were both examined and assisted in their studies by educationalists in the town or city in which they lived in the UK. In 1889 the Universities (Scotland) Act made it possible to formally admit women to St Andrews and to receive an education equal to that of male students. Agnes Forbes Blackadder became the first woman to graduate from St Andrews on the same level as men in October 1894, gaining her MA. She entered the university in 1892, making St Andrews the first university in Scotland to admit female undergraduates on the same level as men. In response to the increasing number of female students attending the university, the first women's hall was built in 1896 and was named University Hall.
Dundee
Up until the start of the 20th century, St Andrews offered a traditional education based on classical languages, divinity and philosophical studies, and was slow to embrace more practical fields such as science and medicine that were becoming more popular at other universities. In response to the need for modernisation and in order to increase student numbers and alleviate financial problems, the university had, by 1883, established a university college in Dundee which formally merged with St Andrews in 1897. From its inception, the Dundee college had a focus on scientific, and professional subjects; the college's mixed sexes read Classics and English at St Andrews. The union was frought with difficulties; in 1894, The Educational Times reported in the article The Quarrel between St Andrews and Dundee that University College, Dundee was "forbidden" to give such instruction in the Arts "as he [the Dundeen student] might require". After the incorporation of University College Dundee, St Andrews' various problems generally receded. For example, it was able to offer medical degrees. Of note is that, up until 1967, many students who obtained a degree from the University of St Andrews had in fact spent most, and sometimes all, of their undergraduate career based in Dundee.
In 1967, the union with Queen's College Dundee (formerly University College Dundee) ended, when that College became an independent institution under the name of the University of Dundee. As a result of this, St Andrews lost its capacity to provide degrees in many areas such as Medicine, Dentistry, Law, Accountancy, and Engineering. As well as losing the right to confer the undergraduate medical degree MBChB, it was also deprived of the right to confer the postgraduate degree MD. St Andrews was eventually able to continue to offer the opportunity to study medicine through a new arrangement with the University of Manchester in England.
In 1972, the College of St Leonard was reconstituted as a postgraduate institute.
Links with the United States
St Andrews' historical links with the United States predate the country's independence. James Wilson, a signer of the Declaration of Independence, attended (but did not graduate from) St Andrews. Wilson was one of six original justices appointed by George Washington to the Supreme Court of the United States and was a founder of the University of Pennsylvania Law School. Other prominent American figures associated with St Andrews include Scottish American industrialist Andrew Carnegie, who was elected Rector in 1901 and whose name is given to the prestigious Carnegie Scholarship, and Edward Harkness, an American philanthropist who in 1930 provided for the construction of St Salvator's Hall. American Bobby Jones, co-founder of the Augusta National Golf Club and the Masters Tournament, was named a Freeman of the City of St Andrews in 1958, becoming only the second American to be so honoured, the other being Benjamin Franklin in 1759. Today a highly competitive scholarship exchange, The Robert T. Jones Scholarship, exists between St Andrews and Emory University in Atlanta. An undergraduate joint degree programme have been in place with the College of William & Mary in Virginia that offers studies in some major areas.
Links with the United States have been maintained into the present day and continue to grow. In 2009, Louise Richardson, an Irish-American political scientist specialising in the study of terrorism, was drawn from Harvard to serve as the first female Principal and Vice Chancellor of St Andrews. She later went on to her next appointment as the Vice Chancellor to the University of Oxford.
Active recruitment of students from North America first began in 1984, with Americans now making up around 1 in 6 of the student population in 2017. Students from almost every state in the United States and province in Canada are represented. This is the highest proportion and absolute number of American students amongst all British universities. Media reports indicate growing numbers of American students are attracted to the university's academics, traditions, prestige, internationalism, and comparatively low tuition fees. The university also regularly features as one of the few non-North American universities in the Fiske Guide to Colleges, an American college guide, as a 'Best Buy'. St Andrews has developed a sizable alumni presence in the United States, with over 8000 alumni spread across all 50 states. Most major cities host alumni clubs, the largest of which is in New York. Both London and New York also host the St Andrews Angels, an alumni led angel investment network, which centres upon the wider university communities in both the United Kingdom and United States. St Andrews has also established relationships with other university alumni clubs and private membership clubs in the United States to provide alumni with social and networking opportunities. For example, alumni are eligible for membership at the Princeton Club of New York, the Penn Club of New York City and the Algonquin Club in Boston.
In 2013, Hillary Clinton, former United States Secretary of State, took part in the academic celebration marking the 600th anniversary of the founding of the University of St Andrews. Clinton received an honorary degree of Doctor of Laws and provided the graduation address, in which she said,
Governance and administration
As with the other ancient universities of Scotland, the governance of the university is determined by the Universities (Scotland) Act 1858. This act created three bodies: the General Council, University Court and Academic Senate (Senatus Academicus).
General Council
The General Council is a standing advisory body of all the graduates, academics and former academics of the university. It meets twice a year and appoints a business committee to manage business between these meetings. Its most important functions are to appoint two assessors to the University Court and elect the university's chancellor.
University Court
The University Court is the body responsible for administrative and financial matters, and is in effect the governing body of the university. It is chaired by the rector, who is elected by the matriculated students of the university. Members are appointed by the General Council, Academic Senate and Fife Council. The President of the Students' Association and Director of Education are ex officio members of the Court. Several lay members are also co-opted and must include a fixed number of alumni of the university.
Senatus Academicus
The Academic Senate (Latin Senatus Academicus) is the supreme academic body for the university. Its members include all the professors of the university, certain senior readers, a number of senior lecturers and lecturers and three elected student senate representatives – one from the arts and divinity faculty, one from the science and medicine faculty and one postgraduate student. It is responsible for authorising degree programmes and issuing all degrees to graduates, and for managing student discipline. The President of the Senate is the University Principal.
Office of the Principal
The Principal is the chief executive of the university and is assisted in that role by several key officers, including the Deputy Principal, Master of the United College and Quaestor. The principal has responsibility for the overall running of the university and presides over the University Senate.
Rector
In Scotland, the position of rector exists at the four ancient universities (St Andrews, Glasgow, Aberdeen and Edinburgh) – as well as the University of Dundee. The post was made an integral part of these universities by the Universities (Scotland) Act 1889. The Rector of the University of St Andrews chairs meetings of the University Court, the governing body of the university; and is elected by the matriculated student body to ensure that their needs are adequately considered by the university's leadership. Through St Andrews' history a number of notable people have been elected to the post, including the actor John Cleese, industrialist and philanthropist Andrew Carnegie, author and poet Rudyard Kipling and the British Prime Minister Archibald Primrose, 5th Earl of Rosebery.
Colleges
The university encompasses three colleges: United College, St Mary's College and St Leonard's College. The purpose of the colleges at St Andrews is mainly ceremonial, as students are housed in separate residential halls or private accommodations. United College is responsible for all students in the faculties of arts, sciences and medicine, and is based around St Salvator's Quadrangle. St Mary's College is responsible for all students studying in the Faculty of Divinity, and has its own dedicated site in St Mary's Quadrangle. St Leonard's College is now responsible for all postgraduate students.
Faculties and schools
The four academic faculties collectively encompass 18 schools. A dean is appointed by the Master of the United College to oversee the day-to-day running of each faculty. Students apply to become members of a particular faculty, as opposed to the school within which teaching is based. The faculties and their affiliated schools are:
Faculty of Arts: art history, classics, economics, English, film studies, history, international relations, management, modern languages, philosophy, social anthropology.
Faculty of Divinity: divinity.
Faculty of Medicine: medicine.
Faculty of Science: biology, chemistry, computer science, geography and geosciences, mathematics, physics and astronomy, psychology and neuroscience.
Certain subjects are offered both within the Faculties of Arts and Sciences, the six subjects are: economics, geography, management, mathematics, psychology and sustainable development. The content of the subject is the same regardless of the faculty.
Academics
Semesters
The academic year at St Andrews is divided into two semesters, Martinmas and Candlemas, named after two of the four Scottish Term and Quarter Days. Martinmas, on 11 November, was originally the feast of Saint Martin of Tours, a fourth-century bishop and hermit. Candlemas originally fell on 2 February, the day of the feast of the Purification, or the Presentation of Christ. Martinmas semester runs from early September until mid-December, with examinations taking place just before the Christmas break. There follows an inter-semester period when Martinmas semester business is concluded and preparations are made for the new Candlemas semester, which starts in January and concludes with examinations at the end of May. Graduation is celebrated at the end of June.
Rankings and reputation
In the 2022 The Times and Sunday Times Good University Guide, St Andrews ranked as the best university in the UK, as the first university to ever top Oxford and Cambridge in a British ranking
In a ranking conducted by The Guardian in 2009, St Andrews placed fifth in the UK for national reputation behind Oxford, Cambridge, Imperial & LSE. When size is taken into account, St Andrews ranks second in the world out of all small to medium-sized fully comprehensive universities (after Brown University) using metrics from the QS Intelligence Unit in 2015.
The 2014 Research Excellence Framework ranked St Andrews 14th in the UK, and second in Scotland, amongst multi-faculty institutions for the research quality (GPA) of its output profile. St Andrews was ranked ninth overall in The Sunday Times 10-year (1998–2007) average ranking of British universities based on consistent league table performance, and is a member of the 'Sutton 13' of top ranked Universities in the UK.
Nearly 86% of its graduates obtain a First Class or an Upper Second Class Honours degree. The ancient Scottish universities award Master of Arts degrees (except for science students who are awarded a Bachelor of Science degree) which are classified upon graduation, in contrast to Oxbridge where one becomes a Master of Arts after a certain number of years, and the rest of the UK, where graduates are awarded BAs. These can be awarded with honours; the majority of students graduate with honours.
In 2017, St Andrews was named as the university with the joint second highest graduate employment rate of any UK university (along with Warwick), with 97.7 per cent of its graduates in work or further study three and a half years after graduation. St Andrews is placed seventh in the UK (1st in Scotland) for the employability of its graduates as chosen by recruiters from the UK's major companies with graduates expected to have the best graduate prospects and highest starting salaries in Scotland as ranked by The Times and Sunday Times Good University Guide 2016 and 2017. According to data released by the Department for Education in 2018, St Andrews was rated as the fifth best university in the UK for boosting male graduate earnings with male graduates seeing a 24.5% increase in earnings compared to the average graduate, and the ninth best university for females, with female graduates seeing a 14.8% increase in earnings compared to the average graduate. An independent report conducted by Swedish investment firm, Skandia found that despite its small undergraduate body, St Andrews is the joint-5th best university in the UK for producing millionaires. A study by High Fliers confirmed this by reporting that the university also features in the top 5 of UK universities for producing self-made millionaires. According to a study by the Institute of Employment Research, St Andrews has produced more directors of FTSE 100 companies in proportion to its size than any other educational institution in Britain.
In the 2019 Complete University Guide, 24 out of the 25 subjects offered by St Andrews rank within the top 10 nationally, making St Andrews one of only three multi-faculty universities (along with Cambridge and Oxford) in the UK to have over 95% of their subjects in the top 10. The Times and Sunday Times Good University Guide 2017 revealed that 24 of the 26 subjects offered by St Andrews ranked within the top 6 nationally with 10 subjects placing within the top 3 including English, Management, Philosophy, International Relations, Italian, Physics and Astronomy and Classics and Ancient History. The Guardian University Guide 2019 ranked Biosciences, Computer Science, International Relations, Physics and Psychology first in the UK. Earth and Marine Sciences, Economics, English, Management, Mathematics, Philosophy and Theology placed within the top three nationally. In the 2015-16 Times Higher Education World University Rankings, St Andrews is ranked 46th in the world for Social Sciences, 50th in the world for Arts and Humanities and 74th in the world for Life Sciences. The 2014 CWTS Leiden rankings, which "aims to provide highly accurate measurements of the scientific impact of universities", placed St Andrews 39th in the world, ranking it fifth domestically. The philosophy department is ranked sixth worldwide (3rd in Europe) in the 2020 QS World University Rankings whilst the graduate programme was ranked 17th worldwide (2nd in the UK) by the 2009 Philosophical Gourmet's biennial report on Philosophy programs in the English-speaking world.
Admissions
The university receives applications mainly through UCAS and the Common Application with the latest figures showing that there are generally 12 applications per undergraduate place available. Overall, the university is one of the most competitive universities in the UK, with 2016-17 having an acceptance rate of 8.35% and offer rate of 22.5% for Scottish/EU applicants where places are capped by the Scottish Government. In 2017, the most competitive courses for Scottish/EU applicants were those within the Schools of International Relations, Management, and Economics and Finance with offer rates of 8.0%, 10.9% and 11.5% respectively. The standard offer of a place tends to require five best Highers equivalent to AAAAB, three best A-levels equivalent to AAA or a score of at least 38 points on the International Baccalaureate. Successful entrants have, on average, 525 UCAS points (the equivalent of just above A*A*AA at A Level) ranking it as the fifth highest amongst higher education institutions in the UK for the 2015 admissions cycle with The Telegraph naming it as the hardest university into which to gain admission in Scotland.
The university has one of the smallest percentages of students (13%) from lower income backgrounds, out of all higher education institutions in the UK. Around 40% of the student body is from independent schools and the university hosts the highest proportion of financially independent students (58%) in the UK. The university participates in widening access schemes such as the Sutton Trust Summer School, First Chances Programme, REACH & SWAP Scotland, and Access for Rural Communities (ARC) in order to promote a more widespread uptake of those traditionally under-represented at university. In the seven-year period between 2008 and 2015, the number of pupils engaged with annual outreach programmes at the university has increased by about tenfold whilst the number of students arriving at St Andrews from the most deprived backgrounds has increased by almost 50 per cent in the past year of 2015. The university has a higher proportion of female than male students with a female ratio of 59.7% in the undergraduate population.
Lecture series
To commemorate the university's 600th anniversary the 600th Lecture Series was commissioned in 2011, which brought diverse speakers such as former Prime Minister Gordon Brown, naturalist David Attenborough and linguist Noam Chomsky to St Andrews.
As part of the celebration of the 400th establishment of the King James Library, the King James Library lectures were initiated in 2009 on the subject of 'The Meaning of the Library'.
The Andrew Lang Lecture series was initiated in 1927, and named for alumnus and poet Andrew Lang. The most famous lecture in this series is that given by J. R. R. Tolkien in March 1939, entitled 'Fairy Stories', but published subsequently as 'On Fairy-Stories'.
The computing Distinguished Lecture Series was initiated in 1969 by Jack Cole.
Exchange programmes
St Andrews has developed student exchange partnerships with universities around the globe, though offerings are largely concentrated in North America, Europe, and Asia. Exchange opportunities vary by School and eligibility requirements are specific to each exchange program.
In North America, the highly competitive Bachelor of Arts International Honours program, run in conjunction with The College of William and Mary in Williamsburg, Virginia, allows students studying Classical Studies, Film Studies, International Relations, English, History, or Economics to spend two years at each institution and earn a joint degree from both. The Robert T. Jones Memorial Trust funds the Robert T. Jones Jr. Scholarship, which allows select St Andrews students to study, fully funded, for a year at Emory University in Atlanta, and Western University and Queen's University in Canada. The Robert Lincoln McNeil Scholarship allows students to study at the University of Pennsylvania.
One of the largest North American exchanges is with the University of California system, in which students can study at UC Berkeley, UC Los Angeles (UCLA), UC Santa Cruz (UCSC) and UC San Diego (UCSD).
Other North American partners offering multiple exchanges include the University of Virginia, the University of North Carolina at Chapel Hill, Washington University in St. Louis, Washington and Lee University, Elon University, and the University of Toronto. Some exchanges are offered within specific research institutes at St Andrews, rather than across entire Schools. For example, the Handa Centre for the Study of Terrorism and Political Violence (CSTPV), within the School of International Relations, offers student exchanges in partnership with the School of Foreign Service at Georgetown University.
St Andrews participates in the Erasmus Programme and has direct exchanges with universities across Europe. For example, in France exchanges are offered at the Sorbonne, Sciences Po, and University of Paris VI. In the Netherlands students can study at Leiden University and Utrecht University. Narrower exchanges include those with the University of Copenhagen, the University of Oslo, and Trinity College Dublin. Exchanges are also available for postgraduate research students, such as the opportunity for social scientists to study at the European University Institute in Florence, Italy.
More recently, St Andrews has developed exchanges with partners in Asia and Australia. Notable partners include the University of Hong Kong and Renmin University of China, National University of Singapore, and the University of Melbourne in Australia.
Buildings, collections and facilities
The University of St Andrews is situated in the small town of St Andrews in rural Fife, Scotland. The university has teaching facilities, libraries, student housing and other buildings spread throughout the town. Generally, university departments and buildings are concentrated on North Street, South Street, The Scores, and the North Haugh. The university has two major sites within the town. The first is the United College, St Andrews (also known as the Quad or St Salvator's) on North Street, which functions both as a teaching space and venue for student events, incorporating the Departments of Social Anthropology and Modern Languages. The second is St Mary's College, St Andrews, based on South Street, which houses the Schools of Divinity, Psychology and Neuroscience, as well as the King James Library. Several schools are located on The Scores including Classics, English, History, Philosophy, the School of Economics and Finance, and International Relations, as well as the Admissions department, the Museum of the University of St Andrews, and the Principal's residence, University House. North Street is also the site of several departments including, the Principal's Office, Younger Hall, Department of Film Studies, and the University Library. The North Haugh is principally home to the Natural Sciences such as Chemistry, Physics, Biology, as well as Mathematics, Computer Science, Medicine and the School of Management.
Libraries and museums
The University of St Andrews maintains one of the most extensive university library collections in the United Kingdom, which includes significant holdings of books, manuscripts, muniments and photographs. The library collection contains over a million volumes and over two hundred thousand rare and antique books.
The university library was founded by King James VI in 1612, with the donation of 350 works from the royal collection, at the urging of George Gledstanes, the then chancellor of St Andrews, although the libraries of the colleges of St Leonard's College, St Salvator's College and St Mary's College had existed prior to this. From 1710 to 1837 the library functioned as a legal deposit library, and as a result has an extensive collection of 18th-century literature.
The library's main building is located on North Street, and houses over 1,000,000 books. The library was designed by the architects Faulkner-Brown Hendy Watkinson Stonor based in North East England at Killingworth. Faulkner-Brown specialised in libraries and leisure facilities and also designed the National Library of Canada in Ottawa and the Robinson Library at Newcastle University In 2011 the main library building underwent a £7 million re-development. The historic King James library, built in 1643, houses the university's Divinity and Medieval history collections.
In 2012 the university purchased the vacant Martyrs' Kirk on North Street, with the purpose of providing reading rooms for the Special Collections department and university postgraduate research students and staff.
The university maintains several museums and galleries, open free to the public. The Museum of the University of St Andrews (MUSA) opened in 2008 and displays some highlights of the university's extensive collection of over 100,000 artefacts. It displays objects relating both to the history of the university, such as its collection of 15th-century maces, and also unrelated objects, such as paintings by John Opie, Alberto Morrocco and Charles Sims. Several of the university's collections have been recognised as being of 'national significance for Scotland' by Museums Galleries Scotland.
The Bell Pettigrew Museum houses the university's natural history collections. Founded in 1912, it is housed in the old Bute Medical School Building in St. Mary's Quad. Among its collections are the remains of several extinct species such as the dodo and Tasmanian tiger as well as fossilised fish from the nearby Dura Den, Fife, which when found in 1859 stimulated the debate on evolution.
Chapels
The university has two collegiate chapels. The chapel of St Salvator's was founded in 1450 by Bishop James Kennedy, and today it is a centre of university life. St Salvator's has a full peal of six bells, and is therefore the only university chapel in Scotland suitable for change ringing. The Chapel of St Leonard's is located in the grounds of the nearby St Leonards School. It is the university's oldest building, some parts dating from 1144 and is the smaller of the two chapels. St Salvator's and St Leonard's both have their own choirs, whose members are drawn from the student body.
Student halls
St Andrews is characterised amongst Scottish universities as having a significant number of students who live in university-maintained accommodation. As of 2012, 52% of the student population live in university halls. The halls vary widely in age and character; the oldest, Deans Court dates from the 12th century, and the newest, Whitehorn Hall, built in 2018. They are built in styles from Gothic revival to brutalist. All are now co-educational and non-smoking, and several are catered. The university guarantees every first year student a place of accommodation, and many students return to halls in their second, third and final years at St Andrews. From September 2015 onward, students have had the option of living in alcohol-free flats in David Russell Apartments on the grounds of medical conditions that do not allow drinking or for religious reasons.
Halls of residence include:
Agnes Blackadder Hall
Albany Park (demolished 2019–2021)
Andrew Melville Hall
David Russell Apartments
Fife Park Apartments
Gannochy House
Hamilton Hall
John Burnet Hall
McIntosh Hall
Powell Hall (Postgraduate only)
St Regulus Hall
St Salvator's Hall
University Hall
Whitehorn Hall (addition to University Hall, 2018)
Angus and Stanley Smith Houses (Postgraduate only)
Deans Court (Postgraduate only)
St Gregory's (Postgraduate only)
Hepburn Hall
Renewable energy projects
Since 2013, the university's endowment has been invested under the United Nations Principles of Responsible Investment (UNPRI) initiative with a sustainable ethical policy enforced since 2007. The university has the target of being the UK's first carbon neutral university and has invested in creating two new macro-scale renewable energy sites.
The Guardbridge Biomass Energy Centre will generate power using locally sourced wood-fuelled biomass, hot water will be transported to the university through underground pipes to heat and cool laboratories and student residences. The £25 million project is expected to save 10,000 tonnes of carbon annually and the university aims to establish the site as a knowledge exchange hub which would provide "missing link" facilities to allow research and discoveries made in university labs to be translated to working prototypes. Work began onsite in 2014 and the centre is expected to be operational by December 2015.
In October 2013, the university received permission to build six medium-sized turbines at Kenly Wind Farm, near Boarhills. The wind farms are expected to be operational by 2017 and will bring an estimated £22 million boost to the local and national economy with 19,000 tonnes of carbon saved annually.
Student life
Students' Association
The University of St Andrews Students' Association is the organisation which represents the student body of the University of St Andrews.
It was founded in 1885 and comprises the students' representative council (SRC) and the Student Activities Forum (SAF) (previously known as the Students' Services Council (SSC)). The Students' Association has 10 SRC subcommittees and 11 SAF subcommittees: SRC: Accommodation, Alumni, BAME Students' Network, Community Relations (ComRels), Disabled Students Network (DSN), Environment, Equal Opportunities (EqualOps), Life Long and Flexible Learners (Lifers), SaintsLGBT+, and Wellbeing. SAF: The Entertainments 'Ents' Committee, Charities Campaign, Union Debating Society, STAR (St Andrews Radio), Mermaids Performing Arts Fund, Design Team, SVS (Student Voluntary Service), the Music Fund (prev. Music is Love), On the Rocks (an annual arts festival), Societies Committee, and the Postgraduate Society. Every matriculated student is automatically a member of each subcommittee.
The Students' Association Building (informally known as the Union) is located on St Mary's Place, St Andrews. Union facilities include several bars (Main, Beacon, and Sandy's) and the university's Student Support Services. In 2013 the Students' Association Building underwent a refurbishment. The Students' Association is affiliated to, and a founding member of, the Coalition of Higher Education Students in Scotland but unlike many other students' unions in the UK is not a member of the National Union of Students, having most recently rejected membership in a referendum in November 2012.
Societies
St Andrews is home to over 200 student societies which cover a wide range of interests.
The oldest student society in St Andrews is the University of St Andrews Celtic Society which has run continuously without mergers since 1796. It promotes Scottish culture to students of the university and the wider community. Currently it does this through Scottish Country Dance and Scottish Gaelic Language Classes. Its Scottish Country Dance activities are affiliated with the Royal Scottish Country Dance Society (RSCDS).
All matriculated students are members of the "Union Debating Society", a student debating society that holds weekly public debates in Lower Parliament Hall, often hosts notable speakers, and participates in competitive debating in both national and international competitions. Its origins go back to the 1794 founding of the Literary Society, however its current form only dates back to the 1890 merger with the Classical Society. Since its roots can be traced back to 1794, it claims to be the oldest continuously run student debating society in the world.
There is a strong tradition of student media at St Andrews. The university's two newspapers are The Saint, a fortnightly publication and The Stand, an online publication founded in 2011. There is also the Foreign Affairs Review ran by the Foreign Affairs Society and the first legal publication in town - the St Andrews Law Review - was launched in 2020. There are also a number of smaller student publications including The Wynd, a student-run magazine and The Regulus, a student magazine focusing on politics and current affairs. In addition to this there are several student-led academic journals, most notably, Stereoscope Magazine which is focused on student photography and raising awareness of the university's historic photographic collection, Ha@sta, an annual journal for those interested in art history, Aporia, the journal of the Philosophy Society, and the Postgraduate Journal of Art History and Museum Studies. The university's radio station is STAR radio, an online station that broadcasts 24/7 during term time. Scoot Around is a literary-cultural magazine based in St Andrews with contributors from universities around the world.The Sinner is an independent website and discussion forum set up by students of the university.
The university's Music Society comprises many student-run musical groups, including the university's flagship symphony orchestra, wind band, and chorus. One of the oldest choirs in the university is the St Andrews University Madrigal Group, which performs a concert each term and has an annual summer tour. The A Cappella Society represents all four a cappella groups at St Andrews: The Other Guys, The Alleycats, The Accidentals and The Hummingbirds. From 2009 to 2011, all four of these groups participated in The Voice Festival UK(VF-UK) competition, and The Other Guys, The Accidentals and The Alleycats all reached the London final.
Student theatre at the University of St Andrews is funded by the Mermaids Performing Arts fund. There are regular dramatic and comedic performances staged at the Barron theatre. Blind Mirth is the university's improvisational theatre troupe, which performs weekly in the town, and annually takes a production to the Edinburgh Fringe Festival.
The Kate Kennedy Club plays a significant role in the life of the university, maintaining university traditions such as the Kate Kennedy Procession, in which students parade through the town dressed as eminent figures from the university's history, and organising social events such as the Opening and May balls. Founded in 1926, the club is composed of around thirty matriculated students, who are selected by the club's members. The club has received criticism from the university's former principal, Louise Richardson, and alumna the Duchess of Cambridge, Kate Middleton, over its previously male-only admission policy. In 2012, the club decided to allow female students to join.
St Andrews is home to several other private clubs, such as The Kensington Club, founded in 1739 by Alexander Laird Balgonie and is an all-male dining club that organises private events for members. The St Andrews Fight Club hosts an annual boxing match, training 20 amateur boxers in an intensive course.
Sports clubs and the Athletic Union
The University of St Andrews Athletic Union is the student representative body for sport. Established in 1901, it is affiliated to BUCS and encompasses around sixty sport clubs, who compete at both a recreational and high-performance level. A notable club is the University of St Andrews Rugby Football Club, which played a pivotal role in shaping the sport and has produced Scottish international players such as J. S. Thomson and Alfred Clunies-Ross.
The university is currently going through a £14 million five-phase development of the student sports centre which will include a new 400-seat eight-court sports hall, a new reception area and expanded gym facilities.
The Scottish Varsity, also known as the 'world's oldest varsity match', is played annually against the University of Edinburgh.
Traditions
Sponsio Academica
In order to become a student at the university a person must take an oath in Latin at the point of matriculation, called the Sponsio Academica, although this tradition now has been digitised and is agreed to as part of an online matriculation process.
In English:
We students who set down our names hereunder in all good faith make a solemn promise that we shall show due deference to our teachers in all matters relating to order and good conduct, that we shall be subject to the authority of the Senatus Academicus and shall, whatever be the position we attain hereafter, promote, so far as lies in our power, the profit and the interest in our University of St Andrews. Further, we recognise that, if any of us conducts ourselves in an unbecoming or disorderly manner or shows insufficient diligence in their studies and, though admonished, does not improve, it is within the power of the Senatus Academicus to inflict on such students a fitting penalty or even expel them from the University.
Gowns
One of the most conspicuous traditions at St Andrews is the wearing of academic dress, particularly the distinctive red undergraduate gown of the United College. Undergraduates in Arts and Science subjects can be seen wearing these garments at the installation of a rector or chancellor, at chapel services, on 'Pier Walks', at formal hall dinners, at meetings of the Union Debating Society, and giving tours to prospective students and visitors as well as on St Andrews Day. Divinity students wear a black undergraduate gown with a purple saltire cross on the left facing. Postgraduates wear the graduate gown or, as members of St Leonard College, may wear a black gown trimmed with burgundy, introduced for graduate students whose original university is without academic dress. (See Academic dress of the University of St Andrews.) St Mary's College Post Graduates, however, wear their graduate gown with a purple saltire cross on the left facing.
Bejant
Bejant is a term used to refer to first year male students; females being described as Bejantines. Second-year students are known as a Semis, a student in their third year may be referred to as a Tertian, and in their final year as a Magistrand. These terms are thought to be unique to St Andrews. When wearing their traditional red gowns, students in each year may be identified according to the way they wear their gowns. In the first year, the gown is worn on the shoulders, in the second year it is worn slightly off the shoulders. In the third year arts students wear their gowns off their left shoulders, and science students off their right shoulders. Finally, fourth years wear their gowns right down to their elbows, ready to shed their scarlet gowns for the black graduation gown. The gown is never to be joined at the top as this is considered bad luck.
Academic parents
The students of the university enjoy an unusual family tradition designed to make new students feel at home and build relationships within the student body. Traditionally, a Bejant or Bejantine acquires academic parents who are at least in their third year as students. These older students act as informal mentors in academic and social matters and it is not uncommon for such academic family ties to stretch well beyond student days. Tradition has it that a Bejant may ask a man to be his Senior Man but must be invited by a woman who is prepared to be his Senior Woman. Similarly, a Bejantine may ask a male to be her Senior Man but there is no overt rule regarding how she acquires a Senior Woman. The establishment of these relationships begins at the very start of the first semester – with the aim of being in place ahead of Raisin Weekend.
Raisin Weekend
Raisin Weekend celebrates the relationship between the Bejants/Bejantines (first-year students) and their respective academic parents who, in St Andrews' tradition, guide and mentor them in their time at the university. It is traditionally said that students went up to study with a sack of oatmeal and a barrel of salt-herring as staple foods to last them a term and that, therefore, anything more exotic was seen as a luxury. In return for the guidance from academic parents a further tradition sprang up of rewarding these "parents" with a pound of raisins. Since the 19th century the giving of raisins was steadily transformed into the giving of a more modern alternative, such as a bottle of wine (although presents are now rarely expected). In return for the raisins or equivalent present, the parents give their "children" a formal receipt — the Raisin Receipt — composed in Latin. Over time this receipt progressively became more elaborate and often humorous. The receipt can be written on anything and is to be carried everywhere by the Bejant/Bejantine on the morning of Raisin Monday until midday.
Raisin Weekend is held annually over the last weekend of October. Affairs often begin with a tea party (or similar) thrown by the mother(s) and then a pub-crawl or house party led by the father(s). It is fairly common for several academic families to combine in the latter stages of the revels. At midday all the First-Years gather in Quad of St Salvator's College to compare their receipts and also to be open to challenge from older students who may look for errors in the Latin of the receipt (an almost inevitable occurrence). Upon detection of such error(s) the bearer may be required to sing the Gaudie. In more recent years the gathering has culminated in a shaving foam fight. Since 2014, the foam fight has been moved from St Salvator's Quad to the adjacent Lower College Lawn. Raisin Weekend has also become synonymous with binge drinking and a certain amount of humiliation of "academic children", commonly involving embarrassing costumes or drinking games. The University Students' Association provides a special First Aid hotline for Raisin Weekend.
Cobblestones
Situated around the town of St Andrews are cobblestone markings denoting where Protestant martyrs were burnt at the stake. To students, the most notable of these is the cobblestone initials "PH" located outside the main gate of St Salvator's College. These cobblestones denote where Patrick Hamilton was martyred in 1528. According to student tradition, stepping on the "PH" will cause a student to become cursed, with the effect that the offender will fail his or her degree and so students are known to jump over the cobblestones when passing. The 'curse' is said to be lifted by participating in the May Dip.
May Dip
The May Dip is a student tradition held annually at dawn on May Day. Students usually stay awake until dawn, at which time they collectively run into the North Sea to the sound of madrigals sung by the University Madrigal Group. Students purportedly do so to cleanse themselves of any academic sins (which they may have acquired by stepping on the PH cobblestone) before they sit exams in May. In 2011, the event was "officially" moved by the Students' Association to East Sands in response to concerns for health and safety in its former location on Castle Sands.
Publications
The Centre for the Study of Terrorism and Political Violence (CSTPV), within the School of International Relations, publishes the online open-access journal Contemporary Voices: St Andrews Journal of International Relations (formerly Journal of Terrorism Research).
Notable people
Alumni
Notable University of St Andrews alumni include King James II of Scotland; United States Declaration of Independence signatory James Wilson (1761); Governor General of Canada John Campbell; discoverer of logarithms John Napier (1563); founder of the Church of Scotland and leader of the Protestant Reformation John Knox (1531); notable Leader of the Church of Scotland Thomas Chalmers; founder of and the first Chancellor of the University of Glasgow William Turnbull; founder of the University of Edinburgh Robert Reid; founder of the world's first commercial savings bank Henry Duncan (1823); journalist and politician during the French Revolution Jean-Paul Marat (1775 MD); inventor of beta-blockers, H2 receptor antagonists and Nobel Prize in Medicine winner James W. Black (1946 MB ChB); the 'father of military medicine' Sir John Pringle, 1st Baronet; pioneer of the smallpox vaccine Edward Jenner (1792 MD); Prince William, Duke of Cambridge (2005) and Catherine, Duchess of Cambridge (2005).
Alumni in the fields of academia and education have gone on to found the University of Melbourne Medical School (Anthony Brownless) and the Scottish Church College in Calcutta (Alexander Duff was also the first Scottish missionary to India), become the first Regent and first Principal of the University of Edinburgh (Robert Rollock), Dean of Harvard Divinity School (David Hempton), the Vice Chancellors of Aberdeen University (Ian Diamond), University of Nottingham (Shearer West), Open University (Walter Perry was also the first Vice-Chancellor) and Sydney University (Gavin Brown), Chancellor of the University of Maine system (James H. Page), provost of Eton College (Eric Anderson), discoverer of the Berry Phase (Sir Michael Berry) and inventor of the Leslie cube John Leslie.
In business and finance, St Andrews graduates have become the CEOs of multinational companies including the Bank of Russia, BHP (Andrew Mackenzie), BP (Robert Horton), FanDuel (Nigel Eccles co-founded the company with fellow St Andrews graduate, Lesley Eccles), Rolls-Royce Holdings (John Rose), Royal Dutch Shell (Robert Paul Reid), Tate & Lyle (Iain Ferguson) and Royal Bank of Scotland (George Mathewson). Other notable businesspeople include Banker Olivier Sarkozy, Director of the Edinburgh Festival Fringe Alistair Moffat and the CEO of Scottish Rugby Union and ATP World Tour Finals Phil Anderton.
Former St Andrews students active in politics and national intelligence include two Chiefs of MI6 Alex Younger and John Sawers, two deputy directors of the Secret Intelligence Service (MI6), George Kennedy Young and J. M. Bruce Lockhart, Secretary of State for Scotland Lord Forsyth (Forsyth is also former Deputy Chairman of JP Morgan), former First Minister of Scotland and leader of the SNP for over 20 years Alex Salmond, former Cabinet Secretary and Head of the Civil Service Sir Mark Sedwill, former Secretary of State for Defence Sir Michael Fallon, Deputy Leader of the Liberal Democrats Malcolm Bruce and leader of the Christian Party James George Hargreaves. Outside of the UK, alumni include the Financial Secretary of Hong Kong credited with laying the foundations for Hong Kong's economic success John James Cowperthwaite, former Senior Director for European and Russian Affairs on the United States National Security Council, Fiona Hill, David Holmes (both were involved in the Impeachment inquiry against Donald Trump), and the first female cabinet minister in Egypt Hikmat Abu Zayd. Alumni have also gone on to serve as diplomats including the current Permanent Representative of the United Kingdom to the United Nations and former British Ambassador to China (2015-2020) Dame Barbara Woodward, former Ambassador to Russia (2008-2011) Dame Anne Pringle and Thomas Bruce who is known for the removal of the Elgin Marbles from the Parthenon.
Alumni from the media and the arts include founder of Forbes magazine B. C. Forbes, founder of The Week Jolyon Connell, current Downing Street Director of Communications and former Controller of BBC World News Craig Oliver, Political Editor of BBC Scotland Brian Taylor, BBC News presenter Louise Minchin, BBC Sport TV presenter Hazel Irvine, Primetime Emmy Award winning screenwriter David Butler, Pulitzer Prize winning author James Michener, feminist writer Fay Weldon, musician The Pictish Trail and actors Siobhan Redmond, Crispin Bonham-Carter, Ian McDiarmid and Jonathan Taylor Thomas.
Other notable alumni include 'father of the poll tax' Douglas Mason, founders of the Adam Smith Institute, Madsen Pirie and Eamonn Butler, former Lord Justice General Lord Cullen, two currently sitting members of the Inner House, Lord Eassie and Baroness Clark of Calton, one of the leading figures in the formation of the United States Golf Association Charles B. Macdonald, the captain of Tottenham Hotspur F.C. during its double-winning season Danny Blanchflower, and the wildlife conservationist Saba Douglas-Hamilton.
The university also boasts of a rich roll of honorary graduates whose members vary from Benjamin Franklin to Hillary Clinton, from Bob Dylan to Arvo Pärt, from Maggie Smith to Sean Connery, from Nora K. Chadwick to Noam Chomsky, from Joseph Stevenson to Lisa Jardine, from Seamus Heaney to Bahram Beyzai, from Georg Cantor to David Attenborough.
Academics
Notable University of St Andrews faculty include Nobel Prize in Medicine winner Maurice Wilkins (lecturer in physics 1945–1946) and discoverer of herring bodies Percy Theodore Herring (Chandos Chair of Medicine and Anatomy 1908–1948). The Morris water navigation task was developed by Richard Morris at the university's Gatty Marine Laboratory.
Anthropology
Paloma Gay y Blasco
Peter Gow
Ladislav Holý
Joanna Overing
Biology
Struther Arnott
Maria Dornelas
Patrick Geddes
Tracey Gloster
Adrian Horridge
Susan D. Healy
D'Arcy Wentworth Thompson
Business and Management
Meaghan Delahunt
Robert Gray
Chemistry
Peter Bruce
Rebecca Goss
Norman Haworth
James Irvine
Russell E Morris
James H Naismith
Catherine Steele
Michael Bühl
Classics
Walter Burkert
Lewis Campbell
Chris Carey
John Craig
James Donaldson
Stephen Halliwell
Wallace Lindsay
William Lorimer
Computer Science
Jack Cole
Ian Gent
Divinity
John Adamson
Mario Aguilar
Robert Arnot
Donald Macpherson Baillie
Robert Baron
Richard Bauckham
Matthew Black
Ian Bradley
David Brown
Thomas Chalmers
Nicol Dalgleish
Ivor Davidson
George Duncan
Philip Esler
Timothy Gorringe
James Haldenston
Robert Halliday
Daphne Hampson
Alexander Henderson
George Hill
Nicholas Thomas Wright
Economics
Ralph Harris, Baron Harris of High Cross
David A. Jaeger
Clara Ponsatí i Obiols
Engineering
Angus Robertson Fulton
English, Literature, and Poetry
Michael J. Alexander
Meg Bateman
John Burnside
Robert Crawford
Douglas Dunn
Roger Lancelyn Green
Robert Irwin
Kathleen Jamie
John Johnston
A. L. Kennedy
William Angus Knight
Don Paterson
Languages and Linguistics
Peter Branscombe
George Hadow
Geology
Christopher Hawkesworth
History and Art History
G.W.S. Barrow
Robert Bartlett
Alison Beach
Paul Bibire
Michael Brown
George Buchanan
Nora K. Chadwick
Barrie Dobson
Norman Gash
John Guy
Robert Kerr Hannay
John Hudson
Martin Kemp
John Philipps Kenyon
Hamish Scott
Alex Woolf
Tomasz Kamusella
International Relations and Politics
Bruce Hoffman
John Lindsay of Balcarres, Lord Menmuir
Hew Strachan
David Veness
Paul Wilkinson
Mathematics and Astronomy
John Couch Adams
Rosemary A. Bailey
Kenneth Falconer
Eric Priest
James Gregory
John Mackintosh Howie
Douglas Samuel Jones
Peter Cameron
Media and Film Studies
Dina Iordanova
Medicine and Physiology
John Adamson
Oswald Taylor Brown
George Edward Day
Margaret Fairlie
John Forfar
Percy Theodore Herring
Robert Hunter
Joseph Fairweather Lamb
Philosophy and Logic
Thomas Spencer Baynes
Piers Benn
Bernard Bosanquet
C. D. Broad
Sarah Broadie
Herman Cappelen
Gershom Carmichael
Laurence Jonathan Cohen
James Main Dixon
James Drever
James Frederick Ferrier
John Joseph Haldane
Bob Hale
Geoffrey Hunter
Malcolm Knox
John Major
Graham Priest
John Skorupski
George Stout
Crispin Wright
Physics and Astronomy
H. Stanley Allen
John F. Allen
Adam Anderson
Sir Michael Berry
David Brewster
Charles Coulson
Dirk ter Haar
Emilios T. Harlaftis
Alan Hood
Thomas F Krauss
Johannes Kuenen
Andrew P. Mackenzie FRS
Psychology
William Fitch
Kay Redfield Jamison
Malcolm Jeeves
Zoology
Ian L. Boyd
H. G. Callan
William Thomas Calman
In popular culture
The University of St Andrews has appeared in or been referenced by a number of popular media works, in film and literature.
Film
West Sands Beach in St Andrews was used as a location for the film Chariots of Fire (1981), the scene, in which several of the main characters run along the beach, has become widely recognised and one of the most famous scenes in British film history.
The student hall, Andrew Melville Hall, was used for location shooting of the 2010 film adaptation of Kazuo Ishiguro's novel, Never Let Me Go starring Keira Knightley and Carey Mulligan.
Literature
In Enid Blyton's Malory Towers novel series, the main heroine Darrell Rivers plans to attend the University of St Andrews after Sixth Form with some of her fellow characters.
St Andrews appeared in Samuel Johnson's travel narrative A Journey to the Western Islands of Scotland (1775), in which he visited the university.
Bruce Marshall's romance novel, Girl in May (1956), is set in St Andrews.
Adam Nevill's horror novel Banquet for the Damned (2004) takes place in St Andrews.
Jay Parini's memoir Borges and Me (2020) recounts the author's road trip from St Andrews to the Highlands with Jorge Luis Borges.
See also
:Category:Academics of the University of St Andrews
Chancellor of the University of St Andrews
St Andrews Cathedral
List of medieval universities
Gaudy
Town and gown
Notes
References
Sources
R.G. Cant The University of St Andrews, A Short History (Oliver and Boyd Ltd. 1946)
External links
University of St Andrews Students' Association Website
Research@StAndrews:FullText, the university's digital repository of research output
BBC Your Paintings, Public Catalogue Foundation
1413 establishments in Scotland
Education in Fife
Educational institutions established in the 15th century
Universities in Scotland
15th-century establishments in Scotland
Universities UK |
35534550 | https://en.wikipedia.org/wiki/AV-Comparatives | AV-Comparatives | AV-Comparatives is an Austrian independent organization that tests and assesses antivirus software, regularly releasing charts and reports that are freely available to the public and the media. Antivirus vendors have to meet various requirements regarding trustworthiness and reliability in order to take part in the tests.
AV-Comparatives issues relevant awards, based on antivirus software's comprehensive performance according to multiple testing criteria. It is also supported by the University of Innsbruck and other academic bodies from around the world, as well as by the Austrian Federal Government and the regional government of Tirol.
Real World Protection Test
The AV-Comparatives "Real World Protection Test" is a test environment that closely approximates how well an antivirus product will protect real-world users. Test results are released monthly (from March to June and August to November). Two detailed overall result reports are released in June and December. The Real World Protection Test framework was recognized by the "Standortagentur Tirol" with the 2012 Cluster Award for innovation in computer science.
Latest Test Series
In 2018 AV-Comparatives started with a large scale Enterprise Security Software Test Series, consisting of a Real-World Test, a False Alarm Test, a Malware Protection Test and a Performance Test as well as a review.
Listing of tests and reviews by AV-Comparatives
Real-World Protection Tests
File Detection Tests
Malware Protection Test
Heuristic / Behaviour Test
False Alarm Test
PerformanceTest
Malware Removal Test
Anti-Phishing Test
Parental Control Test
Mac Security Reviews / Tests
Mobile Security Review
Corporate / Enterprise Security Reviews
PowerShell-based File-less Attacks and File-based Exploits Test
Operating systems used for antivirus tests
Microsoft Windows
macOS
iOS
Android
Linux
AVC UnDroid Analyzer
AV-Comparatives has provided "UnDroid APK Analyzer" as a free service for its website's users since May 2013. Designed for Android smartphone users, it provides a static analysis of Android apps. Users can upload an Android application package (APK) and receive a quick online analysis containing the file hashes, graphical danger level and additional information.
Awards and certifications given to AV-Comparatives
2016: EN ISO 9001:2015 for the Scope "Independent Tests of Anti-Virus Software"
2015: EICAR trusted IT-security testing lab
2013: Constantinus Award in Computer Science, the highest award/certification given by Austrian Government (Chamber of Commerce) for projects in computer science.
2012: Austrian eAward
2012: Cluster Award 2012
References list
External links
Constantinus
University of Innsbruck, Quality Engineering (Laura Bassi Centre)
University of Innsbruck, Databases and Information Systems (DBIS)
Standort Agentur Tirol
Antivirus software
Non-profit organisations based in Austria
Organisations based in Innsbruck |
6975953 | https://en.wikipedia.org/wiki/CDC%20SCOPE | CDC SCOPE | SCOPE (Supervisory Control of Program Execution) is a series of Control Data Corporation operating systems developed in the 1960s.
Variants
SCOPE for the CDC 3000 series
SCOPE for the CDC 6000 series
SCOPE and SCOPE-2 for the CDC 7600/Cyber-76
SCOPE for the CDC 3000 series
SCOPE for the CDC 6000 series
This operating system was based on the original Chippewa Operating System. In the early 1970s, it was renamed NOS/BE for the CDC Cyber machines. The SCOPE operating system is a file-oriented system using mass storage, random access devices. It was designed to make use of all capabilities of CDC 6000 computer systems and exploits fully the multiple-operating modes of all segments of the computer. Main tasks of SCOPE are controlling job execution, storage assignment, performing segment and overlay loading. Its features include comprehensive input/output functions and library maintenance routines. The dayfile chronologically records all jobs run and any problems encountered. To aid debugging, dumps and memory maps are available. Under control of SCOPE, a variety of assemblers (COMPASS), compilers (ALGOL, FORTRAN, COBOL), and utility programs (SORT/MERGE, PERT/TIME, EXPORT/IMPORT, RESPOND, SIMSCRIPT, APT, OPTIMA etc.) may be operated. The computer emulation community has made repeated attempts to recover and preserve this software. It is now running under a CDC CYBER and 6000 series emulator.
Competition
SCOPE was written by a programming team in Sunnyvale, California, about 2,000 miles from the CDC hardware division. It was considered by them a buggy and inefficient piece of software, though not much different than many operating systems of the era. At the CDC Arden Hills, Minnesota laboratories (where they referred to SCOPE as Sunnyvale's Collection Of Programming Errors) they had a competing operating system, MACE. This was the Mansfield And Cahlander Executive (from Greg Mansfield and Dave Cahlander, the authors of the system). It had started as an engineering test executive, but eventually developed into a complete operating system — a modularized rewrite and enhancement of the original Chippewa Operating System (COS). While never an official CDC product, a copy was freely given to any customer who asked for one. Many customers did, especially the more advanced ones (like University and research sites).
When Control Data decided to write its next operating system Kronos, it considered both the current SCOPE system and the unofficial MACE alternative. They chose to abandon the SCOPE system and base Kronos on the MACE software. Eventually, Kronos was replaced by the new Network Operating System (NOS). Though many smaller CDC customers continued to use the SCOPE system rather than Kronos. When NOS became the primary Control Data operating system, some customers running mainly batch operations were reluctant to switch to the NOS system, as they saw no benefit for their shop. So the SCOPE system was maintained, and renamed as NOS/BE (Batch Environment), primarily so that CDC Marketing could say that all mainframe customers were using the NOS operating system.
See also
CDC Kronos
NOS
SCOPE
Discontinued operating systems
1964 software |
6002040 | https://en.wikipedia.org/wiki/FreeSWITCH | FreeSWITCH | FreeSWITCH is a free and open-source application server for real-time communication, WebRTC, telecommunications, video and Voice over Internet Protocol (VoIP). Multiplatform, it runs on Linux, Windows, macOS and FreeBSD. It is used to build PBX systems, IVR services, videoconferencing with chat and screen sharing, wholesale least-cost routing, Session Border Controller (SBC) and embedded communication appliances. It has full support for encryption, ZRTP, DTLS, SIPS. It can act as a gateway between PSTN, SIP, WebRTC, and many other communication protocols. Its core library, libfreeswitch, can be embedded into other projects. It is licensed under the Mozilla Public License (MPL), a free software license.
History
The FreeSWITCH project was first announced in January 2006 at O'Reilly Media's ETEL Conference. In June 2007, FreeSWITCH was selected by Truphone for use, and in August 2007, Gaboogie announced that it selected FreeSWITCH as its conferencing platform.
FreeSWITCH's first official 1.0.0 release (Phoenix) was on May 26, 2008. A minor 1.0.1 patch release came out on July 24, 2008. At ClueCon 2012 Anthony Minessale announced the release of FreeSWITCH version 1.2.0 and that the FreeSWITCH development team had adopted separate stable (version 1.2) and development (version 1.3) branches.
FreeSWITCH 1.4, released at early 2014, is the first version support SIP over Websocket and WebRTC.
FreeSWITCH 1.6 added support for video transcoding and video conferencing, Verto protocol for WebRTC, and all WebRTC codecs and standards.
FreeSWITCH 1.8 was released at ClueCon in 2018 with further updates and stability improvements to the project.
SignalWire Inc was founded in 2018 to provide commercial cloud telecommunication services utilizing an elastic FreeSWITCH core, and provide a permanent commercial sponsor for the open source project that was controlled by the founders of FreeSWITCH. It then acquired FreeSWITCH Solutions.
Design
According to the lead designer, Anthony Minessale, FreeSWITCH is intended to be a softswitch that is built on top of a solid core, driven by a state machine. The stated goals of the project include stability, scalability, and abstraction.
Derived products
FreeSWITCH is a core component in many PBX in a box commercial products and open-source projects. Some of the commercial products are hardware and software bundles, for which the manufacturer supports and releases the software as open source.
BigBlueButton is built on top of FreeSWITCH
See also
List of free and open-source software packages
List of SIP software – other SIP related programs
References
Free VoIP software
Lua (programming language)-scriptable software
Software using the Mozilla license |
58208079 | https://en.wikipedia.org/wiki/Timeline%20of%20women%20in%20science | Timeline of women in science | This is a timeline of women in science, spanning from ancient history up to the 21st century. While the timeline primarily focuses on women involved with natural sciences such as astronomy, biology, chemistry and physics, it also includes women from the social sciences (e.g. sociology, psychology) and the formal sciences (e.g. mathematics, computer science), as well as notable science educators and medical scientists. The chronological events listed in the timeline relate to both scientific achievements and gender equality within the sciences.
Ancient history
c. 2700 BCE: In Ancient Egypt, Merit-Ptah practised medicine in the pharaoh's court.
1900 BCE: Aganice, also known as Athyrta, was an Egyptian princess during the Middle Kingdom (about 2000–1700 BCE) working on astronomy and natural philosophy.
c. 1500 BCE: Hatshepsut, also known as the Queen Doctor, promoted a botanical expedition searching for officinal plants.
1200 BCE: The Mesopotamian perfume-maker Tapputi-Belatekallim was referenced in the text of a cuneiform tablet. She is often considered the world's first recorded chemist.
500 BC: Theano was a Pythagorean philosopher.
c. 150 BCE: Aglaonice became the first female astronomer to be recorded in Ancient Greece.
1st century BCE: A woman known only as Fang became the earliest recorded Chinese woman alchemist. She is credited with "the discovery of how to turn mercury into silver" – possibly the chemical process of boiling off mercury in order to extract pure silver residue from ores.
1st century CE: Mary the Jewess was among the world's first alchemists.
c. 300–350 CE: Greek mathematician Pandrosion develops a numerical approximation for cube roots.
c. 355–415 CE: Greek astronomer, mathematician and philosopher Hypatia became renowned as a respected academic teacher, commentator on mathematics, and head of her own science academy.
3rd century CE: Cleopatra the Alchemist, an early figure in chemistry and practical alchemy, is credited as inventing the alembic.
Middle Ages
c. 975 CE: Chinese alchemist Keng Hsien-Seng was employed by the Royal Court. She distilled perfumes, utilized an early form of the Soxhlet process to extract camphor into alcohol, and gained recognition for her skill in using mercury to extract silver from ores.
10th century: Al-ʻIjliyyah manufactured astrolabes for the court of Sayf al-Dawla in Aleppo.
early 12th century: Dobrodeia of Kiev (died 1131), a Rus' princess, was the first woman to write a treatise on medicine.
Early 12th century: The Italian medical practitioner Trota of Salerno compiled medical works on women's ailments and skin diseases.
12th century: Adelle of the Saracens taught at the Salerno School of Medicine.
12th century: Hildegard of Bingen (1098–1179) was a founder of scientific natural history in Germany.
1159: The Alsatian nun Herrad of Landsberg (1130–1195) compiled the scientific compendium Hortus deliciarum.
1220s: Zulema the Astrologist was a Muslim astronomer in Medina Mayurqa.
Early 14th century: Adelmota of Carrara was a physician in Padua, Italy.
16th century
1561: Italian alchemist Isabella Cortese published her popular book The Secrets of Lady Isabella Cortese. The work included recipes for medicines, distilled oils and cosmetics, and was the only book published by a female alchemist in the 16th century.
1572: Italian botanist Loredana Marcello died from the plague – but not before developing several effective palliative formulas for plague sufferers, which were used by many physicians.
1572: Danish scientist Sophia Brahe (1556–1643) assisted her brother Tycho Brahe with his astronomical observations.
1590: After her husband's death, Caterina Vitale took over his position as chief pharmacist to the Order of St John, becoming the first woman chemist and pharmacist in Malta.
17th century
1609: French midwife Louise Bourgeois Boursier became the first woman to write a book on childbirth practices.
1636: Anna Maria van Schurman is the first woman ever to attend university lectures. She had to sit behind a screen so that her male fellow students would not see her.
1642: Martine Bertereau, the first recorded woman mineralogist, was imprisoned in France on suspicion of witchcraft. Bertereau had published two written works on the science of mining and metallurgy before being arrested.
1650: Silesian astronomer Maria Cunitz published Urania Propitia, a work that both simplified and substantially improved Johannes Kepler's mathematical methods for locating planets. The book was published in both Latin and German, an unconventional decision that made the scientific text more accessible for non-university educated readers.
1656: French chemist and alchemist Marie Meurdrac published her book La Chymie Charitable et Facile, en Faveur des Dames (Useful and Easy Chemistry, for the Benefit of Ladies).
1667: Margaret Lucas Cavendish, Duchess of Newcastle upon Tyne (1623 – 15 December 1673) was an English aristocrat, philosopher, poet, scientist, fiction-writer, and playwright during the 17th century. She was the first woman to attend a meeting at the Royal Society of London, in 1667, and she criticised and engaged with members and philosophers Thomas Hobbes, René Descartes, and Robert Boyle.
1668: After separating from her husband, French polymath Marguerite de la Sablière established a popular salon in Paris. Scientists and scholars from different countries visited the salon regularly to discuss ideas and share knowledge, and Sablière studied physics, astronomy and natural history with her guests.
1680: French astronomer Jeanne Dumée published a summary of arguments supporting the Copernican theory of heliocentrism. She wrote "between the brain of a woman and that of a man there is no difference".
1685: Frisian poet and archaeologist Titia Brongersma supervised the first excavation of a dolmen in Borger, Netherlands. The excavation produced new evidence that the stone structures were graves constructed by prehistoric humans – rather than structures built by giants, which had been the prior common belief.
1690: German-Polish astronomer Elisabetha Koopman Hevelius, widow of Johannes Hevelius, whom she had assisted with his observations (and, probably, computations) for over twenty years, published in his name Prodromus Astronomiae, the largest and most accurate star catalog to that date.
1693–1698: German astronomer and illustrator Maria Clara Eimmart created more than 350 detailed drawings of the moon phases.
1699: German entomologist Maria Sibylla Merian, the first scientist to document the life cycle of insects for the public, embarked on a scientific expedition to Suriname, South America. She subsequently published Metamorphosis insectorum Surinamensium, a groundbreaking illustrated work on South American plants, animals and insects.
18th century
1702: Pioneering English entomologist Eleanor Glanville captured a butterfly specimen in Lincolnshire, which was subsequently named the Glanville fritillary in her honour. Her extensive butterfly collection impressed fellow entomologist William Vernon, who called Glanville's work "the noblest collection of butterflies, all English, which has sham'd us". Her butterfly specimens became part of early collections in the Natural History Museum.
1702: German astronomer Maria Kirch became the first woman to discover a comet.
c. 1702–1744: In Montreal, Canada, French botanist Catherine Jérémie collected plant specimens and studied their properties, sending the specimens and her detailed notes back to scientists in France.
1732: At the age of 20, Italian physicist Laura Bassi became the first female member of the Bologna Academy of Sciences. One month later, she publicly defended her academic theses and received a PhD. Bassi was awarded an honorary position as professor of physics at the University of Bologna. She was the first female physics professor in the world.
1738: French polymath Émilie du Châtelet became the first woman to have a paper published by the Paris Academy, following a contest on the nature of fire.
1740: French polymath Émilie du Châtelet published Institutions de Physique (Foundations of Physics) providing a metaphysical basis for Newtonian physics.
1748: Swedish agronomist Eva Ekeblad became the first woman member of the Royal Swedish Academy of Sciences. Two years earlier, she had developed a new process of using potatoes to make flour and alcohol, which subsequently lessened Sweden's reliance on wheat crops and decreased the risk of famine.
1751: 19-year-old Italian physicist Cristina Roccati received her PhD from the University of Bologna.
1753: Jane Colden, an American, was the only female biologist mentioned by Carl Linnaeus in his masterwork Species Plantarum.
1755: After the death of her husband, Italian anatomist Anna Morandi Manzolini took his place at the University of Bologna, becoming a professor of anatomy and establishing an internationally known laboratory for anatomical research.
1757: French astronomer Nicole-Reine Lepaute worked with mathematicians Alexis Clairaut and Joseph Lalande to calculate the next arrival of Halley's Comet.
1760: American horticulturalist Martha Daniell Logan began corresponding with botanic specialist and collector John Bartram, regularly exchanging seeds, plants and botanical knowledge with him.
1762: French astronomer Nicole-Reine Lepaute calculated the time and percentage of a solar eclipse that had been predicted to occur in two years time. She created a map to show the phases, and published a table of her calculations in the 1763 edition of Connaissance des Temps.
1766: French chemist Geneviève Thiroux d'Arconville published her study on putrefaction. The book presented her observations from more than 300 experiments over the span of five years, during which she attempted to discover factors necessary for the preservation of beef, eggs, and other foods. Her work was recommended for royal privilege by fellow chemist Pierre-Joseph Macquer.
c. 1775: Herbalist/botanist Jeanne Baret becomes the first woman to circumnavigate the globe.
1776: At the University of Bologna, Italian physicist Laura Bassi became the first woman appointed as chair of physics at a university.
1776: Christine Kirch received a respectable salary of 400 Thaler for calendar-making. See also her sister Margaretha Kirch
1782–1791: French chemist and mineralogist Claudine Picardet translated more than 800 pages of Swedish, German, English and Italian scientific papers into French, enabling French scientists to better discuss and utilize international research in chemistry, mineralogy and astronomy.
c. 1787–1797: Self-taught Chinese astronomer Wang Zhenyi published at least twelve books and multiple articles on astronomy and mathematics. Using a lamp, a mirror and a table, she once created a famous scientific exhibit designed to accurately simulate a lunar eclipse.
1789: French astronomer Louise du Pierry, the first Parisian woman to become an astronomy professor, taught the first astronomy courses specifically open to female students.
1794: Scottish chemist Elizabeth Fulhame invented the concept of catalysis and published a book on her findings.
c. 1796–1820: During the reign of the Jiaqing Emperor, astronomer Huang Lü became the first Chinese woman to work with optics and photographic images. She developed a telescope that could take simple photographic images using photosensitive paper.
1797: English science writer and schoolmistress Margaret Bryan published A Compendious System of Astronomy, including an engraving of herself and her two daughters. She dedicated the book to her students.
Early 19th century
1808: Anna Sundström began assisting Jacob Berzelius in his laboratory, becoming one of the first Swedish women chemists.
1809: Sabina Baldoncelli earned her university degree in pharmacy but was allowed to work only in the Italian orphanage where she resided.
1815: English archaeologist Lady Hester Stanhope used a medieval Italian manuscript to locate a promising archaeological site in Ashkelon, becoming the first archaeologist to begin an excavation in the Palestinian region. It was one of the earliest examples of the use of textual sources in field archaeology.
1816: French mathematician and physicist Sophie Germain became the first woman to win a prize from the Paris Academy of Sciences for her work on elasticity theory.
1823: English palaeontologist and fossil collector Mary Anning discovered the first complete Plesiosaurus.
1831: Italian botanist Elisabetta Fiorini Mazzanti published her best-known work Specimen Bryologiae Romanae.
1830–1837: Belgian botanist Marie-Anne Libert published her four-volume Plantae cryptogamicae des Ardennes, a collection of 400 species of mosses, ferns, lichen, algae and fungi from the Ardennes region. Her contributions to systemic cryptogamic studies were formally recognized by Prussian emperor Friedrich Wilhelm III, and Libert received a gold medal of merit.
1832: French marine biologist Jeanne Villepreux-Power invented the first glass aquarium, using it to assist in her scientific observations of Argonauta argo.
1833: English phycologists Amelia Griffiths and Mary Wyatt published two books on local British seaweeds. Griffiths had an internationally respected reputation as a skilled seaweed collector and scholar, and Swedish botanist Carl Agardh had earlier named the seaweed genus Griffithsia in her honour.
1833 Orra White Hitchcock (March 8, 1796 – May 26, 1863) was one of America's earliest women botanical and scientific illustrators and artists, best known for illustrating the scientific works of her husband, geologist Edward Hitchcock (1793–1864), but also notable for her own artistic and scientific work. The most well known appear in her husband's seminal works, the 1833 Report on the Geology, Mineralogy, Botany, and Zoology of Massachusetts and its successor, the 1841 Final Report produced when he was State Geologist. For the 1833 edition, Pendleton's Lithography (Boston) lithographed nine of Hitchcock's Connecticut River Valley drawings and printed them as plates for the work. In 1841, B. W. Thayer and Co., Lithographers (Boston) printed revised lithographs and an additional plate. The hand-colored plate "Autumnal Scenery. View in Amherst" is Hitchcock's most frequently seen work.
1835: Scottish polymath Mary Somerville and German astronomer Caroline Herschel were elected the first female members of the Royal Astronomical Society.
1836: Early English geologist and paleontologist Etheldred Benett, known for her extensive collection of several thousand fossils, was appointed a member of the Imperial Natural History Society of Moscow. The society – which only admitted men at the time – initially mistook Benett for a man due to her reputation as a scientist and her unusual first name, addressing her diploma of admission to "Dominum" (Master) Benett.
1840: Scottish fossil collector and illustrator Lady Eliza Maria Gordon-Cumming invited geologists Louis Agassiz, William Buckland and Roderick Murchison to examine her collection of fish fossils. Agassiz confirmed several of Gordon-Cumming's discoveries as new species.
1843: During a nine-month period in 1842–43, English mathematician Ada Lovelace translated Luigi Menabrea's article on Charles Babbage's newest proposed machine, the Analytical Engine. With the article, she appended a set of notes. Her notes were labelled alphabetically from A to G. In note G, she describes an algorithm for the Analytical Engine to compute Bernoulli numbers. It is considered the first published algorithm ever specifically tailored for implementation on a computer, and Ada Lovelace has often been cited as the first computer programmer for this reason. The engine was never completed, so her program was never tested.
1843: British botanist and pioneering photographer Anna Atkins self-published her book Photographs of British Algae, illustrating the work with cyanotypes. Her book was the first book on any subject to be illustrated by photographs.
1846: British zoologist Anna Thynne built the first stable, self-sustaining marine aquarium.
1848: American astronomer Maria Mitchell became the first woman elected to the American Academy of Arts and Sciences; she had discovered a new comet the year before.
1848–1849: English scientist Mary Anne Whitby, a pioneer in western silkworm cultivation, collaborated with Charles Darwin in researching the hereditary qualities of silkworms.
1850: The American Association for the Advancement of Sciences accepted its first women members: astronomer Maria Mitchell, entomologist Margaretta Morris, and science educator Almira Hart Lincoln Phelps.
Late 19th century
1854:Mary Horner Lyell was a conchologist and geologist. She is most well known for her scientific work in 1854, where she studied her collection of land snails from the Canary Islands. She was married to the notable British geologist Charles Lyell and assisted him in his scientific work. It is believed by historians that she likely made major contributions to her husband's work.
1854–1855: Florence Nightingale organized care for wounded soldiers during the Crimean War. She was an English social reformer and statistician, and the founder of modern nursing. Her pie charts clearly showed that most deaths resulted from disease rather than battle wounds or "other causes," which led the general public to demand improved sanitation at field hospitals.
1855: Working with her father, Welsh astronomer and photographer Thereza Dillwyn Llewelyn produced some of the earliest photographs of the moon.
1856: American atmospheric scientist Eunice Newton Foote presented her paper "Circumstances affecting the heat of the sun's rays" at an annual meeting of the American Association for the Advancement of Sciences. She was an early researcher of the greenhouse effect.
1862: Belgian botanist Marie-Anne Libert became the first woman to join the Royal Botanical Society of Belgium. She was named an honorary member.
1863: German naturalist Amalie Dietrich arrived in Australia to collect plant, animal and anthropological specimens for the German Godeffroy Museum. She remained in Australia for the next decade, discovering a number of new plant and animal species in the process, but also became notorious in later years for her removal of Aboriginal skeletons – and the possible incitement of violence against Aboriginal people – for anthropological research purposes.
1865: English geologist Elizabeth Carne was elected the first female Fellow of the Royal Geological Society of Cornwall.
1870s
1869/1870: American beekeeper Ellen Smith Tupper became the first female editor of an entomological journal.
1870: Katharine Murray Lyell was a British botanist, author of an early book on the worldwide distribution of ferns, and editor of volumes of the correspondence of several of the era's notable scientists.<ref>{{cite book|title=Katharine Lyell. A Geographical Handbook of All the Known Ferns: With Tables to Show Their Distribution|author= Katharine Lyell|publisher=John Murray|year=1870}}</ref>
1870: Ellen Swallow Richards became the first American woman to earn a degree in chemistry.
1870: Russian chemist Anna Volkova became the first woman member of the Russian Chemical Society.
1874: Julia Lermontova became the first Russian woman to receive a PhD in chemistry.
1875: Hungarian archaeologist Zsófia Torma excavated the site of Turdaș-Luncă in Hunedoara County, today in Romania. The site, which uncovered valuable prehistoric artifacts, became one of the most important archaeological discoveries in Europe.
1876–1878: American naturalist Mary Treat studied insectivorous plants in Florida. Her contributions to the scientific understanding of how these plants caught and digested prey were acknowledged by Charles Darwin and Asa Gray.
1878: English entomologist Eleanor Anne Ormerod became the first female Fellow of the Royal Meteorological Society. A few years afterwards, she was appointed as Consulting Entomologist to the Royal Agricultural Society.
1880s
1880: Self-taught German chemist Agnes Pockels began investigating surface tension, becoming a pioneering figure in the field of surface science. The measurement equipment she developed provided the basic foundation for modern quantitative analyses of surface films.
1883: American ethnologist Erminnie A. Smith, the first woman field ethnographer, published her collection of Iroquois legends Myths of the Iroquois.
1884: English zoologist Alice Johnson's paper on newt embryos became the first paper authored by a woman to appear in the Proceedings of the Royal Society.
1885: British naturalist Marian Farquharson became the first female Fellow of the Royal Microscopical Society.
1886: Botanist Emily Lovira Gregory became the first woman member of the American Society of Naturalists.
1887: Rachel Lloyd became the first American woman to receive a PhD in chemistry, completing her research at the Swiss University of Zurich.
1888: Russian scientist Sofia Kovalevskaya discovered the Kovalevskaya top, one of a brief list of known rigid body motion examples that are tractable by manipulating equations by hand.E. T. Whittaker, A Treatise on the Analytical Dynamics of Particles and Rigid Bodies, Cambridge University Press (1952).
1888: American chemist Josephine Silone Yates was appointed head of the Department of Natural Sciences at Lincoln Institute (later Lincoln University), becoming the first black woman to head a college science department.
1889: Geologist Mary Emilie Holmes became the first female Fellow of the Geological Society of America.
1890s
1890: Austrian-born chemist Ida Freund became the first woman to work as a university chemistry lecturer in the United Kingdom. She was promoted to full lecturer at Newnham College, Cambridge.
1890: Popular science educator and author Agnes Giberne co-founded the British Astronomical Association. Subsequently, English astronomer Elizabeth Brown was appointed the Director of the association's Solar Section, well known for her studies in sunspots and other solar phenomena.
1890: Mathematician Philippa Fawcett became the first woman to obtain the highest score in the Cambridge Mathematical Tripos examinations, a score "above the Senior Wrangler". (At the time, women were ineligible to be named Senior Wrangler.)
1891: American-born astronomer Dorothea Klumpke was appointed as Head of the Bureau of Measurements at the Paris Observatory. For the next decade, in addition to completing her doctorate of science, she worked on the Carte du Ciel mapping project. She was recognized for her work with the first Prix de Dames award from the Société astronomique de France and named an Officier of the Paris Academy of Sciences.
1892: American psychologist Christine Ladd-Franklin presented her evolutionary theory on the development of colour vision to the International Congress of Psychology. Her theory was the first to emphasize colour vision as an evolutionary trait.
1893: Florence Bascom became the second woman to earn her PhD in geology in the United States, and the first woman to receive a PhD from Johns Hopkins University. Geologists consider her to be the "first woman geologist in this country (America)".
1893: American botanist Elizabeth Gertrude Britton became a charter member of the Botanical Society of America.
1894: American astronomer Margaretta Palmer becomes the first woman to earn a doctorate in astronomy.
1895: English physiologist Marion Bidder became the first woman to speak and present her own paper at a meeting of the Royal Society.
1896: Florence Bascom became the first woman to work for the United States Geological Survey.
1896: English mycologist and lichenologist Annie Lorrain Smith became a founding member of the British Mycological Society. She later served as president twice.
1897: American cytologists and zoologists Katharine Foot and Ella Church Strobell started working as research partners. Together, they pioneered the practice of photographing microscopic research samples and invented a new technique for creating thin material samples in colder temperatures.
1897: American physicist Isabelle Stone became the first woman to receive a PhD in physics in the United States. She wrote her dissertation "On the Electrical Resistance of Thin Films" at the University of Chicago.
1898: Danish physicist Kirstine Meyer was awarded the gold medal of the Royal Danish Academy of Sciences and Letters.
1898: Italian malacologist Marianna Paulucci donated her collection of specimens to the Royal Museum of Natural History in Florence, Italy (Museo di Storia Naturale di Firenze). Paulucci was the first scientist to compile and publish a species list of Italian malacofauna.
1899: American physicists Marcia Keith and Isabelle Stone became charter members of the American Physical Society.
1899: Irish physicist Edith Anne Stoney was appointed a physics lecturer at the London School of Medicine for Women, becoming the first woman medical physicist. She later became a pioneering figure in the use of X-ray machines on the front lines of World War I.
Early 20th century
1900s
1900: American botanist Anna Murray Vail became the first librarian of the New York Botanical Garden. A key supporter of the institution's establishment, she had earlier donated her entire collection of 3000 botanical specimens to the garden.
1900: Physicists Marie Curie and Isabelle Stone attended the first International Congress of Physics in Paris, France. They were the only two women out of 836 participants.
1901: American Florence Bascom became the first female geologist to present a paper before the Geological Survey of Washington.
1901: Czech botanist and zoologist Marie Zdeňka Baborová-Čiháková became the first woman in the Czech Republic to receive a PhD.
1901: American astronomer Annie Jump Cannon published her first catalog of stellar spectra, which classified stars by temperature. This method was universally and permanently adopted by other astronomers.
1903: Grace Coleridge Frankland née Toynbee was an English microbiologist. Her most notable work was Bacteria in Daily Life. She was one of the nineteen female scientists who wrote the 1904 petition to the Chemical Society to request that they should create some female fellows of the society.
1903: Polish-born physicist and chemist Marie Curie became the first woman to receive a Nobel Prize when she received the Nobel Prize in Physics along with her husband, Pierre Curie, "for their joint researches on the radiation phenomena discovered by Professor Henri Becquerel", and Henri Becquerel, "for his discovery of spontaneous radioactivity".
1904: American geographer, geologist and educator Zonia Baber published her article "The Scope of Geography", in which she laid out her educational theories on the teaching of geography. She argued that students required a more interdisciplinary, experiential approach to learning geography: instead of a reliance on textbooks, students needed field-trips, lab work and map-making knowledge. Baber's educational ideas transformed the way schools taught geography.
1904: British chemists Ida Smedley, Ida Freund and Martha Whiteley organized a petition asking the Chemical Society to admit women as Fellows. A total of 19 female chemists became signatories, but their petition was denied by the society.
1904: Marie Charlotte Carmichael Stopes (15 October 1880 – 2 October 1958) was a British author, palaeobotanist and campaigner for women's rights. She made significant contributions to plant palaeontology and coal classification. She held the post of Lecturer in Palaeobotany at the University of Manchester from 1904 to 1910; in this capacity she became the first female academic of that university. In 1909 she was elected to the Linnean Society of London. She was 26 at the time of her election to Fellowship (the youngest woman admitted at that time).
1904: In a December meeting, the Linnean Society of London elected its first women Fellows. These initial women included horticulturalist Ellen Willmott, ornithologist Emma Turner, biologist Lilian Jane Gould, mycologists Gulielma Lister and Annie Lorrain Smith, and botanists Mary Anne Stebbing, Margaret Jane Benson and Ethel Sargant.
1905: American geneticist Nettie Stevens discovered sex chromosomes.
1906: Following the San Francisco earthquake, American botanist and curator Alice Eastwood rescued almost 1500 rare plant specimens from the burning California Academy of Sciences building. Her curation system of keeping type specimens separate from other collections – unconventional at the time – allowed her to quickly find and retrieve the specimens.
1906: Russian chemist Irma Goldberg published a paper on two newly discovered chemical reactions involving the presence of copper and the creation of a nitrogen-carbon bond to an aromatic halide. These reactions were subsequently named the Goldberg reaction and the Jourdan-Ullman-Goldberg reaction.
1906: English physicist, mathematician and engineer Hertha Ayrton became the first female recipient of the Hughes Medal from the Royal Society of London. She received the award for her experimental research on electric arcs and sand ripples.
1906: After her death, English lepidopterist Emma Hutchinson's collection of 20,000 butterflies and moths was donated to the London Natural History Museum. She had published little during her lifetime, and was barred from joining local scientific societies due to her gender, but was honoured for her work when a variant form of the comma butterfly was named hutchinsoni.
1909: Alice Wilson became the first female geologist hired by the Geological Survey of Canada. She is widely credited as being the first Canadian woman geologist.
1909: Danish physicist Kirstine Meyer became the first Danish woman to receive a doctorate degree in natural sciences. She wrote her dissertation on the topic of "the development of the temperature concept" within the history of physics.
1910s
1911: Polish-born physicist and chemist Marie Curie became the first woman to receive the Nobel Prize in Chemistry, which she received "[for] the discovery of the elements radium and polonium, by the isolation of radium and the study of the nature and compounds of this remarkable element". This made her the first person ever to win the Nobel Prize twice. As of 2021, she is the only woman to win it twice and the only person to win the Nobel Prize in two scientific fields.
1911: Norwegian biologist Kristine Bonnevie became the first woman member of the Norwegian Academy of Science and Letters.
1912: American astronomer Henrietta Swan Leavitt studied the bright-dim cycle periods of Cepheid stars, then found a way to calculate the distance from such stars to Earth.
1912: Canadian botanist and geneticist Carrie Derick was appointed a professor of morphological botany at McGill University. She was the first woman to become a full professor in any department at a Canadian university.
1913: Regina Fleszarowa became the first Polish woman to receive a PhD in natural sciences.
1913: Izabela Textorisová, the first Slovakian woman botanist, published "Flora Data from the County of Turiec" in the journal Botanikai Közlemények. Her work uncovered more than 100 previously unknown species of plants from the Turiec area.
1913: Canadian physician and chemist Maude Menten co-authored a paper on enzyme kinetics, leading to the development of the Michaelis–Menten kinetics equation.
1914–1918: During World War I, a team of seven British women chemists conducted pioneering research on chemical antidotes and weaponized gases. The project leader, Martha Whiteley, was awarded the Order of the British Empire for her wartime contributions.
1914-1918: Dame Helen Gwynne-Vaughan, (née Fraser) was a prominent English botanist and mycologist. For her wartime service she was the first woman to be awarded a military DBE in January 1918. She served as Commandant of the Women's Royal Air Force (WRAF) from September 1918 until December 1919.
1914: British-born mycologist Ethel Doidge became the first woman in South Africa to receive a doctorate in any subject, receiving her doctorate of science degree from the University of the Good Hope. She wrote her thesis on "A bacterial disease of mango".
1916: Isabella Preston became the first female professional plant hybridist in Canada, producing the George C. Creelman trumpet lily. Her lily later received an Award of Merit from the Royal Horticultural Society.
1916: Chika Kuroda became the first Japanese woman to earn a bachelor of science degree, studying chemistry at the Tohoku Imperial University. After graduation, she was subsequently appointed an assistant professor at the university.
1917: American zoologist Mary J. Rathbun received her PhD from the George Washington University. Despite never having attended college – or any formal schooling beyond high school – Rathbun had authored more than 80 scientific publications, described over 674 new species of crustacean, and developed a system for crustacean-related records at the Smithsonian Museum.
1917: Dutch biologist and geneticist Jantina Tammes became the first female university professor in the Netherlands. She was appointed an extraordinary professor of phytopathology at the University of Utrecht.
1918: German physicist and mathematician Emmy Noether created Noether's theorem explaining the connection between symmetry and conservation laws.
1919: Kathleen Maisey Curtis became the first New Zealand woman to earn a Doctorate of Science degree (DSc), completing her thesis on Synchytrium endobioticum (potato wart disease) at the Imperial College of Science and Technology. Her research was cited as "the most outstanding result in mycological research that had been presented for ten years".
1920s
1920: Louisa Bolus was elected a Fellow of the Royal Society of South Africa for her contributions to botany. Over the course of her lifetime, Bolus identified and named more than 1,700 new South African plant species – more species than any other botanist in South Africa.
1923: María Teresa Ferrari, an Argentine physician, earned the first diploma awarded to a woman by the Faculty of Medicine at the University of Paris for her studies of the urinary tract.
1924: Florence Bascom became the first woman elected to the Council of the Geological Society of America.
1925: Mexican-American botanist Ynes Mexia embarked on her first botanical expedition into Mexico, collecting over 1500 plant specimens. Over the course of the next thirteen years, Mexia collected more than 145,000 specimens from Mexico, Alaska, and multiple South American countries. She discovered 500 new species.
1925: American medical scientist Florence Sabin became the first woman elected to the National Academy of Sciences.
1925: British-American astronomer and astrophysicist Cecilia Payne-Gaposchkin established that hydrogen is the most common element in stars, and thus the most abundant element in the universe.
1927: Kono Yasui became the first Japanese woman to earn a doctorate in science, studying at the Tokyo Imperial University and completing her thesis on "Studies on the structure of lignite, brown coal, and bituminous coal in Japan".
1928: Alice Evans became the first woman elected president of the Society of American Bacteriologists.
1928: Helen Battle became the first woman to earn a PhD in marine biology in Canada.
1928: British biologist Kathleen Carpenter published the first English-language textbook devoted to freshwater ecology: Life in Inland Waters.
1929: American botanist Margaret Clay Ferguson became the first woman president of the Botanical Society of America.
1929: Scottish-Nigerian Agnes Yewande Savage became the first West African woman to graduate from medical school, obtaining her degree at the University of Edinburgh.
1930s
1930: Concepción Mendizábal Mendoza became the first woman in Mexico to earn a civil engineering degree.
1932: Michiyo Tsujimura became the first Japanese woman to earn a doctorate in agriculture. She studied at the Tokyo Imperial University, and her doctoral thesis was entitled "On the Chemical Components of Green Tea".
1933: Hungarian scientist Elizabeth Rona received the Haitinger Prize from the Austrian Academy of Sciences for her method of extracting polonium.
1933: American bacteriologist Ruth Ella Moore became the first African-American woman to receive a PhD in the natural sciences, completing her doctorate in bacteriology at Ohio State University.
1935: French chemist Irène Joliot-Curie received the Nobel Prize in Chemistry along with Frédéric Joliot-Curie "for their synthesis of new radioactive elements".
1935: American plant hybridist Grace Sturtevant, the "First Lady of Iris", received the American Iris Society's gold medal for her lifetime's work.
1936: Edith Patch became the first female president of the Entomological Society of America.
1936: Mycologist Kathleen Maisey Curtis was elected the first female Fellow at the Royal Society of New Zealand.
1936: Danish seismologist and geophysicist Inge Lehmann discovered that the Earth has a solid inner core distinct from its molten outer core.
1937: Canadian forensic pathologist Frances Gertrude McGill assisted the Royal Canadian Mounted Police in establishing their first forensic detection laboratory.
1937: Suzanne Comhaire-Sylvain became the first female Haitian anthropologist and the first Haitian person to complete a PhD, receiving her doctoral degree from the University of Paris.
1937: Marietta Blau and her student Hertha Wambacher, both Austrian physicists, received the Lieben Prize of the Austrian Academy of Sciences for their work on cosmic ray observations using the technique of nuclear emulsions.
1938: Elizabeth Abimbola Awoliyi became the first woman to be licensed to practise medicine in Nigeria after graduating from the University of Dublin and the first West African female medical officer with a license of the Royal Surgeon (Dublin).
1938: Geologist Alice Wilson became the first woman appointed as Fellow to the Royal Society of Canada.
1938: South African naturalist Marjorie Courtenay-Latimer discovered a living coelacanth fish caught near the Chalumna river. The species had been believed to be extinct for over 60 million years. It was named latimeria chalumnae in her honour.
1939: Austrian-Swedish physicist Lise Meitner, along with Otto Hahn, led the small group of scientists who first discovered nuclear fission of uranium when it absorbed an extra neutron; the results were published in early 1939.Frisch, O. R. (1939). "Physical Evidence for the Division of Heavy Nuclei under Neutron Bombardment". Nature. 143 (3616): 276. Bibcode:1939Natur.143..276F. doi:10.1038/143276a0. [The experiment for this letter to the editor was conducted on 13 January 1939; see Richard Rhodes, The Making of the Atomic Bomb pp. 263, 268 (Simon and Schuster, 1986).]
1939: French physicist Marguerite Perey discovered francium.
1940s
1940: Turkish Archaeologist, Sumerologist, Assyriologist, and writer Muazzez İlmiye Çığ. Upon receiving her degree in 1940, she began a multi-decade career at Museum of the Ancient Orient, one of three such institutions comprising Istanbul Archaeology Museums, as a resident specialist in the field of cuneiform tablets, thousands of which were being stored untranslated and unclassified in the facility's archives. In the intervening years, due to her efforts in the deciphering and publication of the tablets, the Museum became a Middle Eastern languages learning center attended by ancient history researchers from every part of the world.
1941: American scientist Ruth Smith Lloyd became the first African-American woman to receive a PhD in anatomy.
1942: Austrian-American actress and inventor Hedy Lamarr and composer George Antheil developed a radio guidance system for Allied torpedoes that used spread spectrum and frequency hopping technology to defeat the threat of jamming by the Axis powers. Although the US Navy did not adopt the technology until the 1960s, the principles of their work are incorporated into Bluetooth technology and are similar to methods used in legacy versions of CDMA and Wi-Fi. This work led to their induction into the National Inventors Hall of Fame in 2014.
1942: American geologist Marguerite Williams became the first African-American woman to receive a PhD in geology in the United States. She completed her doctorate, entitled A History of Erosion in the Anacostia Drainage Basin, at Catholic University.
1942: Native American aerospace engineer Mary Golda Ross became employed at Lockheed Aircraft Corporation, where she provided troubleshooting for military aircraft. She went on to work for NASA, developing operational requirements, flight plans, and a Planetary Flight Handbook for spacecraft missions such as the Apollo program.
1943: British geologist Eileen Guppy was promoted to the rank of assistant geologist, therefore becoming the first female geology graduate appointed to the scientific staff of the British Geological Survey.
1944: Indian chemist Asima Chatterjee became the first Indian woman to receive a doctorate of science, completing her studies at the University of Calcutta. She went on to establish the Department of Chemistry at Lady Brabourne College.
1945: American physicists and mathematicians Frances Spence, Ruth Teitelbaum, Marlyn Meltzer, Betty Holberton, Jean Bartik and Kathleen Antonelli programmed the electronic general-purpose computer ENIAC, becoming some of the world's first computer programmers. (The first were uncredited operators, mostly members of the Women's Royal Naval Service, of the Colossus computer in 1943–1945, but that machine was not a stored-program computer and its existence was a state secret until the 1970s.)
1945: Marjory Stephenson and Kathleen Lonsdale were elected as the first female Fellows of the Royal Society.
1947: Austrian-American biochemist Gerty Cori became the first woman to receive the Nobel Prize in Physiology or Medicine, which she received along with Carl Ferdinand Cori "for their discovery of the course of the catalytic conversion of glycogen", and Bernardo Alberto Houssay "for his discovery of the part played by the hormone of the anterior pituitary lobe in the metabolism of sugar".
1947: American biochemist Marie Maynard Daly became the first African-American woman to complete a PhD in chemistry in the United States. She completed her dissertation, entitled "A Study of the Products Formed by the Action of Pancreatic Amylase on Corn Starch" at Columbia University.
1947: Berta Karlik, an Austrian physicist, was awarded the Haitinger Prize of the Austrian Academy of Sciences for her discovery of astatine.
1947: Susan Ofori-Atta became the first Ghanaian woman to earn a medical degree when she graduated from the University of Edinburgh.
1948: Canadian plant pathologist and mycologist Margaret Newton became the first woman to be awarded the Flavelle Medal from the Royal Society of Canada, in recognition of her extensive research in wheat rust fungal disease. Her experiments led to the development of rust-resistant strains of wheat.
1948: American limnologist Ruth Patrick of the Academy of Natural Sciences of Philadelphia led a multidisciplinary team of scientists on an extensive pollution survey of the Conestoga River watershed in Pennsylvania. Patrick would become a leading authority on the ecological effects of river pollution, receiving the Tyler Prize for Environmental Achievement in 1975.
1949: Botanist became the first Azerbaijani woman to receive a PhD in biological studies. She went on to write the first national Azerbaijani-language textbooks on botany and biology.
Winifred Goldring (February 1, 1888 – January 30, 1971), was an American paleontologist and became the first woman president of the Paleontological Society, her work included a description of stromatolites, as well as the study of Devonian crinoids. and She was the first woman in the US to be appointed as a State Paleontologist.
Late 20th century
1950s
1950s: Chinese-American medical scientist Tsai-Fan Yu co-founded a clinic at Mount Sinai Medical Center for the study and treatment of gout. Working with Alexander B. Gutman, Yu established that levels of uric acid were a factor in the pain experienced by gout patients, and subsequently developed multiple effective drugs for the treatment of gout.
1950: Chinese-American particle physicist Chien-Shiung Wu proves the validity of Quantum entanglement which counters Albert Einstein's EPR Paradox and publishes her work on the new year of the new decade. She also proves the validity of beta decay around this time.
1950: Ghanaian, Matilda J. Clerk became the first woman in Ghana and West Africa to attend graduate school, earning a postgraduate diploma at the London School of Hygiene & Tropical Medicine.
1950: Isabella Abbott became the first Native Hawaiian woman to receive a PhD in any science; hers was in botany.
1950: American microbiologist Esther Lederberg became the first to isolate lambda bacteriophage, a DNA virus, from Escherichia coli K-12.
1951: Ghana's Esther Afua Ocloo became the first person of African ancestry to obtain a cooking diploma from the Good Housekeeping Institute in London and to take the post-graduate Food Preservation Course at Long Ashton Research Station, Department of Horticulture, Bristol University.
1952: American computer scientist Grace Hopper completed what is considered to be the first compiler, a program that allows a computer user to use a human-readable high-level programming language instead of machine code. It was known as the A-0 compiler.
1952: Photograph 51, an X-ray diffraction image of crystallized DNA, was taken by Raymond Gosling in May 1952, working as a PhD student under the supervision of British chemist and biophysicist Rosalind Franklin; it was critical evidence in identifying the structure of DNA.
1952: Canadian agriculturalist Mary MacArthur became the first female Fellow of the Agricultural Institute of Canada for her contributions to the science of food dehydration and freezing.
1953: Canadian-British radiobiologist Alma Howard co-authored a paper proposing that cellular life transitions through four distinct periods. This became the first concept of the cell cycle.
1954: Lucy Cranwell was the first female recipient of the Hector Medal from the Royal Society of New Zealand. She was recognized for her pioneering work with pollen in the emerging field of palynology.
1955: Moira Dunbar became the first female glaciologist to study sea ice from a Canadian icebreaker ship.
1955: Japanese geochemist Katsuko Saruhashi published her research on measuring carbonic acid levels in seawater. The paper included "Saruhashi's Table", a tool of measurement she had developed that focused on using water temperature, pH level, and chlorinity to determine carbonic acid levels. Her work contributed to global understanding of climate change, and Saruhashi's Table was used by oceanographers for the next 30 years.
1955–1956: Soviet marine biologist Maria Klenova became the first woman scientist to work in the Antarctic, conducting research and assisting in the establishment of the Mirny Antarctic station.
1956: Canadian zoologist and feminist Anne Innis Dagg began pioneering behavioural research on wild giraffes in South Africa in Kruger National Park. She researched and published on feminism and anti-nepotism laws at academic institutions in North America.
1956: Chinese-American physicist Chien-Shiung Wu conducted a nuclear physics experiment in collaboration with the Low Temperature Group of the US National Bureau of Standards. It was an important foundation for the Standard Model in particle physics and brought the first answer to the question of the universe's existence by virtue of matter's predominance over antimatter. The experiment, becoming known as the Wu experiment, showed that parity could be violated in weak interaction. The nobel prize is given only to her male colleagues soon after the headlines of the discovery were released.
1956: Dorothy Hill became the first Australian woman elected a Fellow of the Australian Academy of Science.
1956: English zoologist and geneticist Margaret Bastock published the first evidence that a single gene could change behavior.
1957–1958: Chinese scientist Lanying Lin produced China's first germanium and silicon mono-crystals, subsequently pioneering new techniques in semiconductor development.
1959: Chinese astronomer Ye Shuhua led the development of the Joint Chinese Universal Time System, which became the Chinese national standard for measuring universal time.
1959: Susan Ofori-Atta, the first Ghanaian woman physician, became a founding member of the Ghana Academy of Arts and Sciences.
1960s
1960: British primatologist Jane Goodall began studying chimpanzees in Tanzania; her study of them continued for over 50 years. Her observations challenged previous ideas that only humans made tools and that chimpanzees had a basically vegetarian diet.
Early 1960s: German-Canadian metallurgist Ursula Franklin studied levels of radioactive isotope strontium-90 that were appearing in the teeth of children as a side effect of nuclear weapons testing fallout. Her research influenced the Partial Nuclear Test Ban Treaty of 1963.
1960s: American mathematician Katherine Johnson calculated flight paths at NASA for manned space flights.
1961: Indian chemist Asima Chatterjee became the first female recipient of a Shanti Swarup Bhatnagar Prize. She was recognized in the Chemical Sciences category for her contributions to phytomedicine.
1962: Rachel Louise Carson was an American marine biologist, author, and conservationist whose book Silent Spring and other writings are credited with advancing the global environmental movement.
1962: South African botanist Margaret Levyns became the first woman president of the Royal Society of South Africa.
1962: French physicist Marguerite Perey became the first female Fellow elected to the Académie des Sciences.
1963: Elsa G. Vilmundardóttir became the first female Icelandic geologist, completing her studies at Stockholm University.
1963: Maria Goeppert Mayer became the first American woman to receive a Nobel Prize in Physics; she shared the prize with J. Hans D. Jensen "for their discoveries concerning nuclear shell structure" and Eugene Paul Wigner "for his contributions to the theory of the atomic nucleus and the elementary particles, particularly through the discovery and application of fundamental symmetry principles".
1964: American mathematician Irene Stegun completed the work which led to the publication of Handbook of Mathematical Functions, a widely used and widely cited reference work in applied mathematics.
1964: British chemist Dorothy Crowfoot Hodgkin received the Nobel Prize in Chemistry "for her determinations by X-ray techniques of the structures of important biochemical substances".
1964: Scottish virologist June Almeida made the first identification of a human coronavirus.
1965: Sister Mary Kenneth Keller became the first American woman to receive a Ph.D. in computer science. Her thesis was titled "Inductive Inference on Computer Generated Patterns".
1966: Japanese immunologist Teruko Ishizaka, working with Kimishige Ishizaka, discovered the antibody class Immunoglobulin E (IgE).
1967: British astrophysicist Jocelyn Bell Burnell co-discovered the first radio pulsars.
1967: Sue Arnold became the first female British Geological Survey person to go to sea on a research vessel.
1967: South African radiobiologist Tikvah Alper discovered that scrapie, an infectious brain disease affecting sheep, did not spread via DNA or RNA like a viral or bacterial disease. The discovery enabled scientists to better understand diseases caused by prions.
1967: Yvonne Brill, a Canadian-American rocket and jet propulsion engineer, invented the hydrazine resistojet propulsion system.
1969: Beris Cox became the first female paleontologist in the British Geological Survey.
1969: Ukrainian-born astronomer Svetlana Gerasimenko co-discovered the 67P/Churyumov–Gerasimenko comet.
1970s
1970: Dorothy Hill became the first female president of the Australian Academy of Science.
1970: Samira Islam became the first Saudi Arabian person to earn a PhD in pharmacology.
1970: Astronomer Vera Rubin published the first evidence for dark matter.
1970: Polish geologist Franciszka Szymakowska became widely known because of her unique and detailed geological drawings that are still used today.
1973: American physicist Anna Coble became the first African-American woman to receive a PhD in biophysics, completing her dissertation at University of Illinois.
1974: Dominican marine biologist Idelisa Bonnelly founded the Dominican Republic Academy of Science.
1975: Indian chemist Asima Chatterjee was elected the General President of the Indian Science Congress Association. She simultaneously became the first woman scientist ever elected a member of the congress.
1975: Indian geneticist Archana Sharma received the Shanti Swarup Bhatnagar Prize, the first female recipient in the Biological Sciences category.
1975: Female officers of the British Geological Survey no longer had to resign upon getting married.
1975: Chien-Shiung Wu became the first female president of the American Physical Society.
1976: Filipino-American microbiologist Roseli Ocampo-Friedmann traveled to the Antarctic with Imre Friedmann and discovered micro-organisms living within the porous rock of the Ross Desert. These organisms – cryptoendoliths – were observed surviving extremely low temperatures and humidity, assisting scientific research into the possibility of life on Mars.
1976: Margaret Burbidge was named the first female president of the American Astronomical Society.
1977: American medical physicist Rosalyn Yalow received the Nobel Prize in Physiology or Medicine "for the development of radioimmunoassays of peptide hormones" along with Roger Guillemin and Andrew V. Schally who received it "for their discoveries concerning the peptide hormone production of the brain".
1977: Friederike Victoria Joy Adamson (née Gessner, 20 January 1910 – 3 January 1980) was a naturalist, artist and author. Her book, Born Free, an international bestseller, describes her experiences raising a lion cub named Elsa. It was made into an Academy Award-winning movie of the same name. In 1977, she was awarded the Austrian Cross of Honour for Science and Art.
1977: The Association for Women Geoscientists was founded.
1977: Argentine-Canadian scientist Veronica Dahl became the first graduate at Université d'Aix-Marseille II (and one of the first women in the world) to earn a PhD in artificial intelligence.
1977: Canadian-American Elizabeth Stern published her research on the link between birth control pills – which contained high levels of estrogen at the time – and the increased risk of cervical cancer development in women. Her data helped pressure the pharmaceutical industry into providing safer contraceptive pills with lower hormone doses.
1978: Anna Jane Harrison became the first female president of the American Chemical Society.
1978: Mildred Cohn served as the first female president of the American Society for Biochemistry and Molecular Biology, then called the American Society of Biological Chemists.
1980s
1980: Japanese geochemist Katsuko Saruhashi became the first woman elected to the Science Council of Japan.
1980: Nigerian geophysicist Deborah Ajakaiye became the first woman in any West African country to be appointed a full professor of physics. Over the course of her scientific career, she became the first female Fellow elected to the Nigerian Academy of Science, and the first female dean of science in Nigeria.
1981: Vera Rubin was the second woman astronomer elected to the National Academy of Science. Beginning her academic career as the sole undergraduate in astronomy at Vassar College, Rubin went on to graduate studies at Cornell University and Georgetown University, where she observed deviations from Hubble flow in galaxies and provided evidence for the existence of galactic superclusters.
1982: Nephrologist Leah Lowenstein became the first woman dean of a co-educational medical school in the United States.
1982: Janet Vida Watson FRS FGS (1923–1985) was a British geologist. She was a professor of Geology at Imperial College, London. A fellow of the Royal Society, she is well known for her contribution to the understanding of the Lewisian complex and as an author and co-author of several books. In 1982 she was elected President of the Geological Society of London, the first women to occupy that position.
1983: American cytogeneticist Barbara McClintock received the Nobel Prize in Physiology or Medicine for her discovery of genetic transposition; she was the first woman to receive that prize without sharing it, and the first American woman to receive any unshared Nobel Prize.
1983: Brazilian agronomist Johanna Döbereiner became a founding Fellow of the World Academy of Sciences.
1983: Indian immunologist Indira Nath became the first woman scientist to receive the Shanti Swaroop Bhatnagar Award in the Medical Sciences category.
1983: Geologist Sudipta Sengupta and marine biologist Aditi Pant became the first Indian women to visit the Antarctic.
1985: After identifying HIV as the cause of AIDS, Chinese-American virologist Flossie Wong-Staal became the first scientist to clone and genetically map the HIV virus, enabling the development of the first HIV blood screening tests.
1986: Italian neurologist Rita Levi-Montalcini received the Nobel Prize in Physiology or Medicine, shared with Stanley Cohen, "for their discoveries of growth factors".
1988: American biochemist and pharmacologist Gertrude B. Elion received the Nobel Prize in Physiology or Medicine along with James W. Black and George H. Hitchings "for their discoveries of important principles for drug treatment".
1988: American scientist and inventor Patricia Bath (born 1942) became the first African-American to patent a medical device, namely the Laserphaco Probe for improving the use of lasers to remove cataracts.
1990s
1991: Doris Malkin Curtis became the first woman president of the Geological Society of America.
1991: Indian geologist Sudipta Sengupta became the first woman scientist to receive the Shanti Swaroop Bhatnagar Award in the Earth Sciences category.
Helen Patricia Sharman, CMG, OBE, HonFRSC (born 30 May 1963) is a chemist who became the first British astronaut (and in particular, the first British cosmonaut) as well as the first woman to visit the Mir space station in May 1991.
1992: Mae Carol Jemison is an American engineer, physician, and former NASA astronaut. She became the first black woman to travel into space when she served as a mission specialist aboard the Space Shuttle Endeavour. Jemison joined NASA's astronaut corps in 1987 and was selected to serve for the STS-47 mission, during which she orbited the Earth for nearly eight days on September 12–20, 1992.
1992: Edith M. Flanigen became the first woman awarded the Perkin Medal (widely considered the highest honor in American industrial chemistry) for her outstanding achievements in applied chemistry. The medal especially recognized her syntheses of aluminophosphate and silicoaluminophosphate molecular sieves as new classes of materials.
1995: German biologist Christiane Nüsslein-Volhard received the Nobel Prize in Physiology or Medicine, shared with Edward B. Lewis and Eric F. Wieschaus, "for their discoveries concerning the genetic control of early embryonic development".
1995: British geomorphologist Marjorie Sweeting published the first comprehensive Western account of China's karst, entitled Karst in China: its Geomorphology and Environment.''
1995: Israeli-Canadian mathematical biologist Leah Keshet became the first woman president of the international Society for Mathematical Biology.
1995: Jane Plant became the first female Deputy Director of the British Geological Survey.
1995: Inspectors from the United Nations Special Commission discovered that Iraqi microbiologist Rihab Taha, nicknamed "Dr. Germ", had been overseeing a secret 10-year biological warfare development program in Iraq.
1996: American planetary scientist Margaret G. Kivelson led a team that discovered the first subsurface, saltwater ocean on an alien world, on the Jovian moon Europa.
1997: Lithuanian-Canadian primatologist Birutė Galdikas received the Tyler Prize for Environmental Achievement for her research and rehabilitation work with orangutans. Her work with orangutans, eventually spanning over 30 years, was later recognized in 2014 as one of the longest continuous scientific studies of wild animals in history.
1997: Chilean astronomer María Teresa Ruiz discovered Kelu 1, one of the first observed brown dwarfs. In recognition of her discovery, she became the first woman to receive the Chilean National Prize for Exact Sciences.
1998: Nurse Fannie Gaston-Johansson became the first African-American woman tenured full professor at Johns Hopkins University.
Late 1990s: Ethiopian-American chemist Sossina M. Haile developed the first solid acid fuel cell.
21st century
2000s
2000: Venezuelan astrophysicist Kathy Vivas presented her discovery of approximately 100 "new and very distant" RR Lyrae stars, providing insight into the structure and history of the Milky Way galaxy.
2003: American geophysicist Claudia Alexander oversaw the final stages of Project Galileo, a space exploration mission that ended at the planet Jupiter.
2004: American biologist Linda B. Buck received the Nobel Prize in Physiology or Medicine along with Richard Axel "for their discoveries of odorant receptors and the organization of the olfactory system".
2006: Chilean biochemist Cecilia Hidalgo Tapia became the first woman to receive the Chilean National Prize for Natural Sciences.
2006: Chinese-American biochemist Yizhi Jane Tao led a team of researchers to become the first to map the atomic structure of Influenza A, contributing to antiviral research.
2006: Parasitologist Susan Lim became the first Malaysian scientist elected to the International Commission on Zoological Nomenclature.
2006: Merieme Chadid became the first Moroccan person and the first female astronomer to travel to Antarctica, leading an international team of scientists in the installation of a major observatory in the South Pole.
2006: American computer scientist Frances E. Allen won the Turing Award for "pioneering contributions to the theory and practice of optimizing compiler techniques that laid the foundation for modern optimizing compilers and automatic parallel execution". She was the first woman to win the award.
2006: Canadian-American computer scientist Maria Klawe became the president of Harvey Mudd College.
2007: Using satellite imagery, Egyptian geomorphologist Eman Ghoneim discovered traces of an 11,000-year-old mega lake in the Sahara Desert. The discovery shed light on the origins of the largest modern groundwater reservoir in the world.
2007: Physicist Ibtesam Badhrees was the first Saudi Arabian woman to become a member of the European Organization for Nuclear Research (CERN).
2008: French virologist Françoise Barré-Sinoussi received the Nobel Prize in Physiology or Medicine, shared with Harald zur Hausen and Luc Montagnier, "for their discovery of HIV, human immunodeficiency virus".
2008: American-born Australian Penny Sackett became Australia's first female Chief Scientist.
2008: American computer scientist Barbara Liskov won the Turing Award for "contributions to practical and theoretical foundations of programming language and system design, especially related to data abstraction, fault tolerance, and distributed computing".
2009: American molecular biologist Carol W. Greider received the Nobel Prize in Physiology or Medicine along with Elizabeth H. Blackburn and Jack W. Szostak "for the discovery of how chromosomes are protected by telomeres and the enzyme telomerase".
2009: Israeli crystallographer Ada E. Yonath, along with Venkatraman Ramakrishnan and Thomas A. Steitz, received the Nobel Prize in Chemistry "for studies of the structure and function of the ribosome".
2009: Chinese geneticist Zeng Fanyi and her research team published their experiment results proving that induced pluripotent stem cells can be used to generate whole mammalian bodies – in this case, live mice.
2010s
2010: Marcia McNutt became the first female director of the United States Geological Survey.
2011: Kazakhstani neuroscience student and computer hacker Alexandra Elbakyan launched Sci-Hub, a website that provides users with pirated copies of scholarly scientific papers. Within five years, Sci-Hub grew to contain 60 million papers and recorded over 42 million annual downloads by users. Elbakyan was finally sued by major academic publishing company Elsevier, and Sci-Hub was subsequently taken down, but it reappeared under different domain names.
2011: Taiwanese-American astrophysicist Chung-Pei Ma led a team of scientists in discovering two of the largest black holes ever observed.
2012: Computer scientist and cryptographer Shafi Goldwasser won the Turing award for her contributions to cryptography and complexity theory.
2013: Canadian genetic specialist Turi King identified the 500-year-old skeletal remains of King Richard III.
2013: Kenyan ichthyologist Dorothy Wanja Nyingi published the first guide to freshwater fish species of Kenya.
2014: Norwegian psychologist and neuroscientist May-Britt Moser received the Nobel Prize in Physiology or Medicine, shared with Edvard Moser and John O'Keefe, "for their discoveries of cells that constitute a positioning system in the brain".
2014: American paleoclimatologist and marine geologist Maureen Raymo became the first woman to be awarded the Wollaston Medal, the highest award of the Geological Society of London.
2014: American theoretical physicist Shirley Ann Jackson was awarded the National Medal of Science. Jackson had been the first African-American woman to receive a PhD from the Massachusetts Institute of Technology (MIT) during the early 1970s, and the first woman to chair the U.S. Nuclear Regulatory Commission.
2014: Iranian mathematician Maryam Mirzakhani became the first woman to receive the Fields Medal, for her work in "the dynamics and geometry of Riemann surfaces and their moduli spaces".
2015: Chinese medical scientist Tu Youyou received the Nobel Prize in Physiology or Medicine, shared with William C. Campbell and Satoshi Ōmura; she received it "for her discoveries concerning a novel therapy against Malaria".
2015: Asha de Vos became the first Sri Lankan person to receive a PhD in marine mammal research, completing her thesis on "Factors influencing blue whale aggregations off southern Sri Lanka" at the University of Western Australia.
2016: Marcia McNutt became the first woman president of the American National Academy of Sciences.
2018: British astrophysicists Hiranya Peiris and Joanna Dunkley and Italian cosmologist Licia Verde were among 27 scientists awarded the Breakthrough Prize in Fundamental Physics for their contributions to "detailed maps of the early universe that greatly improved our knowledge of the evolution of the cosmos and the fluctuations that seeded the formation of galaxies".
2018: British astrophysicist Jocelyn Bell Burnell received the special Breakthrough Prize in Fundamental Physics for her scientific achievements and "inspiring leadership", worth $3 million. She donated the entirety of the prize money towards the creation of scholarships to assist women, underrepresented minorities and refugees who are pursuing the study of physics.
2018: Canadian physicist Donna Strickland received the Nobel Prize in Physics "for groundbreaking inventions in the field of laser physics"; she shared it with Arthur Ashkin and Gérard Mourou.
2018: Frances Arnold received the Nobel Prize in Chemistry "for the directed evolution of enzymes"; she shared it with George Smith and Gregory Winter, who received it "for the phage display of peptides and antibodies". This made Frances the first American woman to receive the Nobel Prize in Chemistry.
2018: For the first time in history, women received the Nobel Prize in Chemistry and the Nobel Prize in Physics in the same year.
2019: Mathematician Karen Uhlenbeck became the first woman to win the Abel Prize for "her pioneering achievements in geometric partial differential equations, gauge theory, and integrable systems, and for the fundamental impact of her work on analysis, geometry and mathematical physics".
2019: Imaging scientist Katie Bouman developed an algorithm that made the first visualization of a black hole possible using the Event Horizon Telescope. She was part of the team of over 200 people who implemented the project.
2020s
2020: The Nigerian Academy of Science elected epidemiologist/parasitologist Ekanem Braide as its first female president.
2020: Brazilian Scientist and Researcher Jaqueline Goes de Jesus, sequenced COVID-19 genome in 12 hours.
2020: Biochemists Jennifer Doudna (American) and Emmanuelle Charpentier (French) received the Nobel Prize in Chemistry for their work on CRISPR genome editing tool.
2020: Andrea M. Ghez received the Nobel Prize in Physics for the discovery of a supermassive compact object.
2020: German-Turkish scientist Özlem Türeci is the co-founder and chief medical officer of BioNTech. Her team developed BNT162b2 (tozinameran (INN)), commonly known as the Pfizer–BioNTech COVID-19 vaccine.
2020: British vaccinologist Sarah Gilbert leads the development and testing of a vaccine which becomes the Oxford–AstraZeneca COVID-19 vaccine.
See also
List of female scientists before the 20th century
Lists of women in science
Timeline of women in geology
Timeline of women in library science
Timeline of women in computing
Timeline of women in mathematics
Timeline of women in mathematics in the United States
Timeline of women in science in the United States
Women in physics
References
External links
Famous female scientists: A timeline of pioneering women in science from the website of Dr Helen Klus
science
Science timelines
Women scientists |
81768 | https://en.wikipedia.org/wiki/Typhon | Typhon | Typhon (; , ), also Typhoeus (; ), Typhaon () or Typhos (), was a monstrous serpentine giant and one of the deadliest creatures in Greek mythology. According to Hesiod, Typhon was the son of Gaia and Tartarus. However, one source has Typhon as the son of Hera alone, while another makes Typhon the offspring of Cronus. Typhon and his mate Echidna were the progenitors of many famous monsters.
Typhon attempted to overthrow Zeus for the supremacy of the cosmos. The two fought a cataclysmic battle, which Zeus finally won with the aid of his thunderbolts. Defeated, Typhon was cast into Tartarus, or buried underneath Mount Etna, or in later accounts, the island of Ischia.
Typhon mythology is part of the Greek succession myth, which explained how Zeus came to rule the gods. Typhon's story is also connected with that of Python (the serpent killed by Apollo), and both stories probably derived from several Near Eastern antecedents. Typhon was (from c. 500 BC) also identified with the Egyptian god of destruction Set. In later accounts, Typhon was often confused with the Giants.
Mythology
Birth
According to Hesiod's Theogony (c. 8th – 7th century BC), Typhon was the son of Gaia (Earth) and Tartarus: "when Zeus had driven the Titans from heaven, huge Earth bore her youngest child Typhoeus of the love of Tartarus, by the aid of golden Aphrodite". The mythographer Apollodorus (1st or 2nd century AD) adds that Gaia bore Typhon in anger at the gods for their destruction of her offspring the Giants.
Numerous other sources mention Typhon as being the offspring of Gaia, or simply "earth-born", with no mention of Tartarus. However, according to the Homeric Hymn to Apollo (6th century BC), Typhon was the child of Hera alone. Hera, angry at Zeus for having given birth to Athena by himself, prayed to Gaia, Uranus, and the Titans, to give her a son stronger than Zeus, then slapped the ground and became pregnant. Hera gave the infant Typhon to the serpent Python to raise, and Typhon grew up to become a great bane to mortals.
Several sources locate Typhon's birth and dwelling place in Cilicia, and in particular the region in the vicinity of the ancient Cilician coastal city of Corycus (modern Kızkalesi, Turkey). The poet Pindar (c. 470 BC) calls Typhon "Cilician", and says that Typhon was born in Cilicia and nurtured in "the famous Cilician cave", an apparent allusion to the Corycian cave in Turkey. In Aeschylus' Prometheus Bound, Typhon is called the "dweller of the Cilician caves", and both Apollodorus and the poet Nonnus (4th or 5th century AD) have Typhon born in Cilicia.
The b scholia to Iliad 2.783, preserving a possibly Orphic tradition, has Typhon born in Cilicia, as the offspring of Cronus. Gaia, angry at the destruction of the Giants, slanders Zeus to Hera. So Hera goes to Zeus' father Cronus (whom Zeus had overthrown) and Cronus gives Hera two eggs smeared with his own semen, telling her to bury them underground, and that from them would be born one who would overthrow Zeus. Hera, angry at Zeus, buries the eggs in Cilicia "under Arimon", but when Typhon is born, Hera, now reconciled with Zeus, informs him.
Descriptions
According to Hesiod, Typhon was "terrible, outrageous and lawless", immensely powerful, and on his shoulders were one hundred snake heads, that emitted fire and every kind of noise:
Strength was with his hands in all that he did and the feet of the strong god were untiring. From his shoulders grew a hundred heads of a snake, a fearful dragon, with dark, flickering tongues, and from under the brows of his eyes in his marvelous heads flashed fire, and fire burned from his heads as he glared. And there were voices in all his dreadful heads which uttered every kind of sound unspeakable; for at one time they made sounds such that the gods understood, but at another, the noise of a bull bellowing aloud in proud ungovernable fury; and at another, the sound of a lion, relentless of heart; and at another, sounds like whelps, wonderful to hear; and again, at another, he would hiss, so that the high mountains re-echoed.
The Homeric Hymn to Apollo describes Typhon as "fell" and "cruel", and like neither gods nor men. Three of Pindar's poems have Typhon as hundred-headed (as in Hesiod), while apparently a fourth gives him only fifty heads, but a hundred heads for Typhon became standard. A Chalcidian hydria (c. 540–530 BC), depicts Typhon as a winged humanoid from the waist up, with two snake tails for legs below. Aeschylus calls Typhon "fire-breathing". For Nicander (2nd century BC), Typhon was a monster of enormous strength, and strange appearance, with many heads, hands, and wings, and with huge snake coils coming from his thighs.
Apollodorus describes Typhon as a huge winged monster, whose head "brushed the stars", human in form above the waist, with snake coils below, and fire flashing from his eyes:
In size and strength he surpassed all the offspring of Earth. As far as the thighs he was of human shape and of such prodigious bulk that he out-topped all the mountains, and his head often brushed the stars. One of his hands reached out to the west and the other to the east, and from them projected a hundred dragons' heads. From the thighs downward he had huge coils of vipers, which when drawn out, reached to his very head and emitted a loud hissing. His body was all winged: unkempt hair streamed on the wind from his head and cheeks; and fire flashed from his eyes.
The most elaborate description of Typhon is found in Nonnus's Dionysiaca. Nonnus makes numerous references to Typhon's serpentine nature, giving him a "tangled army of snakes", snaky feet, and hair. According to Nonnus, Typhon was a "poison-spitting viper", whose "every hair belched viper-poison", and Typhon "spat out showers of poison from his throat; the mountain torrents were swollen, as the monster showered fountains from the viperish bristles of his high head", and "the water-snakes of the monster's viperish feet crawl into the caverns underground, spitting poison!".
Following Hesiod and others, Nonnus gives Typhon many heads (though untotaled), but in addition to snake heads, Nonnus also gives Typhon many other animal heads, including leopards, lions, bulls, boars, bears, cattle, wolves, and dogs, which combine to make 'the cries of all wild beasts together', and a "babel of screaming sounds". Nonnus also gives Typhon "legions of arms innumerable", and where Nicander had only said that Typhon had "many" hands, and Ovid had given Typhon a hundred hands, Nonnus gives Typhon two hundred.
Offspring
According to Hesiod's Theogony, Typhon "was joined in love" to Echidna, a monstrous half-woman and half-snake, who bore Typhon "fierce offspring". First, according to Hesiod, there was Orthrus, the two-headed dog who guarded the Cattle of Geryon, second Cerberus, the multiheaded dog who guarded the gates of Hades, and third the Lernaean Hydra, the many-headed serpent who, when one of its heads was cut off, grew two more. The Theogony next mentions an ambiguous "she", which might refer to Echidna, as the mother of the Chimera (a fire-breathing beast that was part lion, part goat, and had a snake-headed tail) with Typhon then being the father.
While mentioning Cerberus and "other monsters" as being the offspring of Echidna and Typhon, the mythographer Acusilaus (6th century BC) adds the Caucasian Eagle that ate the liver of Prometheus. The mythographer Pherecydes of Athens (5th century BC) also names Prometheus' eagle, and adds Ladon (though Pherecydes does not use this name), the dragon that guarded the golden apples in the Garden of the Hesperides (according to Hesiod, the offspring of Ceto and Phorcys). The lyric poet Lasus of Hermione (6th century BC) adds the Sphinx.
Later authors mostly retain these offspring of Typhon by Echidna, while adding others. Apollodorus, in addition to naming as their offspring Orthrus, the Chimera (citing Hesiod as his source) the Caucasian Eagle, Ladon, and the Sphinx, also adds the Nemean lion (no mother is given), and the Crommyonian Sow, killed by the hero Theseus (unmentioned by Hesiod).
Hyginus (1st century BC), in his list of offspring of Typhon (all by Echidna), retains from the above: Cerberus, the Chimera, the Sphinx, the Hydra and Ladon, and adds "Gorgon" (by which Hyginus means the mother of Medusa, whereas Hesiod's three Gorgons, of which Medusa was one, were the daughters of Ceto and Phorcys), the Colchian dragon that guarded the Golden Fleece and Scylla. The Harpies, in Hesiod the daughters of Thaumas and the Oceanid Electra, in one source, are said to be the daughters of Typhon.
The sea serpents which attacked the Trojan priest Laocoön, during the Trojan War, were perhaps supposed to be the progeny of Typhon and Echidna. According to Hesiod, the defeated Typhon is the father of destructive storm winds.
Battle with Zeus
Typhon challenged Zeus for rule of the cosmos. The earliest mention of Typhon, and his only occurrence in Homer, is a passing reference in the Iliad to Zeus striking the ground around where Typhon lies defeated. Hesiod's Theogony gives the first account of their battle. According to Hesiod, without the quick action of Zeus, Typhon would have "come to reign over mortals and immortals". In the Theogony Zeus and Typhon meet in cataclysmic conflict:
[Zeus] thundered hard and mightily: and the earth around resounded terribly and the wide heaven above, and the sea and Ocean's streams and the nether parts of the earth. Great Olympus reeled beneath the divine feet of the king as he arose and earth groaned thereat. And through the two of them heat took hold on the dark-blue sea, through the thunder and lightning, and through the fire from the monster, and the scorching winds and blazing thunderbolt. The whole earth seethed, and sky and sea: and the long waves raged along the beaches round and about at the rush of the deathless gods: and there arose an endless shaking. Hades trembled where he rules over the dead below, and the Titans under Tartarus who live with Cronos, because of the unending clamor and the fearful strife.
Zeus with his thunderbolt easily overcomes Typhon, who is thrown down to earth in a fiery crash:
So when Zeus had raised up his might and seized his arms, thunder and lightning and lurid thunderbolt, he leaped from Olympus and struck him, and burned all the marvellous heads of the monster about him. But when Zeus had conquered him and lashed him with strokes, Typhoeus was hurled down, a maimed wreck, so that the huge earth groaned. And flame shot forth from the thunderstricken lord in the dim rugged glens of the mount, when he was smitten. A great part of huge earth was scorched by the terrible vapor and melted as tin melts when heated by men's art in channelled crucibles; or as iron, which is hardest of all things, is shortened by glowing fire in mountain glens and melts in the divine earth through the strength of Hephaestus. Even so, then, the earth melted in the glow of the blazing fire.
Defeated, Typhon is cast into Tartarus by an angry Zeus.
Epimenides (7th or 6th century BC) seemingly knew a different version of the story, in which Typhon enters Zeus' palace while Zeus is asleep, but Zeus awakes and kills Typhon with a thunderbolt. Pindar apparently knew of a tradition which had the gods, in order to escape from Typhon, transform themselves into animals, and flee to Egypt. Pindar calls Typhon the "enemy of the gods", and says that he was defeated by Zeus' thunderbolt. In one poem Pindar has Typhon being held prisoner by Zeus under Etna, and in another says that Typhon "lies in dread Tartarus", stretched out underground between Mount Etna and Cumae. In Aeschylus' Prometheus Bound, a "hissing" Typhon, his eyes flashing, "withstood all the gods", but "the unsleeping bolt of Zeus" struck him, and "he was burnt to ashes and his strength blasted from him by the lightning bolt."
According to Pherecydes of Athens, during his battle with Zeus, Typhon first flees to the Caucasus, which begins to burn, then to the volcanic island of Pithecussae (modern Ischia), off the coast of Cumae, where he is buried under the island. Apollonius of Rhodes (3rd century BC), like Pherecydes, presents a multi-stage battle, with Typhon being struck by Zeus' thunderbolt on mount Caucasus, before fleeing to the mountains and plain of Nysa, and ending up (as already mentioned by the fifth-century BC Greek historian Herodotus) buried under Lake Serbonis in Egypt.
Like Pindar, Nicander has all the gods, but Zeus and Athena, transform into animal forms and flee to Egypt: Apollo became a hawk, Hermes an ibis, Ares a fish, Artemis a cat, Dionysus a goat, Heracles a fawn, Hephaestus an ox, and Leto a mouse.
The geographer Strabo (c. 20 AD) gives several locations which were associated with the battle. According to Strabo, Typhon was said to have cut the serpentine channel of the Orontes River, which flowed beneath the Syrian Mount Kasios (modern Jebel Aqra), while fleeing from Zeus, and some placed the battle at Catacecaumene ("Burnt Land"), a volcanic plain, on the upper Gediz River, between the ancient kingdoms of Lydia, Mysia and Phrygia, near Mount Tmolus (modern Bozdağ) and Sardis the ancient capital of Lydia.
In the versions of the battle given by Hesiod, Aeschylus and Pindar, Zeus' defeat of Typhon is straightforward, however a more involved version of the battle is given by Apollodorus. No early source gives any reason for the conflict, but Apollodorus' account seemingly implies that Typhon had been produced by Gaia to avenge the destruction, by Zeus and the other gods, of the Giants, a previous generation of offspring of Gaia. According to Apollodorus, Typhon, "hurling kindled rocks", attacked the gods, "with hissings and shouts, spouting a great jet of fire from his mouth." Seeing this, the gods transformed into animals and fled to Egypt (as in Pindar and Nicander). However "Zeus pelted Typhon at a distance with thunderbolts, and at close quarters struck him down with an adamantine sickle" Wounded, Typhon fled to the Syrian Mount Kasios, where Zeus "grappled" with him. But Typhon, twining his snaky coils around Zeus, was able to wrest away the sickle and cut the sinews from Zeus' hands and feet. Typhon carried the disabled Zeus across the sea to the Corycian cave in Cilicia where he set the she-serpent Delphyne to guard over Zeus and his severed sinews, which Typhon had hidden in a bearskin. But Hermes and Aegipan (possibly another name for Pan) stole the sinews and gave them back to Zeus. His strength restored, Zeus chased Typhon to mount Nysa, where the Moirai tricked Typhon into eating "ephemeral fruits" which weakened him. Typhon then fled to Thrace, where he threw mountains at Zeus, which were turned back on him by Zeus' thunderbolts, and the mountain where Typhon stood, being drenched with Typhon's blood, became known as Mount Haemus (Bloody Mountain). Typhon then fled to Sicily, where Zeus threw Mount Etna on top of Typhon burying him, and so finally defeated him.
Oppian (2nd century AD) says that Pan helped Zeus in the battle by tricking Typhon to come out from his lair, and into the open, by the "promise of a banquet of fish", thus enabling Zeus to defeat Typhon with his thunderbolts.
Nonnus's Dionysiaca
The longest and most involved version of the battle appears in Nonnus's Dionysiaca (late 4th or early 5th century AD). Zeus hides his thunderbolts in a cave, so that he might seduce the maiden Plouto, and so produce Tantalus. But smoke rising from the thunderbolts, enables Typhon, under the guidance of Gaia, to locate Zeus's weapons, steal them, and hide them in another cave. Immediately Typhon extends "his clambering hands into the upper air" and begins a long and concerted attack upon the heavens. Then "leaving the air" he turns his attack upon the seas. Finally Typhon attempts to wield Zeus' thunderbolts, but they "felt the hands of a novice, and all their manly blaze was unmanned."
Now Zeus' sinews had somehow – Nonnus does not say how or when — fallen to the ground during their battle, and Typhon had taken them also. But Zeus devises a plan with Cadmus and Pan to beguile Typhon. Cadmus, disguised as a shepherd, enchants Typhon by playing the panpipes, and Typhon entrusting the thunderbolts to Gaia, sets out to find the source of the music he hears. Finding Cadmus, he challenges him to a contest, offering Cadmus any goddess as wife, excepting Hera whom Typhon has reserved for himself. Cadmus then tells Typhon that, if he liked the "little tune" of his pipes, then he would love the music of his lyre – if only it could be strung with Zeus' sinews. So Typhon retrieves the sinews and gives them to Cadmus, who hides them in another cave, and again begins to play his bewitching pipes, so that "Typhoeus yielded his whole soul to Cadmos for the melody to charm".
With Typhon distracted, Zeus takes back his thunderbolts. Cadmus stops playing, and Typhon, released from his spell, rushes back to his cave to discover the thunderbolts gone. Incensed Typhon unleashes devastation upon the world: animals are devoured, (Typhon's many animal heads each eat animals of its own kind), rivers turned to dust, seas made dry land, and the land "laid waste".
The day ends with Typhon yet unchallenged, and while the other gods "moved about the cloudless Nile", Zeus waits through the night for the coming dawn. Victory "reproaches" Zeus, urging him to "stand up as champion of your own children!" Dawn comes and Typhon roars out a challenge to Zeus. And a cataclysmic battle for "the sceptre and throne of Zeus" is joined. Typhon piles up mountains as battlements and with his "legions of arms innumerable", showers volley after volley of trees and rocks at Zeus, but all are destroyed, or blown aside, or dodged, or thrown back at Typhon. Typhon throws torrents of water at Zeus' thunderbolts to quench them, but Zeus is able to cut off some of Typhon's hands with "frozen volleys of air as by a knife", and hurling thunderbolts is able to burn more of Typhon's "endless hands", and cut off some of his "countless heads". Typhon is attacked by the four winds, and "frozen volleys of jagged hailstones." Gaia tries to aid her burnt and frozen son. Finally Typhon falls, and Zeus shouts out a long stream of mocking taunts, telling Typhon that he is to be buried under Sicily's hills, with a cenotaph over him which will read "This is the barrow of Typhoeus, son of Earth, who once lashed the sky with stones, and the fire of heaven burnt him up".
Burial and cause of volcanic activity
Etna and Ischia
Most accounts have the defeated Typhon buried under either Mount Etna in Sicily, or the volcanic island of Ischia, the largest of the Phlegraean Islands off the coast of Naples, with Typhon being the cause of volcanic eruptions and earthquakes.
Though Hesiod has Typhon simply cast into Tartarus by Zeus, some have read a reference to Mount Etna in Hesiod's description of Typhon's fall:
And flame shot forth from the thunderstricken lord in the dim rugged glens of the mount when he was smitten. A great part of huge earth was scorched by the terrible vapor and melted as tin melts when heated by men's art in channelled crucibles; or as iron, which is hardest of all things, is shortened by glowing fire in mountain glens and melts in the divine earth through the strength of Hephaestus. Even so, then, the earth melted in the glow of the blazing fire.
The first certain references to Typhon buried under Etna, as well as being the cause of its eruptions, occur in Pindar:
Son of Cronus, you who hold Aetna, the wind-swept weight on terrible hundred-headed Typhon,
and:
among them is he who lies in dread Tartarus, that enemy of the gods, Typhon with his hundred heads. Once the famous Cilician cave nurtured him, but now the sea-girt cliffs above Cumae, and Sicily too, lie heavy on his shaggy chest. And the pillar of the sky holds him down, snow-covered Aetna, year-round nurse of bitter frost, from whose inmost caves belch forth the purest streams of unapproachable fire. In the daytime her rivers roll out a fiery flood of smoke, while in the darkness of night the crimson flame hurls rocks down to the deep plain of the sea with a crashing roar. That monster shoots up the most terrible jets of fire; it is a marvellous wonder to see, and a marvel even to hear about when men are present. Such a creature is bound beneath the dark and leafy heights of Aetna and beneath the plain, and his bed scratches and goads the whole length of his back stretched out against it.
Thus Pindar has Typhon in Tartarus, and buried under not just Etna, but under a vast volcanic region stretching from Sicily to Cumae (in the vicinity of modern Naples), a region which presumably also included Mount Vesuvius, as well as Ischia.
Many subsequent accounts mention either Etna or Ischia. In Prometheus Bound, Typhon is imprisoned underneath Etna, while above him Hephaestus "hammers the molten ore", and in his rage, the "charred" Typhon causes "rivers of fire" to pour forth. Ovid has Typhon buried under all of Sicily, with his left and right hands under Pelorus and Pachynus, his feet under Lilybaeus, and his head under Etna; where he "vomits flames from his ferocious mouth". And Valerius Flaccus has Typhon's head under Etna, and all of Sicily shaken when Typhon "struggles". Lycophron has both Typhon and Giants buried under the island of Ischia. Virgil, Silius Italicus and Claudian, all calling the island "Inarime", have Typhon buried there. Strabo, calling Ischia "Pithecussae", reports the "myth" that Typhon lay buried there, and that when he "turns his body the flames and the waters, and sometimes even small islands containing boiling water, spout forth."
In addition to Typhon, other mythological beings were also said to be buried under Mount Etna and the cause of its volcanic activity. Most notably the Giant Enceladus was said to be entombed under Etna, the volcano's eruptions being the breath of Enceladus, and its tremors caused by the Giant rolling over from side to side beneath the mountain. Also said to be buried under Etna were the Hundred-hander Briareus, and Asteropus who was perhaps one of the Cyclopes.
Boeotia
Typhon's final resting place was apparently also said to be in Boeotia. The Hesiodic Shield of Heracles names a mountain near Thebes Typhaonium, perhaps reflecting an early tradition which also had Typhon buried under a Boeotian mountain. And some apparently claimed that Typhon was buried beneath a mountain in Boeotia, from which came exhalations of fire.
"Couch of Typhoeus"
Homer describes a place he calls the "couch [or bed] of Typhoeus", which he locates in the land of the Arimoi (εἰν Ἀρίμοις), where Zeus lashes the land about Typhoeus with his thunderbolts. Presumably this is the same land where, according to Hesiod, Typhon's mate Echidna keeps guard "in Arima" (εἰν Ἀρίμοισιν).
But neither Homer nor Hesiod say anything more about where these Arimoi or this Arima might be. The question of whether an historical place was meant, and its possible location, has been, since ancient times, the subject of speculation and debate.
Strabo discusses the question in some detail. Several locales, Cilicia, Syria, Lydia, and the island of Ischia, all places associated with Typhon, are given by Strabo as possible locations for Homer's "Arimoi".
Pindar has his Cilician Typhon slain by Zeus "among the Arimoi", and the historian Callisthenes (4th century BC), located the Arimoi and the Arima mountains in Cilicia, near the Calycadnus river, the Corycian cave and the Sarpedon promomtory. The b scholia to Iliad 2.783, mentioned above, says Typhon was born in Cilicia "under Arimon", and Nonnus mentions Typhon's "bloodstained cave of Arima" in Cilicia.
Just across the Gulf of Issus from Corycus, in ancient Syria, was Mount Kasios (modern Jebel Aqra) and the Orontes River, sites associated with Typhon's battle with Zeus, and according to Strabo, the historian Posidonius (c. 2nd century BC) identified the Arimoi with the Aramaeans of Syria.
Alternatively, according to Strabo, some placed the Arimoi at Catacecaumene, while Xanthus of Lydia (5th century BC) added that "a certain Arimus" ruled there. Strabo also tells us that for "some" Homer's "couch of Typhon" was located "in a wooded place, in the fertile land of Hyde", with Hyde being another name for Sardis (or its acropolis), and that Demetrius of Scepsis (2nd century BC) thought that the Arimoi were most plausibly located "in the Catacecaumene country in Mysia". The 3rd-century BC poet Lycophron placed the lair of Typhons' mate Echidna in this region.
Another place, mentioned by Strabo, as being associated with Arima, is the island of Ischia, where according to Pherecydes of Athens, Typhon had fled, and in the area where Pindar and others had said Typhon was buried. The connection to Arima, comes from the island's Greek name Pithecussae, which derives from the Greek word for monkey, and according to Strabo, residents of the island said that "arimoi" was also the Etruscan word for monkeys.
Name
Typhon's name has a number of variants. The earliest forms,Typhoeus and Typhaon, occur prior to the 5th century BC. Homer uses Typhoeus, Hesiod and the Homeric Hymn to Apollo use both Typhoeus and Typhaon. The later forms Typhos and Typhon occur from the 5th century BC onwards, with Typhon becoming the standard form by the end of that century.
Though several possible derivations of the name Typhon have been suggested, the derivation remains uncertain. Consistent with Hesiod's making storm winds Typhon's offspring, some have supposed that Typhon was originally a wind-god, and ancient sources associated him with the Greek words tuphon, tuphos meaning "whirlwind". Other theories include derivation from a Greek root meaning "smoke" (consistent with Typhon's identification with volcanoes), from an Indo-European root (*dhuH-) meaning "abyss" (making Typhon a "Serpent of the Deep"), and from Sapõn the Phoenician name for the Ugaritic god Baal's holy mountain Jebel Aqra (the classical Mount Kasios) associated with the epithet Baʿal Sapōn.
The name may have influenced the Persian word tūfān which is a source of the meteorological term typhoon.
Comparative mythology
Succession myth
The Typhonomachy—Zeus' battle with, and defeat of Typhon—is just one part of a larger "Succession Myth" given in Hesiod's Theogony. The Hesiodic succession myth describes how Uranus, the original ruler of the cosmos, hid his offspring away inside Gaia, but was overthrown by his Titan son Cronus, who castrated Uranus, and how in turn, Cronus, who swallowed his children as they were born, was himself overthrown by his son Zeus, whose mother had given Cronus a stone wrapped in swaddling clothes to swallow, in place of Zeus. However Zeus is then confronted with one final adversary, Typhon, which he quickly defeats. Now clearly the supreme power in the cosmos, Zeus is elected king of gods. Zeus then establishes and secures his realm through the apportionment of various functions and responsibilities to the other gods, and by means of marriage. Finally, by swallowing his first wife Metis, who was destined to produce a son stronger than himself, Zeus is able to put an end to the cycle of succession.
Python
Typhon's story seems related to that of another monstrous offspring of Gaia: Python, the serpent killed by Apollo at Delphi, suggesting a possible common origin.
Besides the similarity of names, their shared parentage, and the fact that both were snaky monsters killed in single combat with an Olympian god, there are other connections between the stories surrounding Typhon, and those surrounding Python.
Although the Delphic monster killed by Apollo is usually said to be the male serpent Python, in the Homeric Hymn to Apollo, the earliest account of this story, the god kills a nameless she-serpent (drakaina), subsequently called Delphyne, who had been Typhon's foster-mother. Delphyne and Echidna, besides both being intimately connected to Typhon—one as mother, the other as mate—share other similarities. Both were half-maid and half-snake, a plague to men, and associated with the Corycian cave in Cilicia.
Python was also perhaps connected with a different Corycian Cave than the one in Cilicia, this one on the slopes of Parnassus above Delphi, and just as the Corycian cave in Cilicia was thought to be Typhon and Echidna's lair, and associated with Typhon's battle with Zeus, there is evidence to suggest that the Corycian cave above Delphi was supposed to be Python's (or Delphyne's) lair, and associated with his (or her) battle with Apollo.
Near Eastern influence
From at least as early as Pindar, and possibly as early as Homer and Hesiod (with their references to the Arimoi and Arima), Typhon's birthplace and battle with Zeus were associated with various Near East locales in Cilicia and Syria, including the Corycian Cave, Mount Kasios, and the Orontes River. Besides this coincidence of place, the Hesiodic succession myth, (including the Typhonomachy), as well as other Greek accounts of these myths, exhibit other parallels with several ancient Near Eastern antecedents, and it is generally held that the Greek accounts are intimately connected with, and influenced by, these Near Eastern counterparts. In particular, the Typhonomachy is generally thought to have been influenced by several Near Eastern monster-slaying myths.
Mesopotamia
Three related god vs. monster combat myths from Mesopotamia, date from at least the early second-millennium BC or earlier. These are the battles of the god Ninurta with the monsters Asag and Anzu, and the god Marduk's battle with the monstrous Tiamat.
Ninurta vs. Asag
Lugal-e, a late-third-millennium BC Sumerian poem, tells the story of the battle between the Mesopotamian hero-god Ninurta and the terrible monster Asag. Like Typhon, Asag was a monstrous hissing offspring of Earth (Ki), who grew mighty and challenged the rule of Ninurta, who like Zeus, was a storm-god employing winds and floods as weapons. As in Hesiod's account of the Typhonomachy, during their battle, both Asag and Ninurta set fire to the landscape. And like Apollodorus' Typhon, Asag evidently won an initial victory, before being finally overcome by Ninurta.
Ninurta vs. Anzu
The early second millennium BC Akkadian epic Anzu tells the story of another combat of Ninurta with a monstrous challenger. This second foe is the winged monster Anzu, another offspring of Earth. Like Hesiod's Typhon, Anzu roared like a lion, and was the source of destructive storm winds. Ninurta destroys Anzu on a mountainside, and is portrayed as lashing the ground where Anzu lay with a rainstorm and floodwaters, just as Homer has Zeus lash the land about Typhon with his thunderbolts.
Marduk vs. Tiamat
The early second-millennium BC Babylonian-Akkadian creation epic Enûma Eliš tells the story of the battle of the Babylonian supreme god Marduk with Tiamat, the Sea personified. Like Zeus, Marduk was a storm-god, who employed wind and lightning as weapons, and who, before he can succeed to the kingship of the gods, must defeat a huge and fearsome enemy in single combat. This time the monster is female, and may be related to the Pythian dragoness Delphyne, or Typhon's mate Echidna, since like Echidna, Tiamat was the mother of a brood of monsters.
Mount Kasios
Like the Typhonomachy, several Near East myths, tell of battles between a storm-god and a snaky monster associated with Mount Kasios, the modern Jebel Aqra. These myths are usually considered to be the origins of the myth of Zeus's battle with Typhon.
Baal Sapon vs. Yamm
From the south side of the Jebel Aqra, comes the tale of Baal Sapon, and Yamm, the deified Sea (like Tiamat above). Fragmentary Ugaritic tablets, dated to the fourteenth or thirteenth-century BC, tell the story of the Canaanite storm-god Baal Sapon's battle against the monstrous Yamm on Mount Sapuna the Canaanite name for later Greeks’ Mount Kasios. Baal defeats Yamm with two throwing clubs (thunderbolts?) named ‘Expeller’ and ‘Chaser’, which fly like eagles from the storm-god's hands. Other tablets associate the defeat of the snaky Yamm with the slaying of a seven headed serpent ‘’Ltn’’ (Litan/Lotan), apparently corresponding to the biblical Leviathan.
Tarhunna vs. Illuyanka
From the north side of the Jebel Aqra, come Hittite myths, c. 1250 BC, which tell two versions of the storm-god Tarhunna’s (Tarhunta’s) battle against the serpent Illuyanka(s). In both of these versions, Tarhunna suffers an initial defeat against Illuyanka. In one version, Tarhunna seeks help from the goddess Inara, who lures Illuyanka from his lair with a banquet, thereby enabling Tarhunna to surprise and kill Illuyanka. In the other version Illuyanka steals the heart and eyes of the defeated god, but Tarhunna’s son marries a daughter of Illuyanka and is able to retrieve Tarhunna’s stolen body parts, whereupon Tarhunna kills Illuyanka.
These stories particularly resemble details found in the accounts of the Typhonomachy of Apollodorus, Oppian and Nonnus, which, though late accounts, possibly preserve much earlier ones: The storm-god’s initial defeat (Apollodorus, Nonnus), the loss of vital body parts (sinews: Apollodorus, Nonnus), the help of allies (Hermes and Aegipan: Apollodorus; Cadmos and Pan: Nonnus; Pan: Oppian), the luring of the serpentine opponent from his lair through the trickery of a banquet (Oppian, or by music: Nonnus).
Teshub vs. Hedammu and Ullikummi
Another c. 1250 BC Hittite text, derived from the Hurrians, tells of the Hurrian storm-god Teshub (with whom the Hittite's Tarhunna came to be identified) who lived on Mount Hazzi, the Hurrian name for the Jebel Aqra, and his battle with the sea-serpent Hedammu. Again the storm-god is aided by a goddess Sauska (equivalent to Inaru), who this time seduces the monster with music (as in Nonnus), drink, and sex, successfully luring the serpent from his lair in the sea. Just as the Typhonomachy can be seen as a sequel to the Titanomachy, a different Hittite text derived from the Hurrians, The Song of Ullikummi, a kind of sequel to the Hittite "kingship in heaven" succession myths of which the story of Teshub and Hedammu formed a part, tells of a second monster, this time made of stone, named Ullikummi that Teshub must defeat, in order to secure his rule.
Set
From apparently as early as Hecataeus of Miletus (c. 550 BC – c. 476 BC), Typhon was identified with Set, the Egyptian god of chaos and storms. This syncretization with Egyptian mythology can also be seen in the story, apparently known as early as Pindar, of Typhon chasing the gods to Egypt, and the gods transforming themselves into animals. Such a story arose perhaps as a way for the Greeks to explain Egypt's animal-shaped gods. Herodotus also identified Typhon with Set, making him the second to last divine king of Egypt. Herodotus says that Typhon was deposed by Osiris' son Horus, whom Herodutus equates with Apollo (with Osiris being equated with Dionysus), and after his defeat by Horus, Typhon was "supposed to have been hidden" in the "Serbonian marsh" (identified with modern Lake Bardawil) in Egypt.
Confused with the Giants
Typhon bears a close resemblance to an older generation of descendants of Gaia, the Giants. They, like their younger brother Typhon after them, challenged Zeus for supremacy of the cosmos, were (in later representations) shown as snake-footed, and end up buried under volcanos.
While distinct in early accounts, in later accounts Typhon was often considered to be one of the Giants. The Roman mythographer Hyginus (64 BC – 17 AD) includes Typhon in his list of Giants, while the Roman poet Horace (65 – 8 BC), mentions Typhon, along with the Giants Mimas, Porphyrion, and Enceladus, as together battling Athena, during the Gigantomachy. The Astronomica, attributed to the 1st-century AD Roman poet and astrologer Marcus Manilius, and the late 4th-century early 5th-century Greek poet Nonnus, also consider Typhon to be one of the Giants.
Notes
References
Aeschylus, Seven Against Thebes in Aeschylus, with an English translation by Herbert Weir Smyth, Ph. D. in two volumes. Vol 2. Cambridge, Massachusetts. Harvard University Press. 1926.
Aeschylus (?), Prometheus Bound in Aeschylus, with an English translation by Herbert Weir Smyth, Ph. D. in two volumes. Vol 2. Cambridge, Massachusetts. Harvard University Press. 1926. Online version at the Perseus Digital Library.
Anonymous, Homeric Hymn to Apollo, in The Homeric Hymns and Homerica with an English Translation by Hugh G. Evelyn-White, Cambridge, Massachusetts., Harvard University Press; London, William Heinemann Ltd. 1914. Online version at the Perseus Digital Library.
Antoninus Liberalis, Celoria, Francis, The Metamorphoses of Antoninus Liberalis: A Translation with Commentary, Psychology Press, 1992. .
Apollodorus, Apollodorus, The Library, with an English Translation by Sir James George Frazer, F.B.A., F.R.S. in 2 Volumes. Cambridge, Massachusetts, Harvard University Press; London, William Heinemann Ltd. 1921. Online version at the Perseus Digital Library.
Apollonius of Rhodes, Apollonius Rhodius: the Argonautica, translated by Robert Cooper Seaton, W. Heinemann, 1912. Internet Archive
Aristophanes, Clouds in The Comedies of Aristophanes, William James Hickie. London. Bohn. 1853?. Online version at the Perseus Digital Library.
Bacchylides, Odes, translated by Diane Arnson Svarlien. 1991. Online version at the Perseus Digital Library.
Burkert, "Oriental and Greek Mythology: The Meeting of Parallels" in Interpretations of Greek Mythology, Edited by Jan Bremmer, Routledge, 2014 (first published 1987). .
Campbell, David A., Greek Lyric III: Stesichorus, Ibycus, Simonides, and Others, Harvard University Press, 1991. .
Claudian, Claudian with an English translation by Maurice Platnauer, Volume II, Loeb Classical Library No. 136. Cambridge, Massachusetts: Harvard University Press; London: William Heinemann, Ltd.. 1922. . Internet Archive.
Clay, Jenny Strauss, Hesiod's Cosmos, Cambridge University Press, 2003. .
Diodorus Siculus, Diodorus Siculus: The Library of History. Translated by C. H. Oldfather. Twelve volumes. Loeb Classical Library. Cambridge, Massachusetts: Harvard University Press; London: William Heinemann, Ltd. 1989. Online version by Bill Thayer
Euripides, The Phoenician Women, translated by E. P. Coleridge in The Complete Greek Drama, edited by Whitney J. Oates and Eugene O'Neill, Jr. Volume 2. New York. Random House. 1938.
Euripides, Iphigenia in Tauris, translated by Robert Potter in The Complete Greek Drama, edited by Whitney J. Oates and Eugene O'Neill, Jr. Volume 2. New York. Random House. 1938. Online version at the Perseus Digital Library.
Fontenrose, Joseph Eddy, Python: A Study of Delphic Myth and Its Origins, University of California Press, 1959. .
Fowler, R. L. (2000), Early Greek Mythography: Volume 1: Text and Introduction, Oxford University Press, 2000. .
Fowler, R. L. (2013), Early Greek Mythography: Volume 2: Commentary, Oxford University Press, 2013. .
Frazer, J. G., Pausanias's Description of Greece. Translated with a Commentary by J. G. Frazer. Vol IV. Commentary on Books VI-VIII, Macmillan, 1898. Internet Archive.
Freeman, Kathleen, Ancilla to the Pre-Socratic Philosophers: A Complete Translation of the Fragments in Diels, Fragmente Der Vorsokratiker, Harvard University Press, 1983. .
Gantz, Timothy, Early Greek Myth: A Guide to Literary and Artistic Sources, Johns Hopkins University Press, 1996, Two volumes: (Vol. 1), (Vol. 2).
Graves, Robert, The Greek Myths, (1955) 1960, §36.1–3
Griffiths, J. Gwyn, "The Flight of the Gods Before Typhon: An Unrecognized Myth", Hermes, 88, 1960, pp. 374–376. JSTOR
George M. A. Hanfmann, "Giants" in The Oxford Classical Dictionary, second edition, Hammond, N.G.L. and Howard Hayes Scullard (editors), Oxford University Press, 1992. .
Herodotus; Histories, A. D. Godley (translator), Cambridge: Harvard University Press, 1920; . Online version at the Perseus Digital Library.
Hesiod, Theogony, in The Homeric Hymns and Homerica with an English Translation by Hugh G. Evelyn-White, Cambridge, Massachusetts., Harvard University Press; London, William Heinemann Ltd. 1914. Online version at the Perseus Digital Library.
Horace, The Odes and Carmen Saeculare of Horace. John Conington. trans. London. George Bell and Sons. 1882. Online version at the Perseus Digital Library.
Hošek, Radislav, "Echidna" in Lexicon Iconographicum Mythologiae Classicae (LIMC) III.1. Artemis Verlag, Zürich and Munich, 1986. .
Hyginus, Gaius Julius, The Myths of Hyginus. Edited and translated by Mary A. Grant, Lawrence: University of Kansas Press, 1960.
Kerényi, Carl, The Gods of the Greeks, Thames and Hudson, London, 1951.
Kirk, G. S., J. E. Raven, M. Schofield, The Presocratic Philosophers: A Critical History with a Selection of Texts, Cambridge University Press, Dec 29, 1983. .
Lane Fox, Robin, Travelling Heroes: In the Epic Age of Homer, Vintage Books, 2010. .
Lightfoot, J. L. Hellenistic Collection: Philitas. Alexander of Aetolia. Hermesianax. Euphorion. Parthenius. Edited and translated by J. L. Lightfoot. Loeb Classical Library No. 508. Cambridge, Massachusetts: Harvard University Press, 2010. . Online version at Harvard University Press.
Lucan, Pharsalia, Sir Edward Ridley. London. Longmans, Green, and Co. 1905. Online version at the Perseus Digital Library.
Lycophron, Alexandra (or Cassandra) in Callimachus and Lycophron with an English translation by A. W. Mair ; Aratus, with an English translation by G. R. Mair, London: W. Heinemann, New York: G. P. Putnam 1921. Internet Archive
Lyne, R. O. A. M., Ciris: A Poem Attributed to Vergil, Cambridge University Press, 2004. .
Manilius, Astronomica, edited and translated by G. P. Goold. Loeb Classical Library No. 469. Cambridge, Massachusetts: Harvard University Press, 1977. Online version at Harvard University Press
Nonnus, Dionysiaca; translated by Rouse, W H D, I Books I–XV. Loeb Classical Library No. 344, Cambridge, Massachusetts, Harvard University Press; London, William Heinemann Ltd. 1940. Internet Archive
Ogden, Daniel (2013a), Drakon: Dragon Myth and Serpent Cult in the Greek and Roman Worlds, Oxford University Press, 2013. .
Ogden, Daniel (2013b), Dragons, Serpents, and Slayers in the Classical and early Christian Worlds: A sourcebook, Oxford University Press. .
Oppian, Halieutica in Oppian, Colluthus, Tryphiodorus, With an English Translation by A. W. Mair, London, W. Heinemann, 1928. Internet Archive
Ovid, Ovid's Fasti: With an English translation by Sir James George Frazer, London: W. Heinemann LTD; Cambridge, Massachusetts: : Harvard University Press, 1959. Internet Archive.
Ovid, Metamorphoses, Brookes More. Boston. Cornhill Publishing Co. 1922. Online version at the Perseus Digital Library.
Pausanias, Pausanias Description of Greece with an English Translation by W.H.S. Jones, Litt.D., and H.A. Ormerod, M.A., in 4 Volumes. Cambridge, Massachusetts, Harvard University Press; London, William Heinemann Ltd. 1918. Online version at the Perseus Digital Library.
Penglase, Charles, Greek Myths and Mesopotamia: Parallels and Influence in the Homeric Hymns and Hesiod, Taylor & Francis e-Library, 2005 (first published 1994 by Routledge). . PDF.
Philostratus, On Heroes, editors Jennifer K. Berenson MacLean, Ellen Bradshaw Aitken, BRILL, 2003, .
Pindar, Odes, Diane Arnson Svarlien. 1990. Online version at the Perseus Digital Library.
Quintus Smyrnaeus, Quintus Smyrnaeus: The Fall of Troy, Translator: A.S. Way; Harvard University Press, Cambridge MA, 1913. Internet Archive
Race, William H., Nemean Odes. Isthmian Odes. Fragments, Edited and translated by William H. Race. Loeb Classical Library 485. Cambridge, Massachusetts: Harvard University Press, 1997, revised 2012. . Online version at Harvard University Press.
Rose, Herbert Jennings, "Typhon, Typhoeus" in The Oxford Classical Dictionary, second edition, Hammond, N.G.L. and Howard Hayes Scullard (editors), Oxford University Press, 1992. .
Seneca, Tragedies, Volume I: Hercules. Trojan Women. Phoenician Women. Medea. Phaedra. Edited and translated by John G. Fitch. Loeb Classical Library No. 62. Cambridge, Massachusetts: Harvard University Press, 2002. . Online version at Harvard University Press.
Seneca, Tragedies, Volume II: Oedipus. Agamemnon. Thyestes. Hercules on Oeta. Octavia. Edited and translated by John G. Fitch. Loeb Classical Library No. 78. Cambridge, Massachusetts: Harvard University Press, 2004. . Online version at Harvard University Press.
Silius Italicus, Punica with an English translation by J. D. Duff, Volume II, Cambridge, Massachusetts., Harvard University Press; London, William Heinemann, Ltd. 1934. Internet Archive.
Sophocles, Women of Trachis, Translated by Robert Torrance. Houghton Mifflin. 1966. Online version at the Perseus Digital Library.
Strabo, Geography, translated by Horace Leonard Jones; Cambridge, Massachusetts: Harvard University Press; London: William Heinemann, Ltd. (1924). LacusCurtis, Books 6–14, at the Perseus Digital Library.
Taylor, Thomas, Select Works of Porphyry: Containing His Four Books on Abstinence from Animal Food, His Treatise on the Homeric Cave of the Nymphs and His Auxiliaries to the Perception of Intelligible Natures, London, T. Rodd, 1823. Internet Archive
Trypanis, C. A., Gelzer, Thomas; Whitman, Cedric, CALLIMACHUS, MUSAEUS, Aetia, Iambi, Hecale and Other Fragments. Hero and Leander, Harvard University Press, 1975. .
Tzetzes, Chiliades, editor Gottlieb Kiessling, F.C.G. Vogel, 1826. (English translation, Books II–IV, by Gary Berkowitz. Internet Archive).
Valerius Flaccus, Gaius, Argonautica, translated by J. H. Mozley, Loeb Classical Library Volume 286. Cambridge, Massachusetts, Harvard University Press; London, William Heinemann Ltd. 1928.
Jean-Pierre Vernant, The Origins of Greek Thought. Cornell University Press, 1982. Google books
Virgil; Bucolics, Aeneid, and Georgics Of Vergil. J. B. Greenough. Boston. Ginn & Co. 1900. Online version at the Perseus Digital Library.
Calvert Watkins, How to Kill a Dragon: Aspects of Indo-European Poetics", Oxford University Press. 1995. . PDF.
West, M. L. (1966), Hesiod: Theogony, Oxford University Press.
West, M. L. (1997), The East Face of Helicon: West Asiatic Elements in Greek Poetry and Myth, Oxford University Press. .
External links
Mythological hybrids
Greek dragons
Greek giants
Children of Gaia
Children of Hera
Chaos gods
Monsters in Greek mythology
Characters in Greek mythology
Deeds of Zeus |
36462398 | https://en.wikipedia.org/wiki/Peter%20Warren%20%28journalist%29 | Peter Warren (journalist) | Peter Warren (born 1960) is an English technology and investigative journalist for various newspapers, most notably The Guardian and Sunday Times. He frequently appears on national TV and radio and has provided evidence and advice on request to the UK Government and the House of Lords. Warren specialises in technology, undercover investigations, and science issues. He is the former technology editor of Scotland on Sunday and the Sunday Express and a former associate producer for the BBC2 Sci Files series.
Career
In 1991, Warren reported on Kuwait and Iraq for the Guardian newspaper during Kuwait's liberation in the first Gulf War and has reported from places as diverse as Taiwan, Romania, Argentina, the Philippines and the Kalahari Desert.
A frequent reporter for the Sunday Times Insight team, Warren has also worked for the Sunday Times Magazine, most notably on the magazine cover story investigation into the illegal drug culture in Moss Side in Manchester in March 1993.
In 1996, Warren was runner-up in the UK Press Gazette Business Awards for Technology Scoop of the Year. A guest speaker on Technology Ethics to the European Union’s Information Society Technologies conference in 1999 in Helsinki he is now a frequent speaker at conferences and events and now organises the annual Professor Donald Michie Conference on AI with Cooley the world's largest law firm and the Institution of Engineering Technology. Warren, who lives in Suffolk, is an acknowledged expert on computer security issues and is also recognized within the technology industry for his foresight. Warren was one of the UK’s first journalists to stress the issues raised by computer viruses and the need to address the threat of computer crime in the late 80s, a topic that Warren has been a noted campaigner on since that time. His writing on the topic has won a number of prizes.
Warren is now the director of two technology websites, Cyber Security Research Institute and Future Intelligence.
Biography
Peter Warren was born in Harlow, Essex. Warren went on to work for Computer Talk after being educated at Newport Grammar School and Northumbria University. After Computer Talk he went on to write for the Sunday Times, The Guardian, Daily Express, Scotland on Sunday, the Sunday Herald, Mail on Sunday, Daily Mirror, Evening Standard, Sunday Business, Sunday Express and other specialist magazines. He has also appeared in documentaries with Channel 4, the BBC and Sky News. Warren is a regular commentator on cybercrime issues for Sky News.
Warren has won many awards in his area. In 2006, Warren won the BT IT Security News story of the year prize for his work exposing the practice of discarding computer hard drives containing sensitive business and personal data. Then in 2007, Warren won the IT Security News story of the year prize again for work done with Future Intelligence showing that Chinese hackers had broken into the UK Houses of Parliament. In 2008 Warren won the BT Enigma Award for services to technology security journalism.
A campaigning journalist, Warren also wrote the first articles highlighting the potential for the emerging internet to be abused by paedophiles in 1989 and as a result was asked to brief the first UK police force to respond to the danger, the Greater Manchester Police Obscene Publications Squad, on the issues the technology has produced.
In 2005, with Michael Streeter, Warren wrote the critically acclaimed book, Cyber Alert, which accurately predicted the computer security situation the world is now dealing with.
Since 2009, Warren has worked on the creation of the Cyber Security Research Institute, an organisation pulling together the UK’s top academic and business experts in the field of computer security with leading journalists in a bid to raise awareness of cybercrime.
From 2012, Warren has worked as the presenter on the ground breaking PassW0rd radio show on London's ResonanceFM, an hour long monthly programme on the ramifications of technology that he has developed with the veteran radio producer Jane Whyatt.
In 2013, Warren once again collaborated with Michael Streeter to write Cyber crime and warfare published by Hodder and Stoughton another critically acclaimed book on cybercrime.
In 2014, Warren, Streeter and Jane Whyatt produced a report for the European think tank Netopia entitled Can We Make the Digital World Ethical? which was presented to the European Union and subsequently led to an invitation from the French Senate to give a speech to it.
Since 2014, Warren has made a number of highly successful films on the issues raised by technology a number of which have led to great media interest. One, the Herod Clause - which was publicly endorsed by Europol - went viral on the internet and resulted in over a million downloads of virtual private network software from the cyber security company F-Secure.
Peter Warren was one of the first people to raise concerns about the issues that AI will present, and in 2014 he was the lead writer of ‘Can We Make the Digital World Ethical’ for the European technology think tank Netopia which was presented to the EU. As a result of the EU presentation, Warren was asked to give a presentation to the French Senate on technology ethics. The speech was very well received, with DeepMind collaborator Professor Murray Shanahan, Professor of Cognitive Robotics at Imperial College (who also gave evidence to the French Senate) calling for it to be published. DeepMind, now owned by Google is one of the world’s leading AI companies.
In 2017 Warren was asked to organise a series of conferences to discuss the ramifications of AI by Cooley LLP, one of the world’s 50 largest law firms, which includes Google and Microsoft among its many clients. The conferences are held jointly between Future Intelligence, Cooley and the Institution of Engineering Technology and are now named after Professor Donald Michie, one of the fathers of AI, who originated the concept with Alan Turing over games of chess while cracking the Enigma Code during the Second World War.
Warren was asked to submit evidence to the House of Lords Select Committee on AI which re-opened its investigation into the ethics of AI technology after attending Warren’s first conference in 2018.
Warren has now been asked to provide advice to the House of Lords on technology ethics and is currently writing a book on the issues for Bloomsbury which is intended to lay down ethical ground rules and discuss ways of possibly regulating the technology.
Warren has supplied information on cyber security at the request of Graham Wright in Wright's former role as Deputy Director of the Office of Cyber Security in the Cabinet Office. Warren is currently working on a number of books and TV projects.
References
English male journalists
Living people
1960 births
English male non-fiction writers
People from Harlow
Alumni of Newcastle University
People educated at Newport Free Grammar School |
51844222 | https://en.wikipedia.org/wiki/Distributed%20collaboration | Distributed collaboration | Distributed Collaboration is a way of collaboration wherein participants, regardless of their location, work together to reach a certain goal. This usually entails use of increasingly popular cyberinfrastructure, such as emails, instant messaging and document sharing platforms to reduce the limitations of the users trying to work together from remote locations by overcoming physical barriers of geolocation (using cyberinfrastructure) and also to some extent, depending on the application used, the effects of working together in person. For example, a caller software that can be used to bring all collaborators into a single call-in for easier dissemination of ideas.
Goals
One of the major goals for distributed collaboration is to facilitate use of shared resources and communication. There is a need to enable some sort of interaction which may involve exchange of gestures and body language information at an informal level which is usually unavailable to participants at remote locations. The essence is to allow for groups to collaborate over distances in a manner that emulates, as nearly as possible, the effectiveness of collaboration when the participants meet in person.
Measure of Effectiveness
The measure of effectiveness of a distributed collaboration often bases itself on the concept of collocation. The idea is to bring the team performance as nearly as possible to a scenario wherein the users are actually collocated. This further brings in the concept of Proximity. Studies have revealed that working on a familiar, previously worked upon topics, with people around you could increase the attentiveness and work output whereas, the same scenario, when applicable to an unfamiliar topic of work, could oftentimes prove to be distracting and unproductive.
Meaning of Collocation
Decreasing proximity leads to asymptotic behavior in communication. This means that after a certain finite distance between participants, usually until the participants are actually out of sight of each other, the proximity can be treated the same as if participants were across continents.
The usual type of collocation is the "Project Room" type of collocation wherein the resources for work are stored in a place (e.g. a cloud storage platform) and the participants come in and go out depending upon their availability to work. Another common collocation type is “Radical Collocation” which means all participants and resources are present in the place of work for the duration of the project.
Advantages of Collocation on Distributed Collaboration
Higher proximity (i.e. lesser distance between participants) usually increases the chances of collaboration. This means that people on the same work floor are more likely to collaborate on a project than people in the same building but different work floors. The realization of collocation in a distributed collaborative environment can thus lead to higher productivity.
Moreover, use of cyberinfrastructure, such as emails, instant messaging and document sharing help remove the requirement of synchronicity which would, erstwhile, have been imposed if the participants were to meet physically.
Infrastructures
Following (technical) infrastructures enables distributed collaboration over traditional collaboration:
Internet
The low cost and nearly instantaneous sharing of ideas, knowledge, and skills through the internet has made collaborative work dramatically easier. It allow files to be exchanged, drawings and images to be shared, or voice and video contact between team members. Not only can a group cheaply communicate and test, but the wide reach of the Internet allows such groups to easily form in the first place, even among niche interests. An example of this is the free software movement, which produced GNU and Linux from scratch.
Collaborative software
Computer-supported collaboration focuses on technologies that affect groups, organizations, communities, and societies. Collaborative software designed to help people involved in a common task to achieve their goals by creating a computer supported cooperative work, which includes all contexts in which technology is used to mediate human activities such as communication, coordination, cooperation, competition, entertainment, games, art, and music. It addresses "how collaborative activities and their coordination can be supported by means of computer systems."
Base technologies and software such as netnews, email, chat and wikis could be described as social, collaborative, both or neither. Those who say "social" seem to focus on so-called virtual community, while those who say "collaborative" seem to be more concerned with content management and the actual output. While software may be designed to achieve closer social ties or specific deliverables, it is hard to support collaboration without also enabling relationships to form, and hard to support a social interaction without some kind of shared co-authored works.
Cloud and document collaboration
Document collaboration is a system allowing people to collaborate across different locations using Internet, and cloud collaboration enabled approach. In recent years, the market has seen a rapid development in document collaboration tools. Primitive document collaboration used email, whereby comments would be written in the email with the document attached. However, if the email is then forwarded or replied to, the comments can be easily lost, also it is hard to keep track of the most recent version of a document. Document-centric collaboration is the next step in the evolution of document collaboration. These systems put the document and its contents at the center of the process and allow users to tag the document and add content specific comments, maintaining a complete version control and records and storing all comments and activities associated around a document.
These new innovations are only possible because of the development of cloud computing, whereby software and applications are provisioned on the Internet. Cloud collaboration today is promoted as a tool for collaboration internally between different departments within a firm, but also externally as a means for sharing documents with end-clients as receiving feedback. This makes cloud computing a very versatile tool for firms with many different applications in a business environment.
Crowdsourcing
Crowdsourcing is to divide work between participants to achieve a cumulative result. In modern crowdsourcing, individuals or organizations use contributions from Internet users, which provides a particularly good venue for distributed collaboration since individuals tend to be more open in web-based projects where they are not being physically judged or scrutinized, thus can feel more comfortable sharing. In an online atmosphere, more attention can be given to the specific needs of a project, rather than spending as much time in communication with other individuals.
Crowdsourcing can either take an explicit or an implicit route. Explicit crowdsourcing lets users work together to evaluate, share, and build different specific tasks, while implicit crowdsourcing means that users solve a problem as a side effect of something else they are doing. Advantages of using crowdsourcing may include improved costs, speed, quality, flexibility, scalability, or diversity. Advantages of using crowdsourcing may include improved costs, speed, quality, flexibility, scalability, or diversity. Crowdsourcing in the form of idea competitions or innovation contests provides a way for organizations to learn beyond what their base of minds of employees provides.
Examples of distributed collaboration
Peer production
Peer production (also may refer to as mass or social collaboration) is a way of producing goods and services that relies on self-organizing communities of individuals. In such communities, the labor of a large number of people is coordinated towards a shared outcome. Peer production is a process taking advantage of new collaborative possibilities afforded by the internet and has become a widespread mode of labor. One of the earliest instances of networked peer production is Project Gutenberg, a project in which volunteers make out-of-copyright works available online. Free and open source software, such as Linux and Wikipedia, an online encyclopedia, are another examples of peer production. Commons-based peer production is a subset of peer production, in which the creative energy of large numbers of people is coordinated into large, meaningful projects, mostly without traditional hierarchical organization or financial compensation.
Collaborative Writing
Collaborative writing refers to projects where written works are collaboratively created by multiple people together rather than individually. These projects might also collaboratively edited. Using collaborative writing tools can provide substantial advantages to projects ranging from increased user commitment to easier, more effective and efficient work processes. Since this software makes it easy for users to contribute from anywhere in the world, projects can benefit from distributed collaboration.
Examples of these collaborative writing tools includes:
Collaborative programming including Web IDEs such as Cloud9 IDE, PythonAnywhere, and Eclipse Che for collaborative code-writing, and Mercurial and Git (used in GitHub, Bitbucket, GitLab and CodePlex) for collaborative revision control
Collaborative real-time editors such as ShareLaTeX, Etherpad, Hackpad, GoogleDocs, Microsoft Office, and Authorea
Online platforms mainly focused on collaborative fiction that allow other users to continue a story's narrative such as Protagonize and Ficly
Wikis like Wikipedia, Wikia, and Baidu Baike.
Mobile collaboration
Mobile collaboration is a technology-based process of communicating using electronic assets and accompanying software designed for use in remote locations. Mobile collaboration utilizes wireless, cellular and broadband technologies enabling effective distributed collaboration independent of location, Where traditional video conferencing has been limited to boardrooms, offices, and lecture theaters, recent technological advancements have extended the capabilities of video conferencing for use with discreet, hand-held mobile devices, permitting true mobile collaborative possibilities.
Distributed collaborative learning
Collaborative learning is based on the model that knowledge can be created within a population where members actively interact by sharing experiences and take on asymmetry roles. Put differently, collaborative learning refers to methodologies and environments in which learners engage in a common task where each individual depends on and is accountable to each other.
Technology is an important factor in distributed collaborative learning. The Internet has allowed for a shared space for groups to communicate. Virtual environments have been critical to allowing people to communicate long-distances but still feel like they are part of the group. Computer-supported collaborative learning is a relatively new educational paradigm within collaborative learning which uses technology in a learning environment to control and monitor interactions, to regulate tasks, rules, and roles, and to mediate the acquisition of new knowledge. An example of Computer-supported collaborative learning is MOOC, that support collaborative allow for a strong and engaging learning environment.
Collaborative translation
Collaborative translation is a translation technique that has been enabled by cloud collaboration technology where multiple translation participants with varying tasks participate simultaneously in a collaborative workspace with shared resources. They generally sharing a computer-assisted translation interface that includes tools for collaboration. The purpose of collaborative translation is to reduce the total time of the translation lifecycle, improve communications, particularly between translator and non-translator participants, and eliminate many management tasks.
An example of collaborative translation is Duolingo, where members of the public were invited to translate content and vote on translations. The content came from organizations that pay Duolingo to translate it. Documents could be added to Duolingo for translation with an upload account which had to be applied for.
Crowdfunding
Crowdfunding is a form of crowdsourcing and the practice of funding a project or venture by raising monetary contributions from a large number of people. Croudfunding is now often performed via Internet-mediated registries. Crowdfunding has been used to fund a wide range for-profit entrepreneurial ventures such as artistic and creative projects, medical expenses, travel, or community-oriented social entrepreneurship projects.
Arguably the best-known example of crowdfunding, and also crowdsourcing, is Kickstarter which missioned to help bring creative projects to life. In Kickstarter, project creators choose a deadline and a minimum funding goal. As an assurance contract, if the goal is not met by the deadline, no funds are collected. The platform is open to backers from anywhere in the world and to creators from some certain countries.
See also
Collaboration
Mass collaboration
Collaborative information seeking
Collaboration tool
Collaborative software
References
Collaboration |
26093657 | https://en.wikipedia.org/wiki/MVJ%20College%20of%20Engineering | MVJ College of Engineering | MVJ College of Engineering (MVJCE) is a private autonomous engineering college located in Bangalore, Karnataka, India. MVJCE is affiliated to Visvesvaraya Technological University (VTU). It was established in 1982 by Venkatesha Education Society. It is situated on 15-acre campus in Whitefield, Bangalore.
History
MVJ College of Engineering was established in 1982 under Bangalore University affiliation as a flagship institution of the Venkatesha Education Society by Dr. M. V. Jayaraman with a vision to develop MVJCE as an Institute of academic excellence with high standards.
Board of Governors
Dr. B. N. Suresh, Chairman, Governing Council, Chancellor & Founder Director, Indian Institute of Space Science and Technology, Thiruvananthapuram
Prof. B. N. Raghunandan, Professor and Dean (Retd.), Dept. of Aerospace Engineering, IISc., Bangalore
Prof. Chandrashekar, Visiting Chair Professor, National Institute of Advanced Studies, Bangalore
Dr. K. Ramachandra, Former Director – GTRE, Bangalore
Dr.Viraj Kumar, Visiting Professor, Divecha Centre for Climate Change, IISc., Bangalore
Mr. Vasantha Kumar Narayan, Cyclotis Software Solutions Pvt. Ltd., Bangalore
Mr. P. S. Krishnan, Distinguished Scientist (Retd.) & Ex. Director, Aeronautical Development Establishment, Bangalore.
Affiliation and accreditation
All the courses offered by MVJ College of Engineering are affiliated to Visvesvaraya Technological University (VTU), Belagavi & approved by All India Council for Technical Education (AICTE).
Aeronautical Engineering, Chemical Engineering, Computer Science and Engineering, Electronics and Communication, Electrical and Electronics, Mechanical Engineering and Information Science and Engineering programs offered by MVJ are accredited by National Board of Accreditation (NBA) and the College is accredited by the National Assessment and Accreditation Council (NAAC).
Like all higher education institutes in India, MVJCE is recognised by the University Grants Commission (UGC). MVJCE is also approved by the All India Council for Technical Education (AICTE) and some courses are accredited by the National Board of Accreditation (NBA). It is also accredited by the National Assessment and Accreditation Council (NAAC) with a B++ Grade.
Departments and courses
MVJ College of Engineering offers 11 undergraduate programmes and 5 post graduate programmes across various branches of Engineering.
Undergraduate
These departments offer four-year undergraduate courses in Engineering. The following programmes are offered: Aeronautical Engineering, Aerospace Engineering, Chemical Engineering, Civil Engineering, Computer Science and Engineering, Electronics and Communication Engineering, Electrical and Electronics Engineering, Information Science Engineering, Data Science, Artificial Intelligence & Machine Learning and Mechanical Engineering.
Postgraduate
These departments offer two-year postgraduate courses in engineering. The following programmes are offered Aeronautical Engineering, Computer Science and Engineering, Structural Engineering, Transportation Engineering. Master’s in Business Administration with dual specialization in HR/Finance/Marketing is also offered by MVJCE.
Research Centres
Department of Computer Science & Engineering, Civil Engineering, Electronics and Communication Engineering, Electrical and Electronics Engineering, Mechanical Engineering and Chemistry are recognized as Research Centres by VTU to pursue Doctoral Programme.
Student activities
MVJCE hosts three annual festivals, a technical fest called VertechX, a cultural fest called SWAYAM and Innovation Day.
Rankings
61 among top private engineering colleges in India, The Outlook Magazine, 2020
28 among top private engineering colleges in India, India Today, 2020
37 among top private engineering colleges in India, The Week, 2020
Notable alumni
Vishnu Raj Menon, Mr India 2016 and Indian model
References
External links
Engineering colleges in Bangalore
Educational institutions established in 1982
1982 establishments in Karnataka
Affiliates of Visvesvaraya Technological University |
28462588 | https://en.wikipedia.org/wiki/Keyboard%20controller%20%28computing%29 | Keyboard controller (computing) | In computing, a keyboard controller is a device that interfaces a keyboard to a computer. Its main function is to inform the computer when a key is pressed or released. When data from the keyboard arrives, the controller raises an interrupt (a keyboard interrupt) to allow the CPU to handle the input.
If a keyboard is a separate peripheral system unit (such as in most modern desktop computers), the keyboard controller is not directly attached to the keys but receives scancodes from a microcontroller embedded in the keyboard via some kind of serial interface. In this case, the controller usually also controls the keyboard's LEDs by sending data back to the keyboard through the wire.
The IBM PC AT used an Intel 8042 chip to interface to the keyboard. This computer also controlled access to the A20 line in order to implement a workaround for a chip bug in the Intel 80286. The keyboard controller was also used to initiate a software CPU reset in order to allow the CPU to transition from protected mode to real mode because the 286 did not allow the CPU to go from protected mode to real mode unless the CPU is reset. This was a problem because the BIOS and the operating system services could only be called by programs in real mode. These behaviors have been used by plenty of software that expects this behavior, and therefore keyboard controllers have continued controlling the A20 line and performing software CPU resets even when the need for a reset via the keyboard controller was obviated by the Intel 80386's ability to switch to real mode from protected mode without a CPU reset. The keyboard controller also handles PS/2 mouse input if a PS/2 mouse port is present. Today the keyboard controller is either a unit inside a Super I/O device or is missing, having its keyboard and mouse functions handled by a USB controller and its role in controlling the A20 line handled by the chipset.
IBM
IBM plays a small role in the creation of the keyboard controller. With the IBM compatible computers, the keyboard controller or Intel 8042 keyboard controller is found on the motherboard. The controller handles input received from the computer keyboard, A20 lines, reset, deciphering scan codes, as well as the PS/2 mouse. With later models of keyboards, the 8042 was replaced with the 8742 micro-controller, which had a microprocessor, RAM, and I/O ports.
Anyone trying to use the classic 8042-style keyboard controller (KBC) found in the IBM PC/AT and nearly all later PCs typically runs into a problem with a lack of accurate documentation. The 8042 (or 8742, or any number of compatible parts built into later Super I/O chips) is actually quite well documented. The catch is that the 8042 is a programmable micro-controller with its own control software in (usually) ROM. Until recently, no one outside a few companies (IBM, AMI, Phoenix) knew exactly what the control software did.
IBM documented a number of commands the host can send to the KBC. It should be understood that all those commands are a pure software construct, with nothing about the 8042 hardware dictating that the commands need to follow any specific format, function, or that they even need to be there at all. Therefore, understanding the 8042 ROM code is the only way towards understanding exactly what the commands are and what they do, with the caveat that different controllers may and do have somewhat different code in their ROM.
List of KBC Commands
-The commands listed as “ignored” perform no function.
00h-1Fh: Read KBC RAM indirect. Not documented.
20h-3Fh: Read KBC RAM at offset 20h-3Fh. Only command 20h is documented by IBM.
40h-5Fh: Write KBC RAM indirect. Not documented.
60h-7Fh: Write KBC RAM at offset 20h-3Fh. Only command 60h is documented by IBM. The byte at offset 20h is the command byte and is treated specially.
80h-A8h: Ignored.
AAh: Self test. This command is documented, but its side effects are not.
ABh: Interface test.
ACh: Diagnostic dump. Mentioned by third parties, but not documented by IBM.
ADh: Disable keyboard.
AEh: Enable keyboard.
AFh-BFh: Ignored.
C0h: Read input port.
C1h: Continuous input port poll, high nibble. Mentioned by third parties, but not documented by IBM.
C2h: Continuous input port poll, low nibble. Mentioned by third parties, but not documented by IBM.
C3h-CFh: Ignored.
D0h: Read output port.
D1h: Write output port.
D2h-DEh: Ignored.
E0h: Read test inputs.
E1h-EFh: Ignored.
F0h-FFh: Pulse output bits.
Conclusion
A portable computing device comprising:a keyboard controller having a first input for receiving keystroke inputs and having an output for conveying said keystroke inputs to a main processor; and a secondary processor having an interface to said keyboard controller through a secondary bus, said secondary bus also being used to communicate with a battery module, wherein said keyboard controller also conveys said keystroke inputs to said secondary processor through said secondary bus. The keyboard controller is programmed to support the IBM® compatible personal computer keyboard serial interface. The keyboard controller receives serial data from the keyboard, checks the parity of the data, translates the scan code, and presents the data to the system as a byte of data in its output buffer. The controller will interrupt the system when data is placed in its output buffer. The byte of data will be sent to the keyboard serially with an odd parity bit automatically inserted. The keyboard is required to acknowledge all data transmissions. No transmission should be sent to the keyboard until acknowledgment is received for the previous byte sent. The keyboard controller and BIOS to improve the performance of IBM PC machines and their compatibles. A hardwired methodology is used in this keyboard controller instead of a software implementation, as in the traditional 8042 keyboard BIOS. This enables the keyboard controller to respond instantly to all commands sent from the keyboard to the CPU BIOS. This enables popular programs such as Microsoft® Windows™, NOVELL®, and other programs to run much faster.
See also
Keyboard buffer
AT keyboard
KVM extender
Embedded controller: The Intel 8042 and other keyboard controllers used in computers based on the IBM PC/AT design can be considered embedded controllers.
All Data Sheet
References
External links
keyboard controller - Computer Dictionary
KBD43W13 Keyboard and PS/2 Mouse Controller
Computer keyboards |
4913316 | https://en.wikipedia.org/wiki/Windows%20Live%20Mesh | Windows Live Mesh | Windows Live Mesh (formerly known as Windows Live FolderShare, Live Mesh, and Windows Live Sync) is a free-to-use Internet-based file synchronization application by Microsoft designed to allow files and folders between two or more computers to be in sync with each other on Windows (Vista and later) and Mac OS X (v. 10.5 Leopard and later, Intel processors only) computers or the Web via SkyDrive. Windows Live Mesh also enabled remote desktop access via the Internet.
Windows Live Mesh was part of the Windows Live Essentials 2011 suite of software. However this application was replaced by SkyDrive for Windows application in Windows Essentials 2012 and later OneDrive in Windows 8/8.1/10. Microsoft announced on December 13, 2012 that Windows Live Mesh would be discontinued on February 13, 2013.
Features
Features of Windows Live Mesh include:
Ability to sync up to 200 folders with 100,000 files each (each file up to 40 GB) for PC-to-PC synchronization
Ability to sync up to 5 GB of files to "SkyDrive synced storage" in the cloud
Remote Desktop access via Windows Live Mesh and the Windows Live Devices web service
PC-to-PC synchronisation of application settings for applications such as:
Windows Internet Explorer - synchronisation of favorites and recently typed URLs between computers
Microsoft Office - synchronisation of dictionaries, Outlook email signatures, styles and templates between computers
History
FolderShare and Windows Live Sync
Microsoft bought FolderShare from ByteTaxi Inc. on November 3, 2005, and subsequently made it a part of their Windows Live range of services.
On March 10, 2008, Microsoft released its first user visible update to the then Windows Live FolderShare. This comprised a rewrite of the FolderShare website and an updated Windows Live FolderShare client. Support for discussion groups and Remote Desktop Search was also removed in the update. The new client had some user interface and branding updates and contained several bug fixes - including official support for Windows Vista and discontinued support for Windows 2000.
Since its rebrand as Windows Live FolderShare, the client and service had undergone extensive platform changes, switching from the original LAMP which it was originally built on when acquired, to the Windows Server platform. In the Windows Live Essentials "Wave 3" release, Windows Live FolderShare was again rebranded as Windows Live Sync. New UI improvements were also announced to be part of the "Wave 3" release, integrating it with other Windows Live services. New features of the then Windows Live Sync "Wave 3" compared to FolderShare included increased limit of sync folders, integration with Windows Live ID, integration with Recycle Bin, unicode support, support for Mac OS X, and integration with Windows Live Photo Gallery and Windows Live Toolbar to sync photo albums and favorites between PCs. Windows Live Sync Wave 3 was released on December 11, 2008, and an update of Windows Live Sync for Mac was released on November 2, 2009 to add support for Mac OS X 10.6.
Live Mesh Beta
Microsoft released the Live Mesh technology preview on April 23, 2008, a data synchronization system that allowed files, folders and other data to be shared and synchronized across multiple personal devices and up to 5 GB on the web. Live Mesh was based on FeedSync technologies to convey the changes made in each device so that the changes can be synchronized across all devices and the cloud. The information about devices and folders participating in a synchronization relationship was not stored locally but at the service-end.
The Live Mesh software, called Mesh Operating Environment (MOE), was available for Windows XP, Windows Vista, Windows 7, Mac OS X, as well as Windows Mobile 6. It could be used to create and manage the synchronization relationships between devices and data. Live Mesh also included a cloud storage component, called Live Desktop, which was an online storage service that allows synchronized folders to be accessible via a website. Live Mesh also provided a remote desktop software called Live Mesh Remote Desktop that could be used to remotely connect to and manage any of the devices in a synchronization relationship. Live Mesh Remote Desktop allowed users to control their devices from the Live Mesh application, as well as from any other internet connected PC.
Live Mesh also included a developer component, which consisted of a set of protocols and Application Programming Interfaces (API) known as Live Framework (which was also briefly known as MeshFX). It was a REST-based API for accessing the Live Mesh services over HTTP. Microsoft had also provided APIs for managed code (including .NET Framework and Microsoft Silverlight) as well as for Win32 and JavaScript via a developer Software Development Kit (SDK). Unlike the Mesh Operating Environment (MOE), which was limited to sharing folders, the Live Framework APIs could be used to share any data item between devices that recognize the data. The API encapsulated the data into a Mesh Object—the native synchronization unit of Live Mesh—which was then tracked for changes and synchronized. A Mesh Object consisted of a collection of Data Feeds, which could be represented in Atom, RSS, JSON or Plain Old XML formats. The data entries within these feeds were synchronized via the FeedSync protocol. The MOE software also created Mesh Objects for each Live Mesh folder in order for them to be synchronized. However, the Live Framework APIs were discontinued on September 8, 2010 with the aim of being integrated into Windows Live Messenger Connect in the "Wave 4" release. Live Mesh beta was officially discontinued on March 31, 2011.
Windows Live Mesh 2011
A beta version Windows Live Sync "Wave 4" was released on June 24, 2010. This new version, while initially branded Windows Live Sync, was the first version which was built using both FolderShare and Live Mesh technologies. Compared to the "Wave 3" version of Windows Live Sync, the new version featured increased limit of sync folders and files, ability to sync up to 2 GB of files to the cloud on Windows Live SkyDrive synced storage, addition of Live Mesh's remote desktop access via Windows Live Devices, and ability to sync application settings for Internet Explorer and Microsoft Office. This new version of Windows Live Sync was also designed to be completely separate from both the previous versions of Windows Live Sync and Live Mesh, and as such any previous synchronisation relationships were not retained when being upgraded from Windows Live Sync "Wave 3" and Live Mesh. The previous Windows Live Sync "Wave 3" website, and the Live Mesh Desktop, was also replaced by the new Windows Live Devices service in the "Wave 4" release.
The beta was subsequently updated on August 17, 2010, and on August 29, 2010, the service was officially rebranded as Windows Live Mesh, and its cloud-based SkyDrive synced storage was increased to 5 GB, as was the case for the previous Live Mesh service. The new version also allows users to sync hidden files, view a list of missing files that are awaiting to be synchronised, and various performance improvements. The final version of Windows Live Mesh 2011 (Wave 4) was released on September 30, 2010 as part of Windows Live Essentials 2011.
SkyDrive
Microsoft announced on February 20, 2012 that Windows Live Mesh is set to be superseded by a new SkyDrive desktop application, where the cloud storage portion for the application will utilize the full 7 GB SkyDrive storage (or more if the user has purchased additional storage), rather than the limited 5 GB "SkyDrive synced storage" in the current version of Windows Live Mesh. However, the new SkyDrive desktop application will not support direct PC-to-PC synchronization, and must utilize the SkyDrive cloud storage for synchronization between two or more devices. On August 7, 2012, Microsoft released Windows Essentials 2012, where it was announced that Windows Live Mesh would be removed and replaced by the SkyDrive for Windows desktop application if a user upgrades from Windows Live Essentials 2011.
The Remote Desktop feature from Live Mesh, which allowed users to access the remote computer from the web browser, is not transferred to Skydrive. Users are directed to use Remote Desktop from a Windows computer instead.
Windows Live Mesh was discontinued on February 13, 2013 and some of the functionality is replaced by SkyDrive.
On January 27, 2014, Microsoft announced the rebranding of SkyDrive to "OneDrive".
References
External links
Official website (Archive)
Mesh
Data synchronization
Cloud storage
File hosting for macOS
File hosting for Windows
File sharing software |
29569369 | https://en.wikipedia.org/wiki/Tandon%20Corporation | Tandon Corporation | Tandon Corporation was an American disk drive and PC manufacturer founded in 1975 (incorporated in 1976 as Tandon Magnetics Corp.) by Sirjang Lal Tandon a former mechanical engineer. The company originally produced magnetic recording read/write heads for the then-burgeoning floppy-drive market. Due to the labor-intensive nature of the product, production was carried out in low-wage India which was the key to the company's competitiveness. In the late 1970s, Tandon developed direct equivalents to Shugart floppy drives, and is credited with the invention of DS/DD (double-sided, double-density) versions which became its primary product in the early 1980s.
In 1979, Tandon introduced the TM-100 diskette drive, a 5.25" unit with 40 track support as opposed to the Shugart SA-400's 35 tracks. When Tandy introduced the TRS-80 Model III in 1980, they equipped the computer with TM-100s. The following year, Tandon obtained an even more lucrative contract when IBM released its Personal Computer. Until 1985, Tandon were the sole supplier of floppy drives for IBM PCs, initially the same single-sided unit used in the TRS-80, then the newer double-sided TM-100-2. Tandon would become the world's largest independent producer of disk drives for personal computers and word processors.
In the mid-1980s, Tandon introduced a line of hard disk drives, making several models of the same basic design with a P shaped top cover and a pinion rack stepper motor off to the side. They also introduced portable hard disk drives that could be easily removed from personal computers. A major decline in North American computer sales during 1984–85 as well as competition from Japanese and Taiwanese manufacturers proved difficult for the company. Tandon in April of 1987 purchased hard disk drive maker Atasi Corporation for $5 million in stock. This was done so they could improve the capacity of their disk drive lines, as Atasi offered up to 170 MB while Tandon only did 50 MB. Tandon sold its original data-storage business to Western Digital for nearly $80 million in 1988.
The company then brought in former IBM and other computer industry executives in an attempt to remake the company as a leading producer of personal computers. By 1989, nearly all (90 percent) of its personal computer sales were in Europe, and its stock price had fallen from a 1983 peak of $34.25 to $0.50.
PCX
The PCX computer was made in 1986. It came normally with 256 KB of RAM, 80 column monitor, 2 x 360 KBs 5 diskette drives, 1 x 10 MBs hard disk drive, MS-DOS and GW-BASIC.
References
American companies established in 1975
American companies disestablished in 1993
Computer companies established in 1975
Computer companies disestablished in 1993
Computer storage companies
Defunct computer companies of the United States
Defunct computer hardware companies |
27722518 | https://en.wikipedia.org/wiki/Jeff%20Kosseff | Jeff Kosseff | Jeff Kosseff is a cybersecurity law professor at the United States Naval Academy. He is a lawyer who previously practiced media, cybersecurity, and privacy law at Covington & Burling LLP and former journalist. Before becoming an attorney, he was a Washington, D.C. reporter for The Oregonian, a major newspaper based in Portland, Oregon, and was a finalist for the Pulitzer Prize and recipient of the George Polk Award.
Education
He graduated from the University of Michigan with bachelor's and master's degrees, and from Georgetown University Law Center with a doctorate of jurisprudence.
Personal
He lives with his wife and daughter in the Washington, D.C. area.
Career
Kosseff teaches, researches, and writes about cybersecurity law at the United States Naval Academy, where he is an assistant professor in the Cyber Science department. Previously, as a lawyer at Covington & Burling, he represented media and technology companies in a wide range of First Amendment and privacy issues. Among his representative matters, he advocated for federal shield law for journalists on behalf of a coalition of more than 70 media organizations. He frequently writes and speaks about the First Amendment and privacy law. The Information & Privacy Commissioner of Ontario has named Kosseff a Privacy by Design Ambassador. Kosseff is an adjunct professor of communications law at American University's School of Communications, and he serves on the board of directors of the Writer's Center in Bethesda and Advocates for Survivors of Torture and Trauma in Washington, D.C.
Before joining Covington, Kosseff clerked for Judge Milan Smith of the U.S. Court of Appeals for the Ninth Circuit and Judge Leonie Brinkema of the U.S. District Court for the Eastern District of Virginia.
As a journalist, Kosseff worked for The Oregonian in its Washington, D.C. bureau from 2004 through 2008. Previously, he had spent three years covering technology for The Oregonian.
Awards
2006 George Polk Award
2007 Pulitzer Prize for National Reporting finalist
Bibliography
References
External links
Covington & Burling LLP -- Jeff Kosseff
International Association of Privacy Professionals -- Jeff Kosseff
"Jeff Kosseff: The FishbowlDC Interview", April 19, 2007
American male journalists
American lawyers
First Amendment scholars
George Polk Award recipients
Journalists from Portland, Oregon
Living people
Year of birth missing (living people)
University of Michigan alumni
The Oregonian people
Georgetown University Law Center alumni
United States Naval Academy faculty |
216881 | https://en.wikipedia.org/wiki/Electronic%20design%20automation | Electronic design automation | Electronic design automation (EDA), also referred to as electronic computer-aided design (ECAD), is a category of software tools for designing electronic systems such as integrated circuits and printed circuit boards. The tools work together in a design flow that chip designers use to design and analyze entire semiconductor chips. Since a modern semiconductor chip can have billions of components, EDA tools are essential for their design; this article in particular describes EDA specifically with respect to integrated circuits (ICs).
History
Early days
Prior to the development of EDA, integrated circuits were designed by hand and manually laid out. Some advanced shops used geometric software to generate tapes for a Gerber photoplotter, responsible for generating a monochromatic exposure image, but even those copied digital recordings of mechanically drawn components. The process was fundamentally graphic, with the translation from electronics to graphics done manually; the best-known company from this era was Calma, whose GDSII format is still in use today. By the mid-1970s, developers started to automate circuit design in addition to drafting and the first placement and routing tools were developed; as this occurred, the proceedings of the Design Automation Conference catalogued the large majority of the developments of the time.
The next era began following the publication of "Introduction to VLSI Systems" by Carver Mead and Lynn Conway in 1980; this groundbreaking text advocated chip design with programming languages that compiled to silicon. The immediate result was a considerable increase in the complexity of the chips that could be designed, with improved access to design verification tools that used logic simulation. Often the chips were easier to lay out and more likely to function correctly, since their designs could be simulated more thoroughly prior to construction. Although the languages and tools have evolved, this general approach of specifying the desired behavior in a textual programming language and letting the tools derive the detailed physical design remains the basis of digital IC design today.
The earliest EDA tools were produced academically. One of the most famous was the "Berkeley VLSI Tools Tarball", a set of UNIX utilities used to design early VLSI systems. Still widely used are the Espresso heuristic logic minimizer, responsible for circuit complexity reductions and Magic, a computer-aided design platform. Another crucial development was the formation of MOSIS, a consortium of universities and fabricators that developed an inexpensive way to train student chip designers by producing real integrated circuits. The basic concept was to use reliable, low-cost, relatively low-technology IC processes and pack a large number of projects per wafer, with several copies of chips from each project remaining preserved. Cooperating fabricators either donated the processed wafers or sold them at cost, as they saw the program helpful to their own long-term growth.
Birth of commercial EDA
1981 marked the beginning of EDA as an industry. For many years, the larger electronic companies, such as Hewlett Packard, Tektronix and Intel, had pursued EDA internally, with managers and developers beginning to spin out of these companies to concentrate on EDA as a business. Daisy Systems, Mentor Graphics and Valid Logic Systems were all founded around this time and collectively referred to as DMV. In 1981, the U.S. Department of Defense additionally began funding of VHDL as a hardware description language. Within a few years, there were many companies specializing in EDA, each with a slightly different emphasis.
The first trade show for EDA was held at the Design Automation Conference in 1984 and in 1986, Verilog, another popular high-level design language, was first introduced as a hardware description language by Gateway Design Automation. Simulators quickly followed these introductions, permitting direct simulation of chip designs and executable specifications. Within several years, back-ends were developed to perform logic synthesis.
Current status
Current digital flows are extremely modular, with front ends producing standardized design descriptions that compile into invocations of units similar to cells without regard to their individual technology. Cells implement logic or other electronic functions via the utilisation of a particular integrated circuit technology. Fabricators generally provide libraries of components for their production processes, with simulation models that fit standard simulation tools.
Most analog circuits are still designed in a manual fashion, requiring specialist knowledge that is unique to analog design (such as matching concepts). Hence, analog EDA tools are far less modular, since many more functions are required, they interact more strongly and the components are, in general, less ideal.
EDA for electronics has rapidly increased in importance with the continuous scaling of semiconductor technology. Some users are foundry operators, who operate the semiconductor fabrication facilities ("fabs") and additional individuals responsible for utilising the technology design-service companies who use EDA software to evaluate an incoming design for manufacturing readiness. EDA tools are also used for programming design functionality into FPGAs or field-programmable gate arrays, customisable integrated circuit designs.
Software focuses
Design
Design flow primarily remains characterised via several primary components; these include:
High-level synthesis (additionally known as behavioral synthesis or algorithmic synthesis) – The high-level design description (e.g. in C/C++) is converted into RTL or the register transfer level, responsible for representing circuitry via the utilisation of interactions between registers.
Logic synthesis – The translation of RTL design description (e.g. written in Verilog or VHDL) into a discrete netlist or representation of logic gates.
Schematic capture – For standard cell digital, analog, RF-like Capture CIS in Orcad by Cadence and ISIS in Proteus.
Layout – usually schematic-driven layout, like Layout in Orcad by Cadence, ARES in Proteus
Simulation
Transistor simulation – low-level transistor-simulation of a schematic/layout's behavior, accurate at device-level.
Logic simulation – digital-simulation of an RTL or gate-netlist's digital (boolean 0/1) behavior, accurate at boolean-level.
Behavioral simulation – high-level simulation of a design's architectural operation, accurate at cycle-level or interface-level.
Hardware emulation – Use of special purpose hardware to emulate the logic of a proposed design. Can sometimes be plugged into a system in place of a yet-to-be-built chip; this is called in-circuit emulation.
Technology CAD simulate and analyze the underlying process technology. Electrical properties of devices are derived directly from device physics.
Electromagnetic field solvers, or just field solvers, solve Maxwell's equations directly for cases of interest in IC and PCB design. They are known for being slower but more accurate than the layout extraction above.
Analysis and verification
Functional verification
Clock domain crossing verification (CDC check): similar to linting, but these checks/tools specialize in detecting and reporting potential issues like data loss, meta-stability due to use of multiple clock domains in the design.
Formal verification, also model checking: attempts to prove, by mathematical methods, that the system has certain desired properties, and that certain undesired effects (such as deadlock) cannot occur.
Equivalence checking: algorithmic comparison between a chip's RTL-description and synthesized gate-netlist, to ensure functional equivalence at the logical level.
Static timing analysis: analysis of the timing of a circuit in an input-independent manner, hence finding a worst case over all possible inputs.
Physical verification, PV: checking if a design is physically manufacturable, and that the resulting chips will not have any function-preventing physical defects, and will meet original specifications.
Manufacturing preparation
Mask data preparation or MDP - The generation of actual lithography photomasks, utilised to physically manufacture the chip.
Chip finishing which includes custom designations and structures to improve manufacturability of the layout. Examples of the latter are a seal ring and filler structures.
Producing a reticle layout with test patterns and alignment marks.
Layout-to-mask preparation that enhances layout data with graphics operations, such as Resolution enhancement techniques or RET – methods for increasing the quality of the final photomask. This also includes Optical proximity correction or OPC – the up-front compensation for diffraction and interference effects occurring later when chip is manufactured using this mask.
Mask generation – The generation of flat mask image from hierarchical design.
Automatic test pattern generation or ATPG – The generation of pattern data systematically to exercise as many logic-gates and other components as possible.
Built-in self-test or BIST – The installation of self-contained test-controllers to automatically test a logic or memory structure in the design
Functional safety
Functional safety analysis, systematic computation of failure in time (FIT) rates and diagnostic coverage metrics for designs in order to meet the compliance requirements for the desired safety integrity levels.
Functional safety synthesis, add reliability enhancements to structured elements (modules, RAMs, ROMs, register files, FIFOs) to improves fault detection / fault tolerance. These includes (not limited to), addition of error detection and / or correction codes (Hamming), redundant logic for fault detection and fault tolerance (duplicate / triplicate) and protocol checks (Interface parity, address alignment, beat count)
Functional safety verification, running of a fault campaign, including insertion of faults into the design and verification that the safety mechanism reacts in an appropriate manner for the faults that are deemed covered.
Companies
Old companies
Market capitalization and company name :
$5.77 billion – Synopsys
$4.46 billion – Cadence
$2.33 billion – Mentor Graphics
$507 million – Magma Design Automation; Synopsys acquired Magma in February 2012
NT$6.44 billion – SpringSoft; Synopsys acquired SpringSoft in August 2012
¥11.95 billion – Zuken Inc.
Note: EEsof should likely be on this list, but it does not have a market cap as it is the EDA division of Keysight.
Acquisitions
Many EDA companies acquire small companies with software or other technology that can be adapted to their core business. Most of the market leaders are amalgamations of many smaller companies and this trend is helped by the tendency of software companies to design tools as accessories that fit naturally into a larger vendor's suite of programs on digital circuitry; many new tools incorporate analog design and mixed systems. This is happening due to a trend to place entire electronic systems on a single chip.
See also
Computer-aided design (CAD)
Circuit design
EDA database
Signoff (electronic design automation)
Comparison of EDA software
Platform-based design
References
Notes
http://www.staticfreesoft.com/documentsTextbook.html Computer Aids for VLSI Design by Steven M. Rubin
Fundamentals of Layout Design for Electronic Circuits, by Lienig, Scheible, Springer, , 2020
VLSI Physical Design: From Graph Partitioning to Timing Closure, by Kahng, Lienig, Markov and Hu, , 2011
Electronic Design Automation For Integrated Circuits Handbook, by Lavagno, Martin, and Scheffer, , 2006
The Electronic Design Automation Handbook, by Dirk Jansen et al., Kluwer Academic Publishers, , 2003, available also in German (2005)
Combinatorial Algorithms for Integrated Circuit Layout, by Thomas Lengauer, , Teubner Verlag, 1997.
Electronic engineering |
40057171 | https://en.wikipedia.org/wiki/BeAnywhere | BeAnywhere | BeAnywhere is a cloud computing company, focused on the development of cloud-based "advanced remote solutions". It has two main products: BASE (BeAnywhere Support Express) and inSight. BeAnywhere's software provides remote access, support, management and monitoring to workstations, including technical support, desktop sharing, file transfer, and administration tools.
True to a software-as-a-service business model, all BeAnywhere updates are free to subscribers. It is also available a 14-day trial with full features and a free professional version with limited number of sessions. BeAnywhere is compatible with iOS, Android, OS X and Microsoft Windows and it can also be used through a Java-based web console.
History
BeAnywhere was a Portuguese company owned by Ruben Dias with special focus in the development of advanced remote solutions. Before BeAnywhere, Dias had founded Euro Carisma, a computer security company which, within five years, had become one of the first Country Partners of Panda Security, a leading Anti-Virus software vendor. This background in information technology motivated Dias to found BeAnywhere in 1996. Since then, the company has expanded into the Brazilian and Portuguese Markets, and continues to show growth.
1996 - BeAnywhere was founded. It is the only remote access software producer based in Portugal
2002 - Started developing the protocol for the group's internal use
2008 - Started concept for a corporate version of BAPE (future BASE - BeAnywhere Support Express - product)
2012 - First steps to test international markets with BASE
2013 - Started BeAnywhere inSight development
2014 - BASE reaches 3 million sessions. Open offices in Canada
2015 - BeAnywhere inSight commercial launch. BASE reaches 9 million sessions.
2015 - BeAnywhere has been acquired by SolarWinds
Main features
The main features of BeAnywhere's software are:
Desktop sharing, enabling access to and control of any computer, regardless of geographic limitations. The only requirement is an internet connection. The software also allows unattended remote access.
File transfer facilities, enabling the transfer of files between remote and local machines.
An Admin Area, with options and configurations allowing managers to control technicians and support hours, view reports, create inquiries and perform other business administration tasks.
In-session chat and VoIP facilities, enabling VoIP calls with clients and collaborative tech sessions.
The software requires no installation and works regardless of firewalls and other hardware configurations.
Security
The whole session (all of its content, including but not limited to Remote Desktop, VoIP, file transfer...) is sheltered by a proprietary protocol, with guaranteed global security by the Rijndael algorithm – Advanced Encryption Standard or (AES) – using a 256 bit cipher, whether when establishing the session (the subsequently used key exchange is protected by an SSL based in Advanced Encryption Standard-cipher block chaining with Transport Layer Security v1.1) or all the way through it. Additionally, all commands containing images, computer keyboard and mouse strokes, file transfer and clipboard information have a Digital signature. All encryption is based on an end-to-end negotiation, preventing the transferred information from being intercepted or decoded it in the gateway. The encryption keys are randomly generated in each session between the Viewer and the Applet or the BASE Agent. The client can also configure a Master-Password or choose Windows Account authentication, as well as requiring a previous authorization for the machine user to launch of the session.
References
External links
Software companies of Portugal
1996 establishments in Portugal |
5842202 | https://en.wikipedia.org/wiki/Hidden%20file%20and%20hidden%20directory | Hidden file and hidden directory | In computing, a hidden folder (sometimes hidden directory) or hidden file is a folder or file which filesystem utilities do not display by default when showing a directory listing. They are commonly used for storing user preferences or preserving the state of a utility and are frequently created implicitly by using various utilities. They are not a security mechanism because access is not restricted – usually the intent is simply to not "clutter" the display of the contents of a directory listing with files the user did not directly create.
Unix and Unix-like environments
In Unix-like operating systems, any file or folder that starts with a dot character (for example, ), commonly called a dot file or dotfile, is to be treated as hidden – that is, the ls command does not display them unless the -a or -A flags (ls -a or ls -A) are used. In most command-line shells, wildcards will not match files whose names start with . unless the wildcard itself starts with an explicit . .
A convention arose of using dotfiles in the user's home directory to store per-user configuration or informational text. Early uses of this were the well-known dotfiles .profile, .login, and .cshrc, which are configuration files for the Bourne shell and C shell and shells compatible with them, and .plan and .project, both used by the finger and name commands.
Many applications, from bash to desktop environments such as GNOME, now store their per-user configuration this way, but the Unix/Linux freedesktop.org XDG Base Directory Specification aims to migrate user config files from individual dotfiles in $HOME to non-hidden files in the hidden directory $HOME/.config.
Android
The Android operating system uses empty .nomedia files to tell smartphone apps not to display or include the contents of the folder. This prevents digital photos and digital music files from being shown in picture galleries or played in MP3 player apps. This is useful to prevent downloaded voicemail files from playing between the songs in a playlist, and to keep personal photos private while still allowing those in other folders to be shared in person with friends, family, and colleagues. The .nomedia file has no effect on the filesystem or even the operating system, but instead depends entirely on each individual app to respect the presence of the different files.
GNOME
In the GNOME desktop environment (as well as all programs written using GLib), filenames listed in a file named .hidden in each directory are also excluded from display. In GNOME's file manager, the keyboard shortcut + enables the display of both kinds of hidden files.
macOS
In addition to the "dotfile" behaviour, files with the "Invisible" attribute are hidden in Finder, although not in ls. The "Invisible" attribute can be set or cleared using the SetFile command; for example, invoking SetFile -a V jimbo will hide the file "jimbo". Starting in Mac OS X Snow Leopard, the chflags command can also be used; for example, chflags hidden jimbo will hide the file "jimbo".
DOS and MS Windows
In DOS systems, file directory entries include a Hidden file attribute which is manipulated using the command. Using the command line command dir /ah displays the files with the Hidden attribute. In addition, there is a System file attribute that can be set on a file, which also causes the file to be hidden in directory listings. Use the command line command dir /as to display the files with the System attribute.
Under Windows Explorer, Hidden files and directories are, by default, not displayed - though they are still accessible by entering the full path into the explorer address bar. System files are displayed, unless they are also hidden. There are two options that enable the display of hidden files. The main 'Hidden files and folders' option can be used to turn on the display of hidden files but this won't, on its own, display hidden system files. A second option, 'Hide protected operating system files' additionally needs to be turned off in order for hidden system files to be shown. Hidden files are displayed with a slight transparency, so even when they are visible they are visually delineated from non-hidden files.
Under Windows Explorer, the content of a directory can also be hidden just by appending a pre-defined CLSID to the end of the folder name. The directory is still visible, but its content becomes one of the Windows Special Folders. However, the real content of this directory can still be seen using the CLI command dir.
References
External links
Bellevue Linux Users Group:
Computer Hope: Microsoft DOS command
.NOMEDIA file
Computer file systems
Metadata |
18394233 | https://en.wikipedia.org/wiki/Boxee | Boxee | Boxee was a cross-platform freeware HTPC (Home Theater PC) software application with a 10-foot user interface and social networking features designed for the living-room TV. It enabled its users to view, rate and recommend content to their friends through many social network services and interactive media related features.
Boxee was originally a fork of the free and open source XBMC (now Kodi) media center software which Boxee used as an application framework for its GUI and media player core platform, together with some custom and proprietary additions.
Marketed as the first ever "Social Media Center", the first public alpha of Boxee was made available on 16 June 2008. The UI design of the Alpha prototype was designed with design firm Method Incorporated, who also created Boxee's brand identity. The first public beta version was officially released for all previously supported platforms on 7 January 2010. Boxee gained the ability to watch live TV on the Boxee Box using a live TV stick in January 2012. By the end of 2012 the developers had discontinued all desktop versions and support.
Boxee co-developed a dedicated set-top box (hardware) called "Boxee Box by D-Link" in cooperation with D-Link which was the first "Powered by Boxee" branded device to be announced and launched, as well as a similar media player device called "Iomega TV with Boxee" (available in the UK & Europe) in cooperation with Iomega and a 46" high-definition television from ViewSonic with integrated Boxee software.
Boxee was owned and developed by a single for-profit startup company, (Boxee, Inc.), which began as a high tech stealth startup based in Israel and the United States with seed money from several angel investors, & was then known to be financially backed by venture capital firms such as General Catalyst Partners, Union Square Ventures, Softbank, Pitango, Spark Capital and Globis Capital Partners. The company's main offices are located at 122 West 26th Street, 8th Floor, New York, NY 10001.
On Wednesday, 3 July 2013 online media sources revealed Samsung would hire key employees and purchase Boxee's assets for around $30M. Samsung confirmed the acquisition with The New York Times, but did not disclose the amount.
Overview
Boxee supported a wide range of popularly used multimedia formats, and it included features such as playlists, audio visualizations, slideshows, weather forecasts reporting, and an array of third-party plugins. As a media center, Boxee could play most audio and video file containers, as well as display images from many sources, including CD/DVD-ROM drives, USB flash drives, the Internet, and local area network shares.
When run on modern PC hardware, Boxee was able to decode high-definition video up to 1080p. Boxee was able to use DXVA (DirectX Video Acceleration) on Windows Vista and newer Microsoft operating-systems to utilize GPU accelerated video decoding to assist with process of video decoding of high-definition videos.
With its Python-powered plugin system, the Boxee software incorporated features such as Apple movie trailer support and subtitle downloading, access to large on-demand video streaming services Netflix, Headweb and Vudu; a range of popular online internet content channels like audio services Pandora Radio, Last.fm, Jamendo, NPR, SHOUTcast radio streams; video services from ABC, BBC iPlayer, Blip.TV, CNET, CNN, CBS, Comedy Central, Funny or Die, Joost, Major League Baseball, NHL Hockey, MTV Music, MySpaceTV, Revision3, MUBI, OpenFilm, SnagFilms, IndieMoviesOnline, EZTakes, United Football League, Vevo, Vice Magazine, TED, The WB Television Network, YouTube and image services from Flickr and PicasaWeb picture viewing plugins. All were available as media sources available alongside the local library.
Some of the services were via specialized connections (e.g., YouTube), while the rest were a preselected list of podcast channels for streaming using generic RSS web feeds (e.g., BBC News). Boxee also supported NBC Universal's Hulu quite early on, but in February 2009 was asked by Hulu to remove the service at the request of Hulu's content partners. Boxee later reinstated the feature using Hulu's RSS feeds, but Hulu once again blocked access.
Even though both the Boxee App and the Boxee Box supported Netflix, the Boxee App supported only a limited instant queue, missing more recent TV shows and movies available through the web browser and iPhone apps.
In 2009, Boxee introduced a new plugin architecture based on the XUL (XULRunner) framework which technically allows any web-based application to be ported into an application for Boxee integration. Because of this Boxee could utilise Mozilla corebase architecture for most of its plugins – since this is the same core architecture used by Firefox, Hulu saw Boxee as "any other Mozilla browser so Hulu doesn't block the app." Hulu continued to thwart Boxee using strategies like JavaScript scrambling.
Boxee was able to play Adobe Flash content from sites such as YouTube and Hulu, and display HTML5 or Silverlight content from such web-based services such as HBO Go and Netflix. Boxee shipped with a closed source, binary-only, program called "bxflplayer", which was used to load Adobe Flash Player and Microsoft Silverlight proprietary plugins. This program communicated with the main Boxee process via shared memory and rendered the video onto screen. By using this approach, it was possible for Boxee not only to play Flash Video and Silverlight content that was protected by DRM (Digital Rights Management) but also allowed for the user to control the player using a remote control and other input devices that were more suitable to laid back watching. It was not clear if this way of using "bxflplayer" as closed source libraries with a GPL licensed software passes as GPL linking exception or not.
Boxee source code was otherwise in majority based on the XBMC (now Kodi) media center project's source code which Boxee used as its software framework, and the Boxee developers contributed changes to that part back upstream to the XBMC project. So Boxee was partially open-source, and those parts were distributed under the GNU General Public License, however Boxee's social networking layer library, "libboxee" was closed source as it dealt with proprietary methods of communication with Boxee's online back-end server which handled the user account information and social network communications between the users in the Boxee userbase. It is not clear if this way of using closed source libraries with a GPL licensed software passed the GPL linking exception or not.
Features
Social Networking Layers
The social networking component of Boxee was the differentiator from other media center software.
Boxee required registered user accounts, which formed a social network of fellow Boxee users. Users could follow the activity of other Boxee users who were added as friends, and could publicly rate and recommend content. Users could also control what media appear in the activity feed in order to maintain privacy. If a user recommended something that was freely available from an internet content service then Boxee would let others users stream it directly. If a user recommended something that was not freely available then Boxee would try to show metadata, and movie trailers if it was a movie that the user recommended.
The user's friends' Boxee activity feeds were displayed on the user's home screen, as was the user's own recent activity. Internet content was accessed through a sub-menu of each of the video, audio, and photo menu items, such as Video -> "My videos" and Video -> "Internet videos".
In addition Boxee Beta and later had the option of monitoring Twitter and Facebook news feeds to automatically discover links to videos. Boxee would then add those videos to a watch queue in Boxee so they could be later viewed.
Boxee could also export a user's media activity feed to other social networking services such as FriendFeed, Twitter, and Tumblr. Through FriendFeed, Twitter, and Tumblr it was possible from those third-party social networking services for a user to choose to post the Boxee activity feed to social networking sites such as Facebook, (through FriendFeed, Twitter, and Tumblr apps for Facebook).
Boxee AppBox Add-on Store and plugin Apps (widgets/gadgets)
Boxee's "AppBox" app store "App Store" which was a digital distribution service platform that served add-on apps and plug-ins that provide online content to Boxee, the "AppBox" allowed users to download new apps and addons directly from Boxee's GUI. Many of these sources were in high definition and use streaming sites' native flash and Silverlight players. Boxee had extensibility and integration with online sources for free and premium streaming content.
AppBox offered content including commercial video, educational programming, and media from individuals and small businesses.
Boxee also encouraged users to make and submit their own add-on apps and plug-ins to add additional content accessible from within Boxee.
Audio/video playback and handling
Boxee could play multimedia files from CD/DVD media using the system's DVD-ROM drive, local hard disk drive, or stream them over SMB/SAMBA/CIFS shares, or UPnP (Universal Plug and Play) shares.
Boxee was designed to take advantage of the system's network port if a broadband Internet connection was available, enabling the user to get information from such sites as IMDb, TV.com and AMG.
Boxee could stream Internet-video-streams, and play Internet-radio-stations (such as SHOUTcast). Boxee also included the option to submit music usage statistics to Last.fm and a weather-forecast (via weather.com). It also had music/video-playlist features, picture/image-slideshow functions, an MP3+CDG karaoke function (not available on the Boxee Box) and many audio-visualizations and screensavers.
Boxee could in addition upscale/upconvert all 480p/576p standard-resolution videos and output them to 720p, 1080i, or 1080p HDTV-resolutions.
Boxee could be used to play most common multimedia containers and formats from a local source, (except those protected by those with DRM encryption). It could decode these in software, or optionally pass-through AC3/DTS audio from movies directly to S/PDIF output to an external audio amplifier or receiver for decoding on that device.
Video playback in detail
The Video Library, one of the Boxee metadata databases, was a key feature of Boxee. It allowed for the automatic organization of a users' video content by information associated with the video files (movies and recorded TV Shows) themselves.
The Library Mode view in Boxee allowed a user to browse video content by categories such as Genre, Title, Year, Actors and Director.
Boxee had the capability to on the fly parse and play DVD-Video movies that are stored in ISO and IMG DVD-images, DVD-Video movies that are stored as DVD-Video (IFO/VOB/BUP) files on a hard-drive or network-share, and also ISO and IMG DVD-images directly from RAR and ZIP archives. It also offered software upscaling/upconverting of all DVD-Video movies when outputting them to an HDTV in 720p, 1080i or 1080p.
Audio playback in detail
The Music Library was another key feature of Boxee. It automatically organized the user's music collection by information stored in the music files ID meta tags, such as title, artist, album, genre and popularity.
Boxee featured on-the-fly audio frequency resampling, gapless playback, crossfading, ReplayGain, cue sheet and Ogg Chapter support.
Digital picture/image display in detail
Boxee handled all common digital picture/image formats with the options of panning/zooming and slideshow with "Ken Burns effect", with the use of CxImage open source library code.
BitTorrent client, interface, and torrent trackers
Early builds of Boxee included a built-in BitTorrent client (not in the Windows version), with a frontend for it integrated into the Boxee interface, and there were also Torrent links to legal BitTorrent trackers download sites available incorporated by default. The built-in torrent client was later removed. Through Boxee's Python plugin system it was also possible for the end-users to make their own or add unofficial plugins made by third-party persons for other BitTorrent trackers.
Mobile software associated with Boxee
The "boxee remote") was an application released by Boxee Inc. for the Apple Inc. iOS which allowed for remote controlling of an installed and concurrently-active Boxee session on another computer via the iOS' touchscreen user interface. It was approved for the App Store on 16 March 2010.
Third-party developers also released Boxee remote control apps for Android and webOS.
This is a list of third-party companies who sold hardware bundled with Boxee media center software pre-install, or sold uninstalled systems that specifically claimed to be Boxee-compatible ("Boxee Enabled") by the manufacturer. These third-party companies directly or indirectly helped submit bug fixes back upstream to Boxee, as well as to the XBMC project which Boxee in turn used as its framework.
Boxee Box by D-Link
Boxee Box by D-Link (officially "D-Link Boxee Box DSM-380") was a Linux-based set-top device and media extender that first began shipping in 33 countries worldwide on 10 November 2010. Designed to act as a hub, to bring internet television and other video to the television via Boxee's software, it came pre-installed with Boxee media center software and the hardware was based on Intel CE4110 system-on-a-chip platform (that has a 1.2 GHz Intel Atom CPU with a PowerVR SGX535 Integrated graphics processor), 1 GB of RAM, and 1 GB of NAND Flash Memory. The DSM-380 featured output ports for HDMI (version 1.3), optical digital audio (S/PDIF) connector, and RCA connector for analog stereo audio, two USB ports, an SD card slot, wired 100 Mbit/s (100BASE-T) ethernet, and built-in 802.11n WiFi.
The Boxee Box also shipped with a small two-sided RF remote control with 4-way D-pad navigation and a full QWERTY keypad as standard, and this remote was also sold separately with a USB-receiver as "D-Link Boxee Box Remote DSM-22" which could be used with Boxee installed on a computer so one can use this remote without owning D-Link's Boxee Box The look of both the case and remote prototypes for the Boxee Box were designed by San Francisco-based Astro Studios, the same designer company that designed the look of Xbox 360 and the Microsoft Zune.
Iomega TV with Boxee by Iomega
Iomega TV with Boxee by Iomega was announced by Boxee on 4 January 2011, this was the second Boxee device to be announced It began shipping in Q1 of 2011.
The Iomega TV with Boxee was a Linux device which came pre-installed with Boxee media center software. The hardware was based on Intel CE4110 system-on-a-chip platform (that has a 1.2 GHz Intel Atom CPU with a PowerVR SGX535 Integrated graphics processor), 1 GB of RAM, and 1 GB of NAND Flash Memory. Iomega TV with Boxee features audio / video output ports for HDMI (version 1.3), optical and coaxial digital audio (S/PDIF) connectors, and RCA connector for analog stereo audio, two USB ports, wired 1 Gbit/s ethernet, and built-in 802.11n WiFi.
The Iomega TV with Boxee also shipped with a similar small two-sided RF remote control with 4-way D-pad navigation and full a QWERTY keypad as standard.
However, unlike D-Link's Boxee Box, the Iomega TV with Boxee device featured space to internally fit a 3.5-inch SATA hard drive. According to Boxee, the hard drive was not only for the Boxee software on the device but also usable as a NAS (Network Attached Storage) unit to share its media data over the network as a DLNA compliant UPnP AV media server.
Myka ION
Myka ION was an Nvidia Ion based set-top device designed to bring internet television and media stored on the home network to the living-room, it came pre-installed with Boxee, XBMC, and Hulu Desktop as applications that could be started from the main menu.
NUU Player
NUU Player by NUU Media (NUU Ltd.) was an Nvidia Ion-based set-top device designed to bring internet television and media stored on the home network to the living-room, it came pre-installed with Boxee, Hulu Desktop, and a WebKit web-browser as applications that could be started from the main menu with a remote control. It also had Skype app and Bluetooth support. Nuu has since discontinued NUU Player development and has removed any mention of it from their web site.
Programming and developing
As a partially open source application and freeware software program, Boxee was developed by a commercial start-up company with the goal of someday profiting from Boxee and their social networking service, working as a distribution application framework for both major pay-per-view and independent video on demand providers.
Boxee, like XBMC Media Center (which Boxee is based upon), was a cross-platform software programmed mostly in C++ and used the Simple DirectMedia Layer framework with OpenGL renderer for all versions of Boxee. Some of the libraries that Boxee depended on were also written in the C programming-language, but were used with a C++ wrapper and loaded via Boxee's own DLL loader when used inside Boxee.
Add-on apps (widgets/gadgets) and Python scripts as plugins
Boxee featured a Python Scripts Engine and WindowXML application framework (a XML-based widget toolkit for creating a GUI for widgets) in a similar fashion to Apple Mac OS X Dashboard Widgets and Microsoft Gadgets in Windows Sidebar. Python widget scripts allowed non-developers to themselves create new add-ons functionality to Boxee, (using the easier to learn Python programming language), without knowledge of the complex C/C++ programming language that the rest of the Boxee software was written in. plugin scripts add-ons included functions like Internet-TV and movie-trailer browsers, cinema guides, Internet-radio-station browsers (example SHOUTcast), and much more.
Boxee also introduced an additional plugin architecture based on the XUL (XULRunner) framework which enables any web-based application to be integrated into Boxee as an app add-on. With this new plugin architecture Boxee used Mozilla corebase architecture for those plugins. Since this was the same core architecture that Firefox uses, Hulu will see Boxee as any other Mozilla-based web browser.
Skins, skinning, and the skinning-engine
Boxee GUI source code was based on XBMC Media Center which was noted for having a very flexible GUI toolkit and robust framework for its GUI, using a standard XML base, making theme-skinning and personal customization very accessible. Users can create their own skin (or simply modify an existing skin) and share it with others via public websites dedicated for XBMC skins trading.
Reception
In October 2008, Boxee won Consumer Electronics Association's (CEA) i-Stage award, and with it $50,000 prize for the continued development of Boxee, as well as a free booth for the 2009 International CES (Consumer Electronics Show). Boxee donated half of the $50,000 prize money to the developers of XBMC.
On 9 January 2009, G4 announced Boxee as the winner of their "Best of the Best products of CES 2009" award (in the "Maximum Tech" category) amongst all the products displayed at CES (Consumer Electronics Show) 2009 trade show.
In January 2010, at the Consumer Electronics Show, Boxee garnered 5 awards; "LAPTOP's Best of CES 2010 – Best Home Entertainment (Boxee Box)", "Last Gadget Standing – CES 2010 Winner", "''International CES Best of Innovations 210 – Home Theater Accessories", "Popular Science – Best of CES 2010 (Products of the Future)".
In April 2011, it was made public that Boxee had violated the terms of the GPL in the way they used open source software. According to the GPLv3, which governed software in the firmware of the device, users need to be able to reinstall modifications to the device. Boxee admitted the software was included in each device, but stated that their financial agreements with other companies were at risk if they complied. Despite much user dismay there was no change in course by Boxee.
On 31 October 2012 Boxee posted a statement on their website saying they had to make a decision between releasing a device which was hackable, or one which was commercially viable with premium content.
As it stated, Boxee would have loved for the Boxee Box to be open to other software, but ultimately, they were bound by agreements with their content providers to ensure the security of the content. This started a spate of negative comments from Boxee Box users on the Boxee blog as prior Boxee promises had indicated otherwise. After less than a day, the entire Boxee page (along with the statement, the blog and its comments) was removed and replaced with a new Boxee TV website. However, the old Boxee blog was not deleted but moved.
See also
Comparison of media players
Interactive television
List of free television software
Smart TV
Web television
References
External links
(www.boxee.tv now links to and redirects to a New York Times article on its acquisition by samsung)
Free multimedia software
Free media players
Free video software
Free software programmed in C++
Free television software
MacOS media players
Software DVD players
Windows media players
Digital television
Streaming television
Software forks
Online companies of Israel
Software that uses XUL |
6980625 | https://en.wikipedia.org/wiki/IPod%20game | IPod game | An iPod click wheel game or iPod game is a video game playable on the various versions of the Apple portable media player, the iPod. The original iPod had the game Brick (originally invented by Apple co-founder Steve Wozniak) included as an easter egg hidden feature; later firmware versions added it as a menu option. Later revisions of the iPod added three more games in addition to Brick: Parachute, Solitaire, and Music Quiz. These games should not be confused with games for the iPod Touch, which require iOS and are only available on Apple's App Store on iTunes.
History
On 23rd December 2005, CoolGorilla, a new start-up, launched a trivia game for the iPod. It was titled “Rock and Pop Quiz”.
In September 2006, the iTunes Store began to offer nine additional games for purchase with the launch of iTunes 7, compatible with the fifth-generation iPod with iPod software 1.2 or later. Those games were Bejeweled, Cubis 2, Mahjong, Mini Golf, Pac-Man, Tetris, Texas Hold 'Em, Vortex, and Zuma. These games were made available for purchase from the iTunes Store for US$4.99 each. In December 2006, two more games were released by EA Mobile at the same price: Royal Solitaire and Sudoku. In February 2007, Ms. Pac-Man was released, followed in April 2007 by iQuiz. Until this time, all the available games could be purchased in a package, with no discount.
In May 2007, Apple released Lost: The Video Game by Gameloft, based on the television show. In June 2007, "SAT Prep 2008" by Kaplan was introduced as 3 separate educational games based on the subjects of writing, reading, and mathematics. In December 2007, Apple released a classic Sega game, Sonic the Hedgehog, which was originally packaged with the Sega Genesis system in the early 1990s.
With third parties like Namco, Square Enix, EA, Sega, and Hudson Soft all making games for the iPod, Apple's dedicated MP3 player took great steps towards entering the video game handheld console market. Even video game magazines like GamePro and EGM have reviewed and rated most of their games.
The games are in the form of files (iPod game), which are actually .zip archives in disguise. When unzipped, they reveal executable files along with common audio and image files, leading to the possibility of third-party games, although this never eventuated (with the exception of superficial user-made tweaks). Apple never made a software development kit (SDK) available to the public for iPod-specific development. The iOS SDK covers only iOS on the iPhone and iPod Touch, not traditional iPods.
In October 2011, Apple removed all the click wheel–operated games from its store.
Games
This is a list of games that were made available for the newest iPods, excluding the iPod Touch. Each game (other than Reversi and Chinese Checkers) cost US$4.99 to buy prior to their discontinuation in 2011.
The list contains games that are known to exist. The list is always kept up to date by this script.
Default games
These are the games that originally came with an iPod.
Criticism
iTunes had come under much criticism due to the UK price of iPod games, GB£3.99 (about US$7.40). Many people from the UK had given the games 1-star ratings, stating that Apple was "ripping off" Britain.
A similar situation occurred in Australia, where the price was A$7.49, even though the Australian dollar was (at the time) worth more than the US dollar (A$7.49 = US$7.76).
Developers had criticized Apple for not creating a software development kit (SDK) for software developers to create new iPod games; this was likely to keep the digital rights management of iPod games closed. Despite this, it did not prevent users from running an alternative OS on the iPod such as Linux, whereby, for example, there are ports of Doom that will run on fifth-generation iPods. Running Linux on an iPod retains the music-playing functionality of the device while also adding features such as the ability to create voice memos through the headphones.
When the iPod Classic and iPod Nano third generation were released, games which had previously been purchased could not be synced to the new iPods. Understandably, this made many consumers angry due to losing their investment.
It is also notable that after a download had been made for a game, it couldn't have been downloaded again unless a separate purchase was made for the same item. This is different behavior than applications downloaded on the App Store, which can be downloaded an unlimited number of times. These issues were later fixed, however, making it possible to install any single game on any number of iPods registered under the same account.
Unofficial games
Some older iPod units are capable of using replacement firmware such as iPod Linux and Rockbox. These firmware projects can play many other games, including the aforementioned native port of Doom; and, via a native port of the Game Boy emulator Gnuboy, many other games could be played, including Super Mario Bros., Tomb Raider, Mega Man, Kirby, Metroid, The Legend of Zelda, Street Fighter, and hundreds more.
Games using the ″Notes″ feature
With the release of the third-generation iPod in 2003, Apple introduced a ″Notes″ feature to the iPod's firmware. This functionality provided the first opportunity for third-party developers to create simple text and audio games which could be installed and run on an iPod without users needing to replace the official firmware.
With a limit of 1,000 individual .txt files, each with a maximum file size of 4kb, the Notes feature made use of a limited set of html tags. Hyperlinks could also be used to link to other .txt files or folders and play audio files stored on the device. The limitation of available html tags meant that developers were restricted to Choose Your Own Adventure–style text-based games or multiple choice–style quizzes with narrated audio. Subsequently, very few developers used the Notes feature as a way of publishing games.
References
External links
iPod game page at Apple.com
ITunes |
861440 | https://en.wikipedia.org/wiki/PowerPC%20Reference%20Platform | PowerPC Reference Platform | PowerPC Reference Platform (PReP) was a standard system architecture for PowerPC-based computer systems (as well as a reference implementation) developed at the same time as the PowerPC processor architecture. Published by IBM in 1994, it allowed hardware vendors to build a machine that could run various operating systems, including Windows NT, OS/2, Solaris, Taligent and AIX.
One of the stated goals of the PReP specification was to leverage standard PC hardware. Apple, wishing to seamlessly transition its Macintosh computers to PowerPC, found this to be particularly problematic. As it appeared no one was particularly happy with PReP, a new standard, the Common Hardware Reference Platform (CHRP), was developed and published in late 1995, incorporating the elements of both PReP and the Power Macintosh architecture. Key to CHRP was the requirement for Open Firmware (also required in PReP-compliant systems delivered after June 1, 1995), which gave vendors greatly improved support during the boot process, allowing the hardware to be far more varied.
PReP systems were never popular. Finding current, readily available operating systems for old PReP hardware can be difficult. Debian and NetBSD still maintain their respective ports to this architecture, although developer and user activity is extremely low. The RTEMS real-time operating system provides a board support package for PReP which can be run utilizing the QEMU PReP emulator. This provides a convenient development environment for PowerPC-based real-time, embedded systems.
Power.org has a Power Architecture Platform Reference (PAPR) that provides the foundation for development of Power ISA-based computers running the Linux operating system. PAPR was released in the fourth quarter of 2006.
See also
PowerOpen Environment
IBM ThinkPad Power Series
References
External links
PReP Specification Version 1.1 and related documents
The PowerPC (TM) Hardware Reference Platform, an overview of CHRP
QEMU PReP emulation for RTEMS operating system
PowerPC mainboards
IBM computer hardware |
16811623 | https://en.wikipedia.org/wiki/List%20of%20acquisitions%20by%20Hewlett-Packard | List of acquisitions by Hewlett-Packard | Hewlett-Packard, commonly referred to as HP, was an electronics technology company that shed its roots in 1999 by spinning off the first businesses of Test & Measurement, Medical, Analytical, Semiconductor as Agilent Technologies. It is now best known now as an information technology corporation, based in Palo Alto, California, which was split into two companies: Hewlett Packard Enterprise and HP Inc. The company was founded by Bill Hewlett and Dave Packard in a small garage on January 1, 1939. As of 2012, HP is the largest technology company in the world in terms of revenue, ranking 10th in the Fortune Global 500.
As of 2012, Hewlett-Packard has made a total of 129 acquisitions since 1986. Its first acquisition was the F.L. Moseley Company in 1958. This move enabled HP to enter the plotter business, which was the predecessor to its printer business of today. In 1989, HP purchased Apollo Computer for US$476 million, enabling HP to become the largest supplier of computer workstations. In 1995, the company bought another computer manufacturer, Convex Computer, for US$150 million. In 2000, HP spun off its measurement, chemical and medical businesses into an independent company named Agilent Technologies. The company's largest acquisition came in 2002, when it merged with Compaq, a personal computer manufacturer, for US$25 billion. The combined company overtook Dell for the largest share of the personal computer market worldwide in the second quarter.
Within IT networking hardware and storage market segments, HP has made acquisitions worth over US$15 billion, the largest one being 3PAR and 3COM acquisitions made in 2010, totaling over $5 billion. The most recent acquisition in the enterprise networking segment is Aruba Networks in March 2015 for US$3 billion.
On the IT services and consulting side, the largest acquisition made so far is Electronic Data Systems, in 2008 for US$13.9 billion
In the software products market segment, a stream of acquisitions has helped strengthen HP's position . The largest software company purchased prior to 2011 was Mercury Interactive for US$4.5 billion. This acquisition doubled the size of HP's software business to more than US$2 billion in annual revenue.
Between 2012 and 2013, HP had no acquisitions in any of its business segments as the firm was recouping an $8.8 billion write-off suffered as a result of acquisition of British software company Autonomy Corporation for $11 billion in 2011.
After this two-year gap, in 2014, HP returned to the acquisition market by acquiring Computer Networking Software company Shunra. The majority of companies acquired by HP are based in the United States.
At the end of 2014, HP announced that it will split into two companies, Hewlett Packard Enterprise and HP Inc. The former focuses on enterprise infrastructure and software solutions, whilst the latter focuses on consumer markets with PCs and printers. On November 1, 2015, they became separate companies.
Acquisitions
Each acquisition was for the respective company in its entirety, unless otherwise specified. The agreement date listed is the date of the agreement between HP and the subject of the acquisition, while the acquisition date listed is the exact date in which the acquisition completes. The value of each acquisition is usually the one listed at the time of the announcement. If the value of an acquisition is not listed, then it is undisclosed.
Notes
This figure by The Alacra Store includes acquisitions by companies that are eventually acquired by HP. The actual number of acquisitions included in this list is 96.
The acquisitions are ordered by acquisition dates. If the acquisition date is not available, then the acquisition is ordered by agreement dates.
Footnotes
References
Hewlett-Packard |
2854616 | https://en.wikipedia.org/wiki/Electronic%20dictionary | Electronic dictionary | An electronic dictionary is a dictionary whose data exists in digital form and can be accessed through a number of different media. Electronic dictionaries can be found in several forms, including software installed on tablet or desktop computers, mobile apps, web applications, and as a built-in function of E-readers. They may be free or require payment.
Information
Most of the early electronic dictionaries were, in effect, print dictionaries made available in digital form: the content was identical, but the electronic editions provided users with more powerful search functions. But soon the opportunities offered by digital media began to be exploited. Two obvious advantages are that limitations of space (and the need to optimize its use) become less pressing, so additional content can be provided; and the possibility arises of including multimedia content, such as audio pronunciations and video clips.
Electronic dictionary databases, especially those included with software dictionaries are often extensive and can contain up to 500,000 headwords and definitions, verb conjugation tables, and a grammar reference section. Bilingual electronic dictionaries and monolingual dictionaries of inflected languages often include an interactive verb conjugator, and are capable of word stemming and lemmatization.
Publishers and developers of electronic dictionaries may offer native content from their own lexicographers, licensed data from print publications, or both, as in the case of Babylon offering premium content from Merriam Webster, and Ultralingua offering additional premium content from Collins, Masson, and Simon & Schuster, and Paragon Software offering original content from Duden, Britannica, Harrap, Merriam-Webster and Oxford.
Writing systems
As well as Latin script, electronic dictionaries are also available in logographic and right-to-left scripts, including Arabic, Persian, Chinese, Devanagari, Greek, Hebrew, Japanese, Korean, Cyrillic, and Thai.
Dictionary software
Dictionary software generally far exceeds the scope of the hand held dictionaries. Many publishers of traditional printed dictionaries such as Langenscheidt, Collins-Reverso, Oxford University Press, Duden, American Heritage, and Hachette, offer their resources for use on desktop and laptop computers. These programs can either be downloaded or purchased on CD-ROM and installed. Other dictionary software is available from specialised electronic dictionary publishers such as iFinger, ABBYY Lingvo, Collins-Ultralingua, Mobile Systems and Paragon Software. Some electronic dictionaries provide an online discussion forum moderated by the software developers and lexicographers
In East Asia
The well-known brands, such as Instant-Dict (快譯通), Besta (無敵), and Golden Global View (文曲星), includes basic functions like dictionaries, TTS, calculator, calendar etc. They also have functions other than just dictionary, for example, MP3 player, Video player, web browser (WiFi), and simple games. Some also support Adobe Flash (SWF files). Most of them usually will have a touch screen, Qwerty keyboard, a speaker, SD card slot, and sometimes microphone and camera also, for example, MD8500 from Instant-Dict. Their functions can even be comparable to smartphones, with the exception of phone capabilities since they do not have radios to make or receive phone calls.
Main functions
Dictionaries: This is one of the most basic function, mostly using Oxford and Longman dictionaries
TTS: Includes Text-to-Speech and Speech-to-Text
Data transport: Uses RS-232 in the earlier ones; Mini USB in recent ones
Learning: Programs that can help you study for vocabularies
Note: Notepads, phone books, calendars, world clock, etc.
Calculators: simple calculators, scientific calculators, unit converters
Games: Play Flash games
Handheld dictionaries or PEDs
Handheld electronic dictionaries, also known as "pocket electronic dictionaries" or PEDs, resemble miniature clamshell laptop computers, complete with full keyboards and LCD screens. Because they are intended to be fully portable, the dictionaries are battery-powered and made with durable casing material. Although produced all over the world, handheld dictionaries are especially popular in Japan, Korea, Taiwan, China, and neighbouring countries, where they are the dictionary of choice for many users learning English as a second language. Some features of handheld dictionaries include stroke order animations, voice output, handwriting recognition, language-learning programs, a calculator, PDA-like organizer functions, time zone and currency converters, and crossword puzzle solvers. Dictionaries that contain data for several languages may have a "jump" or "skip-search" feature that allows users to move between the dictionaries when looking up words, and a reverse translation action that allows further look-ups of words displayed in the results. Many manufacturers produce handheld dictionaries that use licensed dictionary content that use a database such as the Merriam Webster Dictionary and Thesaurus while others may use a proprietary database from their own lexicographers. Users can also add content to their handheld dictionaries with memory cards (both expandable and dedicated), CD-ROM data, and Internet downloads. Manufacturers include AlfaLink, Atree, Besta, Casio, Canon, Instant Dict, Ectaco, Franklin, Iriver, Lingo, Maliang Cyber Technology, Compagnia Lingua Ltd., Nurian, Seiko, and Sharp.
In Japan
The market size as of 2014 was about 24.2 billion yen ($227.1 in May 2016 USD), although the market has been shrinking gradually from 2007 because of smartphones and tablet computers. The targeted customer base has been being shifted from business users to students. Student models of Japanese handheld dictionaries also include digital versions of textbooks and other study materials. Sony and Seiko have withdrawn from the market. As of 2016, Casio had 59.3% of the market share, followed by Sharp with 21.5% and Canon with 19.2%.
In 2016, Seiko announced that their mobile device apps on iPad iOS has been launched.
Dictionaries on mobile devices
Dictionaries of all types are available as apps for smartphones and for tablet computers such as Apple's iPad, the BlackBerry PlayBook and the Motorola Xoom. The needs of translators and language learners are especially well catered for, with apps for bilingual dictionaries for numerous language pairs, and for most of the well-known monolingual learner's dictionaries such as the Longman Dictionary of Contemporary English and the Macmillan English Dictionary.
Online dictionaries
There are several types of online dictionary, including:
Aggregator sites, which give access to data licensed from various reference publishers. They typically offer monolingual and bilingual dictionaries, one or more thesauruses, and technical or specialized dictionaries. (e.g. TheFreeDictionary.com, Dictionary.com, Kotobank, and Goo Dictionary)
'Premium' dictionaries available on subscription (e.g. Oxford English Dictionary, van Dale, and Kenkyusha Online Dictionary)
Dictionaries from a single publisher, free to the user and supported by advertising (e.g. Collins Online Dictionary, Duden Online, Larousse bilingual dictionaries, the Macmillan English Dictionary, and the Merriam-Webster Learner's Dictionary)
Dictionaries available free from non-commercial publishers (often institutions with government funding), such as the (ANW), and .
Online dictionaries are regularly updated, keeping abreast of language change. Many have additional content, such as blogs and features on new words. Some are collaborative projects, most notably Wiktionary and the Collins Online Dictionary. And some, like the Urban Dictionary, consist of entries (sometimes self-contradictory) supplied by users.
Many dictionaries for special purposes, especially for professional and trade terminology, and regional dialects and language variations, are published on the websites of organizations and individual authors. Although they may often be presented in list form without a search function, because of the way in which the information is stored and transmitted, they are nevertheless electronic dictionaries.
Evaluation
There are differences in quality of hardware (hand held devices), software (presentation and performance), and dictionary content. Some hand helds are more robustly constructed than others, and the keyboards or touch screen input systems should be physically compared before purchase. The information on the GUI of computer based dictionary software ranges from complex and cluttered, to clear and easy-to-use with user definable preferences including font size and colour.
A major consideration is the quality of the lexical database. Dictionaries intended for collegiate and professional use generally include most or all of the lexical information to be expected in a quality printed dictionary. The content of electronic dictionaries developed in association with leading publishers of printed dictionaries is more reliable that those aimed at the traveler or casual user, while bilingual dictionaries that have not been authored by teams of native speaker lexicographers for each language, will not be suitable for academic work.
Some developers opt to have their products evaluated by an independent academic body such as the CALICO.
Another major consideration is that the devices themselves and the dictionaries in them are generally designed for a particular market. As an example, almost all handheld Japanese-English electronic dictionaries are designed for people with native fluency in Japanese who are learning and using English; thus, Japanese words do not generally include furigana pronunciation glosses, since it is assumed that the reader is literate in Japanese (headwords of entries do have pronunciation, however). Further, the primary manner to look up words is by pronunciation, which makes looking up a word with unknown pronunciation difficult (for example, one would need to know that 網羅 "comprehensive" is pronounced もうら, moura to look it up directly). However, Japanese electronic dictionaries (primarily on recent models) include character recognition, so users (native speakers of Japanese or not) can look up words by writing the kanji.
Similar limitations exist in most two or multi-language dictionaries and can be especially crippling when the languages are not written in the same script or alphabet; it's important to find a dictionary optimized for the user's native language.
Integrated technology
Several developers of the systems that drive electronic dictionary software offer API and SDK – Software Development Kit tools for adding various language-based (dictionary, translation, definitions, synonyms, and spell checking and grammar correction) functions to programs, and web services such as the AJAX API used by Google. These applications manipulate language in various ways, providing dictionary/translation features, and sophisticated solutions for semantic search. They are often available as a C++ API, an XML-RPC server, a .NET API, or as a Python API for many operating systems (Mac, Windows, Linux, etc.) and development environments, and can also be used for indexing other kinds of data.
See also
Sony Data Discman
Notes
Dictionaries by type
Dedicated application electronic devices |
5268091 | https://en.wikipedia.org/wiki/Moriah%20van%20Norman | Moriah van Norman | Moriah van Norman (born May 30, 1984) is an American water polo player who has played for the University of Southern California and the National team, who won the Peter J. Cutino Award in 2004, recognized as the best female collegiate player in the nation. Her position is two-meter offense (center forward).
High school and USC
van Norman was born in San Diego, California. She earned four-time high school All-American honors at University of San Diego High School in San Diego. She was named California Interscholastic Federation player of the year and league most valuable player in her senior season.
van Norman earned All-America honors in her 2003 freshman season after leading her USC Trojans team in scoring with 65 goals. She scored three or more goals in five matches including five against UC Berkeley and three against UCLA. As a 2004 sophomore, van Norman finished second on the team in scoring with 58 goals, leading her team to win the NCAA Women's Water Polo Championship. She became the third player in USC women's water polo history to win the Peter J. Cutino Award as the nation's top collegiate women's player and the last person to receive the award from legendary former Cal coach Pete Cutino, who died in September 2004. In 2005, van Norman was third on the team in scoring with 40 goals in her junior season. In her final season, USC, with a season record of 27-3, was top-seeded at the NCAA championships, but van Norman's six goals in the 3 tournament matches were not enough. She picked up her third ejection with 5:15 left in the final game, and sat out the remainder of the game on the bench as UCLA won, 9-8. van Norman racked up 215 goals in her four years with USC, third all-time in Trojan history.
Career
van Norman was a member of the U.S. Junior National Team, won gold at the 2005 FINA Junior World Championships and silver at the 2003 FINA Junior World Championships. She also played with the 2002 Pan-American Games championship team. van Norman is a member of the U.S. national team, which won silver at the 2005 FINA World Water Polo Championships.
At the 2008 China Summer Olympic games, she and the American team lost 8-9 in the Championship game to the Netherlands and took home the silver medal.
In June 2009, van Norman was named to the USA water polo women's senior national team for the 2009 FINA World Championships.
See also
List of Olympic medalists in water polo (women)
List of world champions in women's water polo
List of World Aquatics Championships medalists in water polo
References
External links
1984 births
American female water polo players
Living people
Sportspeople from San Diego
American people of Dutch descent
Olympic silver medalists for the United States in water polo
Water polo players at the 2008 Summer Olympics
USC Trojans women's water polo players
Medalists at the 2008 Summer Olympics
World Aquatics Championships medalists in water polo |
332744 | https://en.wikipedia.org/wiki/Erichthonius%20of%20Dardania | Erichthonius of Dardania | Erichthonius (; Ancient Greek: Ἐριχθόνιος) of Dardania was a mythical king of Dardania in Greek mythology. the son of Dardanus and Batea (in some other legends his mother is said to be, Olizone, daughter of Phineus). He was the brother of Ilus and Zacynthus. Erichthonius was said to have enjoyed a peaceful and prosperous reign.
Etymology
Erichthonius is of uncertain etymology, possibly related to a pre-Greek form *Erektyeu-. The connection of Ἐριχθόνιος with ἐρέχθω, "shake" is a late folk-etymology; other folk-etymologies include ἔριον, erion, "wool" or eris, "strife"+ χθών chthôn or chthonos, "earth".
Mythology
Fundamentally, all that is known of this Erichthonius comes from Homer, who says (Samuel Butler's translation of Iliad 20.215-234):
In the beginning Dardanos was the son of Zeus, and founded Dardania, for Ilion was not yet established on the plain for men to dwell in, and her people still abode on the spurs of many-fountained Ida. Dardanos had a son, king Erichthonios, who was wealthiest of all men living; he had three thousand mares that fed by the water-meadows, they and their foals with them. Boreas was enamored of them as they were feeding, and covered them in the semblance of a dark-maned stallion. Twelve filly foals did they conceive and bear him, and these, as they sped over the fertile plain, would go bounding on over the ripe ears of wheat and not break them; or again when they would disport themselves on the broad back of Ocean they could gallop on the crest of a breaker. Erichthonios begat Tros, king of the Trojans, and Tros had three noble sons, Ilos, Assarakos, and Ganymede who was comeliest of mortal men; wherefore the gods carried him off to be Zeus' cupbearer, for his beauty's sake, that he might dwell among the immortals.
John Tzetzes and one of the scholia to Lycophron call his wife Astyoche, daughter of Simoeis. The Bibliotheca also adds Erichthonius' older brother Ilus, who died young and childless; presumably a doublet of the other Ilus, grandson of Erichthonius, eponym of Troy.
Strabo records, but discounts, the claim by "some more recent writers" that Teucer came from the deme of Xypeteones in Attica, supposedly called Troes (meaning Trojans) in mythical times. These writers mentioned that Erichthonius appears as founder both in Attica and the Troad, and may be identifying the two.
Erichthonius reigned for forty six or, according to others, sixty five years and was succeeded by his son Tros.
Family tree
Notes
References
Apollodorus, The Library with an English Translation by Sir James George Frazer, F.B.A., F.R.S. in 2 Volumes, Cambridge, MA, Harvard University Press; London, William Heinemann Ltd. 1921. ISBN 0-674-99135-4. Online version at the Perseus Digital Library. Greek text available from the same website.
Beekes, S. P., Etymological Dictionary of Greek, 2 vols. Leiden: Brill, 2009.
Dictys Cretensis, from The Trojan War. The Chronicles of Dictys of Crete and Dares the Phrygian translated by Richard McIlwaine Frazer, Jr. (1931-). Indiana University Press. 1966. Online version at the Topos Text Project.
Dionysus of Halicarnassus, Roman Antiquities. English translation by Earnest Cary in the Loeb Classical Library, 7 volumes. Harvard University Press, 1937-1950. Online version at Bill Thayer's Web Site
Dionysius of Halicarnassus, Antiquitatum Romanarum quae supersunt, Vol I-IV. . Karl Jacoby. In Aedibus B.G. Teubneri. Leipzig. 1885. Greek text available at the Perseus Digital Library.
Graves, Robert; The Greek Myths, Moyer Bell Ltd; Unabridged edition (December 1988), .
Greek Mythology Link 2003-10-01
March, J., Cassell's Dictionary Of Classical Mythology, London, 1999.
Notes to the Bibliotheca 3.12.2; ed. by. Sir James George Frazer.
Perseus Encyclopedia, Erichthonius.
Mythological kings of Troy
Kings in Greek mythology
Trojans
Princes in Greek mythology |
4098234 | https://en.wikipedia.org/wiki/Sort%20%28Unix%29 | Sort (Unix) | In computing, sort is a standard command line program of Unix and Unix-like operating systems, that prints the lines of its input or concatenation of all files listed in its argument list in sorted order. Sorting is done based on one or more sort keys extracted from each line of input. By default, the entire input is taken as sort key. Blank space is the default field separator. The command supports a number of command-line options that can vary by implementation. For instance the "-r" flag will reverse the sort order.
History
A command that invokes a general sort facility was first implemented within Multics. Later, it appeared in Version 1 Unix. This version was originally written by Ken Thompson at AT&T Bell Laboratories. By Version 4 Thompson had modified it to use pipes, but sort retained an option to name the output file because it was used to sort a file in place. In Version 5, Thompson invented "-" to represent standard input.
The version of bundled in GNU coreutils was written by Mike Haertel and Paul Eggert. This implementation employs the merge sort algorithm.
Similar commands are available on many other operating systems, for example a command is part of ASCII's MSX-DOS2 Tools for MSX-DOS version 2.
The command has also been ported to the IBM i operating system.
Syntax
sort [OPTION]... [FILE]...
With no FILE, or when FILE is -, the command reads from standard input.
Parameters
Examples
Sort a file in alphabetical order
$ cat phonebook
Smith, Brett 555-4321
Doe, John 555-1234
Doe, Jane 555-3214
Avery, Cory 555-4132
Fogarty, Suzie 555-2314
$ sort phonebook
Avery, Cory 555-4132
Doe, Jane 555-3214
Doe, John 555-1234
Fogarty, Suzie 555-2314
Smith, Brett 555-4321
Sort by number
The -n option makes the program sort according to numerical value. The command produces output that starts with a number, the file size, so its output can be piped to to produce a list of files sorted by (ascending) file size:
$ du /bin/* | sort -n
4 /bin/domainname
24 /bin/ls
102 /bin/sh
304 /bin/csh
The command with the option prints file sizes in the 7th field, so a list of the files sorted by file size is produced by:
$ find . -name "*.tex" -ls | sort -k 7n
Columns or fields
Use the -k option to sort on a certain column. For example, use "-k 2" to sort on the second column. In old versions of sort, the +1 option made the program sort on the second column of data (+2 for the third, etc.). This usage is deprecated.
$ cat zipcode
Adam 12345
Bob 34567
Joe 56789
Sam 45678
Wendy 23456
$ sort -k 2n zipcode
Adam 12345
Wendy 23456
Bob 34567
Sam 45678
Joe 56789
Sort on multiple fields
The -k m,n option lets you sort on a key that is potentially composed of multiple fields (start at column m, end at column n):
$ cat quota
fred 2000
bob 1000
an 1000
chad 1000
don 1500
eric 500
$ sort -k2,2n -k1,1 quota
eric 500
an 1000
bob 1000
chad 1000
don 1500
fred 2000
Here the first sort is done using column 2. -k2,2n specifies sorting on the key starting and ending with column 2, and sorting numerically. If -k2 is used instead, the sort key would begin at column 2 and extend to the end of the line, spanning all the fields in between. -k1,1 dictates breaking ties using the value in column 1, sorting alphabetically by default. Note that bob, and chad have the same quota and are sorted alphabetically in the final output.
Sorting a pipe delimited file
$ sort -k2,2,-k1,1 -t'|' zipcode
Adam|12345
Wendy|23456
Sam|45678
Joe|56789
Bob|34567
Sorting a tab delimited file
Sorting a file with tab separated values requires a tab character to be specified as the column delimiter. This illustration uses the shell's dollar-quote notation
to specify the tab as a C escape sequence.
$ sort -k2,2 -t $'\t' phonebook
Doe, John 555-1234
Fogarty, Suzie 555-2314
Doe, Jane 555-3214
Avery, Cory 555-4132
Smith, Brett 555-4321
Sort in reverse
The -r option just reverses the order of the sort:
$ sort -rk 2n zipcode
Joe 56789
Sam 45678
Bob 34567
Wendy 23456
Adam 12345
Sort in random
The GNU implementation has a -R --random-sort option based on hashing; this is not a full random shuffle because it will sort identical lines together. A true random sort is provided by the Unix utility shuf.
Sort by version
The GNU implementation has a -V --version-sort option which is a natural sort of (version) numbers within text. Two text strings that are to be compared are split into blocks of letters and blocks of digits. Blocks of letters are compared alpha-numerically, and blocks of digits are compared numerically (i.e., skipping leading zeros, more digits means larger, otherwise the leftmost digits that differ determine the result). Blocks are compared left-to-right and the first non-equal block in that loop decides which text is larger. This happens to work for IP addresses, Debian package version strings and similar tasks where numbers of variable length are embedded in strings.
See also
Collation
List of Unix commands
uniq
shuf
References
Further reading
External links
Original Sort manpage The original BSD Unix program's manpage
Further details about sort at Softpanorama
Computing commands
Sorting algorithms
Unix text processing utilities
Unix SUS2008 utilities
Plan 9 commands
Inferno (operating system) commands |
52513310 | https://en.wikipedia.org/wiki/Apache%20MXNet | Apache MXNet | Apache MXNet is an open-source deep learning software framework, used to train, and deploy deep neural networks. It is scalable, allowing for fast model training, and supports a flexible programming model and multiple programming languages (including C++, Python, Java, Julia, Matlab, JavaScript, Go, R, Scala, Perl, and Wolfram Language. The MXNet library is portable, and can scale to multiple GPUs as well as multiple machines. It was co-developed by Carlos Guestrin at University of Washington (along with GraphLab).
Features
Apache MXNet is a scalable deep learning framework that supports deep learning models, such as; convolutional neural networks (CNNs) and long short-term memory networks (LSTMs).
Scalable
MXNet can be distributed on dynamic cloud infrastructure using a distributed parameter server (based on research at Carnegie Mellon University, Baidu, and Google). with multiple GPUs or CPUs the framework approaches linear scale.
Flexible
MXNet supports both imperative and symbolic programming. The framework allows developers to track, debug, save checkpoints, modify hyperparameters, and perform early stopping.
Multiple languages
MXNet supports Python, R, Scala, Clojure, Julia, Perl, MATLAB and JavaScript for front end development, and C++ for back end optimization.
Portable
Supports an efficient deployment of a trained model to low-end devices for inference, such as mobile devices (using Amalgamation), Internet of things devices (using AWS Greengrass), serverless computing (using AWS Lambda) or containers. These low-end environments can have only weaker CPU or limited memory (RAM), and should be able to use the models that were trained on a higher-level environment (GPU based cluster, for example).
Cloud Support
MXNet is supported by public cloud providers including Amazon Web Services (AWS) and Microsoft Azure. Amazon has chosen MXNet as its deep learning framework of choice at AWS. Currently, MXNet is supported by Intel, Baidu, Microsoft, Wolfram Research, and research institutions such as Carnegie Mellon, MIT, the University of Washington, and the Hong Kong University of Science and Technology.
See also
Comparison of deep learning software
Differentiable programming
References
Data mining and machine learning software
Deep learning
Free statistical software
MXNet
Cross-platform free software
Software using the Apache license |
32018456 | https://en.wikipedia.org/wiki/Intent%20%28military%29 | Intent (military) | For military strategy, Intent is the desired outcome of a military operation. It is a key concept in 21st century military operations and is a vital element to facilitate subordinates' initiative and collaboration and cooperation amongst team members in joint operations.
Intent content
In the reviewed open military doctrine literature intent is a critical component for command and control. The many definitions that exist of intent are mostly similar but the actual intent content differs and is unclear. Intent content can mainly be found as concept descriptions in doctrinal handbooks relating to development or impact usage of intent.
The following examples represent United Kingdom, Sweden, Canada, United States, and NATO doctrinal view of intent.
British Army Doctrine defines it as "Intent is similar to purpose. A clear intent initiates a force’s purposeful activity. It represents what the commander wants to achieve and why; and binds the force together; it is the principal result of decision-making. It is normally expressed using effects, objectives and desired outcomes....The best intents are clear to subordinates with minimal amplifying detail."
Swedish Armed Forces – Integrated Dynamic Command and Control (IDC2) (Josefsson 2007) define intent to "Intent is a concise formulation of the overall goals and purpose. The focus is to describe operations, restrictions and resource allocation."
Canadian Forces Joint Publication 5.0 (Chief of the Defence Staff 2008, p. 5E-2) "Commander's Intent. This summary should provide the Commander's overall intent and establish the purpose of the plan. It is an important focusing statement for subordinate commanders. (1) Military Objectives. (2) Desired Military End-State. (3) Transition Conditions".
US Field Manual 5.0 (U.S. Army 2010, para. 2-90) constitute the US Army's view on planning, preparing, executing, and assessing operations. "The commander’s intent is a clear, concise statement of what the force must do and the conditions the force must establish with respect to the enemy, terrain, and civil considerations that represent the desired end state (FM 3–0). The commander's intent succinctly describes what constitutes success for the operation. It includes the operation’s purpose and the conditions that define the end state. It links the mission, concept of operations, and tasks to subordinate units."
NATO allied Joint publication 1 (AJP-01) (NATO 2010, para. 0538) provide the keystone doctrine for the planning, execution and support of Allied joint operations. "The intent defines the end-state in relation to the factors of mission; adversary, operating environment, terrain, forces, time and preparation for future operations. As such, it addresses what results are expected from the operation, how these results might enable transition to future operations, and how, in broad terms, the Commander expects the force to achieve those results. Its focus is on the force as a whole. Additional information on how the force will achieve the desired results is provided only to clarify the Commander's intentions."
US Joint Publication 3.0 (US Joint Chiefs of Staff 2010, p. IV-25) provides the doctrinal foundation and fundamental principles that guide the Armed Forces of the United States in the conduct of joint operations across the range of military operations. "Commander's intent is a clear and concise expression of the purpose of the operation and the military end state." and continues with "It also includes where the commander will accept risk during the operation. The initial intent statement normally contains the purpose and military end state as the initial inputs for the planning process."
Other doctrinal work that have been used in this survey are US Field Manual 6.0 (U.S Army 2003, para. 1–68) describe doctrine on C2 for tactical Army echelons (corps and below), US Field Manual 3–0 (U.S. Army 2008, para. 5–55) presents overarching doctrinal guidance and direction for conducting operations and is one of the two capstone doctrine handbooks for US Army, SwAF – Regulations for ground operations (Regler för markoperationer) (SwAF 2009, p. 143) UK Glossary of Joint and Multinational Terms and Definitions (The DCDC 2006, p. C-16)
In military doctrinal handbooks the identified intent artefacts generally express the initial state and situation, the desired end state and outcome, and how to get to the desired end state. Artefacts describing the initial situation are: own and other forces, adversaries, operating environment, terrain, time, preparation for future operations. Artefacts describing the outcome are: purpose, goals, mission, effects and end state. Artefacts describing how to reach the outcome are: concept of operations, tasks to subordinate units, risk willing, how results might enable transition to future operations, objectives, transition conditions, restrictions in conducting operations, allocation of resources, and expectations of force usage.
Another way to identify intent artefacts is from how people actually communicate intent. Klein (1998, pp. 225–29) present the results of information types that is identified in intent communication. The seven information types of intent are according to Klein (1998, p. 226): (1) Purpose of task which describes why the task is performed, (2) objective of task, presents a picture of the desired outcome, (3) sequence of steps in the plan. Klein identifies this to be a source of problem since too detailed descriptions may limit the subordinates initiative, (4) rationale for plan includes all the information that where present when making the decision, (5) key decisions that may have to be made, i.e. if there is a choice to be made the commander can provide the intent in how he wants it to be conducted, (6) antigoals, describe unwanted outcomes, and (7) constraints and other considerations describes weather and rule of engagement etc.
Intent content definition
The doctrinal artefacts are mapped into the structure provided by Klein and the resulting seven facets are grouped in: (a) Initial Situation describing initial situation and state and consists of Kleins rationale for the plan, (b) Outcome is describing outcomes and consists of Kleins Purpose of task, objective of task, and antigoals, and (c) Execution that is describing how to reach the outcome and consists of Kleins sequence of steps, key decisions and constraints and other considerations.
Intent is then defined to consist of the following eight facets.
Mission/Goal – The purpose of the task (the higher-level goals). It provides the rationale why the missions and task are to be executed. It is a high-level definition of the overall intent and is regularly stated in one sentence following on the form of Who, What, When, Where and Why. A Mission/Goal can be described by using the Actions, Effects or a compound State, i.e. situation. Mission/Goal belongs to the intent group outcome.
End-State – The objective of the task in form of a representation of the desired outcome. The desired outcome is described as a state, e.g. completion of a task, the effects from tasks, or even the execution of tasks over time. The purpose is to provide a picture of the End-State. According to Klein (1998, p. 226) in his investigation, the end-state objective was only missing once in the thirty five examined Commander's intent statements. Effect is in this work defined as the physical and/or behavioural state of a system that results from an action, a set of actions, or another effect. The End State is a more detail description of the Mission/Goal statement and can be described in several sentences. An End State can be described by using the Actions, Effects or a compound State, i.e. situation. End State belongs to the intent group outcome.
Sequence – The sequence of steps in the plan – This is the plan that describes what do in general terms such as Courses of Actions and Course of Effects. Sequence belongs to the intent group execution describing how to reach the desired outcome and include concept of operations, tasks to subordinate units, how results might enable the transition to future operations, allocation of resources, and expectations of force usage.
Initial State – The situation that builds the rationale for the plan, what information was available, who was making the decisions, under what time pressure and similar circumstances. Initial State encompass own and other forces, adversaries, operating environment, terrain, time, preparation for future operations. Initial State belongs to the intent group initial situation.
Key Decisions – The key decisions that may have to be made and guide the commander in how to make choices. Key Decisions belongs to the intent group execution and describes how to reach the desired outcome and encompass how results might enable the transition to future operations, transition conditions and expectations of force usage.
Antigoals – Antigoals describe undesired outcomes. Antigoals are meaningful when to clarify alternatives action plans and what the resulting outcome might be. Antigoals are described in the same way as End State with the difference in that antigoals describe the unwanted outcome. In the surveyed doctrinal descriptions of intent, antigoals were not mentioned. Antigoals belong to the intent outcome.
Constraints – Constraints and other considerations that should be taken into account and can vary from policy to weather and terrain. Constraints belongs to the intent group execution and encompass risk willing, restrictions in conducting operations, allocation of resources, and expectations of force usage.
An eight facet Expressives is proposed in Gustavsson et al. (2008d; 2011) to capture the style of organizations and commanders: experience, risk taking, use of power and force, diplomacy; ethics; norms; morale; creativity; and unorthodox behaviour. The use of Expressives can range from situations where participants (e.g. commanders) express their style to other participants (e.g. subordinates), or by Staff that develop models over the participants (commanders and subordinates) to be used in Course of Action development and war gaming. In both cases Expressives are a support to better understand the collaboration participants’ capabilities and conduct of operations.
Intent in command and control
Commander's intent
Commander's intent is an intent describing military focused operations and it is a publicly stated description of the end-state as it relates to forces (entities, people) and terrain, the purpose of the operation, and key tasks to accomplish. It is developed by a small group, e.g. staff, and a commander.
Commander's intent (CSI) plays a central role in military decision making and planning. CSI acts as a basis for staffs and subordinates to develop their own plans and orders to transform thought into action, while maintaining the overall intention of their commander.
The commander's intent links the mission and concept of operations. It describes the end state and key tasks that, along with the mission, are the basis for subordinates’ initiative. Commanders may also use the commander's intent to explain a broader purpose beyond that of the mission statement. The mission and the commander's intent must be understood two echelons down. (U.S Army 2003, para. 4-27)
Pigeau and McCann (2006) state that intent is more than an aim or a purpose; they state that Intent contains the aim and purpose together with all implications. Hence, CSI is not only to describe the End-State but also a concise expression of the purpose of the operation. CSI may also include the commander's assessment of the adversary commander's intent and an assessment of where and how much risk is acceptable during the operation. This view is supported by Klein (1998, p. 225)
CSI originates from one commander's mind and is disseminated to the echelons below. CSI rarely gets reviewed and updated. For a short duration mission, such as a deliberate attack, the original statement may remain valid throughout planning. But for longer phases, in order to be agile the CSI might be changing in phase with the unfolding of the situation. Commanders must develop their intent within the bounds of a whole hierarchy of guiding principles that limit the types of solutions that they can entertain (Pigeau & McCann 2006).
Common intent
Common intent is an intent that is shared and understood by all participants, i.e. there is no discrepancy between the intent of the participating humans. Common intent is an idealized view of intent.
In today's operational environment teams need to work together towards a desired End-State. Pigeau and McCann (2000) define Command and Control: “Command and Control: The establishment of common intent to achieve coordinated action”. Common intent is the combination of explicit intent and implicit intent. Pigeau and McCann (2006) put forward that for a realizable Common Intent there need to be a single shared objective, together with a clear understanding in how that objective can be attained. They continue that Common Intent is an idealized concept where maximum overlap, with minimum scattering, exists between the intent of the commander and the intents of his subordinates. Knowledge known by the commander and his subordinates needs to be shared at all levels, guiding principles, reasoning ability, and to express similar levels of commitment. Intent is then not only something for a commander to disseminate but to exchange and learning the team members’ intent.
Farrell and Lichacz (2004) state that common intent describes a socio-psychological phenomenon that seems to be evident amongst a team that achieves a common objective. The CSI is a sort of a one person's view, but as said in the introduction “every individual have intent of her own” implies that it may not just be enough to disseminate intent amongst staff members and subordinates. A conclusion of the work of Farrell (2006) is that teams with different military and civilian cultures would need that all team members establish a common understanding of the mission objectives and the Commander's intent with respect to their own competencies, authorities, and responsibilities.
Pigeau and McCann (2006, pp. 85–108) state that diverse team members need to have a high degree of common intent to perform effectively. In such teams the commander needs to ensure that the intent is perceived and understood by all team members (Pigeau & McCann 2000; Farrell & Lichacz 2004; Pigeau & McCann 2006; Farrell 2006).
Alberts and Hayes (2007) say that in order to allow subordinates’ initiative the operations order focuses on describing the CSI/CI so that flexibility in coordination and collaboration in the dynamic environment is entailed. There is a need for the commander to connect the subordinates’ human potential (reason, opinions, questions, seek information regarding the mission) to align and be a support to the commanders own intent (Pigeau & McCann 2006, p. 102).
Command intent
Command intent (CI) is a practical view of common intent, meaning that it is not plausible to expect that all individuals during a whole mission in all situations will share the same intent. CI will be developed for specific parts of missions and shared amongst the participants. To establish CI the involvement of all participants is necessary, e.g. compare with football or soccer teams where the overall intent is formalized by coaches in collaboration with the players and where each player knows what the other players’ intentions are.
Pigeau and McCann (2000) states that “In reality, it is presumed true that it is impossible to have common intent”. For a specific mission bounded for a certain time an overlap of intent ought to be achievable, e.g. as illustrated by the players in a soccer team where all have individual goals with their lives and families, but on the soccer field they have the common intent to win the game. This means that during the game, and in training and exercises prior to and after the game their common intent is to perform well according to the mission declared by the coach (Farrell & Lichacz 2004; Farrell 2006; Gustavsson et al. 2008d).
A workable version of common intent is command intent that is directed for a specific situation, bounded by participating organization, space and time, i.e. for the operation at hand the intent is common but other intent and goals of the participating humans may differ. Much of the coordination can be done locally, i.e. on a lower level, without explicit orders. Brehmer (2009) proposes that the higher levels of command will have the time to consider other aspects of the problems facing them. Brehmer continues that there is the loss of combat power inherent in top-down command-directed synchronization. By NATO Network Enabled Capabilities (NNEC) as presented by Alberts, Garstka & Stein (1999, pp. 87–93) this is overcome by a high-speed continuum. Brehmer (2009) however says that the main responsibility of the commander and his staff is to articulate intent and crafting rules of engagement. For armed forces that have mission command as their principal doctrine this is not a new concept, but Brehmer further envisions that with articulated intent larger units will be able to co-ordinate with other units and conduct the mission without any explicit directions from higher headquarters. Command Intent is then an outline of a plan, objectives to be achieved, responsibilities, linkages and schemas of manoeuvre, and constraints. Establishing command intent also involves more than one person. Traditional Commander's intent is then replaced by an intent that arises from dialogue between commanders and key staff at more than one level.
Command intent is to allow self-synchronization and to provide understanding of the complex causes and effects. To enable self-synchronization the subordinates must be given the mandate to operate on their own initiatives, within the boundary of the mission. In “Rethinking Command and Control” by Curts and Cambell (2006), the authors address this fine line between delegating authority and maintaining and controlling hierarchy The commander delegating authority must refrain from directing the actions of subordinates, yet must also maintain some command structure. The subordinates must have the ability to work independently or within a team to achieve the mission goals. To create this empowerment the commander's information should be shared with everyone. Autonomy is created by setting boundaries and hierarchy then can be replaced by self-directed teams. CI acts as a basis for staffs and subordinates to develop their own plans and orders that transform thought into action, while maintaining the overall intention of their commanders (Gustavsson et al. 2008d; 2009).
Explicit and implicit intent
Explicit intent is an intent that is publicly stated and made available for the participants. Implicit intent is an intent that is not publicly stated. Implicit intent can be made explicit by the mechanism described below.
Shattuck and Woods (2000) examined the role of communicating intent. In the study, company commanders received a battalion order including the battalion Commander's intent. Then changes to the situation were introduced and the actions performed by the company commanders were compared with the intent of the battalion commander. The result was that the company commanders matched their battalion commander's intent in only 34% of the cases.
Pigeau and McCann (2000) introduced that intent consists of an explicit part and an implicit part.
Explicit intent is the one that is publicly stated for all the headquarters (HQ) staff and subordinates to perceive, think about, and act upon. The explicit intent is either vocalized (i.e. made publicly) in doctrine, orders, statements or can be derived from questions and answers. Theoretically, each staff and subordinate member should be able to reiterate the commander's intent at any point during the process.
Implicit intent consists of all the un-vocalized expectations that the commander and all team members have. The implicit intent is developed over a longer time, prior to the mission, and develops from the style of how the commander is conducting the operations with respect to experience, risk willing, use of power and force, diplomacy, ethics, social values, moral, norms, creativity and unorthodox behaviour and the concepts, policies, laws and doctrine agreed to by military, civil, organizations, agencies, nations and coalitions.
Farell and Lichacz (2004) proposed that implicit intent is an internal expectation of Commander's intent. The example used by Farell & Lichacz (2004) is that with an explicit intent “to capture the hill” the implicit intent might be “to capture the hill with minimal battle damage” or “to capture the hill with Air Force assets only.” The implicit expectations depend on how the members interpret Commander's intent from personal expectations based on their style and experience, (Pigeau & McCann 2000), and with the staff position (e.g., planner, operator, commander, etc.), (cf. Farrell & Lichacz 2004). Farell and Lichacz (2004) provided a way of finding implicit intent by asking questions of the form “from perspective x, how do you interpret Commander’s intent?”
Implicit intent can be made explicit by transforming the implicit into explicit statements. The commander can vocalize the Personal, Military or Cultural implicit intent. The commander can be monitored, e.g. by his subordinates, and his team members, who then draw conclusions about the commanders implicit intent. In the same way a commander can draw conclusions regarding his subordinates. Pigeau and McCann (2000) presented some mechanisms of making original implicit intent made explicit. They are: (1) Externalization is when a commander or subordinate make the internal intents explicit declared; (2) Internalization is a version of tacit learning, when a commander presents the intent and the mouth is saying something and the body language signals something different, or add context and meaning that are put into the mental model and affects the implicit intent; (3) Socialization is meeting and talking and performing exercises together, teaming, i.e. is to find the implicit intent and motives etc.; and (4) Dialogue is the explicit stated, publicly vocalized available description of an individual's intent.
Computational models of intent
Joint Consultation Command and Control information Exchange Data Model (JC3IEDM) convey Intent by free-text.
Coalition Battle Management Language (C-BML) do not convey intent explicitly. However, with the structure it could express intent.
Command and Control Lexical Grammar (C2LG) convey Intent by the rule CI - (KeyTasks) EndState (ExpandedPurpose). Where CI is command intent and derived from US field manual 5 (U.S. Army 2005). C2LG do not convey intent in the format describe in the intent content section above.
Operations Intent and Effects Grammar (OIEG) convey intent as described in the intent content section above by the rule intent - intent: [Goal] {End State} [Sequence] [Initial State] [{Key Decision}]* [Antigoal]* [Constraint]* [{Intent ID}]
See also
Mission-type tactics
Command and control
Truppenführung
Models of intent
Joint Consultation, Command and Control Information Exchange Data Model
Battle management language
References
Based on the PhD Thesis by Gustavsson Per M. (2011) "Modelling, Formalising, and Implementing Intent in Command and Control Systems", De Montfort University Leicester, UK
External links
Command and Control Research Program
C4I Center, George Mason University
Military science
Military doctrines |
35525748 | https://en.wikipedia.org/wiki/Software%20industry%20in%20Chennai | Software industry in Chennai | Chennai is the second largest software exporter in India, next only to Bangalore. India's largest IT park is housed at Chennai. Software exports from Tamil Nadu during 2017–2018 rose 8.6% per cent to touch 1,11,179 crore, involving a workforce of 780,000, and the city is the hub for deep tech startup companies. Many software and software services companies have development centres in Chennai, which contributed 14 percent of India's total software exports of 14,42,140 lakh during 2006–07, making it the second largest Indian city software exporter following Bangalore and the city is the home for 7 top rated IT companies out of 15 in India. The Tidel Park in Chennai was billed as Asia's largest IT park when it was built. Major software companies have their offices set up here, with some of them making Chennai their largest base. Chennai is the largest hub for e-publishing, as there are 67 e-publishing units registered with the STPI and many Rs.8300-Cr data centers, digital hubs are in the process of development. A major reason for the growth of the Software industry are the top engineering colleges in Tamil Nadu, of which Chennai is a major contributor, have been a major recruiting hub for the IT firms. According to estimates, these engineering colleges and universities consistently generate about 50 per cent of the human resource requirements for the IT and ITES industry was being sourced from the state, particularly from Chennai.
Since the late 1990s, software development and business process outsourcing and more recently electronics manufacturing have emerged as major drivers of the city's economic growth. Chennai has been rated as the most attractive Indian city for offshoring services according to A T Kearney's Indian City Services Attractiveness Index 2005. After Bangalore Chennai leads No. 2 in software space absorption.
Major software companies in Chennai
Major software and software services companies including Altran, Accenture, Cognizant, Capgemini, DXC Technology, SAP SE, Oracle Corporation, Cisco Systems, HCL Technologies, Hewlett Packard Enterprise, IBM, CGI Inc., Infosys, Sopra Steria, Symantec, Tata Consultancy Services, Verizon, Wipro, Virtusa, UST Global, Atos, Dassault Systèmes, Fujitsu, NTT DATA, LTI, Honeywell, VMware, Intel, Amazon.com, Inc., Tech Mahindra, Fiserv, Adobe Systems, AT&T, Philips, AstraZeneca, Wolters Kluwer, TransUnion, Ernst & Young, L&T Technology Services, Mindtree, Shell Business Operations, Athenahealth, Ford Global Technology & Business Center, Ramco Systems, Deloitte, Microsoft, Temenos, Synechron, KPMG, PayPal have development centres in the city. The city is now the second largest exporter of IT and IT enabled Services in the country behind Bangalore.
The IT Corridor, on Old Mahabalipuram Road in the southeast of the city houses several technology parks, and, when completed, will provide employment to close to 300,000 people. Besides the existing Tidel Park, two more Tidel Parks are on the anvil in the IT corridor. One is under construction at the Siruseri IT Special Economic Zone ("SEZ") and the other one is being planned at the current location of MGR Film City which is just before the existing Tidel Park, in Taramani on the IT Corridor. A number of SEZ have emerged in and around Chennai. The Mahindra World City, New Chennai, a Special Economic Zone (SEZ) with one of the world's largest high technology business zones, is currently under construction in the outskirts of Chennai. It also includes the World's largest IT Park by Infosys.
Special economic zones
A number of Special Economic Zones (SEZ) projects has emerged along the Grand Southern Trunk Road (NH 45), making it the SEZ corridor of Chennai. It includes MEPZ SEZ established in 1984, Mahindra World City, New Chennai, Shriram Properties's Gateway SEZ, Estancia SEZ and ETL Infrastructure.
It is also emerging as a major IT SEZ region with a number of huge investments by Infosys. Infosys has set up its largest development center in Mahindra SEZ while India Land Tech Park is developing a massive SEZ which is estimated to have office space for both IT and Electronics use. Shriram The Gateway SEZ, is an integrated township with IT/ITeS SEZ residential and mall, which is also home for IT majors like Accenture, ReDIM Information Systems, later the IT park is expanded to 4.6 million sft in association with Xander Group and EISL is an IT/ITES SEZ by ETL Infrastructure at Chengalpattu on is in the process of development.
Software backend services
Chennai houses the permanent back office of the World Bank, which is one of the largest buildings owned by the bank outside its headquarters in Washington, DC. The Chennai office, administrative and IT services of the bank, including the bank's software based analytical work in bond valuation which is estimated to be US$100 billion.
Software as a service
Chennai has emerged as the "SaaS Capital of India" (SaaS is jargon for "software as a service". The SaaS sector in/around Chennai generated $1 billion USD in revenue and employed about 10000 personnel in 2018.
See also
DLF Cybercity Chennai
TIDEL Park
SIPCOT IT Park
International Tech Park, Chennai
Olympia Tech Park
One Indiabulls Park
Economy of Chennai
Information technology in India
List of Indian IT companies
References
Information technology industry of Chennai
Software industry in India
Software companies of India |
34903877 | https://en.wikipedia.org/wiki/Nokia%20Asha%20302 | Nokia Asha 302 | The Nokia Asha 302 is a QWERTY messenger feature phone powered by Nokia's Series 40 operating system. It was announced at Mobile World Congress 2012 in Barcelona (on 27 February) along with other Asha phones - the Nokia Asha 202 and 203. The 302 is considered to be among the flagship of the Asha family. Its main features are the QWERTY keyboard, the pentaband 3G radio, SIP VoIP over 3G and Wi-Fi. Its design looks a lot like the older Nokia E6 with chrome slidings, giving it a somewhat premium look. A software update adds Mail for Exchange support.
Hardware
Processors
The Nokia Asha 302 is powered by the same 1 GHz ARM11 processor found in some Symbian^3 phones such as the Nokia 500, 600 and 700 but lack the dedicated Broadcom GPU which is not supported by the Nokia Series 40 operating system. The system also has 128 MB of low power single channel RAM (Mobile DDR).
Screen and input
The Nokia Asha 302 has a 2.4-inch transmissive LCD screen with a resolution of 320 × 240 pixel. In contrast with the Nokia Asha 303, the screen of the Asha 302 is wider than taller. According to Nokia it is capable of displaying up to 262 thousands colors. The device also has a backlit 4-row keyboard with regional variant available (QWERTY, AZERTY, etc.).
The back camera has an extended depth of field (EDoF) feature (no mechanical zoom), no flash and has a 4× digital zoom for both video and camera. The sensor size of the back camera is 3.2-megapixel (2048 x 1536 px), has a f/2.8 aperture and a 50 cm to infinity focus range. It is capable of video recording at up to 640 x 480 px at 15 fps with mono sound.
Audio and output
The Nokia Asha 302 has one microphone and a loudspeaker, which is situated on the back of the device. On the top, there is a 3.5 mm AV connector which simultaneously provides stereo audio output and microphone input. Between the 3.5 mm AV connector and the 2 mm charging connector, there is a High-Speed USB 2.0 USB Micro AB connector provided for data synchronization, battery charging and supports for USB On-The-Go 1.3 (the ability to act as a USB host) using a Nokia Adapter Cable for USB OTG CA-157 (not included upon purchase).
The built-in Bluetooth v2.1 +EDR (Enhanced Data Rate) supports stereo audio output with the A2DP profile. Built-in car hands-free kits are also supported with the HFP profile. File transfer is supported (FTP) along with the OPP profile for sending/receiving objects. It is possible to remote control the device with the AVRCP profile. It supports wireless earpieces and headphones through the HSP profile. The DUN profile which permits access to the Internet from a laptop by dialing up on a mobile phone wirelessly (tethering) and PAN profile for networking using Bluetooth are also supported. The device also functions as an FM receiver, allowing one to listen to the FM radio by using headphones connected to the 3.5 jack as antenna.
Battery and SIM
The battery life of the BL-5J (1430 mAh) as claimed by Nokia is from 6 to 9 hours of talk time, from 29 to 34 days of standby and 50 hours of music playback depending on actual usage.
The SIM card is located under the battery which can be accessed by removing the back panel of the device. The microSDHC card socket is also located under the back cover (but not under the battery). No tool is necessary to remove the back panel.
Storage
The phone has 150 MB of available non-removable storage. Additional storage is available via a hot swappable microSDHC card socket, which is certified to support up to 32 GB of additional storage.
Document Viewer is not available.
Software
The Nokia Asha 302 is powered by Nokia Series 40 operating system with service pack 1 and comes with a variety of applications:
Web: Nokia (proxy) Browser for Series 40,Nokia Xpress
Conversations: Nokia Messaging Service 3.2 (instant messaging and e-mail) and SMS, MMS
Social: Facebook, Twitter, Flickr and Orkut
Media: Camera, Photos, Music player, Nokia Music Store (on selected market), Flash Lite 3.0 (for YouTube video), Video player
Personal Information Management: Calendar, Detailed contact information
Utilities: VoIP, Notes, Calculator, To-do list, Alarm clock, Voice recorder, Stopwatch
Games: "Bounce Tales", "Brick Breaker Revolution" (trial version), "Tower Bloxx New York" (trial version)
The Home screen is customizable and allow the user to add, amongst others, favorite contacts, Twitter/Facebook feeds, applications shortcuts, IM/e-mail notifications and calendar alert.
Document Viewer is not available. It also doesn't contain any HERE maps.
See also
List of Nokia products
Comparison of smartphones
References
External links
http://www.nokia.com/nokia-asha-smarter-mobile-phones
http://europe.nokia.com/find-products/devices/nokia-asha-302/specifications
http://www.developer.nokia.com/Devices/Device_specifications/302
https://www.webcitation.org/6B7hfLoMa?url=http://www.developer.nokia.com/Community/Wiki/VoIP_support_in_Nokia_devices#Support_in_Series_40_devices
Nokia Asha 302 Review
Smartphones
Asha 302
Mobile phones introduced in 2012 |
28134260 | https://en.wikipedia.org/wiki/Screaming%20Bloody%20Murder | Screaming Bloody Murder | Screaming Bloody Murder is the fifth studio album by Canadian rock band Sum 41, released on March 29, 2011, after many delays. It is the band's second album produced by frontman Deryck Whibley. It is the band's last album to be released on Island Records before they had fulfilled their contract with the major label in 2016 and their first album not to be released on Aquarius Records, which they left in 2010. The album has received mixed reviews.
It is the last album to be released with longtime original drummer, Steve Jocz before he announced his departure from the band in April 2013. It is also the first album to feature guitarist Tom Thacker, best known as vocalist for fellow Canadian punk group Gob. Even though Thacker was already a part of the band and co-wrote the title track of the album, all guitars were still recorded by singer Deryck Whibley. Thacker was also uncredited in the album's liner notes, though he was seen in photos with the band in the album's booklet.
Background
The band initially entered the studio in late 2008 with plans to record an EP for release in April 2009, though as more and more material was written, they have decided to keep writing and make the recording a full-length album, with Deryck Whibley commenting that "it's safe to say the album will be released in 2009", though it was fast announced by Jason McCaslin and Steve Jocz not to expect the album any sooner than Summer 2010.
In November 2009, it was announced that the band hired legendary British producer Gil Norton to produce the album, and that they would begin pre-production in December, and would begin recording the album in January 2010. The recording did begin in January 2010, but Gil Norton was dismissed one week into the recording, with Deryck Whibley deciding to produce the album himself, just as he did on the band's last effort Underclass Hero.
Recording of instruments began on January 26, 2010, and finished on March 17, 2010, after which only vocals were left to record by Whibley himself in his home studio. Drums were recorded at Capitol Studios and Perfect Sound Studios, after which the band rented a house at the Hollywood Hills which served as their recording studio. Vocals were recorded until late March, when the band relocated to EastWest Studios, on April 7, 2010, to record additional songs for the album. On June 12, 2010, Deryck reported in a video update that the album was "99% done". Recording was finalized on June 24, 2010, a day before the band went on to play on the 2010 Vans Warped Tour. While playing the Warped Tour, the album went into the mixing stage by Tom Lord-Alge in Miami, Florida.
It was announced that a new song entitled "Skumfuk" was set to appear on the Warped Tour sampler CD, though it eventually wasn't ready for release in time. On July 6, 2010, the track was leaked online, in a non-final form. The song began rising in popularity online, which led Sum 41 to start playing it live on their European tour in October. It was later announced that the band would release a 12-minute section from the album (later entitled 'A Dark Road Out of Hell', consisting of tracks 7 – 9 of the final album) for free on their website before the official release. However, this release was later denied by the band's label. It was later revealed on the back of the album that 'A Dark Road Out of Hell' was indeed true, comprising tracks 7–9 entitled Holy Image of Lies, Sick of Everyone, and Happiness Machine.
In December 2010, Jason McCaslin confirmed that the album would finally go into mastering, and that though the album had essentially been ready for months, Island Records decided to postpone its release until after Christmas.
Promotion
On January 8, 2011, it was announced that the band would release the radio single "Screaming Bloody Murder" on February 7, 2011 in the United States. The song had its worldwide premiere on January 14, 2011, on the Windsor radio station 89X. Universal Japan has confirmed on the official Japanese Sum 41 website, that Screaming Bloody Murder will be released in Japan on March 23, 2011, after which it was confirmed on the band's official website that the album be released on March 29, 2011, in the US. On February 28, 2011, a stream of "Blood in My Eyes", another new song from the album, was released for free listening on Alternative Press. Universal Music Japan then announced that they postponed release date of the album in Japan because of the 2011 Tōhoku earthquake and tsunami, until April 6, 2011. On March 24, 2011, Island Records started streaming the record in its entirety on the band's official website.
"Baby You Don't Wanna Know" was the album's second single in the UK and Europe only. On June 22, 2011, during the band's performance in Angers, France, the band debuted the song live for the first time. Sum 41 shot a music video for "Baby You Don't Wanna Know" in June. In July 2011, Matt Whibley confirmed that the music video for the first single "Screaming Bloody Murder" would be left unreleased due to its content and difficulties with the label. On August 3, 2011, the band premiered the music video for "Baby You Don't Wanna Know" exclusively on German website Myvideo.de.
On February 29, 2012, the band shot a music video for "Blood in My Eyes", with director Michael Maxxis in Los Angeles. The video was shot in the desert around the Los Angeles area.
Singles
"Screaming Bloody Murder", the band's first single in 3 years, was released on February 8, 2011 in the United States. It was released a day earlier, on February 7, in Europe. The song was released as a digital download only on iTunes, Amazon.com and other music retailers. The song had its worldwide premiere, a month before the official release, on January 13, 2011, on the Detroit area radio station 89X. It then premiered on AOL Radio, a few hours later the same day.
The band performed the song live for the first time on February 4, 2011, in Paris, France, the first date of their first official European leg of the Screaming Bloody Murder Tour. It was performed at every show of the album's tour
The songs "Screaming Bloody Murder" and "Skumfuk" were performed on Jimmy Kimmel Live! on March 31, 2011. "Screaming Bloody Murder" was also performed on Lopez Tonight on April 14, 2011.
"Baby You Don't Wanna Know", along with "Time for You to Go", was one of the two songs that were written and recorded by the band at the last minute, on April 7, 2010, at EastWest Studios in Hollywood, California. The song, co-written by Matt Squire, was added to the album at the last minute and its recording was funded by Deryck Whibley himself, as the label refused to pay for any more songs for the album.
As said by Todd Morse on the album's making-of documentary, Don't Try This at Home, the song's style was more in the vein of classic rock and "straight-up-rock and roll", taking influence from the Rolling Stones and the Beatles, as opposed to all the other songs that were written during 2008–2009, that resulted in a more "dark" alternative rock style.
In an interview with the band during their European tour in July 2011, the band has commented that they considered releasing either "Blood in My Eyes" or "Back Where I Belong" as the second single, but opted to release "Baby You Don't Wanna Know" instead, as it was more radio friendly. It is also on the soundtrack of the 2011 film Green Lantern.
The band performed the song live for the first time on June 22, 2011, in Angers, France, during the band's summer European leg of the Screaming Bloody Murder Tour. The song was since then performed at various other concerts, and although being an official single, the band does not perform it on every date. It was performed on and off until August 2011, and was not played again when the band resumed the tour in 2012.
Critical reception
The album has received generally mixed reviews since its release, with its darker tone and songwriting drawing varied reactions. In the site Metacritic, the album received a score of 47/100 based on 8 reviews.
Jonah Bayer of Alternative Press said: "While we believe Sum 41 has the potential to succeed without the power chords, the fact that only a handful of musical ideas are fully developed in the album is a frustrating experience to listen". Rock Sound remarked: "It is no longer a phenomenon but, Sum 41 have continued to mature as a pretty good band."
On the other hand, Sputnikmusic called the album "Revolting, messy, lazy, and undeniably Sum 41."
Grace Duffy of Under The Gun said:
Accolades
Sum 41 was nominated for the 2011 Grammy Award for Best Hard Rock/Metal Performance for the song "Blood in My Eyes", but they lost to Foo Fighters.
Commercial performance
The album debuted at number 31 on the US Billboard 200, with first week sales of 15,000 copies. , the album has sold 52,000 copies.
Track listing
All songs written and composed by Deryck Whibley, except where noted.
Personnel
Sum 41
Deryck Whibley – lead vocals, guitars, keyboards, piano, production, mixing (tracks 4, 10, 12 & 14)
Jason McCaslin – bass guitar, backing vocals
Steve Jocz – drums, percussion
Additional musicians
James Levine – piano on "Crash"
Dan Chase – percussion on "Holy Image of Lies", "Sick of Everyone" and "Happiness Machine"
Roger Joseph Manning, Jr. – additional keyboards & piano (uncredited)
Production
Tom Lord-Alge – mixing
Chris Lord-Alge – mixing (tracks 6 & 13)
Femio Hernández - mixing assistant
Nik Karpen - mixing assistant
Ted Jensen – mastering
Gil Norton – additional drums production (uncredited)
Ryan Hewitt – engineering
Jason Donaghy - engineering
Travis Huff – additional engineering
Joe Hirst – additional engineering
Robbes Steiglitz - assistant engineering
Ben O'Neil - assistant engineering
Ken Sluiter - assistant engineering
Brad Townsend - mixdown engineering
Andrew Schubert - mixdown engineering
Other
Evan Lipschultz – A&R
Javon Greene – A&R
Kristen Yiengst – artwork, photo coordination
James Minchin – photography
Paul Resta - marketing
Kristen Yiengst - artwork, photo coordination
Marjan Malakpour - stylist
Charts
Weekly charts
Release history
References
External links
Screaming Bloody Murder at YouTube (streamed copy where licensed)
Sum 41 on Myspace
2011 albums
Sum 41 albums
Island Records albums
Albums recorded at Capitol Studios
Albums recorded at EastWest Studios |
3203442 | https://en.wikipedia.org/wiki/Dynamation%20%28software%29 | Dynamation (software) | Dynamation was a 3D computer graphics particle generator program sold by Wavefront to run on SGI's IRIX operating system as part of The Advanced Visualizer. The core software was originally developed by Jim Hourihan while at Santa Barbara Studios, a visual effects company owned by effects pioneer John Grower. The software was licensed to Wavefront Technologies in 1992, and passed through to the merged company Alias/Wavefront. It was introduced as a product at SIGGRAPH in 1993. In 1996, Jim Hourihan received a Scientific and Engineering Award for the primary design and development of Dynamation.
Dynamation could create behavioral particle systems that responded to gravity, air resistance, and other real world physics. It gave users an interactive environment to create and modify dynamic events such as water, clouds, rain, fire and dust. The interactive aspect of this software was revolutionary at the time. Users were able to change parameters and the particle system updated in real time.
The software was used to create visual effects in movies such as Twister, Last Action Hero, Balto, Crimson Tide, Heaven's Prisoners, Michael, Moses, Anaconda, Godzilla, Stuart Little, and Starship Troopers. It was also utilized in the opening credits of Star Trek: Voyager to create an intricate interaction of the starship traveling through cosmic dust.
Dynamation's interactive particle engine has been integrated into the 3D computer graphics package Maya and is no longer sold as a separate product.
References
3D graphics software
Animation software |
240468 | https://en.wikipedia.org/wiki/National%20security | National security | National security, or national defence, is the security and defence of a sovereign state, including its citizens, economy, and institutions, which is regarded as a duty of government. Originally conceived as protection against military attack, national security is widely understood to include also non-military dimensions, including the security from terrorism, minimization of crime, economic security, energy security, environmental security, food security, and cyber-security. Similarly, national security risks include, in addition to the actions of other nation states, action by violent non-state actors, by narcotic cartels, and by multinational corporations, and also the effects of natural disasters.
Governments rely on a range of measures, including political, economic, and military power, as well as diplomacy, to safeguard the security of a nation state. They may also act to build the conditions of security regionally and internationally by reducing transnational causes of insecurity, such as climate change, economic inequality, political exclusion, and nuclear proliferation.
Definitions
The concept of national security remains ambiguous, having evolved from simpler definitions which emphasised freedom from military threat and from political coercion. Among the many definitions proposed to date are the following, which show how the concept has evolved to encompass non-military concerns:
"A nation has security when it does not have to sacrifice its legitimate ínterests to avoid war, and is able, if challenged, to maintain them by war." (Walter Lippmann, 1943).
"The distinctive meaning of national security means freedom from foreign dictation." (Harold Lasswell, 1950)
"National security objectively means the absence of threats to acquired values and subjectively, the absence of fear that such values will be attacked." (Arnold Wolfers, 1960)
"National security then is the ability to preserve the nation's physical integrity and territory; to maintain its economic relations with the rest of the world on reasonable terms; to preserve its nature, institution, and governance from disruption from outside; and to control its borders." (Harold Brown, U.S. Secretary of Defense, 1977–1981)
"National security... is best described as a capacity to control those domestic and foreign conditions that the public opinion of a given community believes necessary to enjoy its own self-determination or autonomy, prosperity, and wellbeing." (Charles Maier, 1990)
"National security is an appropriate and aggressive blend of political resilience and maturity, human resources, economic structure and capacity, technological competence, industrial base and availability of natural resources and finally the military might." (National Defence College of India, 1996)
"[National security is the] measurable state of the capability of a nation to overcome the multi-dimensional threats to the apparent well-being of its people and its survival as a nation-state at any given time, by balancing all instruments of state policy through governance... and is extendable to global security by variables external to it." (Prabhakaran Paleri, 2008)
"[National and international security] may be understood as shared freedom from fear and want, and the freedom to live in dignity. It implies social and ecological health rather than the absence of risk... [and is] a common right." (Ammerdown Group, 2016)
Dimensions of national security
Potential causes of national insecurity include actions by other states (e.g. military or cyber attack), violent non-state actors (e.g. terrorist attack), organised criminal groups such as narcotic cartels, and also the effects of natural disasters (e.g. flooding, earthquakes). Systemic drivers of insecurity, which may be transnational, include climate change, economic inequality and marginalisation, political exclusion, and militarisation.
In view of the wide range of risks, the security of a nation state has several dimensions, including economic security, energy security, physical security, environmental security, food security, border security, and cyber security. These dimensions correlate closely with elements of national power.
Increasingly, governments organise their security policies into a national security strategy (NSS); as of 2017, Spain, Sweden, the United Kingdom, and the United States are among the states to have done so. Some states also appoint a National Security Council and/or a National Security Advisor which is an executive government agency, it feeds the head of the state on topics concerning national security and strategic interest. The national security council/advisor strategies long term, short term, contingency national security plans. India holds one such system in current, which was established on 19 November 1998.
Although states differ in their approach, with some beginning to prioritise non-military action to tackle systemic drivers of insecurity, various forms of coercive power predominate, particularly Military Capabilities. The scope of these capabilities has developed. Traditionally, military capabilities were mainly land- or sea-based, and in smaller countries, they still are. Elsewhere, the domains of potential warfare now include the air, space, cyberspace, and psychological operations. Military capabilities designed for these domains may be used for national security, or equally for offensive purposes, for example to conquer and annex territory and resources.
Physical security
In practice, national security is associated primarily with managing physical threats and with the military capabilities used for doing so. That is, national security is often understood as the capacity of a nation to mobilise military forces to guarantee its borders and to deter or successfully defend against physical threats including military aggression and attacks by non-state actors, such as terrorism. Most states, such as South Africa and Sweden, configure their military forces mainly for territorial defence; others, such as France, Russia, the UK and the US, invest in higher-cost expeditionary capabilities, which allow their armed forces to project power and sustain military operations abroad.
Infrastructure security
Infrastructure security is the security provided to protect infrastructure, especially critical infrastructure, such as airports, highways, rail transport, hospitals, bridges, transport hubs, network communications, media, the electricity grid, dams, power plants, seaports, oil refineries, and water systems. Infrastructure security seeks to limit vulnerability of these structures and systems to sabotage, terrorism, and contamination.
Many countries have established government agencies to directly manage the security of critical infrastructure, usually, through the Ministry of Interior/Home Affairs, dedicated security agencies to protect facilities such as United States Federal Protective Service, and also dedicated transport police such as the British Transport Police. There are also commercial transportation security units such as the Amtrak Police in the United States. Critical infrastructure is vital for the essential functioning of a country. Incidental or deliberate damage can have a serious impact on the economy and essential services. Some of the threats to infrastructure include:
Terrorism: person or groups deliberately targeting critical infrastructure for political gain. In the November 2008 Mumbai attacks, the Mumbai central station and hospital were deliberately targeted.
Sabotage: person or groups such as ex-employees, anti-government groups, environmental groups. Refer to Bangkok's International Airport Seized by Protestors.
Information warfare: private person hacking for private gain or countries initiating attacks to glean information and damage a country's cyberinfrastructure. Cyberattacks on Estonia and cyberattacks during the 2008 South Ossetia war are examples.
Natural disaster: hurricane or other natural events that damage critical infrastructures such as oil pipelines, water, and power grids. See Hurricane Ike and Economic effects of Hurricane Katrina for examples.
Computer security
Computer security, also known as cybersecurity or IT security, refers to the security of computing devices such as computers and smartphones, as well as computer networks such as private and public networks, and the Internet. It concerns the protection of hardware, software, data, people, and also the procedures by which systems are accessed, and the field has growing importance due to the increasing reliance on computer systems in most societies. Since unauthorized access to critical civil and military infrastructure is now considered a major threat, cyberspace is now recognised as a domain of warfare. One such example is the use of Stuxnet by the USA and Israel against the Iranian nuclear programme
Political security
Barry Buzan, Ole Wæver, Jaap de Wilde and others have argued that national security depends on political security: the stability of the social order. Others, such as Paul Rogers, have added that the equitability of the international order is equally vital. Hence, political security depends on the rule of international law (including the laws of war), the effectiveness of international political institutions, as well as diplomacy and negotiation between nations and other security actors. It also depends on, among other factors, effective political inclusion of disaffected groups and the human security of the citizenry.
Economic security
Economic security, in the context of international relations, is the ability of a nation state to maintain and develop the national economy, without which other dimensions of national security cannot be managed. Economic capability largely determines the defence capability of a nation, and thus a sound economic security directly influences the national security of a nation. That is why we see countries with sound economy, happen to have sound security setup too, such as The United States, China, India among others. In larger countries, strategies for economic security expect to access resources and markets in other countries and to protect their own markets at home. Developing countries may be less secure than economically advanced states due to high rates of unemployment and underpaid work.
Ecological security
Ecological security, also known as environmental security, refers to the integrity of ecosystems and the biosphere, particularly in relation to their capacity to sustain a diversity of life-forms (including human life). The security of ecosystems has attracted greater attention as the impact of ecological damage by humans has grown. The degradation of ecosystems, including topsoil erosion, deforestation, biodiversity loss, and climate change, affect economic security and can precipitate mass migration, leading to increased pressure on resources elsewhere. Ecological security is also important since most of the countries in the world are developing and dependent on agriculture and agriculture gets affected largely due to climate change. This effect affects the economy of the nation, which in turn affects national security.
The scope and nature of environmental threats to national security and strategies to engage them are a subject of debate. Romm (1993) classifies the major impacts of ecological changes on national security as:
Transnational environmental problems. These include global environmental problems such as climate change due to global warming, deforestation, and loss of biodiversity.
Local environmental or resource pressures. These include resource scarcities leading to local conflict, such as disputes over water scarcity in the Middle East; migration into the United States caused by the failure of agriculture in Mexico; and the impact on the conflict in Syria of erosion of productive land. Environmental insecurity in Rwanda following a rise in population and dwindling availability of farmland, may also have contributed to the genocide there.
Environmentally threatening outcomes of warfare. These include acts of war that degrade or destroy ecosystems. Examples are the Roman destruction of agriculture in Carthage; Saddam Hussein's burning of oil wells in the Gulf War; the use of Agent Orange by the UK in the Malayan Emergency and the USA in the Vietnam War for defoliating forests; and the high greenhouse gas emissions of military forces.
Security of energy and natural resources
Resources include water, sources of energy, land, and minerals. Availability of adequate natural resources is important for a nation to develop its industry and economic power. For example, in the Persian Gulf War of 1991, Iraq captured Kuwait partly in order to secure access to its oil wells, and one reason for the US counter-invasion was the value of the same wells to its own economy. Water resources are subject to disputes between many nations, including India and Pakistan, and in the Middle East.
The interrelations between security, energy, natural resources, and their sustainability is increasingly acknowledged in national security strategies and resource security is now included among the UN Sustainable Development Goals. In the US, for example, the military has installed solar photovoltaic microgrids on their bases in case of power outage.
Issues in national security
Consistency of approach
The dimensions of national security outlined above are frequently in tension with one another. For example:
The high cost of maintaining large military forces can place a burden on the economic security of a nation And annual defence spending as percent of GDP varies significantly by country. Conversely, economic constraints can limit the scale of expenditure on military capabilities.
Unilateral security action by states can undermine political security at an international level if it erodes the rule of law and undermines the authority of international institutions. The invasion of Iraq in 2003 and the annexation of Crimea in 2014 have been cited as examples.
The pursuit of economic security in competition with other nation states can undermine the ecological security of all when the impact includes widespread topsoil erosion, biodiversity loss, and climate change. Conversely, expenditure on mitigating or adapting to ecological change places a burden on the national economy.
If tensions such as these are not managed effectively, national security policies and actions may be ineffective or counterproductive.
National versus transnational security
Increasingly, national security strategies have begun to recognise that nations cannot provide for their own security without also developing the security of their regional and international context. For example, Sweden's national security strategy of 2017 declared:"Wider security measures must also now encompass protection against epidemics and infectious diseases, combating terrorism and organised crime, ensuring safe transport and reliable food supplies, protecting against energy supply interruptions, countering devastating climate change, initiatives for peace and global development, and much more."
The extent to which this matters, and how it should be done, is the subject of debate. Some argue that the principal beneficiary of national security policy should be the nation state itself, which should centre its strategy on protective and coercive capabilities in order to safeguard itself in a hostile environment (and potentially to project that power into its environment, and dominate it to the point of strategic supremacy). Others argue that security depends principally on building the conditions in which equitable relationships between nations can develop, partly by reducing antagonism between actors, ensuring that fundamental needs can be met, and also that differences of interest can be negotiated effectively. In the UK, for example, Malcolm Chalmers argued in 2015 that the heart of the UK's approach should be support for the Western strategic military alliance led through NATO by the United States, as "the key anchor around which international order is maintained". The Ammerdown Group argued in 2016 that the UK should shift its primary focus to building international cooperation to tackle the systemic drivers of insecurity, including climate change, economic inequality, militarisation and the political exclusion of the world's poorest people.
Impact on civil liberties and human rights
Approaches to national security can have a complex impact on human rights and civil liberties. For example, the rights and liberties of citizens are affected by the use of military personnel and militarised police forces to control public behaviour; the use of surveillance, including mass surveillance in cyberspace, which has implications for privacy; military recruitment and conscription practices; and the effects of warfare on civilians and civil infrastructure. This has led to a dialectical struggle, particularly in liberal democracies, between government authority and the rights and freedoms of the general public.
Even where the exercise of national security is subject to good governance, and the rule of law, a risk remains that the term national security may become a pretext for suppressing unfavorable political and social views. In the US, for example, the controversial USA Patriot Act of 2001, and the revelation by Edward Snowden in 2013 that the National Security Agency harvests the personal data of the general public, brought these issues to wide public attention. Among the questions raised are whether and how national security considerations at times of war should lead to the suppression of individual rights and freedoms, and whether such restrictions are necessary when a state is not at war.
Perspectives
Africa
Conceptualizing and understanding the National Security choices and challenges of African States is a difficult task. This is due to the fact that it is often not rooted in the understanding of their (mostly disrupted) state formation and their often imported process of state-building.
Although Post-Cold War conceptualisations of Security have broadened, the policies and practices of many African states still privilege national security as being synonymous with state security and, even more narrowly- regime security.
The problem with the above is that a number of African states (be specific) have been unable to govern their security in meaningful ways. Often failing to be able to claim the monopoly of force in their territories. The hybridity of security ‘governance’ or ‘providers’ thus exists. States that have not been able to capture this reality in official National Security strategies and policies often find their claim over having the monopoly of force and thus being the Sovereign challenged. This often leads to the weakening of the state. Examples of such states are South Sudan and Somalia.
Argentina and Brazil
National Security ideology as taught by the US Army School of the Americas to military personnel was vital in causing the military coup of 1964 in Brazil and the 1976 one in Argentina. The military dictatorships were installed on the claim by the military that Leftists were an existential threat to the national interests.
China
China's Armed Forces are known as the People's Liberation Army (PLA). The military is the largest in the world, with 2.3 million active troops in 2005.
The Ministry of State Security was established in 1983 to ensure "the security of the state through effective measures against enemy agents, spies, and counterrevolutionary activities designed to sabotage or overthrow China's socialist system."
India
The state of the Republic of India's national security is determined by its internal stability and geopolitical interests. While Islamic upsurge in Indian State of Jammu and Kashmir demanding secession and far left-wing terrorism in India's red corridor remain some key issues in India's internal security, terrorism from Pakistan based militant groups has been emerging as a major concern for New Delhi.
The National Security Advisor of India heads the National Security Council of India, receives all kinds of intelligence reports, and is chief advisor to the Prime Minister of India over national and international security policy. The National Security Council has India's defence, foreign, home, finance ministers and deputy chairman of NITI Aayog as its members and is responsible for shaping strategies for India's security in all aspects.
Illegal immigration to India, most of whom are Muslims from Bangladesh and Myanmar (Rohingya muslims) are a national security risk. There is an organised influx of nearly 40,000 illegal Bangladeshi and Rohingya Muslim immigrants in Delhi who pose a national security risk, threaten the national integration, and alter the demographics. A lawyer Ashwini Upadhyay filed a Public interest litigation (PIL) in the "Supreme Court of India" (SC) to identify and deport these. Responding to this PIL, Delhi Police told the SC in July 2019 that nearly 500 illegal Bangladeshi immigrants have been deported in the preceding 28 months. There are estimated 600,000 to 700,000 illegal Bangladeshi and Rohingya immigrants in National Capital Region (NCR) region specially in the districts of Gurugram, Faridabad, and Nuh (Mewat region), as well as interior villages of Bhiwani and Hisar. Most of them are Muslims who have acquired fake Hindu identity, and under questioning, they pretend to be from West Bengal. In September 2019, the Chief Minister of Haryana, Manohar Lal Khattar announced the implementation of NRC for Haryana by setting up a legal framework under the former judge of the Punjab and Haryana High Court, Justice HS Bhalla for updating NRC which will help in weeding out these illegal immigrants.
Russia
In the years 1997 and 2000, Russia adopted documents titled "National Security Concept" that described Russia's global position, the country's interests, listed threats to national security, and described the means to counter those threats. In 2009, these documents were superseded by the "National Security Strategy to 2020". The key body responsible for coordinating policies related to Russia's national security is the Security Council of Russia.
According to provision 6 of the National Security Strategy to 2020, national security is "the situation in which the individual, the society and the state enjoy protection from foreign and domestic threats to the degree that ensures constitutional rights and freedoms, decent quality of life for citizens, as well as sovereignty, territorial integrity and stable development of the Russian Federation, the defence and security of the state."
Singapore
Total Defence is Singapore’s whole-of-society national defence concept based on the premise that the strongest defence of a nation is collective defence – when every aspect of society stays united for the defence of the country. Adopted from the national defence strategies of Sweden and Switzerland, Total Defence was introduced in Singapore in 1984. Then, it was recognised that military threats to a nation can affect the psyche and social fabric of its people. Therefore, the defence and progress of Singapore were dependent on all its citizens and their resolve, not just the government or the armed forces. Total Defence has since evolved to take into consideration threats and challenges outside of the conventional military domain.
Ukraine
National security of Ukraine is defined in Ukrainian law as "a set of legislative and organisational measures aimed at permanent protection of vital interests of man and citizen, society and the state, which ensure sustainable development of society, timely detection, prevention and neutralisation of real and potential threats to national interests in areas of law enforcement, fight against corruption, border activities and defence, migration policy, health care, education and science, technology and innovation policy, cultural development of the population, freedom of speech and information security, social policy and pension provision, housing and communal services, financial services market, protection of property rights, stock markets and circulation of securities, fiscal and customs policy, trade and business, banking services, investment policy, auditing, monetary and exchange rate policy, information security, licensing, industry and agriculture, transport and communications, information technology, energy and energy saving, functioning of natural monopolies, use of subsoil, land and water resources, minerals, protection of ecology and environment and other areas of public administration, in the event of emergence of negative trends towards the creation of potential or real threats to national interests.".
The primary body responsible for coordinating national security policy in Ukraine is the National Security and Defense Council of Ukraine.
It is an advisory state agency to the President of Ukraine, tasked with developing a policy of national security on domestic and international matters. All sessions of the council take place in the Presidential Administration Building. The council was created by the provision of Supreme Council of Ukraine #1658-12 on October 11, 1991. It was defined as the highest state body of collegiate governing on matters of defence and security of Ukraine with the following goals:
Protecting sovereignty
Constitutional order
Territorial integrity and inviolability of the republic
Developing strategies and continuous improvement of policy in the sphere of defence and state security
Comprehensive scientific assessment of the military threat nature
Determining position toward modern warfare
Effective control over the execution of the tasks of the state and its institutions keeping defence capabilities of Ukraine at the level of defence sufficiency
United Kingdom
The primary body responsible for coordinating national security policy in the UK is the National Security Council (United Kingdom) which helps produce and enact the UK's National Security Strategy. It was created in May 2010 by the new coalition government of the Conservative Party (UK) and Liberal Democrats. The National Security Council is a committee of the Cabinet of the United Kingdom and was created as part of a wider reform of the national security apparatus. This reform also included the creation of a National Security Adviser and a National Security Secretariat to support the National Security Council.
United States
National Security Act of 1947
The concept of national security became an official guiding principle of foreign policy in the United States when the National Security Act of 1947 was signed on July 26, 1947, by U.S. President Harry S. Truman. As amended in 1949, this Act:
created important components of American national security, such as the precursor to the Department of Defense;
subordinated the military branches to the new cabinet-level position of Secretary of Defense;
established the National Security Council and the Central Intelligence Agency;
Notably, the Act did not define national security, which was conceivably advantageous, as its ambiguity made it a powerful phrase to invoke against diverse threats to interests of the state, such as domestic concerns.
The notion that national security encompasses more than just military security was present, though understated, from the beginning. The Act established the National Security Council so as to "advise the President on the integration of domestic, military and foreign policies relating to national security".
While not defining the "interests" of national security, the Act does establish, within the National Security Council, the "Committee on Foreign Intelligence", whose duty is to conduct an annual review "identifying the intelligence required to address the national security interests of the United States as specified by the President" (emphasis added).
In Gen. Maxwell Taylor's 1974 essay "The Legitimate Claims of National Security", Taylor states:
National security state
To reflect on the institutionalisation of new bureaucratic infrastructures and governmental practices in the post-World War II period in the U.S., when a culture of semi-permanent military mobilisation brought around the National Security Council, the CIA, the Department of Defense, and the Joint Chiefs of Staff, national-security researchers apply a notion of a national security state:
Obama administration
The U.S. Joint Chiefs of Staff defines national security of the United States in the following manner :
In 2010, the White House included an all-encompassing world-view in a national security strategy which identified "security" as one of the country's "four enduring national interests" that were "inexorably intertwined":
Empowerment of women
U.S. Secretary of State Hillary Clinton has said that, "The countries that threaten regional and global peace are the very places where women and girls are deprived of dignity and opportunity". She has noted that countries, where women are oppressed, are places where the "rule of law and democracy are struggling to take root", and that, when women's rights as equals in society are upheld, the society as a whole changes and improves, which in turn enhances stability in that society, which in turn contributes to global society.
Cyber
The Bush Administration in January 2008, initiated the Comprehensive National Cybersecurity Initiative (CNCI). It introduced a differentiated approach, such as: identifying existing and emerging cybersecurity threats, finding and plugging existing cyber vulnerabilities, and apprehending actors that trying to gain access to secure federal information systems. President Obama issued a declaration that the "cyber threat is one of the most serious economic and national security challenges we face as a nation" and that "America's economic prosperity in the 21st century will depend on cybersecurity."
See also
Deep state
Fourth branch of government
Homeland security
Human security
International security
Military–industrial complex
Security
National interest
National economic security
References
Further reading
Bhadauria, Sanjeev. National Security. Allahabad: Dept. of Defence and Strategic Studies, University of Allahabad, 2002.
Brzezinski, Zbigniew. Power and Principle: Memoirs of the National Security Adviser, 1977–1981. New York: Farrar, Straus, Giroux, 1983.
Chen, Hsinchun. National Security. Amsterdam: Elsevier, 2007.
Cordesman, Anthony H. Saudi Arabia: National Security in a Troubled Region. Santa Barbara, Calif: Praeger Security International, 2009.
Devanny, Joe, and Josh Harris, The National Security Council: national security at the centre of government. London: Institute for Government/King's College London, 2014.
Jordan, Amos A., William J. Taylor, Michael J. Mazarr, and Suzanne C. Nielsen. American National Security. Baltimore, Md: Johns Hopkins University Press, 1999.
MccGwire, Michael. Perestroika and Soviet National Security. Washington D.C.: Brookings Institution Press, 1991.
Mueller, Karl P. Striking First: Preemptive and Preventive Attack in U.S. National Security Policy. Santa Monica, CA: RAND Project Air Force, 2006.
National Research Council (U.S.). Beyond "Fortress America": National Security Controls on Science and Technology in a Globalized World. Washington, D.C.: National Academies Press, 2009.
Neal, Andrew. Security in a Small Nation: Scotland, Democracy, Politics. Open Book Publishers, 2017.
Rothkopf, David J. Running the World: The Inside Story of the National Security Council and the Architects of American Power. New York: PublicAffairs, 2005.
Ripsman, Norrin M., and T. V. Paul. Globalization and the National Security State. Oxford: Oxford University Press, 2010.
Tal, Israel. National Security: The Israeli Experience. Westport, Conn: Praeger, 2000.
Tan, Andrew. Malaysia's security perspectives. Canberra : Strategic and Defence Studies Centre, Australian National University, 2002
Scherer, Lauri S. National Security. Detroit: Greenhaven Press, 2010.
External links
National Security Internet Archive (NSIA) at the Internet Archive
Political terminology |
16477144 | https://en.wikipedia.org/wiki/4348%20Poulydamas | 4348 Poulydamas | 4348 Poulydamas is a large Jupiter Trojan from the Trojan camp, approximately in diameter. It was discovered on 11 September 1988, by American astronomer Carolyn Shoemaker at the Palomar Observatory in California. The assumed C-type asteroid belongs to the 40 largest Jupiter trojans and has a rotation period of 9.9 hours. It was named after Poulydamas from Greek mythology.
Orbit and classification
Poulydamas is a dark Jovian asteroid orbiting in the trailering Trojan camp at Jupiter's Lagrangian point, 60° behind its orbit in a 1:1 resonance (see Trojans in astronomy). It is also a non-family asteroid of the Jovian background population.
It orbits the Sun at a distance of 4.7–5.7 AU once every 12 years (4,383 days; semi-major axis of 5.24 AU). Its orbit has an eccentricity of 0.10 and an inclination of 8° with respect to the ecliptic. The body's observation arc begins with a precovery taken at Palomar in October 1953, nearly 35 years prior to its official discovery observation.
Physical characteristics
Poulydamas is an assumed, carbonaceous C-type asteroid.
Rotation period
In December 1990, a first rotational lightcurve of Poulydamas was obtained by Stefano Mottola and Mario Di Martino using the 1.52-meter Loiano Telescope at the Bologna Observatory in Italy. Lightcurve analysis gave a well-defined rotation period of hours with an amplitude of magnitude ().
In October 2013, astronomers at the Palomar Transient Factory measured a period of 9.9214 hours and a brightness variation of 0.23 magnitude in the R-band ().
Between 2015 and 2018, photometric observations by Robert Stephens at the Center for Solar System Studies, California, rendered four similar, rotational periods of 9.88, 9.922, 9.937 and 9.941 hours with four corresponding amplitudes of 0.19, 0.34, 0.27 and 0.29 magnitude ()
Diameter and albedo
According to the surveys carried out by the Japanese Akari satellite and the NEOWISE mission of NASA's Wide-field Infrared Survey Explorer, Poulydamas measures 82.03 and 87.51 kilometers in diameter and its surface has an albedo of 0.033 and 0.048, respectively. The Collaborative Asteroid Lightcurve Link assumes a standard albedo for a carbonaceous asteroid of 0.057, and calculates a diameter of 70.08 kilometers based on an absolute magnitude of 9.5.
Naming
This minor planet was named by the discoverer from Greek mythology after Poulydamas, the closest counselor and strategist of the Trojan prince Hector, after whom the minor planet 624 Hektor is named. Hector and Poulydamas were born on the same night. While the gods gave Hector the ability to perfectly master his arms, Poulydamas was given the present of better judgment. It was Poulydamas who urged to lock the gates of Troy against Achilles , but Hector left the city and confronted him nonetheless, which led to his doom and to the city's eventual downfall during the Trojan War. The official naming citation was published on 28 April 1991 ().
Notes
References
External links
Asteroid Lightcurve Database (LCDB), query form (info )
Dictionary of Minor Planet Names, Google books
Discovery Circumstances: Numbered Minor Planets (1)-(5000) – Minor Planet Center
004348
Discoveries by Carolyn S. Shoemaker
Minor planets named from Greek mythology
Named minor planets
19880911 |
208875 | https://en.wikipedia.org/wiki/Underclocking | Underclocking | Underclocking, also known as downclocking, is modifying a computer or electronic circuit's timing settings to run at a lower clock rate than is specified. Underclocking is used to reduce a computer's power consumption, increase battery life, reduce heat emission, and it may also increase the system's stability, lifespan/reliability and compatibility. Underclocking may be implemented by the factory, but many computers and components may be underclocked by the end user.
Types
CPU underclocking
For microprocessors, the purpose is generally to decrease the need for heat dissipation devices or decrease the electrical power consumption. This can provide increased system stability in high-heat environments, or can allow a system to run with a lower airflow (and therefore quieter) cooling fan or without one at all. For example, a Pentium 4 processor normally clocked at 3.4 GHz can be "underclocked" to 2 GHz and can then be safely run with reduced fan speeds. This invariably comes at the expense of some system performance. However, the proportional performance reduction is usually less than the proportional reduction in clock speed because performance is often limited by other bottlenecks: the hard disk, GPU, disk controller, Internet, network, etc. Underclocking refers to alterations of the timing of a synchronous circuit in order to lower a device's energy needs. Deliberate underclocking involves limiting a processor's speed, which may affect the speed of operations, but may or may not make a device noticeably less able, depending on other hardware and desired use.
Many computers and other devices allow for underclocking. Manufacturers add underclocking options for many reasons. Underclocking can help with excessive heat buildup, because lower performance will not generate as much heat inside the device. It can also lower the amount of energy needed to run the device. Laptop computers and other battery-operated devices often have underclocking settings, so that batteries can last longer without being charged.
In addition to providing underclocking features, manufacturers can choose to limit the capability of a machine in order to make it more efficient. Reduced instruction set computer (RISC) models can help makers build devices that work on less power.
Graphics cards
Underclocking can also be performed on graphics card processor's GPUs, usually with the aim of reducing heat output. For instance, it is possible to set a GPU to run at lower clock rates when performing everyday tasks (e.g. internet browsing and word processing), thus allowing the card to operate at lower temperature and thus lower, quieter fan speeds. The GPU can then be overclocked for more graphically intense applications, such as games. Underclocking a GPU will reduce performance, but this decrease will probably not be noticeable except in graphically intensive applications.
Memory underclocking
Newer and faster RAM may be underclocked to match older systems as an inexpensive way to replace rare or discontinued memory. This might also be necessary if stability problems are encountered at higher settings, especially in a PC with several memory modules of different clock speed. If you underclock a PC processor, and do not change the clock factor or multiplier (the ratio between the processor and the memory clock speed), the memory will also be underclocked.
When used
Dynamic frequency scaling (automatic underclocking) is very common on laptop computers and has become common on desktop computers as well. In laptops, the processor is usually underclocked automatically whenever the computer is operating on batteries. Most modern notebook and desktop processors (utilizing power-saving schemes like AMD's Cool'n'Quiet and PowerNow!) will underclock themselves automatically under a light processing load, when the machine BIOS and the operating system support it. Intel has also used this method on numerous processors through a feature called SpeedStep. SpeedStep first appeared on chips like the Core 2 Duo and selective Pentium models, later becoming a standard in mid to high-end Core i3, i5, and i7 models.
Some processors underclock automatically as a defensive measure, to prevent overheating which could cause permanent damage. When such a processor reaches a temperature level deemed too high for safe operation, the thermal control circuit activates, automatically decreasing the clock and CPU core voltage until the temperature has returned to a safe level. In a properly cooled environment, this mechanism should trigger rarely (if ever).
There are several different underclocking competitions similar in format to overclocking competitions, except the goal is to have the lowest clocked computer, as opposed to the highest.
Advantages
Reduced electrical power consumption, especially when combined with undervolting (i.e., reducing the component's voltage below the nominal). For instance, by underclocking an Athlon XP 1700+ processor from 1466 to 1000 MHz and reducing the core voltage from 1.75 to 1.15V, a computer user reduced the power consumption from 64.0 to 21.6W, i.e., 66% power reduction, with only 26% less performance. The same is true for newer processors: When a single-core Intel CPU was 20% underclocked, the PC's performance was down only 13% with a 49% power reduction.
In general, the power consumed by a CPU with a capacitance C, running at frequency f and voltage V is approximately
Reduced heat generation, which is exactly proportional to the power consumption.
Less noise because the cooling fans may be slowed down, or even eliminated. A cooling fan's efficiency is proportional to its rotation speed, but as it increases, so does the noise.
Longer hardware lifespan.
Increased stability.
Increased battery life.
Better compatibility with old applications.
Proper performance of very old computer games that were dependent on CPU timing.
In practice
Linux
Linux kernel supports CPU frequency modulation. In supported processors, using cpufreq to gain access to this feature gives the system administrator a variable level of control over the CPU's clock rate. The kernel includes five governors by default: Conservative, Ondemand, Performance, Powersave, and Userspace. The Conservative and Ondemand governors adjust the clock rate depending on the CPU load, but each with different algorithms. The Ondemand governor jumps to maximum frequency on CPU load and decreases the frequency step by step on CPU idle, whereas the Conservative governor increases the frequency step by step on CPU load and jumps to lowest frequency on CPU idle. The Performance, Powersave and Userspace governors set the clock rate statically: Performance to the highest available, Powersave to the lowest available, and Userspace to a frequency determined and controlled by the user.
Windows
Underclocking can be done manually in the BIOS or with Windows applications, or dynamically using features such as Intel's SpeedStep or AMD's Cool'n'Quiet. In Windows 7 and 10, underclocking can be set within the "advanced" settings of a power management plan.
Asus Eee PC
Earlier models of the Asus Eee PC used a 900 MHz Intel Celeron M processor underclocked to 630 MHz.
Mac OS X
Underclocking can be performed in the EFI.
Smartphones and PDAs
Most smartphones and PDAs, such as the Motorola Droid, Palm Pre, and Apple iPhone, use underclocking of a more powerful processor, rather than the full clocking of a less powerful processor, to maximize battery life. The designers for such mobile devices often discover that a slower processor gives worse battery life than a more powerful processor at a lower clock rate.
They select a processor on the basis of the performance per watt of the processor.
Performance
The performance of an underclocked machine will often be better than might be expected. Under normal desktop use, the full power of the CPU is rarely needed. Even when the system is busy, a large amount of time is usually spent waiting for data from memory, disk, or other devices. Such devices communicate with the CPU through a bus which operates at a much lower bandwidth. Generally, the lower the CPU multiplier (and thus clockrate of a CPU), the closer its performance will be to that of the bus, and the less time it will spend waiting.
See also
big.LITTLE
References
External links
CPU "Undervolting" & "Underclocking" A Primer From SilentPCReview.com
"Underclocking" a Game Boy classic tutorial
Clock signal
Computer hardware tuning
Computer hardware cooling |
37500786 | https://en.wikipedia.org/wiki/2012%E2%80%9313%20Troy%20Trojans%20men%27s%20basketball%20team | 2012–13 Troy Trojans men's basketball team | The 2012–13 Troy Trojans men's basketball team represented Troy University during the 2012–13 NCAA Division I men's basketball season. The Trojans, led by 31st year head coach Don Maestri, played their home games at Trojan Arena and were members of the East Division of the Sun Belt Conference. They finished the season 12–21, 6–14 in Sun Belt play to finish in last place in the East Division. They lost in the quarterfinals of the Sun Belt Tournament to Arkansas State.
Roster
Schedule
|-
!colspan=9| Exhibition
|-
!colspan=9| Regular Season
|-
!colspan=9|
2013 Sun Belt Tournament
References
Troy Trojans men's basketball seasons
Troy
2012 in sports in Alabama
2013 in sports in Alabama |
63572622 | https://en.wikipedia.org/wiki/The%20MICCAI%20Society | The MICCAI Society | The MICCAI Society is a professional organization for scientists in the areas of Medical Image Computing and Computer Assisted Interventions. Due to the multidisciplinary nature of these fields, the society brings together researchers from several scientific disciplines. including computer science, robotics, physics, and medicine. The society is best known for its annual flagship event, The MICCAI Conference, which facilitates the publication and presentation of original research on MICCAI-related topics. However, the society provides endorsements and sponsorships for several scientific events each year.
History
In 1998, three international conferences: Visualization in Biomedical Computing (VBC), Computer Vision and Virtual Reality in Robotics and Medicine (CVRMed), and Medical Robotics and Computer Assisted Surgery (MRCAS) merged into a single conference entitled "The International Conference on Medical Image Computing and Computer Assisted Interventions" (abbreviated MICCAI) with its first edition in Boston. The MICCAI Society was founded in 2004 by several active members of this research community and former chairs of the MICCAI conference. In 2009, the society introduced the "MICCAI Fellow" award to recognize senior members who had made substantial contributions to the MICCAI community. 12 fellows were elected in 2009 and three additional fellows are elected each year. New MICCAI Fellows are announced each year at the Annual MICCAI Conference. Since 2012, the society is involved in several events each year outside of the annual conference through endorsements and/or sponsorships. These include a number of smaller international conferences, MICCAI-focused workshop sessions at related conferences, and educational programs such as "summer schools".
Research focus
Medical Image Computing
Medical Image Computing (the "MIC" in MICCAI) is the field of study involving the application of image processing and computer vision to medical imaging. The goals of medical image computing tasks are diverse, but some common examples are computer-aided diagnosis, image segmentation of anatomical structures and/or abnormalities, and the registration or "alignment" of medical images acquired through different means or at different points in time.
Computer Assisted Interventions
Computer Assisted Interventions (the "CAI" in MICCAI) is the field of study concerned with the use of computational tools in medical interventions. Prominent examples of computer aided interventions currently in widespread use include image guided biopsy and robot-assisted surgery. Integral to this research area is effective human-computer interaction and user interface design.
Subgroups
Within the MICCAI community, a number of organizations have emerged to represent and advocate for certain populations of MICCAI researchers. Among these are the MICCAI Student Board and the Women in MICCAI Committee.
MICCAI Student Board
The MICCAI Student Board began in 2010 when MICCAI initiated its social media presence by creating a facebook group. This effort was championed by student researchers who used the group to organize events specifically for students at the 2011 and 2012 annual conferences. After the 2012 event, the MICCAI board of directors formally recognized the MICCAI student board as a part of the society and began providing support for the student board's annual events.
Women in MICCAI Committee
The Women in MICCAI Committee began as a series of networking sessions for female researchers within the medical image analysis research community during the 2015 MICCAI conference and the 2016 IEEE International Symposium on Biomedical Imaging. In October 2016, the MICCAI board of directors approved a measure to create the "Women in MICCAI Committee" with the goal of strengthening the representation of female scientists in this research area.
Since its inception, the Women in MICCAI Committee has continued to organize networking sessions in conjunction with MICCAI events. It also developed and maintains several online platforms for discussion on social media. The committee is the primary interface between the MICCAI board of directors and the community of women researchers in MICCAI.
Annual MICCAI conference
Conference format
MICCAI conferences are typically scheduled for five days, of which the first and last days set aside for satellite events consisting of tutorials, workshops, and challenges. Those include the Brain lesion workshop (BrainLes), the Workshop on Interpretability of Machine Intelligence in Medical Image Computing (iMIMIC), Domain Adaptation and Representation Transfer (DART), and others. The main conference includes invited presentations, panel discussions, and podium and poster presentations of original research papers which are published by Springer Nature as conference proceedings.
Past MICCAI conferences
Upcoming MICCAI conferences
Publications
The MICCAI conference proceedings consist of full-length papers which undergo comprehensive peer review. Since even before the merger of the CVRMed, MRCAS, and VBC conferences (see History), the proceedings of the annual conference have been published by Springer Nature as part of the Lecture Notes in Computer Science (LNCS) series.
In addition to the proceedings of the annual conference, MICCAI officially partners with two peer reviewed scientific journals: "Medical Image Analysis" published by Elsevier and "The International Journal of Computer Assisted Radiology and Surgery" (IJCARS) published by Springer Nature. These journals loosely correspond to the "MIC" and "CAI" focuses of the MICCAI Society respectively, but they have substantial overlap in subject matter.
The MICCAI Society also partners with Elsevier to develop a series of books on MICCAI-related research, written by scientists in the MICCAI research community. , nine books have been published in this series
See also
Robot-assisted surgery
Computer vision
Conference on Computer Vision and Pattern Recognition
International Conference on Computer Vision
European Conference on Computer Vision
Institute of Electrical and Electronics Engineers
International Society for Computer Aided Surgery
References
External links
The MICCAI Society Website
Computer science-related professional associations
Computer science organizations
Medical technology
Health informatics |
376475 | https://en.wikipedia.org/wiki/The%20Matrix%20Online | The Matrix Online | The Matrix Online (abbreviated as MxO) was a massively multiplayer online role-playing game (MMORPG) initially developed by Monolith Productions and later, a few months after launch, by Sony Online Entertainment. It was advertised as a continuation of the storyline of The Matrix films, as The Wachowskis, the franchise's creators, gave their blessing to the notion of gamers "inherit[ing] the storyline". The game began closed beta-testing in June 2004 which was then opened for people who pre-ordered the game in November 2004. Warner Bros. and Sega released MxO on March 22, 2005 in the United States. It was released in Europe on April 15, 2005. In June, Warner Bros. sold the rights to the game to Sony Online Entertainment, and the game's development and operation was transferred to the latter on August 15, 2005. Sony Online Entertainment shut down operation of the game on July 31, 2009.
Ubisoft backed out of an agreement to co-publish the game, not long after canceling plans for another MMORPG. Ubisoft and Warner Bros. stated that this did not have a negative impact on their relationship. At the time, doubts about the game circled within the industry, based on the lackluster reception of the second and third The Matrix films and an overcrowded MMORPG market.
Gameplay
In The Matrix Online, the player assumes the role of a redpill, a human who was formerly trapped inside the Matrix and has since been freed and shown the truth of humanity's imprisonment. When creating a new character, the player is given the choice of taking a blue pill that will return them to their former life (quit the game) or a red pill, which will free their mind from the Matrix and allow them to take the body of a physical human and experience reality. Characters who are unaware of the fact that they are in the simulation are often referred to as "bluepills" because they have either taken the blue pill or have not been given the choice yet. People who are aware of the simulation (players) are referred to as "redpills" because they have taken the red pill (or, in very rare cases, when a character has self-substantiated out of the Matrix on their own). Following the choice between the two pills, the player is then taken through a basic tutorial of the game's mechanics, including mission interaction and the combat system. After the tutorial, they are then free to roam the Mega City (the large metropolis that the entire Matrix story is set in).
Combat
Combat in the game is divided into two separate parts: Free-fire and Interlock. Free-fire mode allows for large gun battles to take place, while Interlock is often broken down into bullet-time-affected martial arts moves and close-quarters gunfire.
There are three main classes in The Matrix Online: Coder, Hacker, and Operative. Coders create a special "simulacrum" that fights for them. Hackers manipulate the code of the Matrix to affect friends and enemies from a distance, either damaging them, downgrading their combat abilities, or healing them and upgrading their powers. Operatives are the common soldiers seen from the movies - Martial Artists, Gunmen, and the new Spy class, which revolves around stealth fighting and knife throwing. Magazines never seem to run out of bullets and knife throwers also have an unlimited supply.
In free-fire mode, operatives exchange damage with each other. Gunmen and Hackers are well-equipped for this, with their ranged attacks and abilities. Martial Artists must get close to their targets to be effective, and although a Spy's most dangerous abilities are initiated out of Interlock, they also pull their opponents into Interlock. Each attack or ability is used at timed intervals, based on the system of damage per second (D.P.S.). For example, the strongest rifle in the Matrix does 15 damage points per second and has a fire rate of 3.5 seconds, which, in free-fire, causes the rifle to have a base damage of 52.5, to be altered by the player's own stats. Opposed to such, a Hacker's stronger attack ability such as Logic Barrage 4.0 does 63 D.P.S., but with a short casting timer, does base damage of only 120-180 damage.
In Interlock or Close Combat, two players exchange damage in rounds. Each round lasts exactly four seconds. For each round, the two players' accuracies are pitted against each other's defenses, which are slightly affected by a random "luck" roll. There are three different outcomes to around: hit-hit, hit-miss, or miss-miss. In hit-miss, one of the players will hit the other while dodging or blocking their attack. In miss-miss, both players will parry each other without doing damage. In hit-hit, one player will damage the other, only to be damaged themselves in a counterattack. When special abilities are used, however, there can be no hit-hit round, although the miss-miss round can still apply.
When taking or dealing damage, one player's damage influences are pitted against another player's resistance influences of the same damage type (i.e. a gunman's ballistic damage versus an opponent's ballistic resistance). Higher resistance versus lower damage means that the defending player will not take as much damage.
When attacking or defending against attacks, one player's accuracy influences are pitted against another's defense influences of the same attack type.
There is no turn-based combat in the Matrix Online. All combat takes place in "real-time", and large-scale battles are often decided by the sheer numbers of forces of one side versus others. Amassing a large number of players to control the battlefield is affectionately dubbed "zerging".
Items that characters drop in the game world can be picked up, granting powers to the player that lugs them around. These are called "luggables".
Classes
The Matrix Online has a unique class system. Players can load abilities they have either purchased or produced (by the Coder class, known in-game as coding) at Hardlines, provided they have enough memory and the abilities that precede the loading one. These abilities can then be switched out at a Hardline at a moment's notice. This leads to a very flexible class system, without players being stuck in one class.
The three main archetypes are Hacker, Coder, and Operative. They are similar to the classes Mage, Crafter, and Fighter in other MMORPGs. These classes then branch out into sub-classes, with Coder, for example, is divided into Programmer (out of a battle item and ability maker) and Code Shaper (creates simulacrums to fight with, similarities to a necromancer/summoner in other MMOG's). The game has a total of 21 end-game classes with an additional two stubs.
Missions and organizations
After an initial set of introductory missions, players can join one of three organizations working in the Matrix, each with a different set of goals, beliefs, and methods: Zion, the Machines, and the Merovingian.
In order to receive increasingly critical and sensitive missions, players are expected to run missions for their chosen organization, which will increase their standing with their chosen organization but will also lower it with the other two.
Zion/Nirvana:
Zion is the last remaining human city on Earth, hidden deep underground and is concerned chiefly with protecting its citizens from the Machines who see those who have "awakened" as a threat to those still connected to the Matrix. Those who choose to work for Zion usually enlist in the Zion Military and see this as the best way to protect the ideals of freedom.
Machines:
The main motivation for choosing to side with the Machines is that this organization is seen as the most conducive towards maintaining the status-quo of the Matrix and protecting the lives of those still connected to it, i.e., bluepills. However, there are also those who feel that the only way to improve relations between man and machine is to work with them as closely as possible and see joining this organization as the best way to do so.
Merovingian:
Those who work for the Merovingian are in a unique position in that they need not concern themselves with the traditional hostilities between Zion and the Machines, preferring instead to act only when the situation would prove advantageous for themselves or the organization as a whole. However, this organization has also been chosen by some players as it is the only one out of the three that fights to protect the Exiles who reside within the Matrix.
Sub-organizations:
Players cannot run missions for these organizations although in storyline terms they are now quite separate from their original "parent" organization, even receiving their own Live Events:
EPN - E Pluribus Neo (Zion as parent organization):
Members of EPN are devoted to what they deem "Neo's legacy". This mostly involves giving all human beings the opportunity to question the true nature of their "reality", the Matrix, and to have the choice of the red or blue pill. Very much against the Machines, and Cypherites in particular, there are some more fundamentalist schools of thought within this organization who believe that the only solution to humanity's problems is to free the entire human population from the Matrix. They are led by The Kid with his old friend, Shimada - who also acts as their mission controller.
Cypherites (Machines as parent organization):
Usually seen as the more extreme elements of the Machine organization, Cypherites follow in the footsteps of Cypher, wanting to be reinserted into the Matrix as bluepills so that they may be blissfully unaware of the true nature of the Matrix as a computer program. The name of their hovercraft, Blue Dreamer, reflects this philosophy. They are currently led by Cryptos and his second-in-command, the Zion traitor, Veil. During the time that Cryptos was revealed to be a Machine Program inhabiting a redpill's body, Veil assumed control of the organization.
As of Chapter 11.3, the Cypherites and EPN have effectively been withdrawn as a playable organization within the game. Existing factions that have been granted their respective "EPN/CYPH" tags in their faction name will continue to hold said tags unless they disband or reform, but no new splinter org tags will be granted. In addition, no Live Events will occur for these organizations in the future.
Continuing story
Another of The Matrix Online'''s defining and differentiating aspects was its inclusion and emphasis on what was called "The Continuing Story". This is to say the game itself is the official continuation of the universe, story and characters established in The Matrix series of fictional works including the film trilogy, The Animatrix short films, the Enter the Matrix video game and a series of officially written and produced Matrix comic books.
This continuation was written by comic book writer Paul Chadwick. It was also confirmed as having seen verification and input from Matrix creators The Wachowskis through the end of Chapter 9.
Progression of the storyline
The story progressed in real time, with a planned schedule in effect that included the following:
Nine new critical missions (three for each of the game's three main organizations) every six weeks, released weekly as part of the game's patch cycle.
A new hand-drawn cinematic every six weeks to coincide with the start of a new sub-chapter.
Daily live events.
Large-scale organizational meetings (one each month).
Chapter organizationThe Matrix Online used a system of organization akin to that of software versioning to keep track of its chronological progression. Each "Critical" mission and development is given its own unique tag within this system.
For example: Chapter 1, sub-chapter 2, week 3 would be represented as 1.2.3
It has been stated by MxO developer Rarebit, that this numbering system was meant purely for chronological measuring and game design (for the various rewards associated with completing past critical missions in a system called The Mission Archive). The chapters and sub-chapters are not intended as self-contained units. Rather, they are each equally relevant to the unfolding of the story as a whole.
LESIG program
The LESIG (Live Event Special Interest Group) was originally devised under Monolith's operation of the game to give developers insight into the player sentiment around live events, with the creation of a live events program, including the development of in-game event tools and server-specific event teams made of community members, as a long term goal.
However, when The Matrix Online moved to Sony Online Entertainment, the program underwent a radical change in direction as part of similar changes to the other story telling devices, most importantly, the scale and frequency of live events following the departure of a dedicated Live Events Team.
The group were given the new task of playing minor supporting roles (known as organization liaison officers) during future live events or even more permanent characters to enhance interaction between players, essentially replacing the paid staff of the LET with volunteer players.
Closing
In June 2009, Sony Online Entertainment stopped serving The Matrix Online due to low subscription numbers. The service was officially shut down at 00:00 August 1, 2009. At the time, it had fewer than 500 active players.
The days leading up to the closing, as well as the end of the servers themselves, were chronicled on the gaming website Giant Bomb in a video series titled "Not Like This", a reference to a line in The Matrix'''s first movie.
After the closing, the website remained operational for a limited period of time. Visitors were greeted with an invitation to peruse the official memory book, which had been posted as a parting gift to the fans. The book included a summary of the storyline and various nostalgic items.
Reception
The game received "mixed or average" reviews according to video game review aggregator Metacritic.
See also
Simulated reality
References
External links
Matrix Online Storybook - A browsable/downloadable version of the memory book which contains elements of the ongoing story, as held by the Internet Archive
2005 video games
Products and services discontinued in 2009
Massively multiplayer online role-playing games
Inactive massively multiplayer online games
LithTech games
Monolith Productions games
Sega video games
Sony Interactive Entertainment games
Online
Video games developed in the United States
Video games with time manipulation
Warner Bros. Interactive Entertainment games
Windows games
Windows-only games
Video games directed by The Wachowskis |
5758975 | https://en.wikipedia.org/wiki/Virtuoso%20Universal%20Server | Virtuoso Universal Server | Virtuoso Universal Server is a middleware and database engine hybrid that combines the functionality of a traditional relational database management system (RDBMS), object–relational database (ORDBMS), virtual database, RDF, XML, free-text, web application server and file server functionality in a single system. Rather than have dedicated servers for each of the aforementioned functionality realms, Virtuoso is an "universal server"; it enables a single multithreaded server process that implements multiple protocols. The free and open source edition of Virtuoso Universal Server is also known as OpenLink Virtuoso. The software has been developed by OpenLink Software with Kingsley Uyi Idehen and Orri Erling as the chief software architects.
Database structure
Core database engine
Virtuoso provides an extended object–relational model, which combines the flexibility of relational access with inheritance, run time data typing, late binding, and identity-based access. Virtuoso Universal Server database includes physical file and in memory storage and operating system processes that interact with the storage. There is one main process, which has listeners on a specified port for HTTP, SOAP, and other protocols.
Architecture
Virtuoso is designed to take advantage of operating system threading support and multiple CPUs. It consists of a single process with an adjustable pool of threads shared between clients. Multiple threads may work on a single index tree with minimal interference with each other. One cache of database pages is shared among all threads and old dirty pages are written back to disk as a background process.
The database has at all times a clean checkpoint state and a delta of committed or uncommitted changes to this checkpointed state. This makes it possible to do a clean backup of the checkpoint state while transactions proceed on the commit state.
A transaction log file records all transactions since the last checkpoint. Transaction log files may be preserved and archived for an indefinite time, providing a full, recoverable history of the database.
A single set of files is used for storing all tables. A separate set of files is used for all temporary data. The maximum size of a file set is 32 terabytes, for 4G × 8K pages.
Locking
Virtuoso provides dynamic locking, starting with row level locks and escalating to page level locks when a cursor holds a large percentage of a page's rows or when it has a history of locking entire pages. Lock escalation only happens when no other transactions hold locks on the same page, hence it never deadlocks. Virtuoso SQL provides means for exclusive read and for setting transaction isolation.
Transactions
All four levels of isolation are supported: Dirty read, read committed, repeatable read and serializable. The level of isolation may be specified operation by operation within a single transaction. Virtuoso can also act as a resource manager and/or transaction coordinator under Microsoft's Distributed Transaction Coordinator (MS DTC) or the XA standard.
Data integrity
Virtuoso ORDBMS database supports entity integrity and referential integrity. Virtuoso ensures that relationships between records in related tables are valid by enforcing referential integrity. Integrity constraints include:
NOT NULL – Within the definition of a table, Virtuoso allows data to contain a NULL value. This NULL value is not really a value at all and is considered an absence of value. The constraint of NOT NULL forces a value to be given to a column.
Unique key – Uniqueness for a column or set of columns means that the values in that column or set of columns must be different from all other columns or set of columns in that table. A unique key may contain NULL values since they are by definition a unique non-valued value.
Primary key – Primary key are much like unique keys except that they are designed to uniquely identify a row in a table. They can consist of a single column or multiple columns. The primary key cannot contain a NULL value.
CHECK Constraint – Virtuoso provides on a column an integrity constraint that requires certain conditions to be met before the data is inserted or modified. If the checks are not satisfied then the transaction cannot be completed.
Data dictionary
Virtuoso stores all its information about all user objects in the database in the system catalog tables designated by db.dba*.
Components and files
Components
Virtuoso is made up of client and server components. These components typically communicate with a local or remote Virtuoso server, which include:
Virtuoso Drivers for ODBC, JDBC, ADO.NET and OLE DB
Conductor, a web-based database administration user interface
ISQL (Interactive SQL) and ISQO Utilities
Documentation and Tutorials
Samples
Installations come with two databases: a default and a demo database.
History
The Virtuoso project was born in 1998 from a merger of the OpenLink data access middleware and Kubl RDBMS.
Kubl RDBMS
The Kubl ORDBMS was one of a list of relational database systems with roots in Finland. This list also includes MySQL, InnoDB, and Solid RDBMS/Solid Technologies.
As is the case with most technology products, key personnel behind OpenLink Virtuoso, InnoDB, and Solid share periods of professional overlap that provide noteworthy insight into the history of database technology development in Finland. Heikki Tuuri (creator of InnoDB), Ora Lassila (W3C and Nokia Research, a technology lead and visionary in the areas RDF and Semantic Web in general alongside Tim Berners-Lee), and Orri Erling (Virtuoso Program Manager at OpenLink Software) all worked together in a startup company called Entity Systems in Finland – where they were developing Common Lisp and Prolog development environments for the early generation of PC's circa. 1986–88.
Later, Orri Erling worked with VIA International, the developer of VIA/DRE in designing a LISP-based object-oriented data access layer atop the company's DBMS product. The core development team of VIA, following the company's demise in 1992, went on to found Solid Technologies under the direction of Artturi Tarjanne.
Heikki Tuuri worked at Solid for a while before starting his own database development project which became InnoDB (acquired by Oracle in 2005).
Orri Erling started his own DBMS development work in 1994, which was to become Kubl. Development of Kubl was initially financed by Infosto Group, publisher of Finland's largest free ads paper, as part of their in-house software development project for their on-line services. The on-line version of Keltainen Pörssi was at one time said to be Finland's most popular web site with 500,000 registered users. The Kubl database was prominently displayed in a "Powered by Kubl" logo on the search results.
A free trial version of Kubl was made available for download on November 7, 1996.
Kubl was marketed as a high performance lightweight database for embedded use; the development aim was to achieve top scores in Transactions Per Second tests. Pricing of the product was especially favorable to Linux users with a Linux license priced at $199.
Kubl became the cornerstone of OpenLink Virtuoso, after the technology paths of Kingsley Uyi Idehen and Orri Erling crossed in 1998, leading to the acquisition of Kubl by OpenLink Software.
Functionality realms
Virtuoso's functionality covers a broad range of traditionally distinct realms in a single product offering. These functional realms include:
Object–relational database engine for (SQL, XML, RDF and plain text)
Web services computing platform
Web application server
Web content management system (WCMS)
NNTP-based discussion management
Replication of homogeneous and heterogeneous data
Mail Storage Sink and (POP3) service proxy
DataPortability
Protocols implemented
Virtuoso supports a broad range of industry standard Web & Internet protocols that includes:
HTTP, WebDAV, CalDAV, CardDAV, SOAP, UDDI, WSDL, WS-Policy, WS-Security, WS-ReliableMessaging, WS-Routing, WS-Referral, WS-Attachment, WS-BPEL, SyncML, GData, SPARQL, SPARUL, NNTP
API support
For the database application developer and systems integrator, Virtuoso implements a variety of industry standard data access APIs (client and server) that includes: ODBC, JDBC, OLE DB, ADO.NET, ADO.NET Entity Framework, XMLA.
Content syndication and interchange format support
For the Web application developer and content syndicate(s) publishers, and consumers, Virtuoso supports standards such as: Atom, RSS 2.0, RSS 1.0, OPML, XBEL, FOAF, SIOC.
Query language support
SQL, SPARQL (with numerous extensions), XQuery (implementation of Core functions library is seriously incomplete), XPath (1.0 only), XSLT (1.0 only)
Schema definition language support
SQL's Data Definition Language, XML Schema
Usage scenarios
Virtuoso is a solution for the following system integration challenges:
Enterprise Information Integration (EII)
Programming Language Independent Web application deployment
Monolithic application decomposition that leverages the principles of service-oriented architecture
Web service based enterprise application integration via a significant amount of WS-* protocols support
Business process management via BPEL
Semantic Web Data Spaces Generation
Deployment platform for injecting RDF-based Linked Data into the Semantic Data Web
Related technology areas
Data management
Relational database management system
List of relational database management systems
Comparison of object–relational database management systems
Comparison of relational database management systems
Enterprise application, information, and data integration
Web 2.0
Enterprise service bus
Service-oriented architecture
Enterprise application integration
Data integration
Open Semantic Framework
Web service
Semantic Web
Business Integration Severs Comparison Matrix
Related products and tools
In addition to Virtuoso, OpenLink Software produces several related tools and applications:
OpenLink Data Spaces – a Virtuoso-based platform for cost-effective creation and management of Semantic Web / Linked Data Web presence. It provides a data junction box for integrating data across third party Social network service, Blog, File sharing, Shared & Social bookmarking, Wiki, E-mail, Photo Sharing, RSS 2.0, Atom, and RSS 1.1 Content Aggregation services. In addition, to its third party integration functionality, it also includes its own rich collection of Linked Data compliant distributed collaborative applications, across each of the aforementioned Web application realms.
Universal Data Access Drivers – High-performance data access drivers for ODBC, JDBC, ADO.NET, and OLE DB that provide transparent access to enterprise databases across multiple platforms and databases.
Platforms
Virtuoso is supported on a number of 32- and 64-bit platforms including cross-platform Windows, UNIX (HP, AIX, Sun, DEC, BSD, SCO), Linux (Red Hat, SUSE) and macOS.
Licensing
In April 2006, a free software version of Virtuoso was made available under the GNU General Public License version 2.
The software is now available in Commercial and Open Source license variants.
References
External links
Atom (Web standard)
Big data products
Client-server database management systems
Column-oriented DBMS software for Linux
Cross-platform free software
Cross-platform software
Database engines
Distributed computing architecture
Document-oriented databases
Enterprise application integration
Free database management systems
Free file sharing software
Free software programmed in C
Free web server software
FTP server software
Message-oriented middleware
Metadata
Middleware
NewSQL
NoSQL
Online databases
ORDBMS software for Linux
MacOS database-related software
Products introduced in 1998
Relational database management systems
RSS
Semantic Web
SQL data access
Structured storage
Triplestores
Unix Internet software
Unix network-related software
Web services
Windows database-related software
XML software
XSLT processors |
18037258 | https://en.wikipedia.org/wiki/Test%20Drive%20%281987%20video%20game%29 | Test Drive (1987 video game) | Test Drive is a racing video game developed by Distinctive Software and published by Accolade, released in 1987 for the Amiga, Atari ST, Commodore 64, and DOS, in 1988 for the Apple II, and later ported for the PC-98 in 1989. It is the first game in the Test Drive video game series.
Gameplay
The player chooses one of five supercars (Lamborghini Countach, Lotus Esprit Turbo, Chevrolet Corvette C4, Porsche 911 Turbo (930), or Ferrari Testarossa) to drive on a winding cliffside two-lane road while avoiding traffic and outrunning police speed traps. The course's five stages are separated by gas station pit stops.
Release
In 1987, Accolade published Test Drive as a computer game worldwide, and Electronic Arts imported it to the United Kingdom. The quality of the Amiga, Atari ST, Commodore 64, and DOS ports differ from each other. The Amiga version's detailed visuals and audio realistically depicted the game's racing theme, while its Atari ST counterpart used simplified graphics and sound effects. The Commodore 64 and DOS ports were of similar quality to the Amiga version. The gameplay was kept intact for all platforms.
Reception and legacy
Test Drive was a commercial hit. In late 1989, Video Games & Computer Entertainment reported that the game's sales had surpassed 400,000 units and were well on their way to the half-million mark.
It received generally positive reviews from video game critics. Computer Gaming World stated in 1987 that Test Drive "offers outstanding graphics and the potential to 'hook' every Pole Position fan". Compute! praised the excellent graphics and sound, but noted that the game only had one course. The game was reviewed in 1988 in Dragon #132 by Hartley, Patricia, and Kirk Lesser in "The Role of Computers" column. The reviewers gave the game 4 out of 5 stars. David M. Wilson reviewed the game for Computer Gaming World, and stated that "there may be more competitive racing games on the market, but this game combines the enjoyment of driving five of the most exotic sportscars in the world with outrunning "Smokies" on mountain highways. What more could a race car junkie (or arcade fan) ask for?!"
Test Drive spawned several sequels and spin-offs. Distinctive Software developed its 1989 sequel, The Duel: Test Drive II, using several software libraries. Distinctive (as Unlimited Software, Inc.) used the aforementioned software libraries for a MS-DOS port of Outrun, resulting in the Accolade v. Distinctive lawsuit. Distinctive Software won, so the rights to make the Test Drive games without the source code transferred to Accolade. The court also found that Accolade had failed to demonstrate that the balance of hardships was in its favor. Another sequel, Test Drive III: The Passion, was developed and published by Accolade in 1990.
In 1997, Accolade distributed Test Drive: Off-Road, an off-road truck racing spinoff, and Test Drive 4, the first video game developed by Pitbull Syndicate. In 1998, Pitbull Syndicate developed two further Test Drive titles, Test Drive 4X4 (also known as Test Drive Off-Road 2), a sequel to the Test Drive: Off-Road spinoff, and Test Drive 5; both games were the two last entries in the series to be published by Accolade. In April 1999, Accolade was acquired by French video game company Infogrames Entertainment for a combined sum of , of which in cash and in growth capital, and was renamed Infogrames North America, Inc. The company chief executive officer, Jim Barnett, was named head of Infogrames Entertainment's American distribution subsidiary. As a result, Test Drive 6 was the first game in the series to be published by Infogrames in 1999. TD Overdrive: The Brotherhood of Speed (also known as Test Drive) was the last entry in the series to be developed by Pitbull Syndicate, and as a result, the next game in the series, Test Drive: Eve of Destruction, was developed by Monster Games in 2003.
References
External links
1987 video games
Accolade (company) games
Amiga games
Apple II games
Atari ST games
Commodore 64 games
DOS games
NEC PC-9801 games
1
Video games developed in Canada |
58907592 | https://en.wikipedia.org/wiki/Thomas%20Studer | Thomas Studer | Thomas Studer, born April 4, 1972, is Professor at the Computer Science Institute at the University of Bern. He is a specialist in logic and theoretical computer science.
He has a degree in mathematics, computer science, and philosophy from the University of Bern; he earned his PhD in 2011. He was the senior software engineer at Crosspoint Informatik before joining the faculty at the university.
He is elected presidium member of the Platform Mathematics, Astronomy and Physics of the Swiss Academy of Science.
Since 2014 he is president of the Swiss Society for Logic and Philosophy of Science.
Bibliography
Relationale Datenbanken - Von den theoretischen Grundlagen zu Anwendungen mit PostgreSQL (2016, Springer Vieweg)
Kahle, Reinhard, Strahm, Thomas, Studer, Thomas (eds.): Advances in Proof Theory (2016, Birkhäuser)
Guram Bezhanishvili, Giovanna D'Agostino, George Metcalfe and Thomas Studer (eds.): Advances in Modal Logic - Volume 12 (2018 College Publication)
References
Living people
Computer science writers
1972 births |
31607666 | https://en.wikipedia.org/wiki/2011%20PlayStation%20Network%20outage | 2011 PlayStation Network outage | The 2011 PlayStation Network outage (sometimes referred to as the PSN Hack) was the result of an "external intrusion" on Sony's PlayStation Network and Qriocity services, in which personal details from approximately 77 million accounts were compromised and prevented users of PlayStation 3 and PlayStation Portable consoles from accessing the service. The attack occurred between April 17 and April 19, 2011, forcing Sony to turn off the PlayStation Network on April 20. On May 4, Sony confirmed that personally identifiable information from each of the 77 million accounts had been exposed. The outage lasted 23 days.
At the time of the outage, with a count of 77 million registered PlayStation Network accounts, it was one of the largest data security breaches in history. It surpassed the 2007 TJX hack which affected 45 million customers. Government officials in various countries voiced concern over the theft and Sony's one-week delay before warning its users.
Sony stated on April 26 that it was attempting to get online services running "within a week." On May 14, Sony released PlayStation 3 firmware version 3.61 as a security patch. The firmware required users to change their account's password upon signing in. At the time the firmware was released, the network was still offline. Regional restoration was announced by Kazuo Hirai in a video from Sony. A map of regional restoration and the network within the United States was shared as the service was coming back online.
Timeline of the outage
On April 20, 2011, Sony acknowledged on the official PlayStation Blog that it was "aware certain functions of the PlayStation Network" were down. Upon attempting to sign in via the PlayStation 3, users received a message indicating that the network was "undergoing maintenance". The following day, Sony asked its customers for patience while the cause of outage was investigated and stated that it may take "a full day or two" to get the service fully functional again.
The company later announced an "external intrusion" had affected the PlayStation Network and Qriocity services. This intrusion occurred between April 17 and April 19. On April 20, Sony suspended all PlayStation Network and Qriocity services worldwide. Sony expressed their regrets for the downtime and called the task of repairing the system "time-consuming" but would lead to a stronger network infrastructure and additional security. On April 25, Sony spokesman Patrick Seybold reiterated on the PlayStation Blog that fixing and enhancing the network was a "time intensive" process with no estimated time of completion. However, the next day Sony stated that there was a "clear path to have PlayStation Network and Qriocity systems back online", with some services expected to be restored within a week. Furthermore, Sony acknowledged the "compromise of personal information as a result of an illegal intrusion on our systems."
On May 1 Sony announced a "Welcome Back" program for customers affected by the outage. The company also confirmed that some PSN and Qriocity services would be available during the first week of May. The list of services expected to become available included:
On May 2 Sony issued a press release, according to which the Sony Online Entertainment (SOE) services had been taken offline for maintenance due to potentially related activities during the initial criminal hack. Over 12,000 credit card numbers, albeit in encrypted form, from non-U.S. cardholders and additional information from 24.7 million SOE accounts may have been accessed.
During the week, Sony sent a letter to the US House of Representatives, answering questions and concerns about the event. In the letter Sony announced that they would be providing Identity Theft insurance policies in the amount of US$1 million per user of the PlayStation Network and Qriocity services, despite no reports of credit card fraud being indicated. This was later confirmed on the PlayStation Blog, where it was announced that the service, AllClear ID Plus powered by Debix, would be available to users in the United States free for 12 months, and would include Internet surveillance, complete identity repair in the event of theft and a $1 million identity theft insurance policy for each user.
On May 6 Sony stated they had begun "final stages of internal testing" for the PlayStation Network, which had been rebuilt. However, the following day Sony reported that they would not be able to bring services back online within the one-week timeframe given on May 1, because "the extent of the attack on Sony Online Entertainment servers" had not been known at the time. SOE confirmed on their Twitter account that their games would not be available until some time after the weekend.
Reuters began reporting the event as "the biggest Internet security break-in ever". A Sony spokesperson said:
Sony had removed the personal details of 2,500 people stolen by hackers and posted on a website
The data included names and some addresses, which were in a database created in 2001
No date had been fixed for the restart
On May 14 various services began coming back online on a country-by-country basis, starting with North America. These services included: sign-in for PSN and Qriocity services (including password resetting), online game-play on PS3 and PSP, playback of rental video content, Music Unlimited service (PS3 and PC), access to third party services (such as Netflix, Hulu, Vudu and MLB.tv), friends list, chat functionality and PlayStation Home. The actions came with a firmware update for the PS3, version 3.61. As of May 15 service in Japan and East Asia had not yet been approved.
On May 18 SOE shut down the password reset page on their site following the discovery of another exploit that allowed users to reset other users' passwords, using the other user's email address and date of birth. Sign-in using PSN details to various other Sony websites was also disabled, but console sign-ins were not affected.
On May 23 Sony stated that the outage costs were $171 million.
Sony response
US House of Representatives
Sony reported on May 4 to the PlayStation Blog that:
Sony relayed via the letter that:
Explanation of delays
On April 26, 2011 Sony explained on the PlayStation Blog why it took so long to inform PSN users of the data theft:
Sony investigation
Possible data theft led Sony to provide an update in regards to a criminal investigation in a blog posted on April 27: "We are currently working with law enforcement on this matter as well as a recognized technology security firm to conduct a complete investigation. This malicious attack against our system and against our customers is a criminal act and we are proceeding aggressively to find those responsible."
On May 3 Sony Computer Entertainment CEO Kazuo Hirai reiterated this and said the "external intrusion" which had caused them to shut down the PlayStation Network constituted a "criminal cyber attack". Hirai expanded further, claiming that Sony systems had been under attack prior to the outage "for the past month and half", suggesting a concerted attempt to target Sony.
On May 4 Sony announced that it was adding Data Forte to the investigation team of Guidance Software and Protiviti in analysing the attacks. Legal aspects of the case were handled by Baker & McKenzie. Sony stated their belief that Anonymous, a decentralized unorganized loosely affiliated group of hackers and activists may have performed the attack. No Anons claimed any involvement.
Upon learning that a breach had occurred, Sony launched an internal investigation. Sony reported, in its letter to the United States Congress:
Inability to use PlayStation 3 content
While most games remained playable in their offline modes, the PlayStation 3 was unable to play certain Capcom titles in any form. Streaming video providers throughout different regions such as Hulu, Vudu, Netflix and LoveFilm displayed the same maintenance message. Some users claimed to be able to use Netflix's streaming service but others were unable.
Criticism of Sony
Delayed warning of possible data theft
On April 26 nearly a week after the outage, Sony confirmed that it "cannot rule out the possibility" that personally identifiable information such as PlayStation Network account username, password, home address, and email address had been compromised. Sony also mentioned the possibility that credit card data was taken—after claiming that encryption had been placed on the databases, which would partially satisfy PCI Compliance for storing credit card information on a server.
Subsequent to the announcement on both the official blog and by e-mail, users were asked to safeguard credit card transactions by checking bank statements. This warning came nearly a week after the initial "external intrusion" and while the Network was turned off.
Some disputed this explanation and queried that if Sony deemed the situation so severe that they had to turn off the network, Sony should have warned users of possible data theft sooner than on April 26. Concerns have been raised over violations of PCI Compliance and the failure to immediately notify users. US Senator Richard Blumenthal wrote to Sony Computer Entertainment America CEO Jack Tretton questioning the delay.
Sony replied in a letter to the subcommittee:
Unencrypted personal details
Credit card data was encrypted, but Sony admitted that other user information was not encrypted at the time of the intrusion. The Daily Telegraph reported that "If the provider stores passwords unencrypted, then it's very easy for somebody else – not just an external attacker, but members of staff or contractors working on Sony's site – to get access and discover those passwords, potentially using them for nefarious means."
On May 2, Sony clarified the "unencrypted" status of users' passwords, stating that:
British Information Commissioners Office
Following a formal investigation of Sony for breaches of the UK's Data Protection Act 1998, the Information Commissioners' Office issued a statement highly critical of the security Sony had in place:
Sony was fined £250,000 ($395k) for security measures so poor they did not comply with the British law.
Sony Online Entertainment outage
On May 3 Sony stated in a press release that there may be a correlation between the attack that had occurred on April 16 towards the PlayStation Network and one that compromised Sony Online Entertainment on May 2. This portion of the attack resulted in the theft of information on 24.6 million Sony Online Entertainment account holders. The database contained 12,700 credit card numbers, particularly those of non-U.S. residents, and had not been in use since 2007 as much of the data applied to expired cards and deleted accounts. Sony updated this information the following day by stating that only 900 cards on the database were still valid. The attack resulted in the suspension of SOE servers and Facebook games. SOE granted 30 days of free time, plus one day for each day the server was down, to users of Clone Wars Adventures, DC Universe Online, EverQuest, EverQuest II, EverQuest Online Adventures, Free Realms, Pirates of the Burning Sea, PlanetSide, Poxnora, Star Wars Galaxies and Vanguard: Saga of Heroes, as well as other forms of compensation for all other Sony Online games.
Security experts Eugene Lapidous of AnchorFree, Chester Wisniewski of Sophos Canada and Avner Levin of Ryerson University criticized Sony, questioning its methods of securing user data. Lapidous called the breach "difficult to excuse" and Wisniewski called it "an act of hubris or simply gross incompetence".
Reaction
Compensation to users
Sony hosted special events after the PlayStation Network returned to service. Sony stated that they had plans for PS3 versions of DC Universe Online and Free Realms to help alleviate some of their losses. In a press conference in Tokyo on May 1, Sony announced a "Welcome Back" program. As well as "selected PlayStation entertainment content" the program promised to include 30 days free membership of PlayStation Plus for all PSN members, while existing PlayStation Plus members received an additional 30 days on their subscription. Qriocity subscribers received 30 days. Sony promised other content and services over the coming weeks. Sony offered one year free identity theft protection to all users with details forthcoming.
Hulu compensated PlayStation 3 users for the inability to use their service during the outage by offering one week of free service to Hulu Plus members.
On May 16, 2011, Sony announced that two PlayStation 3 games and two PSP games would be offered for free from lists of five and four, respectively. The games available varied by region and were only available in countries which had access to the PlayStation Store prior to the outage. On May 27, 2011, Sony announced the "welcome back" package for Japan and the Asia region (Hong Kong, Singapore, Malaysia, Thailand and Indonesia). In the Asia region, a theme - Dokodemo Issyo Spring Theme - was offered for free in addition to the games available in the "welcome back" package.
5 PSP games are offered in the Japanese market.
Version of Killzone Liberation offered does not offer online gameplay functionality.
Government reaction
The data theft concerned authorities around the world. Graham Cluley, senior technology consultant at Sophos, said the breach "certainly ranks as one of the biggest data losses ever to affect individuals".
The British Information Commissioner's Office stated that Sony would be questioned, and that an investigation would take place to discover whether Sony had taken adequate precautions to protect customer details. Under the UK's Data Protection Act, Sony was fined £250,000 for the breach.
Privacy Commissioner of Canada Jennifer Stoddart confirmed that the Canadian authorities would investigate. The Commissioner's office conveyed their concern as to why the authorities in Canada weren't informed of a security breach earlier.
US Senator Richard Blumenthal of Connecticut demanded answers from Sony about the data breach by emailing SCEA CEO Jack Tretton arguing about the delay in informing its customers and insisting that Sony do more for its customers than just offer free credit reporting services. Blumenthal later called for an investigation by the US Department of Justice to find the person or persons responsible and to determine if Sony was liable for the way that it handled the situation.
Congresswoman Mary Bono Mack and Congressman G. K. Butterfield sent a letter to Sony, demanding information on when the breach was discovered and how the crisis would be handled.
Sony had been asked to testify before a congressional hearing on security and to answer questions about the breach of security on May 2, but sent a written response instead.
Legal action against Sony
A lawsuit was posted on April 27 by Kristopher Johns from Birmingham, Alabama on behalf of all PlayStation users alleging Sony "failed to encrypt data and establish adequate firewalls to handle a server intrusion contingency, failed to provide prompt and adequate warnings of security breaches, and unreasonably delayed in bringing the PSN service back online." According to the complaint filed in the lawsuit, Sony failed to notify members of a possible security breach and storing members' credit card information, a violation of PCI Compliance—the digital security standard for the Payment Card Industry.
A Canadian lawsuit against Sony USA, Sony Canada and Sony Japan claimed damages up to C$1 billion including free credit monitoring and identity theft insurance. The plaintiff was quoted as saying, "If you can't trust a huge multi-national corporation like Sony to protect your private information, who can you trust? It appears to me that Sony focuses more on protecting its games than its PlayStation users".
In October 2012 a California judge dismissed a lawsuit against Sony over the PSN security breach, ruling that Sony had not violated California's consumer-protection laws, citing "there is no such thing as perfect security".
In 2013 United Kingdom Information Commissioner's Office charged Sony with a £250,000 penalty for putting a large amount of personal and financial data of PSN clients at risk.
Credit card fraud
, there were no verifiable reports of credit card fraud related to the outage. There were reports on the Internet that some PlayStation users experienced credit card fraud; however, they were yet to be linked to the incident. Users who registered a credit card for use only with Sony also reported credit card fraud. Sony said that the CSC codes requested by their services were not stored, but hackers may have been able to decrypt or record credit card details while inside Sony's network.
Sony stated in their letter to the subcommittee:
On May 5, a letter from Sony Corporation of America CEO and President Sir Howard Stringer emphasized that there had been no evidence of credit card fraud and that a $1 million identity theft insurance policy would be available to PSN and Qriocity users:
Change to terms and conditions
It has been suggested that a change to the PSN terms and conditions announced on September 15, 2011, was motivated by the large damages being claimed by class action suits against Sony, in an effort to minimise the company's losses. The new agreement required users to agree to give up their right (to join together as a group in a class action) to sue Sony over any future security breach, without first trying to resolve legal issues with an arbitrator. This included any ongoing class action suits initiated prior to August 20, 2011.
Another clause, which removed a user's right to trial by jury should the user opt out of the clause (by sending a letter to Sony), says:
Sony guaranteed that a court of law in the respective country, in this case the US, would hold jurisdiction in regards to any rules or changes in the Sony PSN ToS:
References
2011 crimes
PlayStation
Network
PlayStation Network
Network
Sony Interactive Entertainment
2010s internet outages
da:PlayStation Network#PlayStation Networks nedbrud 2011 |
14766445 | https://en.wikipedia.org/wiki/Institute%20for%20Applied%20Information%20Processing%20and%20Communications | Institute for Applied Information Processing and Communications | The Institute for Applied Information Processing and Communications (IAIK) is part of the Faculty of Computer Science and Biomedical Engineering at the Graz University of Technology (TU Graz). IAIK is concerned with aspects of computer security and information security. Current focal points are set on design of new cryptographic algorithms, implementation of cryptographic algorithms and protocols in hardware as well as in software, network security, e-Government, and trusted computing.
IAIK conducts applied research into these areas, fostering a holistic view of the aspects of computer and information security. Teaching activities closely follow the latest developments in IAIK’s research fields. The activities of IAIK are led by Stefan Mangard after Reinhard Posch abdicated in 2019.
IAIK is specialized on following specific aspects of computer and information security:
VLSI design and security
Implementation attacks
RFID hardware and security
Software security
E-Government
Trusted computing
Design and analysis of hash and block cipher primitives
Network security
Formal methods in verification and design
References
External links
IAIK
IAIK at the Faculty of Computer Science of TU Graz
Research institutes in Austria
Graz University of Technology |
239450 | https://en.wikipedia.org/wiki/Strategic%20management | Strategic management | In the field of management, strategic management involves the formulation and implementation of the major goals and initiatives taken by an organization's managers on behalf of stakeholders, based on consideration of resources and an assessment of the internal and external environments in which the organization operates. Strategic management provides overall direction to an enterprise and involves specifying the organization's objectives, developing policies and plans to achieve those objectives, and then allocating resources to implement the plans. Academics and practicing managers have developed numerous models and frameworks to assist in strategic decision-making in the context of complex environments and competitive dynamics. Strategic management is not static in nature; the models often include a feedback loop to monitor execution and to inform the next round of planning.
Michael Porter identifies three principles underlying strategy:
creating a "unique and valuable [market] position"
making trade-offs by choosing "what not to do"
creating "fit" by aligning company activities with one another to support the chosen strategy
Corporate strategy involves answering a key question from a portfolio perspective: "What business should we be in?" Business strategy involves answering the question: "How shall we compete in this business?"
Management theory and practice often make a distinction between strategic management and operational management, with operational management concerned primarily with improving efficiency and controlling costs within the boundaries set by the organization's strategy.
Application
Strategy is defined as "the determination of the basic long-term goals of an enterprise, and the adoption of courses of action and the allocation of resources necessary for carrying out these goals." Strategies are established to set direction, focus effort, define or clarify the organization, and provide consistency or guidance in response to the environment.
Strategic management involves the related concepts of strategic planning and strategic thinking. Strategic planning is analytical in nature and refers to formalized procedures to produce the data and analyses used as inputs for strategic thinking, which synthesizes the data resulting in the strategy. Strategic planning may also refer to control mechanisms used to implement the strategy once it is determined. In other words, strategic planning happens around the strategic thinking or strategy making activity.
Strategic management is often described as involving two major processes: formulation and implementation of strategy. While described sequentially below, in practice the two processes are iterative and each provides input for the other.
Formulation
Formulation of strategy involves analyzing the environment in which the organization operates, then making a series of strategic decisions about how the organization will compete. Formulation ends with a series of goals or objectives and measures for the organization to pursue.
Environmental analysis includes the:
Remote external environment, including the political, economic, social, technological, legal and environmental landscape (PESTLE);
Industry environment, such as the competitive behavior of rival organizations, the bargaining power of buyers/customers and suppliers, threats from new entrants to the industry, and the ability of buyers to substitute products (Porter's 5 forces); and
Internal environment, regarding the strengths and weaknesses of the organization's resources (i.e., its people, processes and IT systems).
Strategic decisions are based on insight from the environmental assessment and are responses to strategic questions about how the organization will compete, such as:
What is the organization's business?
Who is the target customer for the organization's products and services?
Where are the customers and how do they buy? What is considered "value" to the customer?
Which businesses, products and services should be included or excluded from the portfolio of offerings?
What is the geographic scope of the business?
What differentiates the company from its competitors in the eyes of customers and other stakeholders?
Which skills and capabilities should be developed within the firm?
What are the important opportunities and risks for the organization?
How can the firm grow, through both its base business and new business?
How can the firm generate more value for investors?
The answers to these and many other strategic questions result in the organization's strategy and a series of specific short-term and long-term goals or objectives and related measures.
Implementation
The second major process of strategic management is implementation, which involves decisions regarding how the organization's resources (i.e., people, process and IT systems) will be aligned and mobilized towards the objectives. Implementation results in how the organization's resources are structured (such as by product or service or geography), leadership arrangements, communication, incentives, and monitoring mechanisms to track progress towards objectives, among others.
Running the day-to-day operations of the business is often referred to as "operations management" or specific terms for key departments or functions, such as "logistics management" or "marketing management," which take over once strategic management decisions are implemented.
Definitions
In 1988, Henry Mintzberg described the many different definitions and perspectives on strategy reflected in both academic research and in practice. He examined the strategic process and concluded it was much more fluid and unpredictable than people had thought. Because of this, he could not point to one process that could be called strategic planning. Instead Mintzberg concludes that there are five types of strategies:
Strategy as plan – a directed course of action to achieve an intended set of goals; similar to the strategic planning concept;
Strategy as pattern – a consistent pattern of past behavior, with a strategy realized over time rather than planned or intended. Where the realized pattern was different from the intent, he referred to the strategy as emergent;
Strategy as position – locating brands, products, or companies within the market, based on the conceptual framework of consumers or other stakeholders; a strategy determined primarily by factors outside the firm;
Strategy as ploy – a specific maneuver intended to outwit a competitor; and
Strategy as perspective – executing strategy based on a "theory of the business" or natural extension of the mindset or ideological perspective of the organization.
In 1998, Mintzberg developed these five types of management strategy into 10 "schools of thought" and grouped them into three categories. The first group is normative. It consists of the schools of informal design and conception, the formal planning, and analytical positioning. The second group, consisting of six schools, is more concerned with how strategic management is actually done, rather than prescribing optimal plans or positions. The six schools are entrepreneurial, visionary, cognitive, learning/adaptive/emergent, negotiation, corporate culture and business environment. The third and final group consists of one school, the configuration or transformation school, a hybrid of the other schools organized into stages, organizational life cycles, or "episodes".
Michael Porter defined strategy in 1980 as the "...broad formula for how a business is going to compete, what its goals should be, and what policies will be needed to carry out those goals" and the "...combination of the ends (goals) for which the firm is striving and the means (policies) by which it is seeking to get there." He continued that: "The essence of formulating competitive strategy is relating a company to its environment."
Some complexity theorists define strategy as the unfolding of the internal and external aspects of the organization that results in actions in a socio-economic context.
Historical development
Origins
The strategic management discipline originated in the 1950s and 1960s. Among the numerous early contributors, the most influential were Peter Drucker, Philip Selznick, Alfred Chandler, Igor Ansoff, and Bruce Henderson. The discipline draws from earlier thinking and texts on 'strategy' dating back thousands of years. Prior to 1960, the term "strategy" was primarily used regarding war and politics, not business. Many companies built strategic planning functions to develop and execute the formulation and implementation processes during the 1960s.
Peter Drucker was a prolific management theorist and author of dozens of management books, with a career spanning five decades. He addressed fundamental strategic questions in a 1954 book The Practice of Management writing: "... the first responsibility of top management is to ask the question 'what is our business?' and to make sure it is carefully studied and correctly answered." He wrote that the answer was determined by the customer. He recommended eight areas where objectives should be set, such as market standing, innovation, productivity, physical and financial resources, worker performance and attitude, profitability, manager performance and development, and public responsibility.
In 1957, Philip Selznick initially used the term "distinctive competence" in referring to how the Navy was attempting to differentiate itself from the other services. He also formalized the idea of matching the organization's internal factors with external environmental circumstances. This core idea was developed further by Kenneth R. Andrews in 1963 into what we now call SWOT analysis, in which the strengths and weaknesses of the firm are assessed in light of the opportunities and threats in the business environment.
Alfred Chandler recognized the importance of coordinating management activity under an all-encompassing strategy. Interactions between functions were typically handled by managers who relayed information back and forth between departments. Chandler stressed the importance of taking a long-term perspective when looking to the future. In his 1962 ground breaking work Strategy and Structure, Chandler showed that a long-term coordinated strategy was necessary to give a company structure, direction and focus. He says it concisely, "structure follows strategy." Chandler wrote that: "Strategy is the determination of the basic long-term goals of an enterprise, and the adoption of courses of action and the allocation of resources necessary for carrying out these goals."
Igor Ansoff built on Chandler's work by adding concepts and inventing a vocabulary. He developed a grid that compared strategies for market penetration, product development, market development and horizontal and vertical integration and diversification. He felt that management could use the grid to systematically prepare for the future. In his 1965 classic Corporate Strategy, he developed gap analysis to clarify the gap between the current reality and the goals and to develop what he called "gap reducing actions". Ansoff wrote that strategic management had three parts: strategic planning; the skill of a firm in converting its plans into reality; and the skill of a firm in managing its own internal resistance to change.
Bruce Henderson, founder of the Boston Consulting Group, wrote about the concept of the experience curve in 1968, following initial work begun in 1965. The experience curve refers to a hypothesis that unit production costs decline by 20–30% every time cumulative production doubles. This supported the argument for achieving higher market share and economies of scale.
Porter wrote in 1980 that companies have to make choices about their scope and the type of competitive advantage they seek to achieve, whether lower cost or differentiation. The idea of strategy targeting particular industries and customers (i.e., competitive positions) with a differentiated offering was a departure from the experience-curve influenced strategy paradigm, which was focused on larger scale and lower cost. Porter revised the strategy paradigm again in 1985, writing that superior performance of the processes and activities performed by organizations as part of their value chain is the foundation of competitive advantage, thereby outlining a process view of strategy.
Change in focus from production to marketing
The direction of strategic research also paralleled a major paradigm shift in how companies competed, specifically a shift from the production focus to market focus. The prevailing concept in strategy up to the 1950s was to create a product of high technical quality. If you created a product that worked well and was durable, it was assumed you would have no difficulty profiting. This was called the production orientation. Henry Ford famously said of the Model T car: "Any customer can have a car painted any color that he wants, so long as it is black."
Management theorist Peter F Drucker wrote in 1954 that it was the customer who defined what business the organization was in. In 1960 Theodore Levitt argued that instead of producing products then trying to sell them to the customer, businesses should start with the customer, find out what they wanted, and then produce it for them. The fallacy of the production orientation was also referred to as marketing myopia in an article of the same name by Levitt.
Over time, the customer became the driving force behind all strategic business decisions. This marketing concept, in the decades since its introduction, has been reformulated and repackaged under names including market orientation, customer orientation, customer intimacy, customer focus, customer-driven and market focus.
Nature of strategy
In 1985, Professor Ellen Earle-Chaffee summarized what she thought were the main elements of strategic management theory where consensus generally existed as of the 1970s, writing that strategic management:
Involves adapting the organization to its business environment;
Is fluid and complex. Change creates novel combinations of circumstances requiring unstructured non-repetitive responses;
Affects the entire organization by providing direction;
Involves both strategy formulation processes and also implementation of the content of the strategy;
May be planned (intended) and unplanned (emergent);
Is done at several levels: overall corporate strategy, and individual business strategies; and
Involves both conceptual and analytical thought processes.
Chaffee further wrote that research up to that point covered three models of strategy, which were not mutually exclusive:
Linear strategy: A planned determination of goals, initiatives, and allocation of resources, along the lines of the Chandler definition above. This is most consistent with strategic planning approaches and may have a long planning horizon. The strategist "deals with" the environment but it is not the central concern.
Adaptive strategy: In this model, the organization's goals and activities are primarily concerned with adaptation to the environment, analogous to a biological organism. The need for continuous adaption reduces or eliminates the planning window. There is more focus on means (resource mobilization to address the environment) rather than ends (goals). Strategy is less centralized than in the linear model.
Interpretive strategy: A more recent and less developed model than the linear and adaptive models, interpretive strategy is concerned with "orienting metaphors constructed for the purpose of conceptualizing and guiding individual attitudes or organizational participants." The aim of interpretive strategy is legitimacy or credibility in the mind of stakeholders. It places emphasis on symbols and language to influence the minds of customers, rather than the physical product of the organization.
Concepts and frameworks
The progress of strategy since 1960 can be charted by a variety of frameworks and concepts introduced by management consultants and academics. These reflect an increased focus on cost, competition and customers. These "3 Cs" were illuminated by much more robust empirical analysis at ever-more granular levels of detail, as industries and organizations were disaggregated into business units, activities, processes, and individuals in a search for sources of competitive advantage.
SWOT analysis
By the 1960s, the capstone business policy course at the Harvard Business School included the concept of matching the distinctive competence of a company (its internal strengths and weaknesses) with its environment (external opportunities and threats) in the context of its objectives. This framework came to be known by the acronym SWOT and was "a major step forward in bringing explicitly competitive thinking to bear on questions of strategy". Kenneth R. Andrews helped popularize the framework via a 1963 conference and it remains commonly used in practice.
Experience curve
The experience curve was developed by the Boston Consulting Group in 1966. It is a hypothesis that total per unit costs decline systematically by as much as 15–25% every time cumulative production (i.e., "experience") doubles. It has been empirically confirmed by some firms at various points in their history. Costs decline due to a variety of factors, such as the learning curve, substitution of labor for capital (automation), and technological sophistication. Author Walter Kiechel wrote that it reflected several insights, including:
A company can always improve its cost structure;
Competitors have varying cost positions based on their experience;
Firms could achieve lower costs through higher market share, attaining a competitive advantage; and
An increased focus on empirical analysis of costs and processes, a concept which author Kiechel refers to as "Greater Taylorism".
Kiechel wrote in 2010: "The experience curve was, simply, the most important concept in launching the strategy revolution...with the experience curve, the strategy revolution began to insinuate an acute awareness of competition into the corporate consciousness." Prior to the 1960s, the word competition rarely appeared in the most prominent management literature; U.S. companies then faced considerably less competition and did not focus on performance relative to peers. Further, the experience curve provided a basis for the retail sale of business ideas, helping drive the management consulting
industry.
Corporate strategy and portfolio theory
The concept of the corporation as a portfolio of business units, with each plotted graphically based on its market share (a measure of its competitive position relative to its peers) and industry growth rate (a measure of industry attractiveness), was summarized in the growth–share matrix developed by the Boston Consulting Group around 1970. By 1979, one study estimated that 45% of the Fortune 500 companies were using some variation of the matrix in their strategic planning. This framework helped companies decide where to invest their resources (i.e., in their high market share, high growth businesses) and which businesses to divest (i.e., low market share, low growth businesses.) The growth-share matrix was followed by G.E. multi factoral model, developed by General Electric.
Companies continued to diversify as conglomerates until the 1980s, when deregulation and a less restrictive antitrust environment led to the view that a portfolio of operating divisions in different industries was worth more as many independent companies, leading to the breakup of many conglomerates. While the popularity of portfolio theory has waxed and waned, the key dimensions considered (industry attractiveness and competitive position) remain central to strategy.
In response to the evident problems of "over diversification", C. K. Prahalad and Gary Hamel suggested that companies should build portfolios of businesses around shared technical or operating competencies, and should develop structures and processes to enhance their core competencies.
Michael Porter also addressed the issue of the appropriate level of diversification. In 1987, he argued that corporate strategy involves two questions: 1) What business should the corporation be in? and 2) How should the corporate office manage its business units? He mentioned four concepts of corporate strategy each of which suggest a certain type of portfolio and a certain role for the corporate office; the latter three can be used together:
Portfolio theory: A strategy based primarily on diversification through acquisition. The corporation shifts resources among the units and monitors the performance of each business unit and its leaders. Each unit generally runs autonomously, with limited interference from the corporate center provided goals are met.
Restructuring: The corporate office acquires then actively intervenes in a business where it detects potential, often by replacing management and implementing a new business strategy.
Transferring skills: Important managerial skills and organizational capability are essentially spread to multiple businesses. The skills must be necessary to competitive advantage.
Sharing activities: Ability of the combined corporation to leverage centralized functions, such as sales, finance, etc. thereby reducing costs.
Building on Porter's ideas, Michael Goold, Andrew Campbell and Marcus Alexander developed the concept of "parenting advantage" to be applied at the corporate level, as a parallel to the concept of "competitive advantage" applied at the business level. Parent companies, they argued, should aim to "add more value" to their portfolio of businesses than rivals. If they succeed, they have a parenting advantage. The right level of diversification depends, therefore, on the ability of the parent company to add value in comparison to others. Different parent companies with different skills should expect to have different portfolios. See Corporate Level Strategy 1995 and Strategy for the Corporate Level 2014
Competitive advantage
In 1980, Porter defined the two types of competitive advantage an organization can achieve relative to its rivals: lower cost or differentiation. This advantage derives from attribute(s) that allow an organization to outperform its competition, such as superior market position, skills, or resources. In Porter's view, strategic management should be concerned with building and sustaining competitive advantage.
Industry structure and profitability
Porter developed a framework for analyzing the profitability of industries and how those profits are divided among the participants in 1980. In five forces analysis he identified the forces that shape the industry structure or environment. The framework involves the bargaining power of buyers and suppliers, the threat of new entrants, the availability of substitute products, and the competitive rivalry of firms in the industry. These forces affect the organization's ability to raise its prices as well as the costs of inputs (such as raw materials) for its processes.
The five forces framework helps describe how a firm can use these forces to obtain a sustainable competitive advantage, either lower cost or differentiation. Companies can maximize their profitability by competing in industries with favorable structure. Competitors can take steps to grow the overall profitability of the industry, or to take profit away from other parts of the industry structure. Porter modified Chandler's dictum about structure following strategy by introducing a second level of structure: while organizational structure follows strategy, it in turn follows industry structure.
Generic competitive strategies
Porter wrote in 1980 that strategy target either cost leadership, differentiation, or focus. These are known as Porter's three generic strategies and can be applied to any size or form of business. Porter claimed that a company must only choose one of the three or risk that the business would waste precious resources. Porter's generic strategies detail the interaction between cost minimization strategies, product differentiation strategies, and market focus strategies.
Porter described an industry as having multiple segments that can be targeted by a firm. The breadth of its targeting refers to the competitive scope of the business. Porter defined two types of competitive advantage: lower cost or differentiation relative to its rivals. Achieving competitive advantage results from a firm's ability to cope with the five forces better than its rivals. Porter wrote: "[A]chieving competitive advantage requires a firm to make a choice...about the type of competitive advantage it seeks to attain and the scope within which it will attain it." He also wrote: "The two basic types of competitive advantage [differentiation and lower cost] combined with the scope of activities for which a firm seeks to achieve them lead to three generic strategies for achieving above average performance in an industry: cost leadership, differentiation and focus. The focus strategy has two variants, cost focus and differentiation focus."
The concept of choice was a different perspective on strategy, as the 1970s paradigm was the pursuit of market share (size and scale) influenced by the experience curve. Companies that pursued the highest market share position to achieve cost advantages fit under Porter's cost leadership generic strategy, but the concept of choice regarding differentiation and focus represented a new perspective.
Value chain
Porter's 1985 description of the value chain refers to the chain of activities (processes or collections of processes) that an organization performs in order to deliver a valuable product or service for the market. These include functions such as inbound logistics, operations, outbound logistics, marketing and sales, and service, supported by systems and technology infrastructure. By aligning the various activities in its value chain with the organization's strategy in a coherent way, a firm can achieve a competitive advantage. Porter also wrote that strategy is an internally consistent configuration of activities that differentiates a firm from its rivals. A robust competitive position cumulates from many activities which should fit coherently together.
Porter wrote in 1985: "Competitive advantage cannot be understood by looking at a firm as a whole. It stems from the many discrete activities a firm performs in designing, producing, marketing, delivering and supporting its product. Each of these activities can contribute to a firm's relative cost position and create a basis for differentiation...the value chain disaggregates a firm into its strategically relevant activities in order to understand the behavior of costs and the existing and potential sources of differentiation."
Interorganizational relationships
Interorganizational relationships allow independent organizations to get access to resources or to enter new markets. Interorganizational relationships represent a critical lever of competitive advantage.
The field of strategic management has paid much attention to the different forms of relationships between organizations ranging from strategic alliances to buyer-supplier relationships, joint ventures, networks, R&D consortia, licensing, and franchising.
On the one hand, scholars drawing on organizational economics (e.g., transaction costs theory) have argued that firms use interorganizational relationships when they are the most efficient form comparatively to other forms of organization such as operating on its own or using the market. On the other hand, scholars drawing on organizational theory (e.g., resource dependence theory) suggest that firms tend to partner with others when such relationships allow them to improve their status, power, reputation, or legitimacy.
A key component to the strategic management of inter-organizational relationships relates to the choice of governance mechanisms. While early research focused on the choice between equity and non equity forms, recent scholarship studies the nature of the contractual and relational arrangements between organizations.
Researchers have also noted, although to a lesser extent, the dark side of interorganizational relationships, such as conflict, disputes, opportunism and unethical behaviors. Relational or collaborative risk can be defined as the uncertainty about whether potentially significant and/or disappointing outcomes of collaborative activities will be realized. Companies can assess, monitor and manage collaborative risks. Empirical studies show that managers assess risks as lower when they external partners, higher if they are satisfied with their own performance, and lower when their business environment is turbulent.
Core competence
Gary Hamel and C. K. Prahalad described the idea of core competency in 1990, the idea that each organization has some capability in which it excels and that the business should focus on opportunities in that area, letting others go or outsourcing them. Further, core competency is difficult to duplicate, as it involves the skills and coordination of people across a variety of functional areas or processes used to deliver value to customers. By outsourcing, companies expanded the concept of the value chain, with some elements within the entity and others without. Core competency is part of a branch of strategy called the resource-based view of the firm, which postulates that if activities are strategic as indicated by the value chain, then the organization's capabilities and ability to learn or adapt are also strategic.
Theory of the business
Peter Drucker wrote in 1994 about the "Theory of the Business," which represents the key assumptions underlying a firm's strategy. These assumptions are in three categories: a) the external environment, including society, market, customer, and technology; b) the mission of the organization; and c) the core competencies needed to accomplish the mission. He continued that a valid theory of the business has four specifications: 1) assumptions about the environment, mission, and core competencies must fit reality; 2) the assumptions in all three areas have to fit one another; 3) the theory of the business must be known and understood throughout the organization; and 4) the theory of the business has to be tested constantly.
He wrote that organizations get into trouble when the assumptions representing the theory of the business no longer fit reality. He used an example of retail department stores, where their theory of the business assumed that people who could afford to shop in department stores would do so. However, many shoppers abandoned department stores in favor of specialty retailers (often located outside of malls) when time became the primary factor in the shopping destination rather than income.
Drucker described the theory of the business as a "hypothesis" and a "discipline." He advocated building in systematic diagnostics, monitoring and testing of the assumptions comprising the theory of the business to maintain competitiveness.
Strategic thinking
Strategic thinking involves the generation and application of unique business insights to opportunities intended to create competitive advantage for a firm or organization. It involves challenging the assumptions underlying the organization's strategy and value proposition. Mintzberg wrote in 1994 that it is more about synthesis (i.e., "connecting the dots") than analysis (i.e., "finding the dots"). It is about "capturing what the manager learns from all sources (both the soft insights from his or her personal experiences and the experiences of others throughout the organization and the hard data from market research and the like) and then synthesizing that learning into a vision of the direction that the business should pursue." Mintzberg argued that strategic thinking is the critical part of formulating strategy, more so than strategic planning exercises.
General Andre Beaufre wrote in 1963 that strategic thinking "is a mental process, at once abstract and rational, which must be capable of synthesizing both psychological and material data. The strategist must have a great capacity for both analysis and synthesis; analysis is necessary to assemble the data on which he makes his diagnosis, synthesis in order to produce from these data the diagnosis itself--and the diagnosis in fact amounts to a choice between alternative courses of action."
Will Mulcaster argued that while much research and creative thought has been devoted to generating alternative strategies, too little work has been done on what influences the quality of strategic decision making and the effectiveness with which strategies are implemented. For instance, in retrospect it can be seen that the financial crisis of 2008–9 could have been avoided if the banks had paid more attention to the risks associated with their investments, but how should banks change the way they make decisions to improve the quality of their decisions in the future? Mulcaster's Managing Forces framework addresses this issue by identifying 11 forces that should be incorporated into the processes of decision making and strategic implementation. The 11 forces are: Time; Opposing forces; Politics; Perception; Holistic effects; Adding value; Incentives; Learning capabilities; Opportunity cost; Risk and Style.
Strategic planning
Strategic planning is a means of administering the formulation and implementation of strategy. Strategic planning is analytical in nature and refers to formalized procedures to produce the data and analyses used as inputs for strategic thinking, which synthesizes the data resulting in the strategy. Strategic planning may also refer to control mechanisms used to implement the strategy once it is determined. In other words, strategic planning happens around the strategy formation process.
Environmental analysis
Porter wrote in 1980 that formulation of competitive strategy includes consideration of four key elements:
Company strengths and weaknesses;
Personal values of the key implementers (i.e., management and the board)
Industry opportunities and threats; and
Broader societal expectations.
The first two elements relate to factors internal to the company (i.e., the internal environment), while the latter two relate to factors external to the company (i.e., the external environment).
There are many analytical frameworks which attempt to organize the strategic planning process. Examples of frameworks that address the four elements described above include:
External environment: PEST analysis or STEEP analysis is a framework used to examine the remote external environmental factors that can affect the organization, such as political, economic, social/demographic, and technological. Common variations include SLEPT, PESTLE, STEEPLE, and STEER analysis, each of which incorporates slightly different emphases.
Industry environment: The Porter Five Forces Analysis framework helps to determine the competitive rivalry and therefore attractiveness of a market. It is used to help determine the portfolio of offerings the organization will provide and in which markets.
Relationship of internal and external environment: SWOT analysis is one of the most basic and widely used frameworks, which examines both internal elements of the organization—Strengths and Weaknesses—and external elements—Opportunities and Threats. It helps examine the organization's resources in the context of its environment.
Scenario planning
A number of strategists use scenario planning techniques to deal with change. The way Peter Schwartz put it in 1991 is that strategic outcomes cannot be known in advance so the sources of competitive advantage cannot be predetermined. The fast changing business environment is too uncertain for us to find sustainable value in formulas of excellence or competitive advantage. Instead, scenario planning is a technique in which multiple outcomes can be developed, their implications assessed, and their likeliness of occurrence evaluated. According to Pierre Wack, scenario planning is about insight, complexity, and subtlety, not about formal analysis and numbers. The flowchart to the right provides a process for classifying a phenomenon as a scenario in the intuitive logics tradition.
Some business planners are starting to use a complexity theory approach to strategy. Complexity can be thought of as chaos with a dash of order. Chaos theory deals with turbulent systems that rapidly become disordered. Complexity is not quite so unpredictable. It involves multiple agents interacting in such a way that a glimpse of structure may appear.
Measuring and controlling implementation
Once the strategy is determined, various goals and measures may be established to chart a course for the organization, measure performance and control implementation of the strategy. Tools such as the balanced scorecard and strategy maps help crystallize the strategy, by relating key measures of success and performance to the strategy. These tools measure financial, marketing, production, organizational development, and innovation measures to achieve a 'balanced' perspective. Advances in information technology and data availability enable the gathering of more information about performance, allowing managers to take a much more analytical view of their business than before.
Strategy may also be organized as a series of "initiatives" or "programs", each of which comprises one or more projects. Various monitoring and feedback mechanisms may also be established, such as regular meetings between divisional and corporate management to control implementation.
Evaluation
A key component to strategic management which is often overlooked when planning is evaluation. There are many ways to evaluate whether or not strategic priorities and plans have been achieved, one such method is Robert Stake's Responsive Evaluation. Responsive evaluation provides a naturalistic and humanistic approach to program evaluation. In expanding beyond the goal-oriented or pre-ordinate evaluation design, responsive evaluation takes into consideration the program's background (history), conditions, and transactions among stakeholders. It is largely emergent, the design unfolds as contact is made with stakeholders.
Limitations
While strategies are established to set direction, focus effort, define or clarify the organization, and provide consistency or guidance in response to the environment, these very elements also mean that certain signals are excluded from consideration or de-emphasized. Mintzberg wrote in 1987: "Strategy is a categorizing scheme by which incoming stimuli can be ordered and dispatched." Since a strategy orients the organization in a particular manner or direction, that direction may not effectively match the environment, initially (if a bad strategy) or over time as circumstances change. As such, Mintzberg continued, "Strategy [once established] is a force that resists change, not encourages it."
Therefore, a critique of strategic management is that it can overly constrain managerial discretion in a dynamic environment. "How can individuals, organizations and societies cope as well as possible with ... issues too complex to be fully understood, given the fact that actions initiated on the basis of inadequate understanding may lead to significant regret?" Some theorists insist on an iterative approach, considering in turn objectives, implementation and resources. I.e., a "...repetitive learning cycle [rather than] a linear progression towards a clearly defined final destination." Strategies must be able to adjust during implementation because "humans rarely can proceed satisfactorily except by learning from experience; and modest probes, serially modified on the basis of feedback, usually are the best method for such learning."
In 2000, Gary Hamel coined the term strategic convergence to explain the limited scope of the strategies being used by rivals in greatly differing circumstances. He lamented that successful strategies are imitated by firms that do not understand that for a strategy to work, it must account for the specifics of each situation.
Woodhouse and Collingridge claim that the essence of being "strategic" lies in a capacity for "intelligent trial-and error" rather than strict adherence to finely honed strategic plans. Strategy should be seen as laying out the general path rather than precise steps. Means are as likely to determine ends as ends are to determine means. The objectives that an organization might wish to pursue are limited by the range of feasible approaches to implementation. (There will usually be only a small number of approaches that will not only be technically and administratively possible, but also satisfactory to the full range of organizational stakeholders.) In turn, the range of feasible implementation approaches is determined by the availability of resources.
Strategic themes
Various strategic approaches used across industries (themes) have arisen over the years. These include the shift from product-driven demand to customer- or marketing-driven demand (described above), the increased use of self-service approaches to lower cost, changes in the value chain or corporate structure due to globalization (e.g., off-shoring of production and assembly), and the internet.
Self-service
One theme in strategic competition has been the trend towards self-service, often enabled by technology, where the customer takes on a role previously performed by a worker to lower costs for the firm and perhaps prices. Examples include:
Automated teller machine (ATM) to obtain cash rather via a bank teller;
Self-service at the gas pump rather than with help from an attendant;
Retail internet orders input by the customer rather than a retail clerk, such as online book sales;
Mass-produced ready-to-assemble furniture transported by the customer;
Self-checkout at the grocery store; and
Online banking and bill payment.
Globalization and the virtual firm
One definition of globalization refers to the integration of economies due to technology and supply chain process innovation. Companies are no longer required to be vertically integrated (i.e., designing, producing, assembling, and selling their products). In other words, the value chain for a company's product may no longer be entirely within one firm; several entities comprising a virtual firm may exist to fulfill the customer requirement. For example, some companies have chosen to outsource production to third parties, retaining only design and sales functions inside their organization.
Internet and information availability
The internet has dramatically empowered consumers and enabled buyers and sellers to come together with drastically reduced transaction and intermediary costs, creating much more robust marketplaces for the purchase and sale of goods and services. Examples include online auction sites, internet dating services, and internet book sellers. In many industries, the internet has dramatically altered the competitive landscape. Services that used to be provided within one entity (e.g., a car dealership providing financing and pricing information) are now provided by third parties. Further, compared to traditional media like television, the internet has caused a major shift in viewing habits through on demand content which has led to an increasingly fragmented audience.
Author Phillip Evans said in 2013 that networks are challenging traditional hierarchies. Value chains may also be breaking up ("deconstructing") where information aspects can be separated from functional activity. Data that is readily available for free or very low cost makes it harder for information-based, vertically integrated businesses to remain intact. Evans said: "The basic story here is that what used to be vertically integrated, oligopolistic competition among essentially similar kinds of competitors is evolving, by one means or another, from a vertical structure to a horizontal one. Why is that happening? It's happening because transaction costs are plummeting and because scale is polarizing. The plummeting of transaction costs weakens the glue that holds value chains together, and allows them to separate." He used Wikipedia as an example of a network that has challenged the traditional encyclopedia business model. Evans predicts the emergence of a new form of industrial organization called a "stack", analogous to a technology stack, in which competitors rely on a common platform of inputs (services or information), essentially layering the remaining competing parts of their value chains on top of this common platform.
Sustainability
In the recent decade, sustainability—or ability to successfully sustain a company in a context of rapidly changing environmental, social, health, and economic circumstances—has emerged as crucial aspect of any strategy development. Research focusing on corporations and leaders who have integrated sustainability into commercial strategy has led to emergence of the concept of "embedded sustainability" – defined by its authors Chris Laszlo and Nadya Zhexembayeva as "incorporation of environmental, health, and social value into the core business with no trade-off in price or quality—in other words, with no social or green premium." Their research showed that embedded sustainability offers at least seven distinct opportunities for business value and competitive advantage creation: a) better risk-management, b) increased efficiency through reduced waste and resource use, c) better product differentiation, d) new market entrances, e) enhanced brand and reputation, f) greater opportunity to influence industry standards, and g) greater opportunity for radical innovation. Research further suggested that innovation driven by resource depletion can result in fundamental competitive advantages for a company's products and services, as well as the company strategy as a whole, when right principles of innovation are applied. Asset managers who committed to integrating embedded sustainability factors in their capital allocation decisions created a stronger return on investment than managers that did not strategically integrate sustainability into their similar business model.
Strategy as learning
In 1990, Peter Senge, who had collaborated with Arie de Geus at Dutch Shell, popularized de Geus' notion of the "learning organization". The theory is that gathering and analyzing information is a necessary requirement for business success in the information age. To do this, Senge claimed that an organization would need to be structured such that:
People can continuously expand their capacity to learn and be productive.
New patterns of thinking are nurtured.
Collective aspirations are encouraged.
People are encouraged to see the "whole picture" together.
Senge identified five disciplines of a learning organization. They are:
Personal responsibility, self-reliance, and mastery – We accept that we are the masters of our own destiny. We make decisions and live with the consequences of them. When a problem needs to be fixed, or an opportunity exploited, we take the initiative to learn the required skills to get it done.
Mental models – We need to explore our personal mental models to understand the subtle effect they have on our behaviour.
Shared vision – The vision of where we want to be in the future is discussed and communicated to all. It provides guidance and energy for the journey ahead.
Team learning – We learn together in teams. This involves a shift from "a spirit of advocacy to a spirit of enquiry".
Systems thinking – We look at the whole rather than the parts. This is what Senge calls the "Fifth discipline". It is the glue that integrates the other four into a coherent strategy. For an alternative approach to the "learning organization", see Garratt, B. (1987).
Geoffrey Moore (1991) and R. Frank and P. Cook also detected a shift in the nature of competition. Markets driven by technical standards or by "network effects" can give the dominant firm a near-monopoly. The same is true of networked industries in which interoperability requires compatibility between users. Examples include Internet Explorer's and Amazon's early dominance of their respective industries. IE's later decline shows that such dominance may be only temporary.
Moore showed how firms could attain this enviable position by using E.M. Rogers' five stage adoption process and focusing on one group of customers at a time, using each group as a base for reaching the next group. The most difficult step is making the transition between introduction and mass acceptance. (See Crossing the Chasm). If successful a firm can create a bandwagon effect in which the momentum builds and its product becomes a de facto standard.
Strategy as adapting to change
In 1969, Peter Drucker coined the phrase Age of Discontinuity to describe the way change disrupts lives. In an age of continuity attempts to predict the future by extrapolating from the past can be accurate. But according to Drucker, we are now in an age of discontinuity and extrapolating is ineffective. He identifies four sources of discontinuity: new technologies, globalization, cultural pluralism and knowledge capital.
In 1970, Alvin Toffler in Future Shock described a trend towards accelerating rates of change. He illustrated how social and technical phenomena had shorter lifespans with each generation, and he questioned society's ability to cope with the resulting turmoil and accompanying anxiety. In past eras periods of change were always punctuated with times of stability. This allowed society to assimilate the change before the next change arrived. But these periods of stability had all but disappeared by the late 20th century. In 1980 in The Third Wave, Toffler characterized this shift to relentless change as the defining feature of the third phase of civilization (the first two phases being the agricultural and industrial waves).
In 1978, Derek F. Abell (Abell, D. 1978) described "strategic windows" and stressed the importance of the timing (both entrance and exit) of any given strategy. This led some strategic planners to build planned obsolescence into their strategies.
In 1983, Noel Tichy wrote that because we are all beings of habit we tend to repeat what we are comfortable with. He wrote that this is a trap that constrains our creativity, prevents us from exploring new ideas, and hampers our dealing with the full complexity of new issues. He developed a systematic method of dealing with change that involved looking at any new issue from three angles: technical and production, political and resource allocation, and corporate culture.
In 1989, Charles Handy identified two types of change. "Strategic drift" is a gradual change that occurs so subtly that it is not noticed until it is too late. By contrast, "transformational change" is sudden and radical. It is typically caused by discontinuities (or exogenous shocks) in the business environment. The point where a new trend is initiated is called a "strategic inflection point" by Andy Grove. Inflection points can be subtle or radical.
In 1990, Richard Pascale wrote that relentless change requires that businesses continuously reinvent themselves. His famous maxim is "Nothing fails like success" by which he means that what was a strength yesterday becomes the root of weakness today, We tend to depend on what worked yesterday and refuse to let go of what worked so well for us in the past. Prevailing strategies become self-confirming. To avoid this trap, businesses must stimulate a spirit of inquiry and healthy debate. They must encourage a creative process of self-renewal based on constructive conflict.
In 1996, Adrian Slywotzky showed how changes in the business environment are reflected in value migrations between industries, between companies, and within companies. He claimed that recognizing the patterns behind these value migrations is necessary if we wish to understand the world of chaotic change. In "Profit Patterns" (1999) he described businesses as being in a state of strategic anticipation as they try to spot emerging patterns. Slywotsky and his team identified 30 patterns that have transformed industry after industry.
In 1997, Clayton Christensen (1997) took the position that great companies can fail precisely because they do everything right since the capabilities of the organization also define its disabilities. Christensen's thesis is that outstanding companies lose their market leadership when confronted with disruptive technology. He called the approach to discovering the emerging markets for disruptive technologies agnostic marketing, i.e., marketing under the implicit assumption that no one – not the company, not the customers – can know how or in what quantities a disruptive product can or will be used without the experience of using it.
In 1999, Constantinos Markides reexamined the nature of strategic planning. He described strategy formation and implementation as an ongoing, never-ending, integrated process requiring continuous reassessment and reformation. Strategic management is planned and emergent, dynamic and interactive.
J. Moncrieff (1999) stressed strategy dynamics. He claimed that strategy is partially deliberate and partially unplanned. The unplanned element comes from emergent strategies that result from the emergence of opportunities and threats in the environment and from "strategies in action" (ad hoc actions across the organization).
David Teece pioneered research on resource-based strategic management and the dynamic capabilities perspective, defined as "the ability to integrate, build, and reconfigure internal and external competencies to address rapidly changing environments". His 1997 paper (with Gary Pisano and Amy Shuen) "Dynamic Capabilities and Strategic Management" was the most cited paper in economics and business for the period from 1995 to 2005.
In 2000, Gary Hamel discussed strategic decay, the notion that the value of every strategy, no matter how brilliant, decays over time.
Strategy as operational excellence
Quality
A large group of theorists felt the area where western business was most lacking was product quality. W. Edwards Deming, Joseph M. Juran, A. Kearney, Philip Crosby and Armand Feignbaum suggested quality improvement techniques such total quality management (TQM), continuous improvement (kaizen), lean manufacturing, Six Sigma, and return on quality (ROQ).
Contrarily, James Heskett (1988), Earl Sasser (1995), William Davidow, Len Schlesinger, A. Paraurgman (1988), Len Berry, Jane Kingman-Brundage, Christopher Hart, and Christopher Lovelock (1994), felt that poor customer service was the problem. They gave us fishbone diagramming, service charting, Total Customer Service (TCS), the service profit chain, service gaps analysis, the service encounter, strategic service vision, service mapping, and service teams. Their underlying assumption was that there is no better source of competitive advantage than a continuous stream of delighted customers.
Process management uses some of the techniques from product quality management and some of the techniques from customer service management. It looks at an activity as a sequential process. The objective is to find inefficiencies and make the process more effective. Although the procedures have a long history, dating back to Taylorism, the scope of their applicability has been greatly widened, leaving no aspect of the firm free from potential process improvements. Because of the broad applicability of process management techniques, they can be used as a basis for competitive advantage.
Carl Sewell, Frederick F. Reichheld, C. Gronroos, and Earl Sasser observed that businesses were spending more on customer acquisition than on retention. They showed how a competitive advantage could be found in ensuring that customers returned again and again. Reicheld broadened the concept to include loyalty from employees, suppliers, distributors and shareholders. They developed techniques for estimating customer lifetime value (CLV) for assessing long-term relationships. The concepts begat attempts to recast selling and marketing into a long term endeavor that created a sustained relationship (called relationship selling, relationship marketing, and customer relationship management). Customer relationship management (CRM) software became integral to many firms.
Reengineering
Michael Hammer and James Champy felt that these resources needed to be restructured. In a process that they labeled reengineering, firm's reorganized their assets around whole processes rather than tasks. In this way a team of people saw a project through, from inception to completion. This avoided functional silos where isolated departments seldom talked to each other. It also eliminated waste due to functional overlap and interdepartmental communications.
In 1989 Richard Lester and the researchers at the MIT Industrial Performance Center identified seven best practices and concluded that firms must accelerate the shift away from the mass production of low cost standardized products. The seven areas of best practice were:
Simultaneous continuous improvement in cost, quality, service, and product innovation
Breaking down organizational barriers between departments
Eliminating layers of management creating flatter organizational hierarchies.
Closer relationships with customers and suppliers
Intelligent use of new technology
Global focus
Improving human resource skills
The search for best practices is also called benchmarking. This involves determining where you need to improve, finding an organization that is exceptional in this area, then studying the company and applying its best practices in your firm.
Other perspectives on strategy
Strategy as problem solving
Professor Richard P. Rumelt described strategy as a type of problem solving in 2011. He wrote that good strategy has an underlying structure called a kernel. The kernel has three parts: 1) A diagnosis that defines or explains the nature of the challenge; 2) A guiding policy for dealing with the challenge; and 3) Coherent actions designed to carry out the guiding policy.
President Kennedy outlined these three elements of strategy in his Cuban Missile Crisis Address to the Nation of 22 October 1962:
Diagnosis: "This Government, as promised, has maintained the closest surveillance of the Soviet military buildup on the island of Cuba. Within the past week, unmistakable evidence has established the fact that a series of offensive missile sites is now in preparation on that imprisoned island. The purpose of these bases can be none other than to provide a nuclear strike capability against the Western Hemisphere."
Guiding Policy: "Our unswerving objective, therefore, must be to prevent the use of these missiles against this or any other country, and to secure their withdrawal or elimination from the Western Hemisphere."
Action Plans: First among seven numbered steps was the following: "To halt this offensive buildup a strict quarantine on all offensive military equipment under shipment to Cuba is being initiated. All ships of any kind bound for Cuba from whatever nation or port will, if found to contain cargoes of offensive weapons, be turned back."
Active strategic management required active information gathering and active problem solving. In the early days of Hewlett-Packard (HP), Dave Packard and Bill Hewlett devised an active management style that they called management by walking around (MBWA). Senior HP managers were seldom at their desks. They spent most of their days visiting employees, customers, and suppliers. This direct contact with key people provided them with a solid grounding from which viable strategies could be crafted. Management consultants Tom Peters and Robert H. Waterman had used the term in their 1982 book In Search of Excellence: Lessons From America's Best-Run Companies. Some Japanese managers employ a similar system, which originated at Honda, and is sometimes called the 3 G's (Genba, Genbutsu, and Genjitsu, which translate into "actual place", "actual thing", and "actual situation").
Creative vs analytic approaches
In 2010, IBM released a study summarizing three conclusions of 1500 CEOs around the world: 1) complexity is escalating, 2) enterprises are not equipped to cope with this complexity, and 3) creativity is now the single most important leadership competency. IBM said that it is needed in all aspects of leadership, including strategic thinking and planning.
Similarly, McKeown argued that over-reliance on any particular approach to strategy is dangerous and that multiple methods can be used to combine the creativity and analytics to create an "approach to shaping the future", that is difficult to copy.
Non-strategic management
A 1938 treatise by Chester Barnard, based on his own experience as a business executive, described the process as informal, intuitive, non-routinized and involving primarily oral, 2-way communications. Bernard says "The process is the sensing of the organization as a whole and the total situation relevant to it. It transcends the capacity of merely intellectual methods, and the techniques of discriminating the factors of the situation. The terms pertinent to it are "feeling", "judgement", "sense", "proportion", "balance", "appropriateness". It is a matter of art rather than science."
In 1973, Mintzberg found that senior managers typically deal with unpredictable situations so they strategize in ad hoc, flexible, dynamic, and implicit ways. He wrote, "The job breeds adaptive information-manipulators who prefer the live concrete situation. The manager works in an environment of stimulus-response, and he develops in his work a clear preference for live action."
In 1982, John Kotter studied the daily activities of 15 executives and concluded that they spent most of their time developing and working a network of relationships that provided general insights and specific details for strategic decisions. They tended to use "mental road maps" rather than systematic planning techniques.
Daniel Isenberg's 1984 study of senior managers found that their decisions were highly intuitive. Executives often sensed what they were going to do before they could explain why. He claimed in 1986 that one of the reasons for this is the complexity of strategic decisions and the resultant information uncertainty.
Zuboff claimed that information technology was widening the divide between senior managers (who typically make strategic decisions) and operational level managers (who typically make routine decisions). She alleged that prior to the widespread use of computer systems, managers, even at the most senior level, engaged in both strategic decisions and routine administration, but as computers facilitated (She called it "deskilled") routine processes, these activities were moved further down the hierarchy, leaving senior management free for strategic decision making.
In 1977, Abraham Zaleznik distinguished leaders from managers. He described leaders as visionaries who inspire, while managers care about process. He claimed that the rise of managers was the main cause of the decline of American business in the 1970s and 1980s. Lack of leadership is most damaging at the level of strategic management where it can paralyze an entire organization.
According to Corner, Kinichi, and Keats, strategic decision making in organizations occurs at two levels: individual and aggregate. They developed a model of parallel strategic decision making. The model identifies two parallel processes that involve getting attention, encoding information, storage and retrieval of information, strategic choice, strategic outcome and feedback. The individual and organizational processes interact at each stage. For instance, competition-oriented objectives are based on the knowledge of competing firms, such as their market share.
Strategy as marketing
The 1980s also saw the widespread acceptance of positioning theory. Although the theory originated with Jack Trout in 1969, it didn't gain wide acceptance until Al Ries and Jack Trout wrote their classic book Positioning: The Battle For Your Mind (1979). The basic premise is that a strategy should not be judged by internal company factors but by the way customers see it relative to the competition. Crafting and implementing a strategy involves creating a position in the mind of the collective consumer. Several techniques enabled the practical use of positioning theory. Perceptual mapping for example, creates visual displays of the relationships between positions. Multidimensional scaling, discriminant analysis, factor analysis and conjoint analysis are mathematical techniques used to determine the most relevant characteristics (called dimensions or factors) upon which positions should be based. Preference regression can be used to determine vectors of ideal positions and cluster analysis can identify clusters of positions.
In 1992 Jay Barney saw strategy as assembling the optimum mix of resources, including human, technology and suppliers, and then configuring them in unique and sustainable ways.
James Gilmore and Joseph Pine found competitive advantage in mass customization. Flexible manufacturing techniques allowed businesses to individualize products for each customer without losing economies of scale. This effectively turned the product into a service. They also realized that if a service is mass-customized by creating a "performance" for each individual client, that service would be transformed into an "experience". Their book, The Experience Economy, along with the work of Bernd Schmitt convinced many to see service provision as a form of theatre. This school of thought is sometimes referred to as customer experience management (CEM).
Information- and technology-driven strategy
Many industries with a high information component are being transformed. For example, Encarta demolished Encyclopædia Britannica (whose sales have plummeted 80% since their peak of $650 million in 1990) before it was in turn, eclipsed by collaborative encyclopedias like Wikipedia. The music industry was similarly disrupted. The technology sector has provided some strategies directly. For example, from the software development industry agile software development provides a model for shared development processes.
Peter Drucker conceived of the "knowledge worker" in the 1950s. He described how fewer workers would do physical labor, and more would apply their minds. In 1984, John Naisbitt theorized that the future would be driven largely by information: companies that managed information well could obtain an advantage, however the profitability of what he called "information float" (information that the company had and others desired) would disappear as inexpensive computers made information more accessible.
Daniel Bell (1985) examined the sociological consequences of information technology, while Gloria Schuck and Shoshana Zuboff looked at psychological factors. Zuboff distinguished between "automating technologies" and "informating technologies". She studied the effect that both had on workers, managers and organizational structures. She largely confirmed Drucker's predictions about the importance of flexible decentralized structure, work teams, knowledge sharing and the knowledge worker's central role. Zuboff also detected a new basis for managerial authority, based on knowledge (also predicted by Drucker) which she called "participative management".
Maturity of planning process
McKinsey & Company developed a capability maturity model in the 1970s to describe the sophistication of planning processes, with strategic management ranked the highest. The four stages include:
Financial planning, which is primarily about annual budgets and a functional focus, with limited regard for the environment;
Forecast-based planning, which includes multi-year budgets and more robust capital allocation across business units;
Externally oriented planning, where a thorough situation analysis and competitive assessment is performed;
Strategic management, where widespread strategic thinking occurs and a well-defined strategic framework is used.
PIMS study
The long-term PIMS study, started in the 1960s and lasting for 19 years, attempted to understand the Profit Impact of Marketing Strategies (PIMS), particularly the effect of market share. The initial conclusion of the study was unambiguous: the greater a company's market share, the greater their rate of profit. Market share provides economies of scale. It also provides experience curve advantages. The combined effect is increased profits.
The benefits of high market share naturally led to an interest in growth strategies. The relative advantages of horizontal integration, vertical integration, diversification, franchises, mergers and acquisitions, joint ventures and organic growth were discussed. Other research indicated that a low market share strategy could still be very profitable. Schumacher (1973), Woo and Cooper (1982), Levenson (1984), and later Traverso (2002) showed how smaller niche players obtained very high returns.
Other influences on business strategy
Military strategy
In the 1980s business strategists realized that there was a vast knowledge base stretching back thousands of years that they had barely examined. They turned to military strategy for guidance. Military strategy books such as The Art of War by Sun Tzu, On War by von Clausewitz, and The Red Book by Mao Zedong became business classics. From Sun Tzu, they learned the tactical side of military strategy and specific tactical prescriptions. From von Clausewitz, they learned the dynamic and unpredictable nature of military action. From Mao, they learned the principles of guerrilla warfare. Important marketing warfare books include Business War Games by Barrie James, Marketing Warfare by Al Ries and Jack Trout and Leadership Secrets of Attila the Hun by Wess Roberts.
The four types of business warfare theories are:
Offensive marketing warfare strategies
Defensive marketing warfare strategies
Flanking marketing warfare strategies
Guerrilla marketing warfare strategies
The marketing warfare literature also examined leadership and motivation, intelligence gathering, types of marketing weapons, logistics and communications.
By the twenty-first century marketing warfare strategies had gone out of favour in favor of non-confrontational approaches. In 1989, Dudley Lynch and Paul L. Kordis published Strategy of the Dolphin: Scoring a Win in a Chaotic World. "The Strategy of the Dolphin" was developed to give guidance as to when to use aggressive strategies and when to use passive strategies. A variety of aggressiveness strategies were developed.
In 1993, J. Moore used a similar metaphor. Instead of using military terms, he created an ecological theory of predators and prey(see ecological model of competition), a sort of Darwinian management strategy in which market interactions mimic long term ecological stability.
Author Phillip Evans said in 2014 that "Henderson's central idea was what you might call the Napoleonic idea of concentrating mass against weakness, of overwhelming the enemy. What Henderson recognized was that, in the business world, there are many phenomena which are characterized by what economists would call increasing returns—scale, experience. The more you do of something, disproportionately the better you get. And therefore he found a logic for investing in such kinds of overwhelming mass in order to achieve competitive advantage. And that was the first introduction of essentially a military concept of strategy into the business world. ... It was on those two ideas, Henderson's idea of increasing returns to scale and experience, and Porter's idea of the value chain, encompassing heterogenous elements, that the whole edifice of business strategy was subsequently erected."
Traits of successful companies
Like Peters and Waterman a decade earlier, James Collins and Jerry Porras spent years conducting empirical research on what makes great companies. Six years of research uncovered a key underlying principle behind the 19 successful companies that they studied: They all encourage and preserve a core ideology that nurtures the company. Even though strategy and tactics change daily, the companies, nevertheless, were able to maintain a core set of values. These core values encourage employees to build an organization that lasts. In Built To Last (1994) they claim that short term profit goals, cost cutting, and restructuring will not stimulate dedicated employees to build a great company that will endure. In 2000 Collins coined the term "built to flip" to describe the prevailing business attitudes in Silicon Valley. It describes a business culture where technological change inhibits a long term focus. He also popularized the concept of the BHAG (Big Hairy Audacious Goal).
Arie de Geus (1997) undertook a similar study and obtained similar results. He identified four key traits of companies that had prospered for 50 years or more. They are:
Sensitivity to the business environment – the ability to learn and adjust
Cohesion and identity – the ability to build a community with personality, vision, and purpose
Tolerance and decentralization – the ability to build relationships
Conservative financing
A company with these key characteristics he called a living company because it is able to perpetuate itself. If a company emphasizes knowledge rather than finance, and sees itself as an ongoing community of human beings, it has the potential to become great and endure for decades. Such an organization is an organic entity capable of learning (he called it a "learning organization") and capable of creating its own processes, goals, and persona.
Will Mulcaster suggests that firms engage in a dialogue that centres around these questions:
Will the proposed competitive advantage create Perceived Differential Value?"
Will the proposed competitive advantage create something that is different from the competition?"
Will the difference add value in the eyes of potential customers?" – This question will entail a discussion of the combined effects of price, product features and consumer perceptions.
Will the product add value for the firm?" – Answering this question will require an examination of cost effectiveness and the pricing strategy.
See also
Balanced scorecard
Business analysis
Business model
Business plan
Concept-driven strategy
Cost overrun
Dynamic capabilities
Integrated business planning
Marketing
Marketing plan
Marketing strategies
Management
Management consulting
Military strategy
Morphological analysis
Overall equipment effectiveness
Real options valuation
Results-based management
Revenue shortfall
Strategy (game theory)
Strategy dynamics
Strategic lenses
Strategic planning
Strategic Management Society
Strategy map
Strategy Markup Language
Strategy visualization
Value migration
Six Forces Model
Adversarial purchasing
References
Further reading
Cameron, Bobby Thomas. (2014). Using responsive evaluation in Strategic Management.Strategic Leadership Review 4 (2), 22–27.
David Besanko, David Dranove, Scott Schaefer, and Mark Shanley (2012) Economics of Strategy, John Wiley & Sons,
Edwards, Janice et al. Mastering Strategic Management- 1st Canadian Edition. BC Open Textbooks, 2014.
Kemp, Roger L. "Strategic Planning for Local Government: A Handbook for Officials and Citizens," McFarland and Co., Inc., Jefferson, NC, USA, and London, England, UK, 2008 ()
Kvint, Vladimir (2009) The Global Emerging Market: Strategic Management and Economics Excerpt from Google Books
Pankaj Ghemawhat - Harvard Strategy Professor: Competition and Business Strategy in Historical Perspective Social Science History Network-Spring 2002
External links
Institute for Strategy and Competitiveness at Harvard Business School - recent publications
The Journal of Business Strategies - online library
Systems thinking
Business terms
Management by type
Strategy |
48488163 | https://en.wikipedia.org/wiki/Relational%20Semantics%2C%20Inc. | Relational Semantics, Inc. | Relational Semantics, Inc. (RSI) is an American software company that specializes in case management systems for state courts and agencies. Founded in 1983, RSI is based in Boston, Massachusetts,
The Massachusetts Appellate Courts system and other court systems in New England, use RSI's Forecourt judicial case management system to automate case workflows and to provide web-based public access to case information.
History
RSI was founded in 1983. Bob Gorman currently serves as RSI's president and lead software architect.
An early version of RSI's case management technology was adopted by the Vermont Judicial Bureau in 1990 to computerize workflows associated with traffic court cases.
RSI's Forecourt software was adopted by the Massachusetts Appellate Courts (comprising the Massachusetts Appeals Court and the Massachusetts Supreme Judicial Court) as a pilot program in the late 1990s and then implemented state-wide in 2001. Starting in 2003 the Appellate Courts used RSI technology to allow web-based public access to non-confidential case information in the Forecourt system. An updated version of the system and its web portal remain in use as of November 2015. In 2014 RSI updated the system to ingest electronically filed case documents from attorneys and litigants, such as briefs and motions.
Forecourt was adopted by the Connecticut Housing Court in 2002. The system remains in use as of November 2015.
Products
RSI's Forecourt judicial case management system coordinates and automates workflows throughout the case lifecycle from case initiation, docketing, and calendaring, through to final disposition and reporting. RSI's extensions to the core Forecourt system support web publishing, role-sensitive mobile access, and integration with external systems including e-filing portals.
RSI also makes Paragon software which supports state agencies in managing various types of case work. The Vermont Division of Fire Safety uses Paragon to manage inspection and safety information for structures statewide.
References
External links
Official website
Software companies based in Massachusetts
Software companies of the United States |
47361983 | https://en.wikipedia.org/wiki/Usama%20Siala | Usama Siala | Usama Siala is a Libyan politician who served as the Minister of Communications and Information Technology from January 2013 to August 2014. The cabinet was selected by Prime Minister Ali Zeidan on 30 October 2012 and was approved by the General National Congress on 31 October 2012. Siala's term as Minister of Communications and Information Technology ended when the cabinet resigned on 29 August 2014. He was then reinstated as President of General Telecommunications and Information Authority on 22/09/2014.
Education
Siala graduated from Tripoli University with a b.sc. in telecommunication in 1999.
Career
OCT 2014–present day
President of General Telecommunications and Information Authority
NOV 2012–present day
Head of General Assembly of LPTIC
OCT 2014–present day
Board member of the Libyan Investment Authority board of directors
MAR 2015–present day
Libyan African Investment Portfolio
Board member of the Libyan African Investment Portfolio board of directors
NOV 2012–OCT 2014
Libyan government
Minister of Communications and Informatics
Views
Usama Siala is a strong supporter of privatization of the telecom sector of Libya and would like to see the government decrease its hold on the telecommunications sector and get the private sector more involved.
References
External links
Ministry of Communications and Information Technology
Libyan Telecommunication Holding company
Government ministers of Libya
Living people
Members of the Interim Government of Libya
1970 births
People of the First Libyan Civil War
Libyan engineers |
13375705 | https://en.wikipedia.org/wiki/Qshell | Qshell | Qshell is an optional command-line interpreter (shell) for the IBM i operating system. Qshell is based on POSIX and X/Open standards. It is a Bourne-like shell that also includes features of KornShell. The utilities (or commands) are external programs that provide additional functions. The development team of Qshell had to deal with platform-specific issues such as translating between ASCII and EBCDIC. The shell supports interactive mode as well as batch processing and can run shell scripts from Unix-like operating systems with few or no modifications.
Commands
The following is a list of commands that are supported by the Qshell command-line interpreter on IBM i 7.4.
Differences from other Unix shells
Qshell does not support the redirection operator or provide a command history. It also has no job control support as IBM i operating system does not have the concept of a foreground or background process group. The POSIX standard and built-in commands are therefore not available as well.
Compared to PASE for i
According to IBM, QSHELL is a “Unix-like” interface built over IBM i. The commands issued by the user point to programs in a “Qshell” library. It began as a port from the ash shell, which was a Bourne-like shell created by Berkeley Software Design.
See also
Control Language
Comparison of command shells
References
Further reading
External links
Exploring iSeries QSHELL
Command shells
Interpreters (computing)
IBM operating systems |
28257292 | https://en.wikipedia.org/wiki/Illumos | Illumos | Illumos (stylized as illumos) is a partly free and open-source Unix operating system. It is based on OpenSolaris, which was based on System V Release 4 (SVR4) and the Berkeley Software Distribution (BSD). Illumos comprises a kernel, device drivers, system libraries, and utility software for system administration. This core is now the base for many different open-sourced Illumos distributions, in a similar way in which the Linux kernel is used in different Linux distributions.
The maintainers write illumos in lowercase since some computer fonts do not clearly distinguish a lowercase L from an uppercase i: Il (see homoglyph). The project name is a combination of words illuminare from Latin for to light and OS for Operating System.
Overview
Illumos was announced via webinar on Thursday, 3 August 2010, as a community effort of some core Solaris engineers to create a truly open source Solaris by swapping closed source bits of OpenSolaris with open implementations.
The original plan explicitly stated that Illumos would not be a distribution or a fork. However, after Oracle announced discontinuing OpenSolaris, plans were made to fork the final version of the Solaris ON kernel allowing Illumos to evolve into a kernel of its own.
, efforts focused on libc, the NFS lock manager, the crypto module, and many device drivers to create a Solaris-like OS with no closed, proprietary code. , development emphasis includes transitioning from the historical compiler, Studio, to GCC. The "userland" software is now built with GNU make and contains many GNU utilities such as GNU tar.
Illumos is lightly led by founder Garrett D'Amore and other community members/developers such as Bryan Cantrill and Adam Leventhal, via a Developers' Council.
The Illumos Foundation has been incorporated in the State of California as a 501(c)6 trade association, with founding board members Jason Hoffman (formerly at Joyent), Evan Powell (Nexenta), and Garrett D'Amore. As of August 2012, the foundation was in the process of formalizing its by-laws and organizational development.
At OpenStorage Summit 2010, the new logo for Illumos was revealed, with official type and branding to follow over.
Development
Its primary development project, illumos-gate, derives from OS/Net (aka ON), which is a Solaris kernel with the bulk of the drivers, core libraries, and basic utilities, similar to what is delivered by a BSD "src" tree. It was originally dependent on OpenSolaris OS/Net, but a fork was made after Oracle silently decided to close the development of Solaris and unofficially killed the OpenSolaris project.
Features
ZFS, a combined file system and logical volume manager providing a high level of data integrity for very large storage capacities.
Solaris Containers (or Zones), a low overhead implementation of operating-system-level virtualization technology for x86 and SPARC systems.
DTrace, a comprehensive dynamic tracing framework for troubleshooting kernel and application problems on production systems in real time.
Kernel-based Virtual Machine (KVM), a virtualization infrastructure. KVM supports native virtualization on processors with hardware virtualization extensions.
OpenSolaris Network Virtualization and Resource Control (or Crossbow), a set of features that provides an internal network virtualization and quality of service including: virtual NIC (VNIC) pseudo-network interface technology, exclusive ip zones, bandwidth management, and flow control on a per interface and per VNIC basis.
Relatives
Solaris (operating system)
Current distributions
Distributions, at illumos.org
DilOS, with Debian package manager (dpkg + apt) and virtualization support, available for x86-64 and SPARC.
NexentaStor, distribution optimized for virtualization, storage area networks, network-attached storage, and iSCSI or Fibre Channel applications employing the ZFS file system.
OmniOS Community Edition, takes a minimalist approach suitable for server use.
OpenIndiana, a distribution that is a continuation and fork in the spirit of the OpenSolaris operating system.
SmartOS, a distribution for cloud computing with Kernel-based Virtual Machine integration.
Tribblix, retro style distribution with modern components, available for x86-64 and SPARC.
v9os, a server-only, IPS-based minimal SPARC distribution.
XStreamOS, a distribution for infrastructure, cloud, and web development.
Discontinued:
Dyson, derived from Debian using libc, and SMF init system.
OpenSXCE, distribution for developers and system administrators for IA-32/x86-64 x86 platforms and SPARC.
See also
napp-it, ZFS web interface for Illumos-based NAS or SAN appliances.
References
Free software
OpenSolaris
Software forks
Solaris software
de:OpenSolaris#illumos |
62001706 | https://en.wikipedia.org/wiki/A%20Story%20About%20My%20Uncle | A Story About My Uncle | A Story About My Uncle is an adventure game by independent developer Gone North Games and published by Coffee Stain Studios in 2014. It was initially developed by students of Södertörn University in 2012, with a full release in May 2014 for Microsoft Windows, and three years later for macOS and Linux. The game was re-developed professionally after collaboration with Coffee Stain Studios.
A Story About My Uncle is played from a first-person perspective, and the player travels by using floating rocks. According to review aggregator Metacritic, the game received mixed reviews. It was nominated for the Game of the Year award at the 2012 Swedish Game Awards.
Gameplay and plot
A Story About My Uncle is an adventure game. Played from a first-person perspective, uses platform game elements set in a world of drifting rocks. The player is searching for Uncle Fred.
The game focuses on the narrator's uncle Fred, as the narrator tells a bedtime story to a small child. The uncle Fred is described as "a brilliant scientist – a whimsical and even-tempered version of Uncle Quentin from the Famous Five books". A board in the narrator's abandoned house tells the player that Fred built a waste disposal system, conceivably controlled by starlight. After this backstory, the game begins with the narrator as a child entering the "waste disposal dimension" to search for his uncle. The player follows the uncle character through the game environment, with the player character having a suit that is equipped with a "magical grappling hook and shock absorbers" that stop the player character from taking damage when landing. Later on in the game, the player will find and be able to use jet-propelling boots, which enable them to travel farther in one jump.
Development
The initial ideas for developing the game were "basically playing with gravity in the world – you would turn it upside down" according to Sebastian Eriksson, co-founder of Gone North Games and one of the company's programmers. Eriksson "had been playing around with a mechanic to propel yourself within this gravity-bending game"; this ended up being the main game mechanic, and the game was built around it. The students who developed A Story About My Uncle had "to teach themselves how to use game development software" Unreal Engine because they did not have much experience in the video game field.
The game was developed over three months in 2012, in the Unreal Engine Unreal Development Kit, by a small group of students at Södertörn University. Being developed for a competition, university students were tasked to build a "non-violent first-person game in the Unreal Engine". The Södertörn students went on to form their own video game studio called Gone North Games. According to Sebastian Zethraeus, they learned how to use the engine in ten weeks and built a prototype in that time; he said that they "were proud of it at the time, but our eyes bleed now when we look at it".
A Story About My Uncle was first released on 30 July 2012 as a free demo; the demo was nominated for Game of the Year at Swedish Game Awards. Developers Gone North Games then partnered with Coffee Stain Studios to publish the game on 28 May 2014, on Steam, professionally redeveloping it in collaboration with Coffee Stain Studios. The game was released on 28 May 2014 for Microsoft Windows, and on 12 May 2017 for macOS and Linux.
Gone North Games' student developers, together with Coffee Stain Studios, polished the game by "remaking everything from the meshes to the voice acting to the code itself". Sebastian Eriksson said that being nominated for Game of the Year, and getting feedback from early players, led the group to finish the game.
Reception
Metacritic, which uses a weighted average, assigned the game a score of 73 out of 100, based on 24 critics, indicating "mixed or average reviews". As of December 2019 the game scored a "Very Positive" review by over 10,150 reviews on Steam.
Stephen Dunne of GodisaGeek wrote that "Some [players] may find A Story About My Uncle too easy, some people may find it infuriating", but that "most, however, will be engrossed in its ever-glowing and charismatic fantasy world". Ben Griffin of PC Gamer wrote that the game "is fast, fluid and fun". Cassidee Moser of CGMagazine praised the game for being "well-crafted and paced", adding that the "environments are varied and beautifully done" and the "story itself is a light-hearted, innocent magical romp".
Kyle Hilliard of Game Informer also found the game to be light-hearted, but did suggest that this is "at odds with the difficulty late in-game", and so confusing who the game is aimed at.
In 2012, A Story About My Uncle received a nomination for Game of the Year at the Swedish Game Awards.
References
External links
Official website
Adventure games
2012 video games
Windows games
Linux games
MacOS games
Single-player video games
First-person adventure games
Video games developed in Sweden
Indie video games
Coffee Stain Studios games |
1740975 | https://en.wikipedia.org/wiki/Kolab | Kolab | Kolab is a free and open source groupware suite. It consists of the Kolab server and a wide variety of Kolab clients, including KDE PIM-Suite Kontact, Roundcube web frontend, Mozilla Thunderbird and Mozilla Lightning with SyncKolab extension and Microsoft Outlook with proprietary Kolab-Connector PlugIns.
Basic Concepts
Kolab uses IMAP as an underlying protocol for email, contact, and calendar entries. These entries are saved in IMAP folders in Kolab XML format, and the IMAP server controls storage and access rights. Configuration and maintenance of Kolab is done by LDAP.
Kolab Clients and the Kolab server use well established protocols and formats for their work (i.e. IMAP as mentioned above, vCard, iCal, XML and LDAP). This allows the Kolab Format specification framework, or even portions of it, to be utilized as an open set of specifications for groupware clients and servers to communicate with each other. Third party implementations began almost immediately; for example, the Citadel groupware server began supporting version 1 of the Kolab Format specification in March 2004.
The concepts on which Kolab relies are laid out in the Kolab Format Specification and Architecture Paper for Kolab 2, and for Kolab 1 in the Kroupware Contract, Architecture Paper and Technical Description.
Main features
Full seamless support of mixed clients environments (Outlook, KDE, Web etc.)
Full server side support for ActiveSync and CalDAV, CardDAV and WebDAV
Support for Email, Calendar, Address Books, Tasks and File-Cloud
Support for KDE with Kontact
Support for Microsoft Outlook with proprietary connector PlugIns
A web administration interface
Configuration data is kept in a LDAP directory
A global LDAP addressbook for contacts
IMAP4rev1 as well as POP3 access to mail
Full support for client-side PGP and S/MIME email encryption (officially Sphinx-interoperable)
Full support for shared calendars with IMAP ACLs
Full support for shared contacts with IMAP ACLs
Fully offline capable using KDE Kontact or Microsoft Outlook
Support for server side resource management (e.g. rooms, cars)
Full support for freebusy handling
Kolab 3.x clients
Roundcube is the default web client delivered with Kolab 3.0;
KDE Kontact, starting with version 4.10 , and Kontact-Touch (K Desktop Environment, Free Software);
SyncKolab , starting with version 3.0.0, is a Mozilla Thunderbird/SeaMonkey and Lightning extension (Free Software);
Kolab Desktop Client is a stabilized and professionally supported version of KDE Kontact.
Kolab 2.x clients
KDE Kontact and Kontact-Touch (K Desktop Environment, Free Software)
Horde (integrated in Kolab since v2.2.1), a web-frontend for utilising web-browsers as Kolab-clients (Free Software)
SyncKolab , a Mozilla Thunderbird / SeaMonkey and Lightning extension (Free Software)
evolution-kolab (integrated in GNOME / Evolution since v3.4, see,) extends GNOME's Evolution and EDS (Evolution-Data-Server) to be a full-featured Kolab-client (Free Software)
Kolab's integrated Horde also provides a SyncML interface, over which SyncML-capable mobile phones can synchronise PIM-data on a Kolab-server (Free Software)
Z-Push (integrated in Kolab since v2.3.0) enables ActiveSync-capable clients to access their server mailboxes and to use Kolab-provided PIM-functionality (Free Software)
Kolab-WS extends Kolab to provide Kolab-functionality as a web-service; Kolab-WS was originally part of Syncphony, which utilises Kolab-WS since their split (Free Software)
Syncphony (initially "kolab-sync") connects Kolab-WS with a Funambol sync-server, thus enabling devices supported by Funambol to synchronise their PIM-Data with a Kolab server (Free Software)
kolab-android synchronises Android's addressbook and calendar to IMAP-folders in the Kolab2 format (Free Software)
Toltec Connector (Microsoft Outlook Connector, Proprietary Software)
KONSEC Konnektor (Microsoft Outlook MAPI Storage Provider, Proprietary Software)
Bynari Outlook Connector (Microsoft Outlook MAPI Provider, Proprietary Software)
Aethera, a client solely for the Kolab 1 format, available for Windows, Linux and Mac OS X (Free Software)
Roundcube is the default Webclient in the Kolab 2.4 release
History
2013:
Kolab 3.0 was released, featuring the improved Kolab 3 format based on xCard and xCal, the Roundcube web-client, a new ActiveSync component called Syncroton and other new properties.
Kolab Now hosted solution was launched. It lets customers create individual accounts or managed domain accounts, with the possibility to move data to a self-hosted Kolab 3 installation any time. The service is operated by Kolab Systems in Switzerland.
2012:
evolution-kolab had its first release as part of the GNOME initiative in Evolution 3.4.
SyncKolab was massively overhauled and released as version 2.0 for Thunderbird / SeaMonkey (optionally with Lightning for calendaring and tasks).
evolution-kolab was vastly enhanced and had its second release as part of the GNOME initiative in Evolution 3.6.
SyncKolab 3.0 was largely rewritten, yielding speed and feature improvements, and now supports the Kolab 3 format.
2011:
Kolab 2.3.0 was released, containing many updated core components and other improvements, the new Z-push synchronization for clients using ActiveSync (in addition to the already existing SyncML support) and an overhauled web-based administration front-end.
Kolab 2.3.1 was released shortly thereafter, as a bugfix release.
Kolab 2.3.2 was released as a regular maintenance release of the Kolab 2.3 branch, with updated Postfix and Z-push components.
Kontact 2 was released next to KDE SC 4.6.4, with many improvements related to Kolab.
evolution-kolab was released, which extends GNOME's Evolution and EDS (Evolution Data Server) to become a full-featured Kolab client.
Kolab 2.3.3 was released, providing a lot of bugfixes for the Horde components and many updated base components (Apache, Cyrus IMAP, OpenLDAP, OpenSSL and PHP).
Kolab 2.3.4 was released as a bugfix release.
Kolab-WS was split out of Syncphony as a standalone web service, providing Kolab functionality. Hence Syncphony solely becomes a Funambol connector.
2010:
KolabiPhone, a Kolab sync connector for the iPhone, had its first pre-alpha release.
Two new Free Software Kolab sync connectors for Android and Outlook were announced and their first alpha releases published.
Syncphony was released, which extends Kolab with a Funambol connector and offers Kolab functionality as a web service.
SyncKolab 1.5 for Thunderbird 3 / SeaMonkey 2 (optionally with Lightning 1.0 beta for calendaring and tasks) was released.
Kolab 2.2.4 was released as a maintenance release of the Kolab 2.2 branch.
Kontact Touch was released, offering full Kolab functionality on mobile devices, such as smartphones and tablets.
2009:
Kolab 2.2.1 was released, as an enhancement and maintenance release, integrating an updated web client (Horde) and preliminary SyncML support.
Kolab 2.2.2 was released, as a maintenance release of the Kolab 2.2 branch.
Kolab 2.2.3 was released, further enhancing functionality, stability and scalability of the Kolab 2.2 branch.
2008:
SyncKolab 1.0 for Thunderbird 1.5 and 2.0 as well as SeaMonkey 1.0 and 1.1 (optionally with Lightning 0.9 for calendaring and tasks) was released.
Kolab 2.2 was released, with full support of multiple mail domains, integrated Horde web front-end, updated base packages (OpenPKG, OpenLDAP, Cyrus IMAP, Postfix, Perl, Apache, PHP, etc.), easier integration in operating system distributions, and many other new features.
2007:
Kolab 2.1 was released.
A third Outlook connector was released.
2006:
Kolab 2.1 was designed with many significant enhancements over 2.0.
2005:
KolabSyncML ("Sync4j Kolab Connector / SyncSource"), a Kolab Java interface and Funambol connector, had its first alpha release.
Kolab 2.0 was released.
A second Outlook connector appeared on the market.
The SyncKolab project started developing a Mozilla Thunderbird / SeaMonkey and Lightning connector.
2004:
Aethera became a Kolab 1 groupware client.
Citadel/UX learned how to mimic a Kolab 1 groupware server.
Kolab 2 was designed as a general overhaul and implemented utilizing the versatile and extensible Kolab Open Format to store groupware data.
The Kroupware Client matured to KDE Kontact.
2003:
KMail and the KDE PIM software were enhanced, creating the Kroupware Client.
Kolab 1.0 was released.
The first Outlook connector was developed.
2002:
Kolab 1 / Kroupware was designed utilizing iCal and vCard formats to store calendar entries, contacts, notes, tasks etc. in Kolab's IMAP directories.
References
External links
Kolab Project home page
Kolab Systems AG home page (successor of the Kolab Konsortium)
Free email software
Free groupware
Free software programmed in C++
Free software programmed in Python
KDE software |
34584857 | https://en.wikipedia.org/wiki/Gina%20Raimondo | Gina Raimondo | Gina Marie Raimondo (; born May 17, 1971) is an American politician and venture capitalist serving since 2021 as the 40th and current United States Secretary of Commerce. A member of the Democratic Party, she previously served as the 75th and first female Governor of Rhode Island from 2015 to 2021.
She served as General Treasurer of Rhode Island from 2011 to 2015. She was selected as the Democratic candidate for Rhode Island's governorship in the 2014 election. Raimondo won the election on November 4, 2014, with 41% of the vote, in a three-way race, against the mayor of Cranston, Republican Allan Fung, and businessman Robert J. Healey. She won re-election on November 6, 2018. She resigned as Governor in March 2021 after being confirmed by the U.S. Senate to serve as the United States Secretary of Commerce. Raimondo was confirmed by a vote of 84 to 15.
Early life and education
Gina Marie Raimondo was born in 1971 in Smithfield, Rhode Island, where she later grew up. Of Italian descent, she is the youngest of Josephine (Piro) and Joseph Raimondo's three children. Her father, Joseph (1926–2014), made his career at the Bulova watch factory in Providence, Rhode Island. He became unemployed at 56 when the Bulova company decamped operations to China, shuttering the factory in Providence. Raimondo was a childhood friend of U.S. Senator Jack Reed. Raimondo graduated from LaSalle Academy, in Providence, as one of the first girls allowed to attend the Catholic school, where she was valedictorian.
Raimondo graduated with a Bachelor of Arts degree magna cum laude in economics from Harvard College in 1993, where she served on the staff of The Harvard Crimson. While at Harvard, she resided in Quincy House. She attended New College, Oxford as a Rhodes Scholar, where she received a Master of Arts (MA) degree and Doctor of Philosophy in 2002 in sociology. Her thesis was on single motherhood and supervised by Stephen Nickell and Anne H. Gauthier while she was a postgraduate student of New College, Oxford. Raimondo received her Juris Doctor degree from Yale Law School in 1998.
Early career
Following her graduation from Yale Law School, Raimondo served as a law clerk to federal judge Kimba Wood of the United States District Court for the Southern District of New York. Later, Raimondo acted as senior vice president for fund development at the Manhattan offices of Village Ventures, a venture capital firm based in Williamstown, Massachusetts, and backed by Bain Capital and Highland Capital Groups. Raimondo returned to Rhode Island in 2000 to co-found the state's first venture capital firm, Point Judith Capital. Point Judith subsequently relocated its offices to Boston, Massachusetts. At Point Judith, Raimondo served as a general partner covering health care investments; she retains some executive duties with the firm.
General Treasurer of Rhode Island (2011–2015)
On November 2, 2010, Raimondo was elected as general treasurer of Rhode Island by a margin of 62% to 38%.
During her first year as general treasurer, she prioritized reforming Rhode Island's public employee pension system, which was 48% funded in 2010. In April 2011, Raimondo led the state retirement board to reduce the state's assumed rate of return on pension investments from 8.25 percent to 7.5 percent. In May 2011, Raimondo released "Truth in Numbers", a report that advocated for benefit cuts as the solution to Rhode Island's pension problems, and she helped lead the effort to cut pensions, along with then-Speaker of the House Gordon Fox. The Rhode Island Retirement Security Act (RIRSA) was enacted by the General Assembly on November 17, 2012, with bipartisan support in both chambers. The next day, Lincoln Chafee signed RIRSA into law. The legality of RIRSA was challenged in court by the public employee unions, but a settlement was reached in June 2015.
Under Raimondo's tenure, the pension fund was criticized for underperforming when compared with its peers. Raimondo's critics attributed the underperformance to a sharp increase in fees paid to hedge fund managers while her supporters argued investments in hedge funds stabilize investments during market downturns for more consistent returns over time.
Municipalities
Raimondo created the Ocean State Investment Pool (OSIP), a low-cost investment vehicle intended to help the state and municipalities better manage and improve the investment performance of their liquid assets, which are used for day-to-day operations including payroll and operating expenses. $500million in funds could be eligible for the program, which would enable Treasury "to extend its expertise to municipalities and improve investment returns by creating economies of scale". The program launched on April 23, 2012.
Payday lending
During the Rhode Island General Assembly's 2012 session, Raimondo advocated for a decrease in the maximum allowable interest rate on payday loans in Rhode Island. She hosted a roundtable discussion with then Providence mayor Angel Taveras and members of the Rhode Island Payday Reform Coalition. Raimondo submitted letters to the Senate and House Corporations Committees in support of payday reform legislation. She wrote "Far too many families are facing financial challenges that might be mitigated or avoided through a greater understanding of personal finance," and "payday loans exploit that lack of understanding... With numerous economic challenges, Rhode Island should not permit the sale of a financial product that traps so many customers in a cycle of debt." Raimondo wrote an op-ed in the edition of May 29, 2012 of The Providence Journal in support of payday lending reform.
Governor of Rhode Island (2015–2021)
Raimondo was elected governor of Rhode Island on November 4, 2014, winning 41% of the vote in a three-way race, defeating challengers Allan Fung (R) and Robert J. Healey of the Moderate Party. Raimondo is the first female governor of Rhode Island. At the time of her resignation, she was one of nine incumbent female governors of the United States.
When she ran for governor, Rhode Island had the nation's highest unemployment rate. Raimondo has cut taxes every year and removed eight thousand pages of regulationsthirty percent of the state's regulations. She raised the state minimum wage to $11.50, created a sick-leave entitlement, financed the largest infrastructure program in the state's history, appointed more judges of color than any previous Rhode Island governor, and made community colleges tuition-free.
Raimondo was elected to serve as the vice chair of the Democratic Governors Association for the 2018 election cycle. She was subsequently elected as chair of the Democratic Governors Association in 2019. Raimondo ran for and won reelection to a second term as Governor of Rhode Island in 2018, becoming the first candidate to secure a majority of votes for that office since 2006.
Approval ratings
Between assuming office and the end of 2019, Raimondo consistently ranked towards the bottom of approval ratings for all governors in the United States.
In April 2020, in the midst of the COVID-19 pandemic, Microsoft News conducted a poll to determine how well governors across the U.S. were handling mitigation of COVID-19. The poll found 76% of Rhode Islanders said they approved of the work done by Raimondo and her administration "to keep people safe" during the ongoing crisis. Partnering with CVS, the nation's largest pharmacy chain, headquartered in Woonsocket, her state has achieved one of the nation's highest per capita levels of testing for COVID-19. Her approval rating has soared during the pandemic.
The poll found majority support across all 50 states for how governors are handling the pandemic. Raimondo was tied with the governors of North Dakota and Utah for the 12th-highest rating.
Criticism
State Health and Human Services computer system failure
A widely-criticized rollout of a new computer network system for the Rhode Island Executive Office of Health and Human Services dubbed the "Unified Health Infrastructure Project" (UHIP) in September 2016 saw scores of people without access to government programs such as food stamps and child care due to glitches in the software, designed by Deloitte. This computer crash created a backlog of more than 20,000 cases.
The Raimondo Administration received several letters from the federal government in August and September 2016 warning that UHIP was not ready to be launched. On the orders of Raimondo, the UHIP launch occurred as planned despite these federal warnings. The U.S. Food and Nutrition Service's Northeast Regional Administrator, Kurt Messner, urged Raimondo to postpone the launch because it would interrupt or interfere with benefits the agency oversees. Messner said in the letter which local news outlets described as "strongly-worded" that "the transition plan remains inadequate and unacceptable." Messner also pointed out that the state had failed to gradually launch UHIP in phases or administer a live pilot test of UHIP. Messner opined that "Launching a system without having conducted a live pilot is against the intent of the regulations and against our best advice." The Raimondo Administration ultimately ignored the federal warnings resulting in benefit delays, system downtime, and benefit loss caused in error.
In December 2016, the federal government give the state Department of Human Services less than a month to fix the UHIP computer system or risk losing $13million in federal funding. Federal officials judged that the state was not compliant in lowering a significant case backlog, starting a sufficient call-center, adequate staff training, and improving wait times at Health and Human Services field offices.
In February 2017, Executive Secretary of Health and Human Services Elizabeth H. Roberts resigned from her cabinet post in the Raimondo Administration due to the failed roll-out of the UHIP.
In March 2017, Rhode Island Monthly reported that the U.S. Department of Justice opened an investigation into UHIP, specifically false claims and statements made about the Health and Human Services computer network rollout. The investigation was still underway as of summer 2017. In an interview, House Oversight Chair Rep. Patricia Serpa (D-West Warwick) said, "There's plenty of blame to go around. The auditor's report found that [the contract with Deloitte] was poorly written, poorly overseen and poorly executed. They were warned against the implementation because the system was not ready. Not only did they implement it, they displaced all of the most senior workers with the wealth of experience. We pulled all the plugs to make sure this was a failure."
According to documents submitted to the federal government, the cost estimate for UHIP through 2021 is $656million. State taxpayers will pay $154million of this amount while the federal government will pay the remainder.
In January 2020, State Senator Sam Bell said a Rhode Island Senate Fiscal Report on Raimondo's budget proved that "a single UHIP update kicked 5,500 Rhode Islanders off their Medicaid" in November 2019 without due process and the decisions were based on a computer update. Bell went on: "Medicaid terminations need to be done with some due process. They should not come from a notoriously glitchy computer system. You should have a chance to fight the decision to rip away your health insurance. When you lose your Medicaid with no warning and no effort to transition you onto the exchange, the consequences can be deadly."
RI DCYF fatalities and near-fatalities
Under Raimondo, the Rhode Island Department of Children, Youth & Families has come under fire for the rate of deaths and near-deaths of children in its care. In a period between January 2016 and December 2017, there were 31 fatalities or near fatalities of children in its care, with eight being confirmed fatal.
Raimondo appointed Trista Piccola as her new DCYF director in January 2017. Piccola's term was marked by the death and near-deaths of children, high staff turnover, votes of no confidence, and high budget deficits. Rep. Patricia Serpa and Rep. Charlene Lima called for Piccola's resignation, which finally occurred in July 2019.
In October 2018, the United States Department of Health and Human Services' Administration for Children and Families ordered the Raimondo Administration DCYF to improve in 33 of 36 areas assessed. The federal report noted that DCYF services were "inadequate, not developed when needed, or lacked consistent monitoring". Harvard Kennedy School professor and former Obama Administration official Jeffrey Liebman agreed with the recommendations and analysis of the report from the U.S. Department of Health and Human Services and claimed that the DCYF is "the most messed-up agency ever".
With Piccola's departure, the interim director is DCYF executive legal counsel Kevin Aucoin. Aucoin has served in an interim director capacity twice before when DCYF was without a permanent Director. Secretary of the Rhode Island Executive Office of Health and Human Services and Raimondo cabinet member Womazetta Jones said in December 2019 that she was "very determined to stay the course of not hiring anybody unless it's the right person". As of December 2020 DCYF does not have a permanent Director.
During Raimondo's tenure as Governor, the Department of Children, Youth and Families (DCYF) has focused on shifting children from congregate settings to licensed foster homes. DCYF has increased its capacity and utilization of licensed foster homes, including an increase in the number of licensed kinship families, from 280 in October 2019 to 576 in June 2020. As of December 2020, 83% of all children placed in out-of-home care are placed in a foster home. Since 2015, the Department’s intensive reforms have resulted in a 43% reduction in the number of youth placed in congregate care and a 39% reduction in the number of youth placed in out-of-state congregate care. At the same time, the Department has increased the number of children placed in licensed foster family homes.
Bloomberg 2020 campaign involvement
In early February 2020, Raimondo appeared alongside former Republican New York City Mayor and Democratic presidential hopeful Michael Bloomberg at the Wexford Innovation Center in Providence to endorse his candidacy, a move she described as "an easy call". Raimondo was named a national co-chair for the Bloomberg campaign.
Press secretary Jennifer Bogdan Jones of the Governor's Office told The Providence Journal "[Raimondo] is prepared to do whatever it takes to support Mike and defeat President Trump." As campaign co-chair, Raimondo would have "provided advice and attended events". Less than a month later, however, Bloomberg dropped out of the race and endorsed former Vice President Joe Biden. On the same day, Raimondo also endorsed Biden. She said Bloomberg "obviously" performed poorly on the debate stage but supporting his candidacy "was an easy decision for me at the beginning. But [supporting Biden] is an easy decision, too." Raimondo concluded that it was now time "to unify behind Joe Biden".
Clash with New York Gov. Cuomo over COVID-19 quarantine
On March 28, 2020, New York Governor Andrew Cuomo threatened Raimondo with a lawsuit over a new state quarantine policy, which would make sure people from New York, which had been hit hard by the COVID-19 pandemic, would self-quarantine for 14 days upon arrival in Rhode Island. On March 29, Raimondo repealed the order that specifically referred to New Yorkers, and broadened it to include any out-of-state traveler entering Rhode Island with intent to stay.
Secretary of Commerce (2021–present)
Nomination
Following the 2020 United States presidential election, Raimondo was routinely mentioned as a possible cabinet secretary in the incoming Biden Administration. Though first seen as a likely Secretary of Health and Human Services, Raimondo announced on December 3, 2020, that she would not be taking that role. She was also considered for Secretary of the Treasury.
On January 7, 2021, Biden announced he would nominate Raimondo to serve as his Secretary of Commerce. She appeared before the Senate Committee on Commerce, Science and Transportation on January 26. On March 1, the Senate voted 84–15 in favor of cloture on the nomination, and confirmed Raimondo to the position the following day by a vote of 84–15.
Tenure
Raimondo was duly sworn in by Vice President Kamala Harris on March 3, 2021. In August 2021, Politico reported that Raimondo has become one of the "administration’s secret weapons on the Hill" due to her role in negotiating the Infrastructure Investment and Jobs Act (IIJA).
Global chip shortage
As Secretary of Commerce, Raimondo has helped lead the U.S. response to the global chip shortage, and has urged Congress to pass legislation that would boost domestic semiconductor manufacturing. Raimondo has argued that the chip shortage presents a national and economic security threat to U.S. interests.
Cybersecurity policy
During Raimondo's tenure, the Department of Commerce sanctioned NSO Group for selling spyware technology. As Secretary of Commerce, Raimondo has worked with other administration officials, such as Secretary of Homeland Security Alejandro Mayorkas, on coordinating cybersecurity policy.
Relations with China
In September 2021, Raimondo accused China of violating the intellectual property (IP) rights of U.S> companies, and stated that the Chinese government has put in place "all kinds of different barriers for American companies to do business in China." In October 2021, Raimondo was criticized for stating ‘there’s no point in talking about decoupling’ our economy from China’s' by Senator Tom Cotton.
Other
Raimondo was the only Cabinet member not to attend Joe Biden's first State of the Union address on March 1, 2022, since was chose as the designated survivor.
Electoral history
Personal life and recognition
On December 1, 2001, Raimondo married Andrew Kind Moffit, in Providence. The couple have two children, Cecilia and Thompson Raimondo Moffit. The family resides on the east side of Providence. Raimondo is a practicing Roman Catholic.
Raimondo is a member of the Council on Foreign Relations and an Aspen Institute Rodel fellow. She was awarded an honorary degree from Bryant University, in 2012; and has received awards from the northern Rhode Island chamber of commerce and the YWCA of northern Rhode Island. Raimondo was elected alumni fellow at Yale, in 2014.
Community service
Raimondo serves as vice chair of the board of directors of Crossroads Rhode Island, the state's largest homeless services organization. Until 2011, she was an administrator of Women and Infants Hospital and chair of its Quality Committee. She has served on the boards of La Salle Academy and Family Service of Rhode Island.
See also
List of female governors in the United States
List of female United States Cabinet members
COVID-19 pandemic in Rhode Island
References
External links
Biography at the United States Department of Commerce
Campaign website
Inauguration Program from the Rhode Island State Archives
1971 births
21st-century American politicians
21st-century American women politicians
21st-century Roman Catholics
Alumni of New College, Oxford
American business executives
American politicians of Italian descent
American Rhodes Scholars
American venture capitalists
American women business executives
Biden administration cabinet members
Businesspeople from Rhode Island
Catholics from Rhode Island
Democratic Party state governors of the United States
Governors of Rhode Island
La Salle Academy alumni
Living people
Members of the Council on Foreign Relations
People from Smithfield, Rhode Island
Rhode Island Democrats
State treasurers of Rhode Island
The Harvard Crimson people
United States Secretaries of Commerce
Women in Rhode Island politics
Women members of the Cabinet of the United States
Women state governors of the United States
Yale Law School alumni |
38360943 | https://en.wikipedia.org/wiki/Milling%20%28machining%29 | Milling (machining) | Milling is the process of machining using rotary cutters to remove material by advancing a cutter into a workpiece. This may be done varying direction on one or several axes, cutter head speed, and pressure. Milling covers a wide variety of different operations and machines, on scales from small individual parts to large, heavy-duty gang milling operations. It is one of the most commonly used processes for machining custom parts to precise tolerances.
Milling can be done with a wide range of machine tools. The original class of machine tools for milling was the milling machine (often called a mill). After the advent of computer numerical control (CNC) in the 1960s, milling machines evolved into machining centers: milling machines augmented by automatic tool changers, tool magazines or carousels, CNC capability, coolant systems, and enclosures. Milling centers are generally classified as vertical machining centers (VMCs) or horizontal machining centers (HMCs).
The integration of milling into turning environments, and vice versa, began with live tooling for lathes and the occasional use of mills for turning operations. This led to a new class of machine tools, multitasking machines (MTMs), which are purpose-built to facilitate milling and turning within the same work envelope.
Process
Milling is a cutting process that uses a milling cutter to remove material from the surface of a work piece. The milling cutter is a rotary cutting tool, often with multiple cutting points. As opposed to drilling, where the tool is advanced along its rotation axis, the cutter in milling is usually moved perpendicular to its axis so that cutting occurs on the circumference of the cutter. As the milling cutter enters the work piece, the cutting edges (flutes or teeth) of the tool repeatedly cut into and exit from the material, shaving off chips (swarf) from the work piece with each pass. The cutting action is shear deformation; material is pushed off the work piece in tiny clumps that hang together to a greater or lesser extent (depending on the material) to form chips. This makes metal cutting somewhat different (in its mechanics) from slicing softer materials with a blade.
The milling process removes material by performing many separate, small cuts. This is accomplished by using a cutter with many teeth, spinning the cutter at high speed, or advancing the material through the cutter slowly; most often it is some combination of these three approaches. The speeds and feeds used are varied to suit a combination of variables. The speed at which the piece advances through the cutter is called feed rate, or just feed; it is most often measured as distance per time (inches per minute [in/min or ipm] or millimeters per minute [mm/min]), although distance per revolution or per cutter tooth are also sometimes used.
There are two major classes of milling process:
In face milling, the cutting action occurs primarily at the end corners of the milling cutter. Face milling is used to cut flat surfaces (faces) into the work piece, or to cut flat-bottomed cavities.
In peripheral milling, the cutting action occurs primarily along the circumference of the cutter, so that the cross section of the milled surface ends up receiving the shape of the cutter. In this case the blades of the cutter can be seen as scooping out material from the work piece. Peripheral milling is well suited to the cutting of deep slots, threads, and gear teeth.
Milling cutters
Many different types of cutting tools are used in the milling process. Milling cutters such as end mills may have cutting surfaces across their entire end surface, so that they can be drilled into the work piece (plunging). Milling cutters may also have extended cutting surfaces on their sides to allow for peripheral milling. Tools optimized for face milling tend to have only small cutters at their end corners.
The cutting surfaces of a milling cutter are generally made of a hard and temperature-resistant material, so that they wear slowly. A low cost cutter may have surfaces made of high speed steel. More expensive but slower-wearing materials include cemented carbide. Thin film coatings may be applied to decrease friction or further increase hardness.
There are cutting tools typically used in milling machines or machining centers to perform milling operations (and occasionally in other machine tools). They remove material by their movement within the machine (e.g., a ball nose mill) or directly from the cutter's shape (e.g., a form tool such as a hobbing cutter).
As material passes through the cutting area of a milling machine, the blades of the cutter take swarfs of material at regular intervals. Surfaces cut by the side of the cutter (as in peripheral milling) therefore always contain regular ridges. The distance between ridges and the height of the ridges depend on the feed rate, number of cutting surfaces, the cutter diameter. With a narrow cutter and rapid feed rate, these revolution ridges can be significant variations in the surface finish.
The face milling process can in principle produce very flat surfaces. However, in practice the result always shows visible trochoidal marks following the motion of points on the cutter's end face. These revolution marks give the characteristic finish of a face milled surface. Revolution marks can have significant roughness depending on factors such as flatness of the cutter's end face and the degree of perpendicularity between the cutter's rotation axis and feed direction. Often a final pass with a slow feed rate is used to improve the surface finish after the bulk of the material has been removed. In a precise face milling operation, the revolution marks will only be microscopic scratches due to imperfections in the cutting edge.
Gang milling refers to the use of two or more milling cutters mounted on the same arbor (that is, ganged) in a horizontal-milling setup. All of the cutters may perform the same type of operation, or each cutter may perform a different type of operation. For example, if several workpieces need a slot, a flat surface, and an angular groove, a good method to cut these (within a non-CNC context) would be gang milling. All the completed workpieces would be the same, and milling time per piece would be minimized.
Gang milling was especially important before the CNC era, because for duplicate part production, it was a substantial efficiency improvement over manual-milling one feature at an operation, then changing machines (or changing setup of the same machine) to cut the next op. Today, CNC mills with automatic tool change and 4- or 5-axis control obviate gang-milling practice to a large extent.
Equipment
Milling is performed with a milling cutter in various forms, held in a collett or similar which, in turn, is held in the spindle of a milling machine.
Types and nomenclature
Mill orientation is the primary classification for milling machines. The two basic configurations are vertical and horizontal - referring to the orientation of the rotating spindle upon which the cutter is mounted. However, there are alternative classifications according to method of control, size, purpose and power source.
Mill orientation
Vertical
In the vertical milling machine the spindle axis is vertically oriented. Milling cutters are held in the spindle and rotate on its axis. The spindle can generally be lowered (or the table can be raised, giving the same relative effect of bringing the cutter closer or deeper into the work), allowing plunge cuts and drilling. There are two subcategories of vertical mills: the bed mill and the turret mill.
A turret mill has a fixed spindle and the table is moved both perpendicular and parallel to the spindle axis to accomplish cutting. Some turret mills have a quill which allows the milling cutter (or a drill) to be raised and lowered in a manner similar to a drill press. This provides two methods of cutting in the vertical (Z) direction: by raising or lowering the quill, and by moving the knee.
In the bed mill, however, the table moves only perpendicular to the spindle's axis, while the spindle itself moves parallel to its own axis.
Turret mills are generally considered by some to be more versatile of the two designs.
A third type also exists, a lighter, more versatile machine, called a mill-drill. The mill-drill is a close relative of the vertical mill and quite popular in light industry; and with hobbyists. A mill-drill is similar in basic configuration to a very heavy drill press, but equipped with an X-Y table and a much larger column. They also typically use more powerful motors than a comparably sized drill press, most are muti-speed belt driven with some models having a geared head or electronic speed control. They generally have quite heavy-duty spindle bearings to deal with the lateral loading on the spindle that is created by a milling operation. A mill drill also typically raises and lowers the entire head, including motor, often on a dovetailed (sometimes round with rack and pinion) vertical column. A mill drill also has a large quill that is generally locked during milling operations and released to facilitate drilling functions. Other differences that separate a mill-drill from a drill press may be a fine tuning adjustment for the Z-axis, a more precise depth stop, the capability to lock the X, Y or Z axis, and often a system of tilting the head or the entire vertical column and powerhead assembly to allow angled cutting-drilling. Aside from size, the principal difference between these lighter machines and larger vertical mills is that the X-Y table is at a fixed elevation; the Z-axis is controlled by moving the head or quill down toward the X,Y table. A mill drill typically has an internal taper fitting in the quill to take a collet chuck, face mills, or a Jacobs chuck similar to the vertical mill.
Horizontal
A horizontal mill has the same sort but the cutters are mounted on a horizontal spindle (see Arbor milling) across the table. Many horizontal mills also feature a built-in rotary table that allows milling at various angles; this feature is called a universal table. While endmills and the other types of tools available to a vertical mill may be used in a horizontal mill, their real advantage lies in arbor-mounted cutters, called side and face mills, which have a cross section rather like a circular saw, but are generally wider and smaller in diameter. Because the cutters have good support from the arbor and have a larger cross-sectional area than an end mill, quite heavy cuts can be taken enabling rapid material removal rates. These are used to mill grooves and slots. Plain mills are used to shape flat surfaces. Several cutters may be ganged together on the arbor to mill a complex shape of slots and planes. Special cutters can also cut grooves, bevels, radii, or indeed any section desired. These specialty cutters tend to be expensive. Simplex mills have one spindle, and duplex mills have two. It is also easier to cut gears on a horizontal mill. Some horizontal milling machines are equipped with a power-take-off provision on the table. This allows the table feed to be synchronized to a rotary fixture, enabling the milling of spiral features such as hypoid gears.
Universal
Is a milling machine with the facility to either have a horizontal spindle or a vertical spindle. The latter sometimes being on a two-axis turret enabling the spindle to be pointed in any direction on desires. The two options may be driven independently or from one motor through gearing. In either case, as the work is generally placed in the same place for either type of operation, the mechanism for the method not being used is moved out of the way. In smaller machines, 'spares' may be lifted off while larger machines offer a system to retract those parts not in use.
Comparative merits
The choice between vertical and horizontal spindle orientation in milling machine design usually hinges on the shape and size of a workpiece and the number of sides of the workpiece that require machining. Work in which the spindle's axial movement is normal to one plane, with an endmill as the cutter, lends itself to a vertical mill, where the operator can stand before the machine and have easy access to the cutting action by looking down upon it. Thus vertical mills are most favored for diesinking work (machining a mould into a block of metal). Heavier and longer workpieces lend themselves to placement on the table of a horizontal mill.
Prior to numerical control, horizontal milling machines evolved first, because they evolved by putting milling tables under lathe-like headstocks. Vertical mills appeared in subsequent decades, and accessories in the form of add-on heads to change horizontal mills to vertical mills (and later vice versa) have been commonly used. Even in the CNC era, a heavy workpiece needing machining on multiple sides lends itself to a horizontal machining center, while diesinking lends itself to a vertical one.
Alternative classifications
In addition to horizontal versus vertical, other distinctions are also important:
Variants
Bed mill This refers to any milling machine where the spindle is on a pendant that moves up and down to move the cutter into the work, while the table sits on a stout bed that rests on the floor. These are generally more rigid than a knee mill. Gantry mills can be included in this bed mill category.
Box mill or column mill Very basic hobbyist bench-mounted milling machines that feature a head riding up and down on a column or box way.
C-frame mill These are larger, industrial production mills. They feature a knee and fixed spindle head that is only mobile vertically. They are typically much more powerful than a turret mill, featuring a separate hydraulic motor for integral hydraulic power feeds in all directions, and a twenty to fifty horsepower motor. Backlash eliminators are almost always standard equipment. They use large NMTB 40 or 50 tooling. The tables on C-frame mills are usually 18" by 68" or larger, to allow multiple parts to be machined at the same time.
Floor mill These have a row of rotary tables, and a horizontal pendant spindle mounted on a set of tracks that runs parallel to the table row. These mills have predominantly been converted to CNC, but some can still be found (if one can even find a used machine available) under manual control. The spindle carriage moves to each individual table, performs the machining operations, and moves to the next table while the previous table is being set up for the next operation. Unlike other mills, floor mills have movable floor units. A crane drops massive rotary tables, X-Y tables, etc., into position for machining, allowing large and complex custom milling operations.
Gantry mill The milling head rides over two rails (often steel shafts) which lie at each side of the work surface. Due to its design it usually has a very small footprint compared to the machine travel size. As a downside they are usually not as rigid as e.g. C-Frame mills.
Horizontal boring mill Large, accurate bed horizontal mills that incorporate many features from various machine tools. They are predominantly used to create large manufacturing jigs, or to modify large, high precision parts. They have a spindle stroke of several (usually between four and six) feet, and many are equipped with a tailstock to perform very long boring operations without losing accuracy as the bore increases in depth. A typical bed has X and Y travel, and is between three and four feet square with a rotary table or a larger rectangle without a table. The pendant usually provides between four and eight feet of vertical movement. Some mills have a large (30" or more) integral facing head. Right angle rotary tables and vertical milling attachments are available for further flexibility.
Jig borer Vertical mills that are built to bore holes, and very light slot or face milling. They are typically bed mills with a long spindle throw. The beds are more accurate, and the handwheels are graduated down to .0001" for precise hole placement.
Knee mill or knee-and-column mill refers to any milling machine whose x-y table rides up and down the column on a vertically adjustable knee. This includes Bridgeports.
Planer-style mill (Plano Milling)Large mills built in the same configuration as planers except with a milling spindle instead of a planing head. This term is growing dated as planers themselves are largely a thing of the past.
Ram-type mill This can refer to any mill that has a cutting head mounted on a sliding ram. The spindle can be oriented either vertically or horizontally. In practice most mills with rams also involve swiveling ability, whether or not it is called "turret" mounting. The Bridgeport configuration can be classified as a vertical-head ram-type mill. Van Norman specialized in ram-type mills through most of the 20th century. Since the wide dissemination of CNC machines, ram-type mills are still made in the Bridgeport configuration (with either manual or CNC control), but the less common variations (such as were built by Van Norman, Index, and others) have died out, their work being done now by either Bridgeport-form mills or machining centers.
Turret mill More commonly referred to as Bridgeport-type milling machines. The spindle can be aligned in many different positions for a very versatile, if somewhat less rigid machine.
Alternative terminology
A milling machine is often called a mill by machinists. The archaic term miller was commonly used in the 19th and early 20th centuries.
Since the 1960s there has developed an overlap of usage between the terms milling machine and machining center. NC/CNC machining centers evolved from milling machines, which is why the terminology evolved gradually with considerable overlap that still persists. The distinction, when one is made, is that a machining center is a mill with features that pre-CNC mills never had, especially an automatic tool changer (ATC) that includes a tool magazine (carousel), and sometimes an automatic pallet changer (APC). In typical usage, all machining centers are mills, but not all mills are machining centers; only mills with ATCs are machining centers.
Computer numerical control
Most CNC milling machines (also called machining centers) are computer controlled vertical mills with the ability to move the spindle vertically along the Z-axis. This extra degree of freedom permits their use in diesinking, engraving applications, and 2.5D surfaces such as relief sculptures. When combined with the use of conical tools or a ball nose cutter, it also significantly improves milling precision without impacting speed, providing a cost-efficient alternative to most flat-surface hand-engraving work.
CNC machines can exist in virtually any of the forms of manual machinery, like horizontal mills. The most advanced CNC milling-machines, the multiaxis machine, add two more axes in addition to the three normal axes (XYZ). Horizontal milling machines also have a C or Q axis, allowing the horizontally mounted workpiece to be rotated, essentially allowing asymmetric and eccentric turning. The fifth axis (B axis) controls the tilt of the tool itself. When all of these axes are used in conjunction with each other, extremely complicated geometries, even organic geometries such as a human head can be made with relative ease with these machines. But the skill to program such geometries is beyond that of most operators. Therefore, 5-axis milling machines are practically always programmed with CAM.
The operating system of such machines is a closed loop system and functions on feedback.
These machines have developed from the basic NC (NUMERIC CONTROL) machines. A computerized form of NC machines is known as CNC machines. A set of instructions (called a program) is used to guide the machine for desired operations. Some very commonly used codes, which are used in the program are:
G00 – rapid traverse
G01 – linear interpolation of tool.
G21 – dimensions in metric units.
M03/M04 – spindle start (clockwise/counter clockwise).
T01 M06 – automatic tool change to tool 1
M30 – program end.
Various other codes are also used. A CNC machine is operated by a single operator called a programmer. This machine is capable of performing various operations automatically and economically.
With the declining price of computers and open source CNC software, the entry price of CNC machines has plummeted.
Tooling
The accessories and cutting tools used on machine tools (including milling machines) are referred to in aggregate by the mass noun "tooling". There is a high degree of standardization of the tooling used with CNC milling machines, and a lesser degree with manual milling machines. To ease up the organization of the tooling in CNC production many companies use a tool management solution.
Milling cutters for specific applications are held in various tooling configurations.
CNC milling machines nearly always use SK (or ISO), CAT, BT or HSK tooling. SK tooling is the most common in Europe, while CAT tooling, sometimes called V-Flange Tooling, is the oldest and probably most common type in the USA. CAT tooling was invented by Caterpillar Inc. of Peoria, Illinois, in order to standardize the tooling used on their machinery. CAT tooling comes in a range of sizes designated as CAT-30, CAT-40, CAT-50, etc. The number refers to the Association for Manufacturing Technology (formerly the National Machine Tool Builders Association (NMTB)) taper size of the tool.
An improvement on CAT Tooling is BT Tooling, which looks similar and can easily be confused with CAT tooling. Like CAT Tooling, BT Tooling comes in a range of sizes and uses the same NMTB body taper. However, BT tooling is symmetrical about the spindle axis, which CAT tooling is not. This gives BT tooling greater stability and balance at high speeds. One other subtle difference between these two toolholders is the thread used to hold the pull stud. CAT Tooling is all Imperial thread and BT Tooling is all Metric thread. Note that this affects the pull stud only; it does not affect the tool that they can hold. Both types of tooling are sold to accept both Imperial and metric sized tools.
SK and HSK tooling, sometimes called "Hollow Shank Tooling", is much more common in Europe where it was invented than it is in the United States. It is claimed that HSK tooling is even better than BT Tooling at high speeds. The holding mechanism for HSK tooling is placed within the (hollow) body of the tool and, as spindle speed increases, it expands, gripping the tool more tightly with increasing spindle speed. There is no pull stud with this type of tooling.
For manual milling machines, there is less standardization, because a greater plurality of formerly competing standards exist. Newer and larger manual machines usually use NMTB tooling. This tooling is somewhat similar to CAT tooling but requires a drawbar within the milling machine. Furthermore, there are a number of variations with NMTB tooling that make interchangeability troublesome. The older a machine, the greater the plurality of standards that may apply (e.g., Morse, Jarno, Brown & Sharpe, Van Norman, and other less common builder-specific tapers). However, two standards that have seen especially wide usage are the Morse #2 and the R8, whose prevalence was driven by the popularity of the mills built by Bridgeport Machines of Bridgeport, Connecticut. These mills so dominated the market for such a long time that "Bridgeport" is virtually synonymous with "manual milling machine". Most of the machines that Bridgeport made between 1938 and 1965 used a Morse taper #2, and from about 1965 onward most used an R8 taper.
Accessories
Arbor support
Stop block
CNC pocket milling
Pocket milling has been regarded as one of the most widely used operations in machining. It is extensively used in aerospace and shipyard industries. In pocket milling the material inside an arbitrarily closed boundary on a flat surface of a work piece is removed to a fixed depth. Generally flat bottom end mills are used for pocket milling. Firstly roughing operation is done to remove the bulk of material and then the pocket is finished by a finish end mill.
Most of the industrial milling operations can be taken care of by 2.5 axis CNC milling. This type of path control can machine up to 80% of all mechanical parts. Since the importance of pocket milling is very relevant, therefore effective pocketing approaches can result in reduction in machining time and cost.
NC pocket milling can be carried out mainly by two tool paths, viz. linear and non-linear.
Linear tool path
In this approach, the tool movement is unidirectional. Zig-zag and zig tool paths are the examples of linear tool path.
Zig-zag
In zig-zag milling, material is removed both in forward and backward paths. In this case, cutting is done both with and against the rotation of the spindle. This reduces the machining time but increases machine chatter and tool wear.
Zig
In zig milling, the tool moves only in one direction. The tool has to be lifted and retracted after each cut, due to which machining time increases. However, in case of zig milling surface quality is better.
Non-linear tool path
In this approach, tool movement is multi-directional. One example of non-linear tool path is contour-parallel tool path.
Contour-parallel
In this approach, the required pocket boundary is used to derive the tool path. In this case the cutter is always in contact with the work material. Hence the idle time spent in positioning and retracting the tool is avoided. For large-scale material removal, contour-parallel tool path is widely used because it can be consistently used with up-cut or down-cut method during the entire process. There are three different approaches that fall into the category of contour-parallel tool path generation. They are:
Pair-wise intersection approach:In pair-wise intersection approach, the boundary of the pocket is brought inwards in steps, The offset segments will intersect at concave corners. To obtain the required contour, these intersections are to be trimmed off. On the other hand, in case of convex corner, the offset segments are extended and thereby connected to make the contour. These operations viz. offsetting, trimming and extending are repeatedly done to cover the entire machining volume with sufficient layer of profiles.
Voronoi diagram approach: In voronoi diagram approach, the pocket boundary is segmented and voronoi diagram is constructed for the entire pocket boundary. These voronoi diagrams are used for generating the tool path for machining. This method is considered to be more efficient and robust. Moreover, it avoids topological problems associated with traditional offsetting algorithms.
Curvilinear
In this approach, the tool travels along a gradually evolving spiral path. The spiral starts at the center of the pocket to be machined and the tool gradually moves towards the pocket boundary. The direction of the tool path changes progressively and local acceleration and deceleration of the tool are minimized. This reduces tool wear.
History
1780-1810
Milling machines evolved from the practice of rotary filing—that is, running a circular cutter with file-like teeth in the headstock of a lathe. Rotary filing and, later, true milling were developed to reduce time and effort spent hand-filing. The full story of milling machine development may never be known, because much early development took place in individual shops where few records were kept for posterity. However, the broad outlines are known, as summarized below. From a history-of-technology viewpoint, it is clear that the naming of this new type of machining with the term "milling" was an extension from that word's earlier senses of processing materials by abrading them in some way (cutting, grinding, crushing, etc.).
Rotary filing long predated milling. A rotary file by Jacques de Vaucanson, circa 1760, is well known.
In 1783 Samuel Rehe invented a true milling machine. In 1795, Eli Terry began using a milling machine at Plymouth Connecticut in the production of tall case clocks. With the use of his milling machine, Terry was the first to accomplish Interchangeable parts in the clock industry. Milling wooden parts was efficient in interchangeable parts, but inefficient in high yields. Milling wooden blanks results in a low yield of parts because the machines single blade would cause loss of gear teeth when the cutter hit parallel grains in the wood. Terry later invented a spindle cutting machine to mass produce parts in 1807. Other Connecticut clockmakers like James Harrison of Waterbury, Thomas Barnes of Litchfield, and Gideon Roberts of Bristol, also used milling machines to produce their clocks.
1810s–1830s
It is clear that milling machines as a distinct class of machine tool (separate from lathes running rotary files) first appeared between 1814 and 1818. The centers of earliest development of true milling machines were two federal armories of the U.S. (Springfield and Harpers Ferry) together with the various private armories and inside contractors that shared turnover of skilled workmen with them.
Between 1912 and 1916, Joseph W. Roe, a respected founding father of machine tool historians, credited Eli Whitney (one of the private arms makers mentioned above) with producing the first true milling machine. By 1918, he considered it "Probably the first milling machine ever built—certainly the oldest now in existence […]." However, subsequent scholars, including Robert S. Woodbury and others, have improved upon Roe's early version of the history and suggest that just as much credit—in fact, probably more—belongs to various other inventors, including Robert Johnson of Middletown, Connecticut; Captain John H. Hall of the Harpers Ferry armory; Simeon North of the Staddle Hill factory in Middletown; Roswell Lee of the Springfield armory; and Thomas Blanchard. (Several of the men mentioned above are sometimes described on the internet as "the inventor of the first milling machine" or "the inventor of interchangeable parts". Such claims are oversimplified, as these technologies evolved over time among many people.)
Peter Baida, citing Edward A. Battison's article "Eli Whitney and the Milling Machine," which was published in the Smithsonian Journal of History in 1966, exemplifies the dispelling of the "Great Man" image of Whitney by historians of technology working in the 1950s and 1960s. He quotes Battison as concluding that "There is no evidence that Whitney developed or used a true milling machine." Baida says, "The so-called Whitney machine of 1818 seems actually to have been made after Whitney's death in 1825." Baida cites Battison's suggestion that the first true milling machine was made not by Whitney, but by Robert Johnson of Middletown.
The late teens of the 19th century were a pivotal time in the history of machine tools, as the period of 1814 to 1818 is also the period during which several contemporary pioneers (Fox, Murray, and Roberts) were developing the planer, and as with the milling machine, the work being done in various shops was undocumented for various reasons (partially because of proprietary secrecy, and also simply because no one was taking down records for posterity).
James Nasmyth built a milling machine very advanced for its time between 1829 and 1831. It was tooled to mill the six sides of a hex nut that was mounted in a six-way indexing fixture.
A milling machine built and used in the shop of Gay & Silver (aka Gay, Silver, & Co) in the 1830s was influential because it employed a better method of vertical positioning than earlier machines. For example, Whitney's machine (the one that Roe considered the very first) and others did not make provision for vertical travel of the knee. Evidently, the workflow assumption behind this was that the machine would be set up with shims, vise, etc. for a certain part design, and successive parts did not require vertical adjustment (or at most would need only shimming). This indicates that early thinking about milling machines was as production and not as toolroom machines.
In these early years, milling was often viewed as only a roughing operation to be followed by finishing with a hand file. The idea of reducing hand filing was more important than replacing it.
1840s–1860
Some of the key men in milling machine development during this era included Frederick W. Howe, Francis A. Pratt, Elisha K. Root, and others. (These same men during the same era were also busy developing the state of the art in turret lathes. Howe's experience at Gay & Silver in the 1840s acquainted him with early versions of both machine tools. His machine tool designs were later built at Robbins & Lawrence, the Providence Tool Company, and Brown & Sharpe.) The most successful milling machine design to emerge during this era was the , which rather than being a specific make and model of machine tool is truly a family of tools built by various companies on a common configuration over several decades. It took its name from the first company to put one on the market, George S. Lincoln & Company (formerly the Phoenix Iron Works), whose first one was built in 1855 for the Colt armory.
During this era there was a continued blind spot in milling machine design, as various designers failed to develop a truly simple and effective means of providing slide travel in all three of the archetypal milling axes (X, Y, and Z—or as they were known in the past, longitudinal, traverse, and vertical). Vertical positioning ideas were either absent or underdeveloped. The Lincoln miller's spindle could be raised and lowered, but the original idea behind its positioning was to be set up in position and then run, as opposed to being moved frequently while running. Like a turret lathe, it was a repetitive-production machine, with each skilled setup followed by extensive fairly low skill operation.
1860s
In 1861, Frederick W. Howe, while working for the Providence Tool Company, asked Joseph R. Brown of Brown & Sharpe for a solution to the problem of milling spirals, such as the flutes of twist drills. These were usually filed by hand at the time. (Helical planing existed but was by no means common.) Brown designed a "universal milling machine" that, starting from its first sale in March 1862, was wildly successful. It solved the problem of 3-axis travel (i.e., the axes that we now call XYZ) much more elegantly than had been done in the past, and it allowed for the milling of spirals using an indexing head fed in coordination with the table feed. The term "universal" was applied to it because it was ready for any kind of work, including toolroom work, and was not as limited in application as previous designs. (Howe had designed a "universal miller" in 1852, but Brown's of 1861 is the one considered a groundbreaking success.)
Brown also developed and patented (1864) the design of formed milling cutters in which successive sharpenings of the teeth do not disturb the geometry of the form.
The advances of the 1860s opened the floodgates and ushered in modern milling practice.
1870s to World War I
In these decades, Brown & Sharpe and the Cincinnati Milling Machine Company dominated the american milling machine field. However, hundreds of other firms also built milling machines at the time, and many were significant in various ways. Besides a wide variety of specialized production machines, the archetypal multipurpose milling machine of the late 19th and early 20th centuries was a heavy knee-and-column horizontal-spindle design with power table feeds, indexing head, and a stout overarm to support the arbor. The evolution of machine design was driven not only by inventive spirit but also by the constant evolution of milling cutters that saw milestone after milestone from 1860 through World War I.
World War I and interwar period
Around the end of World War I, machine tool control advanced in various ways that laid the groundwork for later CNC technology. The jig borer popularized the ideas of coordinate dimensioning (dimensioning of all locations on the part from a single reference point); working routinely in "tenths" (ten-thousandths of an inch, 0.0001") as an everyday machine capability; and using the control to go straight from drawing to part, circumventing jig-making. In 1920 the new tracer design of J.C. Shaw was applied to Keller tracer milling machines for die sinking via the three dimensional copying of a template. This made die sinking faster and easier just as dies were in higher demand than ever before, and was very helpful for large steel dies such as those used to stamp sheets in automobile manufacturing. Such machines translated the tracer movements to input for servos that worked the machine leadscrews or hydraulics. They also spurred the development of antibacklash leadscrew nuts. All of the above concepts were new in the 1920s but became routine in the NC/CNC era. By the 1930s, incredibly large and advanced milling machines existed, such as the Cincinnati Hydro-Tel, that presaged today's CNC mills in every respect except for CNC control itself.
Bridgeport milling machine
In 1936, Rudolph Bannow (1897–1962) conceived of a major improvement to the milling machine. His company commenced manufacturing a new knee-and-column vertical mill in 1938. This was the Bridgeport milling machine, often called a ram-type or turret-type mill because its head has sliding-ram and rotating-turret mounting. The machine became so popular that many other manufacturers created copies and variants. Furthermore, its name came to connote any such variant. The Bridgeport offered enduring advantages over previous models. It was small enough, light enough, and affordable enough to be a practical acquisition for even the smallest machine shop businesses, yet it was also smartly designed, versatile, well-built, and rigid. Its various directions of sliding and pivoting movement allowed the head to approach the work from any angle. The Bridgeport's design became the dominant form for manual milling machines used by several generations of small- and medium-enterprise machinists. By the 1980s an estimated quarter-million Bridgeport milling machines had been built, and they (and their clones) are still being produced today.
1940s–1970s
By 1940, automation via cams, such as in screw machines and automatic chuckers, had already been very well developed for decades. Beginning in the 1930s, ideas involving servomechanisms had been in the air, but it was especially during and immediately after World War II that they began to germinate (see also Numerical control > History). These were soon combined with the emerging technology of digital computers. This technological development milieu, spanning from the immediate pre–World War II period into the 1950s, was powered by the military capital expenditures that pursued contemporary advancements in the directing of gun and rocket artillery and in missile guidance—other applications in which humans wished to control the kinematics/dynamics of large machines quickly, precisely, and automatically. Sufficient R&D spending probably would not have happened within the machine tool industry alone; but it was for the latter applications that the will and ability to spend was available. Once the development was underway, it was eagerly applied to machine tool control in one of the many post-WWII instances of technology transfer.
In 1952, numerical control reached the developmental stage of laboratory reality. The first NC machine tool was a Cincinnati Hydrotel milling machine retrofitted with a scratch-built NC control unit. It was reported in Scientific American, just as another groundbreaking milling machine, the Brown & Sharpe universal, had been in 1862.
During the 1950s, numerical control moved slowly from the laboratory into commercial service. For its first decade, it had rather limited impact outside of aerospace work. But during the 1960s and 1970s, NC evolved into CNC, data storage and input media evolved, computer processing power and memory capacity steadily increased, and NC and CNC machine tools gradually disseminated from an environment of huge corporations and mainly aerospace work to the level of medium-sized corporations and a wide variety of products. NC and CNC's drastic advancement of machine tool control deeply transformed the culture of manufacturing. The details (which are beyond the scope of this article) have evolved immensely with every passing decade.
1980s–present
Computers and CNC machine tools continue to develop rapidly. The personal computer revolution has a great impact on this development. By the late 1980s small machine shops had desktop computers and CNC machine tools. Soon after, hobbyists, artists, and designers began obtaining CNC mills and lathes. Manufacturers have started producing economically priced CNCs machines small enough to sit on a desktop which can cut at high resolution materials softer than stainless steel. They can be used to make anything from jewelry to printed circuit boards to gun parts, even fine art.
Standards
National and international standards are used to standardize the definitions, environmental requirements, and test methods used for milling. Selection of the standard to be used is an agreement between the supplier and the user and has some significance in the design of the mill. In the United States, ASME has developed the standards B5.45-1972 Milling Machines and B94.19-1997 Milling Cutters and End Mills.
General tolerances include: +/-0.005" for local tolerances across most geometries, +/-0.010" for plastics with variation depending on the size of the part, 0.030" minimum wall thickness for metals, and 0.060" minimum wall thickness for plastics.
See also
Arbor milling
CNC router
Cryomilling
Electrical discharge machining
Milling cutter
Multiaxis machining
Photochemical machining
Printed circuit board milling
Router (woodworking)
3D printing
References
Notes
Bibliography
.
.
Further reading
Computer-aided engineering
Machine tools
Metalworking
Metalworking terminology |
35409949 | https://en.wikipedia.org/wiki/OpenShift | OpenShift | OpenShift is a family of containerization software products developed by Red Hat. Its flagship product is the OpenShift Container Platform — an on-premises platform as a service built around Linux containers orchestrated and managed by Kubernetes on a foundation of Red Hat Enterprise Linux. The family's other products provide this platform through different environments: OKD serves as the community-driven upstream (akin to the way that Fedora is upstream of Red Hat Enterprise Linux), OpenShift Online is the platform offered as software as a service, and OpenShift Dedicated is the platform offered as a managed service.
The OpenShift Console has developer and administrator oriented views. Administrator views allow one to monitor container resources and container health, manage users, work with operators, etc. Developer views are oriented around working with application resources within a namespace. OpenShift also provides a CLI that supports a superset of the actions that the Kubernetes CLI provides.
History
OpenShift originally came from Red Hat's acquisition of Makara, a company marketing a platform as a service (PaaS) based on Linux containers, in November 2010.
OpenShift was announced in May 2011 as proprietary technology and did not become open-source until May of 2012. Up until v3, the container technology and container orchestration technology used custom developed technologies. This changed in v3 with the adoption of Docker as the container technology, and Kubernetes as the container orchestration technology. The v4 product has many other architectural changes - a prominent one being a shift to using CRI-O as the container runtime (and Podman for interacting with pods and containers), and Buildah as the container build tool, thus breaking the exclusive dependency on Docker.
Architecture
The main difference between OpenShift and vanilla Kubernetes is the concept of build-related artifacts. In OpenShift, such artifacts are considered first class Kubernetes resources upon which standard Kubernetes operations can apply. OpenShift's client program, "oc", offers a superset of the standard capabilities bundled in the mainline "kubectl" client program of Kubernetes. Using this client, one can directly interact with the build-related resources using sub-commands (such as "new-build" or "start-build"). In addition to this, an OpenShift-native pod build technology called Source-to-Image (S2I) is available out of the box, though this is slowly being phased out in favor of Tekton - which is a cloud native way of building and deploying to Kubernetes. For the OpenShift platform, this provides capabilities equivalent to what Jenkins can do.
Some other differences when OpenShift is compared to Kubernetes:
The v4 product line uses the CRI-O runtime - which means that docker daemons are not present on the master or worker nodes. This improves the security posture of the cluster.
The out-of-the-box install of OpenShift comes with an image repository.
ImageStreams (a sequence of pointers to images which can be associated with deployments) and Templates (a packaging mechanism for application components) are unique to OpenShift and simplify application deployment and management.
The "new-app" command which can be used to initiate an application deployment automatically applies the app label (with the value of the label taken from the --name argument) to all resources created as a result of the deployment. This can simplify the management of application resources.
In terms of platforms, OpenShift used to be limited to Red Hat’s own offerings but by 2020 supports others like AWS, IBM Cloud, vSphere, and bare metal deployments with OpenShift 4.
OpenShift’s implementation of Deployment, called DeploymentConfig is logic-based in comparison to Kubernetes' controller-based Deployment objects. As of v4.5, OpenShift is steering more towards Deployments by changing the default behavior of its CLI.
An embedded OperatorHub. This is a web GUI where users can browse and install a library of Kubernetes Operators that have been packaged for easy lifecycle management. These include Red Hat authored Operators, Red Hat Certified Operators and Community Operators
Openshift tightly controls the operating systems used. The "master" components have to be running Red Hat CoreOS. This level of control enables the cluster to support upgrades and patches of the master nodes with minimal effort. The worker Nodes can be running other variants of Linux or even Windows.
OpenShift introduced the concept of routes - points of traffic ingress into the Kubernetes cluster. The Kubernetes ingress concept was modeled after this.
OpenShift includes other software such as application runtimes as well as infrastructure components from the Kubernetes ecosystem. For example, for observability needs, Prometheus, Hawkular, and Istio (and their dependencies) are included. The Red Hat branding of Istio is called Red Hat Service Mesh, and is based on an opensource project called Maistra, that aligns base Istio to the needs of opensource OpenShift.
Products
OpenShift Container Platform
OpenShift Container Platform (formerly known as OpenShift Enterprise) is Red Hat's on-premises private platform as a service product, built around application containers powered by Docker, with orchestration and management provided by Kubernetes, on Red Hat Enterprise Linux and Container Linux (formerly known as CoreOS or RHCOS).
OKD
OKD, known until August 2018 as OpenShift Origin (Origin Community Distribution) is the upstream community project used in OpenShift Online, OpenShift Dedicated, and OpenShift Container Platform. Built around a core of Docker container packaging and Kubernetes container cluster management, OKD is augmented by application lifecycle management functionality and DevOps tooling. OKD provides an open source application container platform. All source code for the OKD project is available under the Apache License (Version 2.0) on GitHub.
Red Hat OpenShift Online
Red Hat OpenShift Online (RHOO) is Red Hat's public cloud application development and hosting service which runs on AWS and IBM Cloud.
Online offered version 2 of the OKD project source code, which is also available under the Apache License Version 2.0. This version supported a variety of languages, frameworks, and databases via pre-built "cartridges" running under resource-quota "gears". Developers could add other languages, databases, or components via the OpenShift Cartridge application programming interface. This was deprecated in favour of OpenShift 3 and was withdrawn on 30 September 2017 for non-paying customers and 31 December 2017 for paying customers.
OpenShift 3 is built around Kubernetes. It can run any Docker-based container, but Openshift Online is limited to running containers that do not require root.
Red Hat OpenShift 4 for IBM Z and IBM LinuxONE supports on-premise, cloud, and hybrid environments.
OpenShift Dedicated
OpenShift Dedicated (OSD) is Red Hat's managed private cluster offering, built around a core of application containers powered by Docker, with orchestration and management provided by Kubernetes, on a foundation of Red Hat Enterprise Linux. It is available on the Amazon Web Services (AWS), IBM Cloud, Google Cloud Platform (GCP) and Microsoft Azure marketplaces since December 2016.
OpenShift Data Foundation
OpenShift Data Foundation (ODF) provides cloud native storage, data management and data protection for applications running with OpenShift Container platform in the cloud, on-prem, and in hybrid/multi-cloud environments
OpenShift Database Access
Red Hat OpenShift Database Access (RHODA) is a capability in managed OpenShift Kubernetes environments enabling administrators to set up connections to database-as-a-service offerings from different providers. RHODA is an add-on service to OSD and Red Hat OpenShift Service on AWS (ROSA). RHODA's initial alpha release included support for MongoDB Atlas for MongoDB and Crunchy Bridge for PostgreSQL.
See also
Ceph
OpenStack
Jelastic
Apache ServiceMix
References
Further reading
External links
OpenShift Commons
OpenShift User Group (German speaking)
Cloud computing providers
Cloud platforms
Cloud storage
Containerization software
File hosting
Free software for cloud computing
Open-source cloud hosting services
Red Hat software
Web hosting
Web services
Free software programmed in Go |
4240094 | https://en.wikipedia.org/wiki/History%20of%20Microsoft | History of Microsoft | Microsoft is a multinational computer technology corporation. Microsoft was founded on April 4, 1975, by Bill Gates and Paul Allen in Albuquerque, New Mexico. Its current best-selling products are the Microsoft Windows operating system; Microsoft Office, a suite of productivity software; Xbox, a line of entertainment of games, music, and video; Bing, a line of search engines; and Microsoft Azure, a cloud services platform.
In 1980, Microsoft formed a partnership with IBM to bundle Microsoft's operating system with IBM computers; with that deal, IBM paid Microsoft a royalty for every sale. In 1985, IBM requested Microsoft to develop a new operating system for their computers called OS/2. Microsoft produced that operating system, but also continued to sell their own alternative, which proved to be in direct competition with OS/2. Microsoft Windows eventually overshadowed OS/2 in terms of sales. When Microsoft launched several versions of Microsoft Windows in the 1990s, they had captured over 90% market share of the world's personal computers.
As of June 30, 2015, Microsoft has a global annual revenue of US$86.83 Billion and 128,076 employees worldwide. It develops, manufactures, licenses, and supports a wide range of software products for computing devices.
1975–1985: The founding of Microsoft
In late 1974, Paul Allen, a programmer at Honeywell, was walking through Harvard Square when he saw the cover of the January 1975 issue of Popular Electronics that demonstrated the Altair 8800, the first microcomputer. Allen bought the magazine and rushed to Currier House at Harvard College, where he showed it to high school friend Bill Gates. They saw potential to develop an implementation of BASIC for the system.
Gates called Altair manufacturer Micro Instrumentation and Telemetry Systems (MITS), offering to demonstrate the implementation. Allen and Gates had neither an interpreter nor an Altair system, yet in the eight weeks before the demo, they developed an interpreter with the help of Monte Davidoff. When Allen flew to Albuquerque to meet with MITS, the interpreter worked and MITS agreed to distribute Altair BASIC. Allen moved to Albuquerque, Gates soon quit Harvard to join him, and they co-founded Microsoft there. Revenues of the company totalled $16,005 by the end of 1976.
Allen came up with the original name of Micro-Soft, a portmanteau of microcomputer and software. Hyphenated in its early incarnations, on November 26, 1976, the company was registered under that name with the Secretary of State of New Mexico. The company's first international office was founded on November 1, 1978, in Japan, entitled "ASCII Microsoft" (now called "Microsoft Japan"), and on November 29, 1979, the term, "Microsoft" was first used by Bill Gates. On January 1, 1979, the company moved from Albuquerque to a new home in Bellevue, Washington, since it was hard to recruit top programmers to Albuquerque. Shortly before the move, eleven of the then-thirteen employees posed for the staff photo on the right.
Steve Ballmer joined the company on June 11, 1980, and would later succeed Bill Gates as CEO from January 2000 until February 2014. The company restructured on June 25, 1981, to become an incorporated business in its home state of Washington (with a further change of its name to "Microsoft Corporation, Inc."). As part of the restructuring, Bill Gates became president of the company and chairman of the board, and Paul Allen became executive vice president. In 1983, Allen left the company after receiving a Hodgkin lymphoma diagnosis, though he remained on the board as vice-chairman.
Microsoft's early products were different variants of Microsoft BASIC which was the dominant programming language in late 1970s and early 1980s home computers such as Apple II (Applesoft BASIC) and Commodore 64 (Commodore BASIC), and were also provided with early versions of the IBM PC as the IBM Cassette BASIC.
Microsoft also marketed through an Apple dealer in West Palm Beach, Florida two products for the Radio-Shack TRS-80. One was "Typing Tutor" which led the user through learning to use a keyboard. The other was authored by a professor at the University of Hawaii called "MuMATH" and had the ability to do mathematics in long integer math to avoid floating point numbers.
The first hardware product was the Z-80 SoftCard which enabled the Apple II to run the CP/M operating system, at the time an industry-standard operating system for running business software and many compilers and interpreters for several high-level languages on microcomputers. The SoftCard was first demonstrated publicly at the West Coast Computer Faire in March 1980. It was an immediate success; 5,000 cards, a large number given the microcomputer market at the time, were purchased in the initial three months at $349 each and it was Microsoft's number one revenue source in 1980.
The first operating system publicly released by the company was a variant of Unix announced on August 25, 1980. Acquired from AT&T through a distribution license, Microsoft dubbed it Xenix, and hired Santa Cruz Operation in order to port/adapt the operating system to several platforms. This Unix variant would become home to the first version of Microsoft's word processor, Microsoft Word. Originally titled "Multi-Tool Word", Microsoft Word became notable for its use of "What You See Is What You Get", or WYSIWYG pioneered by the Xerox Alto and the Bravo text editor in the 1970s.
Word was first released in the spring of 1983, and free demonstration copies of the application were bundled with the November 1983 issue of PC World, making it one of the first programs to be distributed on-disk with a magazine. (Earlier magazine on-disk distributions included Robert Uiterwyk's BASIC in the May 1977 issue of Information Age.) However, Xenix was never sold to end users directly although it was licensed to many software OEMs for resale. It grew to become the most popular version of Unix, measured by the number of machines running it (note that Unix is a multi-user operating system, allowing simultaneous access to a machine by several users). By the mid-1980s Microsoft had gotten out of the Unix business, except for its ownership stake in SCO.
IBM first approached Microsoft about its upcoming IBM Personal Computer (IBM PC) in July 1980, shortly after Gates's mother began working on United Way's executive board with IBM CEO John Opel. On August 12, 1981, after negotiations with Digital Research failed, IBM awarded a contract to Microsoft to provide a version of the CP/M operating system, which was set to be used in the IBM PC. For this deal, Microsoft purchased a CP/M clone called 86-DOS from Tim Paterson of Seattle Computer Products for less than US$100,000, which IBM renamed to IBM PC DOS. The original CP/M was made by Gary Kildall of Digital Research, Inc. Due to potential copyright infringement problems with CP/M, IBM marketed both CP/M and PC DOS for US$240 and US$40, respectively, with PC DOS eventually becoming the standard because of its lower price. Thirty-five of the company's 100 employees worked on the IBM project for more than a year. When the IBM PC debuted, Microsoft was the only company that offered operating system, programming language, and application software for the new computer. The IBM PC DOS is also known as MS-DOS.
InfoWorld stated in 1984 that Microsoft, with $55 million in 1983 sales,
In 1983, in collaboration with numerous companies, Microsoft created a home computer system, MSX, which contained its own version of the DOS operating system, called MSX-DOS; this became relatively popular in Japan, Europe and South America. Later, the market saw a flood of IBM PC clones after Columbia Data Products successfully cloned the IBM BIOS, quickly followed by Eagle Computer and Compaq. The deal with IBM allowed Microsoft to have control of its own QDOS derivative, MS-DOS, and through aggressive marketing of the operating system to manufacturers of IBM-PC clones Microsoft rose from a small player to one of the major software vendors in the home computer industry. With the release of the Microsoft Mouse on May 2, 1983, Microsoft continued to expand its product line in other markets. This expansion included Microsoft Press, a book publishing division, on July 11 the same year, which debuted with two titles: Exploring the IBM PCjr Home Computer by Peter Norton, and The Apple Macintosh Book by Cary Lu.
1985–1994: Windows and Office
Ireland became home to one of Microsoft's international production facilities in 1985, and on November 20 Microsoft released its first retail version of Microsoft Windows (Windows 1.0), originally a graphical extension for its MS-DOS operating system. In August, Microsoft and IBM partnered in the development of a different operating system called OS/2. OS/2 was marketed in connection with a new hardware design proprietary to IBM, the PS/2. On February 16, 1986, Microsoft relocated their headquarters to a corporate office campus in Redmond, Washington. Around one month later, on March 13, the company went public with an IPO, raising US$61 million at US$21.00 per share. By the end of the trading day, the price had risen to US$28.00. In 1987, Microsoft eventually released their first version of OS/2 to OEMs. By then the company was the world's largest producer of software for personal computers—ahead of former leader Lotus Development—and published the three most-popular Macintosh business applications. That year the company purchased Forethought, the developer of PowerPoint and Microsoft's first major software acquisition on the 30th July 1987.
Meanwhile, Microsoft began introducing its most prominent office products. Microsoft Works, an integrated office program which combined features typically found in a word processor, spreadsheet, database and other office applications, saw its first release as an application for the Apple Macintosh towards the end of 1986. Microsoft Works would later be sold with other Microsoft products including Microsoft Word and Microsoft Bookshelf, a reference collection introduced in 1987 that was the company's first CD-ROM product. Later, on August 8, 1989, Microsoft introduced its most successful office product, Microsoft Office. Unlike the model of Microsoft Works, Microsoft Office was a bundle of separate office productivity applications, such as Microsoft Word, Microsoft Excel and so forth. While Microsoft Word and Microsoft Office were mostly developed internally, Microsoft also continued its trend of rebranding products from other companies, such as Microsoft SQL Server on January 13, 1988, a relational database management system for companies that was based on technology licensed from Sybase.
On May 22, 1990, Microsoft launched Windows 3.0. The new version of Microsoft's operating system boasted new features such as streamlined graphic user interface GUI and improved protected mode ability for the Intel 386 processor; it sold over 100,000 copies in two weeks. Windows at the time generated more revenue for Microsoft than OS/2, and the company decided to move more resources from OS/2 to Windows. In an internal memo to Microsoft employees on May 16, 1991, Bill Gates announced that the OS/2 partnership was over, and that Microsoft would henceforth focus its platform efforts on Windows and the Windows NT kernel. Some people, especially developers who had ignored Windows and committed most of their resources to OS/2, were taken by surprise, and accused Microsoft of deception. This changeover from OS/2 was frequently referred to in the industry as "the head-fake". In the recent years, the popularity of OS/2 declined, and Windows quickly became the favored PC platform. 1991 also marked the founding of Microsoft Research, an organization in Microsoft for researching computer science subjects, and Microsoft Visual Basic, a popular development product for companies and individuals.
During the transition from MS-DOS to Windows, the success of Microsoft's product Microsoft Office allowed the company to gain ground on application-software competitors, such as WordPerfect and Lotus 1-2-3. Novell, an owner of WordPerfect for a time, alleged that Microsoft used its inside knowledge of the DOS and Windows kernels and of undocumented Application Programming Interface features to make Office perform better than its competitors. Eventually, Microsoft Office became the dominant business suite, with a market share far exceeding that of its competitors. In March 1992, Microsoft released Windows 3.1 along with its first promotional campaign on TV; the software sold over three million copies in its first two months on the market. In October, Windows for Workgroups 3.1 was released with integrated networking abilities such as peer-to-peer file and printing sharing. In November, Microsoft released the first version of their popular database software Microsoft Access.
By 1993, Windows had become the most widely used GUI operating system in the world. Fortune Magazine named Microsoft as the "1993 Most Innovative Company Operating in the U.S." The year also marked the end of a five-year copyright infringement legal case brought by Apple Computer, dubbed Apple Computer, Inc. v. Microsoft Corp., in which the ruling was in Microsoft's favor. Microsoft also released Windows for Workgroups 3.11, a new version of the consumer line of Windows, and Windows NT 3.1, a server-based operating system with a similar user interface to consumer versions of the operating system, but with an entirely different kernel. As part of its strategy to broaden its business, Microsoft released Microsoft Encarta on March 22, 1993, the first encyclopedia designed to run on a computer. Soon after, the Microsoft Home brand was introduced – encompassing Microsoft's new multimedia applications for Windows 3.x., Microsoft changed its slogan to "Where do you want to go today?" in 1994 as part of an attempt to appeal to nontechnical audiences in a US$100 million advertising campaign.
1995–2007: Foray into the Web, Windows 95, Windows XP, and Xbox
Microsoft continued to make strategic decisions directed at consumers. The company released Microsoft Bob, a graphical user interface designed for novice computer users, in March 1995. The interface was discontinued in 1996 due to poor sales; Bill Gates later attributed its failure to hardware requirements that were too high for typical computers, and is widely regarded as one of Microsoft's most unsuccessful products. DreamWorks SKG and Microsoft formed a new company, DreamWorks Interactive (in 2000 acquired by Electronic Arts which named it EA Los Angeles), to produce interactive and multimedia entertainment properties. On August 24, 1995, Microsoft released Windows 95, a new version of the company's flagship operating system which featured a completely new user interface, including a novel start button; more than a million copies were sold in the first four days after its release.
Windows 95 was released without a web browser as Microsoft had not yet developed one. The success of the web caught them by surprise and they subsequently approached Spyglass to license their browser as Internet Explorer. Spyglass went on to later dispute the terms of the agreement, as Microsoft was to pay a royalty for every copy sold. However, Microsoft sold no copies of Internet Explorer, choosing instead to bundle it for free with the operating system.
Internet Explorer was first included in the Windows 95 Plus! Pack that was released in August 1995. In September, the Chinese government chose Windows to be the operating system of choice in that country, and entered into an agreement with the company to standardize a Chinese version of the operating system. Microsoft also released the Microsoft Sidewinder 3D Pro joystick in an attempt to further expand its profile in the computer hardware market.
On May 26, 1995, Bill Gates sent the "Internet Tidal Wave" memorandum to Microsoft executives. The memo described Netscape with their Netscape Navigator as a "new competitor 'born' on the Internet". The memo outlines Microsoft's failure to grasp the Internet's importance, and in it Gates assigned "the Internet the highest level of importance" from then on. Microsoft began to expand its product line into computer networking and the World Wide Web. On August 24, 1995, it launched a major online service, MSN (Microsoft Network), as a direct competitor to AOL. MSN became an umbrella service for Microsoft's online services, using Microsoft Passport (now called a Microsoft account) as a universal login system for all of its web sites. The company continued to branch out into new markets in 1996, starting with a joint venture with NBC to create a new 24-hour cable news television station, MSNBC. The station was launched on July 15, 1996, to compete with similar news outlets such as CNN. Microsoft also launched Slate, an online magazine edited by Michael Kinsley, which offered political and social commentary along with the cartoon Doonesbury. In an attempt to extend its reach in the consumer market, the company acquired WebTV, which enabled consumers to access the Web from their televisions. Microsoft entered the personal digital assistant (PDA) market in November with Windows CE 1.0, a new built-from-scratch version of their flagship operating system, designed to run on low-memory, low-performance machines, such as handhelds and other small computers. 1996 saw the release of Windows NT 4.0, which brought the Windows 95 GUI and Windows NT kernel together.
While Microsoft largely failed to participate in the rise of the Internet in the early 1990s, some of the key technologies in which the company had invested to enter the Internet market started to pay off by the mid-90s. One of the most prominent of these was ActiveX, an application programming interface built on the Microsoft Component Object Model (COM); this enabled Microsoft and others to embed controls in many programming languages, including the company's own scripting languages, such as JScript and VBScript. ActiveX included frameworks for documents and server solutions. The company also released the Microsoft SQL Server 6.5, which had built-in support for internet applications. In November 1996, Microsoft Office 97 was released, which is the first version to include Office Assistant. In 1997, Internet Explorer 4.0 was released, marking the beginning of the takeover of the browser market from rival Netscape, and by agreement with Apple Computer, Internet Explorer was bundled with the Apple Macintosh operating system as well as with Windows. Windows CE 2.0, the handheld version of Windows, was released this year, including a host of bug fixes and new features designed to make it more appealing to corporate customers. In October, the Justice Department filed a motion in the federal district court in which they stated that Microsoft had violated an agreement signed in 1994, and asked the court to stop the bundling of Internet Explorer with Windows.
The year 1998 was significant in Microsoft's history, with Bill Gates appointing Steve Ballmer as president of Microsoft but remaining as Chair and CEO himself. The company released an update to the consumer version of Windows, Windows 98. Windows 98 came with Internet Explorer 4.0 SP1 (which had Windows Desktop Update bundled), and included new features from Windows 95 OSR 2.x including the FAT32 file system, and new features designed for Windows 98, such as support for multiple displays. Microsoft launched its Indian headquarters as well, which would eventually become the company's second largest after its U.S. headquarters. Finally, a great deal of controversy took place when a set of internal memos from the company were leaked on the Internet. These documents, colloquially referred to as "The Halloween Documents", were widely reported by the media and went into detail of the threats that free software / open source software poses to Microsoft's own software, previously voiced mainly by analysts and advocates of open source software. The documents also alluded to legal and other actions against Linux as well as other open source software. While Microsoft acknowledged the documents, it claimed that they are merely engineering studies. Despite this, some believe that these studies were used in the real strategies of the company.
Microsoft, in 2000, released new products for all three lines of the company's flagship operating system, and saw the beginning of the end of one of its most prominent legal cases. On February 17, Microsoft released an update to its business line of software in Windows 2000. It provided a high level of stability similar to that of its Unix counterparts due to its usage of the Windows NT kernel, and matching features found in the consumer line of the Windows operating system including a DOS emulator that could run many legacy DOS applications.
On April 3, 2000, a judgment was handed down in the case of United States v. Microsoft Corp., calling the company an "abusive monopoly" and forcing the company to split into two separate units. Part of this ruling was later overturned by a federal appeals court, and eventually settled with the U.S. Department of Justice in 2001. On June 15, 2000, the company released a new version of its hand-held operating system, Windows CE 3.0. The main change was the new programming APIs of the software. Previous versions of Windows CE supported only a small subset of the WinAPI, the main development library for Windows, and with Version 3 of Windows CE, the operating system now supported nearly all of the core functionality of the WinAPI. The next update to the consumer line, Windows ME (or Windows Millennium Edition), was released on September 14, 2000. It sported several new features such as enhanced multimedia abilities and consumer-oriented PC maintenance options, but is often regarded as one of the worst versions of Windows due to stability problems, restricted real mode DOS support and other issues.
Microsoft released Windows XP and Office XP in 2001, a version that aimed to encompass the features of both its business and home product lines. The release included an updated version of the Windows 2000 kernel, enhanced DOS emulation abilities, and many of the home-user features found in previous consumer versions. XP introduced a new graphical user interface, the first such change since Windows 95. The operating system was the first to require Microsoft Product Activation, an anti-piracy mechanism that requires users to activate the software with Microsoft within 30 days. Later, Microsoft would enter the multibillion-dollar game console market dominated by Sony and Nintendo, with the release of the Xbox. The Xbox finished behind the dominant PlayStation 2 selling 24 million units compared to 155 million overall; however they managed to outsell the GameCube which sold 21 million units. Microsoft launched their second console, the Xbox 360, in 2005 – which was more successful than the original. By 2017 the Xbox 360 had sold 84 million units but failed to outsell its main rival the PlayStation 3 which sold 87 million units when discontinued. The console was also outsold by the Wii which introduced gesture control and opened up a new market for video games. Microsoft later used their popular controller-free Kinect peripheral to increase the popularity of the Xbox. This was very successful. Kinect was the fastest selling consumer electronics product in history. It sold 8 million units from November 4, 2010, to January 3, 2011, (its first 60 days). It averaged 133,333 units per day, outselling the iPhone and iPad over equivalent post-launch periods.
In 2002, Microsoft launched the .NET initiative, along with new versions of some of its development products, such as Microsoft Visual Studio. The initiative has been an entirely new development API for Windows programming, and included a new programming language, C#. Windows Server 2003 was launched, featuring enhanced administration abilities, such as new user interfaces to server tools. In 2004, the company released Windows XP Media Center Edition 2005, a version of Windows XP designed for multimedia abilities, and Windows XP Starter Edition, a version of Windows XP with a smaller feature set designed for entry-level consumers. However, Microsoft encountered more turmoil in March 2004 when antitrust legal action would be brought against it by the European Union for allegedly abusing its market dominance (see Microsoft Corp v Commission). Eventually Microsoft was fined €497 million (US$613 million), ordered to divulge certain protocols to competitors, and to produce a new version of its Windows XP platform—called Windows XP Home Edition N—that did not include its Windows Media Player. Microsoft was also ordered to produce separate packages of Windows after South Korea also landed a settlement against the company in 2005. It had to pay out US$32 million and produce more than one version of Windows for the country in the same vein as the European Union-one with Windows Media Player and Windows Messenger and one without the two programs.
In guise of competing with other Internet Companies such as the search service Google, in 2005 Microsoft announced a new version of its MSN search service. Later, in 2006, the company launched Microsoft adCenter, a service that offers pay per click advertisements, in an effort to further develop their search marketing revenue. Soon afterward, Microsoft created the CodePlex collaborative development site for hosting open source projects. Activity grew quickly as developers from around the world began to participate, and by early 2007 commercial open source companies, such as Aras Corp. began to offer enterprise open source software exclusively on the Microsoft platform.
On June 15, 2006, Bill Gates announced his plans for a two-year transition period out of a day-to-day role with Microsoft until July 31, 2008. After that date, Gates will continue in his role as the company's chairman, head of the board of directors and act as an adviser on key projects. His role as Chief Software Architect will be filled immediately by Ray Ozzie, the Chief Technical Officer of the company as of June 15, 2006. Bill Gates stated "My announcement is not a retirement – it's a reordering of my priorities."
2007–2011: Microsoft Azure, Windows Vista, Windows 7, and Microsoft Stores
Formerly codenamed "Longhorn" in the early development stages, Windows Vista was released to consumers on January 30, 2007. Microsoft also released a new version of its Office suite, called Microsoft Office 2007, alongside Windows Vista. Windows Server 2008 and Visual Studio 2008, the next versions of the company's server operating system and development suite, respectively, were released on February 27, 2008. Windows Vista was criticized for being heavy and needing large amounts of power to run the desktop widgets and the Aero theme. Many people continued to use Windows XP for many years after, due to its stability and low processing needs.
On December 19, 2007, Microsoft signed a five-year, $500 million contract with Viacom that included content sharing and advertisements. The deal allowed Microsoft to license many shows from Viacom owned cable television and film studios for use on Xbox Live and MSN. The deal also made Viacom a preferred publisher partner for casual game development and distribution through MSN and Windows. On the advertisement side of the deal, Microsoft's Atlas ad-serving division became the exclusive provider of previously unsold advertising inventory on Viacom owned web sites. Microsoft also purchased a large amount of advertising on Viacom owned broadcasts and online networks, and collaborated on promotions and sponsorships for MTV and BET award shows, two Viacom owned cable networks.
In 2008, Microsoft wanted to purchase Yahoo (first completely, later partially) in order to strengthen its position on the search engine market vis-à-vis Google. The company rejected the offer, saying that it undervalued the company. In response, Microsoft withdrew its offer.
In 2009, the opening show of the Consumer Electronics Show (CES) was hosted by Steve Ballmer for the first time. In past years, it has been hosted by Bill Gates. During the show, Ballmer announced the first public Beta Test of Windows 7 for partners and developers on January 8, but also for the general public two days later. On June 26, 2009, Microsoft started taking pre-orders at a discounted price for Windows 7 which was launched on October 22, 2009. Windows 7 has several editions, which acknowledge the rise of netbook computers with reduced processing power.
On April 12, 2010, Microsoft launched their Kin phone line, a result of their acquisition of Danger Incorporated in 2008. The phones became available May 14, 2010, but were discontinued within two months because of poor sales.
On May 10, 2011, the company acquired Skype Technologies for US$8.5 billion.
2011–2014: Windows 8, Xbox One, Outlook.com, and Surface devices
Following the release of Windows Phone, Microsoft underwent a gradual rebranding of its product range throughout 2011 and 2012—the corporation's logos, products, services and websites adopted the principles and concepts of the Metro design language. Microsoft previewed Windows 8, an operating system designed to power both personal computers and tablet computers, in Taipei in June 2011. A developer preview was released on September 13, and was replaced by a consumer preview on February 29, 2012. On May 31, 2012, the preview version was released. On June 18, 2012, Microsoft unveiled the Surface, the first computer in the company's history to have its hardware made by Microsoft. On June 25, Microsoft paid US$1.2 billion to buy the social network Yammer. On July 31, 2012, Microsoft launched the Outlook.com webmail service to compete with Gmail. On September 4, 2012, Microsoft released Windows Server 2012.
In July 2012, Microsoft sold its 50% stake in MSNBC.com, which it had run as a joint venture with NBC since 1996. On October 1, Microsoft announced its intention to launch a news operation, part of a new-look MSN, at the time of the Windows 8 launch that was later in the month. On October 26, 2012, Microsoft launched Windows 8 and the Microsoft Surface. Three days later, Windows Phone 8 was launched. To cope with the potential for an increase in demand for products and services, Microsoft opened a number of "holiday stores" across the U.S. to complement the increasing number of "bricks-and-mortar" Microsoft Stores that opened in 2012. On March 29, 2013, Microsoft launched a Patent Tracker.
The Kinect, a motion-sensing input device made by Microsoft and designed as a video game controller, was first introduced in November 2010, and was upgraded for the 2013 release of the eighth-generation Xbox One video game console. Kinect's capabilities were revealed in May 2013. The new Kinect uses an ultra-wide 1080p camera, which can function in the dark due to an infrared sensor. It employs higher-end processing power and new software, can distinguish between fine movements (such as a thumb movements), and can determine a user's heart rate by looking at his/her face. Microsoft filed a patent application in 2011 that suggests that the corporation may use the Kinect camera system to monitor the behavior of television viewers as part of a plan to make the viewing experience more interactive. On July 19, 2013, Microsoft stocks suffered its biggest one-day percentage sell-off since the year 2000 after its fourth-quarter report raised concerns among the investors on the poor showings of both Windows 8 and the Surface tablet; with more than 11 percentage points declining Microsoft suffered a loss of more than US$32 billion. For the 2010 fiscal year, Microsoft had five product divisions: Windows Division, Server and Tools, Online Services Division, Microsoft Business Division and Entertainment and Devices Division.
On September 3, 2013, Microsoft agreed to buy Nokia's mobile unit for $7 billion. Also in 2013, Amy Hood became the CFO of Microsoft. The Alliance for Affordable Internet (A4AI) was launched in October 2013 and Microsoft was part of the coalition of public and private organizations that also included Facebook, Intel and Google. Led by World Wide Web inventor Tim Berners-Lee, the A4AI seeks to make Internet access more affordable so that access is broadened in the developing world, where only 31% of people are online. Google will help to decrease Internet access prices so that they fall below the UN Broadband Commission's worldwide target of 5% of monthly income. In line with the maturing PC business, in July 2013, Microsoft announced that it would reorganize the business into four new business divisions by function: Operating System, Apps, Cloud, and Devices. All previous divisions were diluted into new divisions without any workforce cuts.
In 2014, Microsoft exhibited a snapshot of their 1994 website as a twenty-year anniversary.
2014–ongoing: Windows 10, Windows 10 Mobile, Microsoft Edge and HoloLens
On February 4, 2014, Steve Ballmer stepped down as CEO of Microsoft and was succeeded by Satya Nadella, who previously led Microsoft's Cloud and Enterprise division. On the same day, John W. Thompson took on the role of chairman, with Bill Gates stepping down from the position, while continuing to participate as a technology advisor. On April 25, 2014, Microsoft acquired Nokia Devices and Services for $7.2 billion. The new subsidiary was renamed Microsoft Mobile Oy. In May 2016, the company announced it will lay off 1,850 workers, taking an impairment and restructuring charge of $950 million. During the previous summer of 2015 the company wrote down $7.6 billion related to its mobile-phone business and fired 7,800 employees from those operations. On September 15, 2014, Microsoft acquired the video game development company Mojang, best known for its wildly popular flagship game Minecraft, for $2.5 billion.
On January 21, 2015, Microsoft announced the release of their first Interactive whiteboard, Microsoft Surface Hub (part of the Surface family). On July 29, 2015, Microsoft released the next version of the Windows operating system, Windows 10. The successor to Windows Phone 8.1, Windows 10 Mobile, was released November 20, 2015. In Q1 2015, Microsoft was the third largest maker of mobile phones selling 33 million units (7.2% of all), while a large majority (at least 75%) of them do not run any version of Windows Phone those other phones are not categorized as smartphones by Gartner in the same time frame 8 million Windows smartphones (2.5% of all smartphones) were made by all manufacturers (but mostly by Microsoft). Microsoft's share of the U.S. smartphone market in January 2016 was 2.7%.
On March 1, 2016, Microsoft announced the merge of its PC and Xbox divisions, with Phil Spencer announcing that Universal Windows Platform (UWP) apps would be the focus for Microsoft's gaming in the future. In June 2016, Microsoft announced a project named Microsoft Azure Information Protection. It aims to help enterprises protect their data as it moves between servers and devices. The server sibling to Windows 10, Windows Server 2016, was released in September 2016. In November 2016, Microsoft joined the Linux Foundation as a Platinum member during Microsoft's Connect(); developer event in New York. The cost of each Platinum membership is US$500,000 per year. Some analysts deemed this unthinkable ten years prior; however in 2001, then-CEO Steve Ballmer called Linux "cancer".
On January 24, 2017, Microsoft showcased Intune for Education at the BETT 2017 education technology conference in London. Intune for Education is a new cloud-based application and device management service for the education sector. Microsoft will launch a preview of Intune for Education "in the coming weeks", with general availability scheduled for spring 2017, priced at $30 per device, or through volume licensing agreements. On June 8, 2017, Microsoft acquired Hexadite, an Israeli security firm, for $100 million.
In August 2018, Toyota Tsusho began a partnership with Microsoft to create fish farming tools using the Microsoft Azure application suite for IoT technologies related to water management. Developed in part by researchers from Kindai University, the water pump mechanisms use artificial intelligence to count the number of fish on a conveyor belt, analyze the number of fish, and deduce the effectiveness of water flow from the data the fish provide. The specific computer programs used in the process fall under the Azure Machine Learning and the Azure IoT Hub platforms. On October 8, 2017, Joe Belfiore announced that work on Windows 10 Mobile was drawing to a close due to lack of market penetration and resultant lack of interest from app developers. On October 10, 2018, Microsoft joined the Open Invention Network community despite holding more than 60,000 patents. On October 15, 2018, Paul Allen the co-founder of Microsoft died after complications of non-Hodgkin's lymphoma. In November 2018, Microsoft agreed to supply 100,000 HoloLens headsets to the United States military in order to "increase lethality by enhancing the ability to detect, decide and engage before the enemy." In December 2018, Microsoft announced Project Mu, an open source release of the UEFI core used in Microsoft Surface and Hyper-V products. The project promotes the idea of Firmware as a Service. In the same month, Microsoft announced the open source implementation of Windows Forms and the Windows Presentation Foundation (WPF) which will allow for further movement of the company toward the transparent release of key frameworks used in developing Windows desktop applications and software. December also saw the company discontinue the Microsoft Edge project in favor of Chromium backends for their browsers.
In January 2019, Microsoft announced that support for Windows 10 Mobile would end on December 10, 2019, and that Windows 10 Mobile users should migrate to iOS or Android phones. On 20 February 2019, Microsoft Corp said it will offer its cyber security service AccountGuard to 12 new markets in Europe including Germany, France and Spain, to close security gaps and protect customers in political space from hacking. In February 2019, hundreds of Microsoft employees protested the company's $480 million contract to develop VR headsets for the United States army, calling it war profiteering.
See also
History of Microsoft Windows
History of Microsoft Word
Microsoft litigation
Embrace, extend, and extinguish
References
External links
The History of Microsoft at Channel 9
Bill Gates Money In Realtime
Inside The Deal That Made Bill Gates $350,000,000, Bro Uttal, Fortune, July 21, 1986, reprinted on March 13, 2011
The History of Microsoft and Bill Gates – Timeline, Rahul Vijay Manekari, February 2, 2013
Microsoft
Microsoft
History of the Internet |
2841222 | https://en.wikipedia.org/wiki/Direct%20stiffness%20method | Direct stiffness method | As one of the methods of structural analysis, the direct stiffness method, also known as the matrix stiffness method, is particularly suited for computer-automated analysis of complex structures including the statically indeterminate type. It is a matrix method that makes use of the members' stiffness relations for computing member forces and displacements in structures. The direct stiffness method is the most common implementation of the finite element method (FEM). In applying the method, the system must be modeled as a set of simpler, idealized elements interconnected at the nodes. The material stiffness properties of these elements are then, through matrix mathematics, compiled into a single matrix equation which governs the behaviour of the entire idealized structure. The structure’s unknown displacements and forces can then be determined by solving this equation. The direct stiffness method forms the basis for most commercial and free source finite element software.
The direct stiffness method originated in the field of aerospace. Researchers looked at various approaches for analysis of complex airplane frames. These included elasticity theory, energy principles in structural mechanics, flexibility method and matrix stiffness method. It was through analysis of these methods that the direct stiffness method emerged as an efficient method ideally suited for computer implementation.
History
Between 1934 and 1938 A. R. Collar and W. J. Duncan published the first papers with the representation and terminology for matrix systems that are used today. Aeroelastic research continued through World War II but publication restrictions from 1938 to 1947 make this work difficult to trace. The second major breakthrough in matrix structural analysis occurred through 1954 and 1955 when professor John H. Argyris systemized the concept of assembling elemental components of a structure into a system of equations. Finally, on Nov. 6 1959, M. J. Turner, head of Boeing’s Structural Dynamics Unit, published a paper outlining the direct stiffness method as an efficient model for computer implementation .
Member stiffness relations
A typical member stiffness relation has the following general form:
where
m = member number m.
= vector of member's characteristic forces, which are unknown internal forces.
= member stiffness matrix which characterizes the member's resistance against deformations.
= vector of member's characteristic displacements or deformations.
= vector of member's characteristic forces caused by external effects (such as known forces and temperature changes) applied to the member while .
If are member deformations rather than absolute displacements, then are independent member forces, and in such case (1) can be inverted to yield the so-called member flexibility matrix, which is used in the flexibility method.
System stiffness relation
For a system with many members interconnected at points called nodes, the members' stiffness relations such as Eq.(1) can be integrated by making use of the following observations:
The member deformations can be expressed in terms of system nodal displacements r in order to ensure compatibility between members. This implies that r will be the primary unknowns.
The member forces help to the keep the nodes in equilibrium under the nodal forces R. This implies that the right-hand-side of (1) will be integrated into the right-hand-side of the following nodal equilibrium equations for the entire system:
where
= vector of nodal forces, representing external forces applied to the system's nodes.
= system stiffness matrix, which is established by assembling the members' stiffness matrices .
= vector of system's nodal displacements that can define all possible deformed configurations of the system subject to arbitrary nodal forces R.
= vector of equivalent nodal forces, representing all external effects other than the nodal forces which are already included in the preceding nodal force vector R. This vector is established by assembling the members' .
Solution
The system stiffness matrix K is square since the vectors R and r have the same size. In addition, it is symmetric because is symmetric. Once the supports' constraints are accounted for in (2), the nodal displacements are found by solving the system of linear equations (2), symbolically:
Subsequently, the members' characteristic forces may be found from Eq.(1) where can be found from r by compatibility consideration.
The direct stiffness method
It is common to have Eq.(1) in a form where and are, respectively, the member-end displacements and forces matching in direction with r and R. In such case, and can be obtained by direct summation of the members' matrices and . The method is then known as the direct stiffness method.
The advantages and disadvantages of the matrix stiffness method are compared and discussed in the flexibility method article.
Example
Breakdown
The first step when using the direct stiffness method is to identify the individual elements which make up the structure.
Once the elements are identified, the structure is disconnected at the nodes, the points which connect the different elements together.
Each element is then analyzed individually to develop member stiffness equations. The forces and displacements are related through the element stiffness matrix which depends on the geometry and properties of the element.
A truss element can only transmit forces in compression or tension. This means that in two dimensions, each node has two degrees of freedom (DOF): horizontal and vertical displacement. The resulting equation contains a four by four stiffness matrix.
A frame element is able to withstand bending moments in addition to compression and tension. This results in three degrees of freedom: horizontal displacement, vertical displacement and in-plane rotation. The stiffness matrix in this case is six by six.
Other elements such as plates and shells can also be incorporated into the direct stiffness method and similar equations must be developed.
Assembly
Once the individual element stiffness relations have been developed they must be assembled into the original structure. The first step in this process is to convert the stiffness relations for the individual elements into a global system for the entire structure. In the case of a truss element, the global form of the stiffness method depends on the angle of the element with respect to the global coordinate system (This system is usually the traditional Cartesian coordinate system).
(for a truss element at angle β)
Equivalently,
where and are the direction cosines of the truss element (i.e., they are components of a unit vector aligned with the member). This form reveals how to generalize the element stiffness to 3-D space trusses by simply extending the pattern that is evident in this formulation.
After developing the element stiffness matrix in the global coordinate system, they must be merged into a single “master” or “global” stiffness matrix. When merging these matrices together there are two rules that must be followed: compatibility of displacements and force equilibrium at each node. These rules are upheld by relating the element nodal displacements to the global nodal displacements.
The global displacement and force vectors each contain one entry for each degree of freedom in the structure. The element stiffness matrices are merged by augmenting or expanding each matrix in conformation to the global displacement and load vectors.
(for element (1) of the above structure)
Finally, the global stiffness matrix is constructed by adding the individual expanded element matrices together.
Solution
Once the global stiffness matrix, displacement vector, and force vector have been constructed, the system can be expressed as a single matrix equation.
For each degree of freedom in the structure, either the displacement or the force is known.
After inserting the known value for each degree of freedom, the master stiffness equation is complete and ready to be evaluated. There are several different methods available for evaluating a matrix equation including but not limited to Cholesky decomposition and the brute force evaluation of systems of equations. If a structure isn’t properly restrained, the application of a force will cause it to move rigidly and additional support conditions must be added.
The method described in this section is meant as an overview of the direct stiffness method. Additional sources should be consulted for more details on the process as well as the assumptions about material properties inherent in the process.
Applications
The direct stiffness method was developed specifically to effectively and easily implement into computer software to evaluate complicated structures that contain a large number of elements. Today, nearly every finite element solver available is based on the direct stiffness method. While each program utilizes the same process, many have been streamlined to reduce computation time and reduce the required memory. In order to achieve this, shortcuts have been developed.
One of the largest areas to utilize the direct stiffness method is the field of structural analysis where this method has been incorporated into modeling software. The software allows users to model a structure and, after the user defines the material properties of the elements, the program automatically generates element and global stiffness relationships. When various loading conditions are applied the software evaluates the structure and generates the deflections for the user.
See also
Finite element method
Finite element method in structural mechanics
Structural analysis
Flexibility method
List of finite element software packages
External links
Application of direct stiffness method to a 1-D Spring System
Matrix Structural Analysis
Animations of Stiffness Analysis Simulations
References
Felippa, Carlos A. Introduction to Finite Element Method. Fall 2001. University of Colorado. 18 Sept. 2005
Robinson, John. Structural Matrix Analysis for the Engineer. New York: John Wiley & Sons, 1966
Rubinstein, Moshe F. Matrix Computer Analysis of Structures. New Jersey: Prentice-Hall, 1966
McGuire, W., Gallagher, R. H., and Ziemian, R. D. Matrix Structural Analysis, 2nd Ed. New York: John Wiley & Sons, 2000.
Structural analysis
Numerical differential equations |
65315590 | https://en.wikipedia.org/wiki/Ferranti%20F100-L | Ferranti F100-L | The Ferranti F100-L was a 16-bit microprocessor family announced by Ferranti in 1976 which entered production in 1977. It was the first microprocessor designed in Europe, and among the first 16-bit single-chip CPUs. It was designed with military use in mind, able to work in a very wide temperature range and radiation hardened. To deliver these capabilities, the F100 was implemented using bipolar junction transistors, as opposed to the metal oxide semiconductor (MOS) process used by most other processors of the era. The family included a variety of support chips including a multiply/divide unit, various memory support chips, timers and serial bus controllers.
The F100 was priced at £39 in 1978 in 100-off quantities. Three models were offered at the same price; the commercial spec was rated at 8 MHz, industrial at 6.5 MHz at an extended temperature range, and military spec at 3.5 or 5 MHz with a temperature range from -55 C to +125 C. It was very cost competitive in the industrial and military markets, but less so in the commercial market where processors like the MOS 6502 were about $11 in the same 100 unit quantity.
The line was updated with the F200-L in 1984. This was software compatible with the F100, but included the maths processor on the same die, expanded addressing to 128 kB, and allowed up to 1 MB of memory when paired with the new F220 memory management unit. Shortly after the F200 came to market, in 1987 Ferranti purchased International Signal and Control, a company soon discovered to be committing large amounts of fraud; this drove Ferranti into bankruptcy.
The chip division was purchased by Plessey who continued producing some of the F100 family support chips as late as 1995. Owing to it being used almost entirely in the military realm, the F100 is little known in the wider retrocomputing field and few examples remain.
History
Previous computers
Ferranti was among the first companies to introduce a commercial computer, the Ferranti Mark 1 of 1951. They followed this with several other commercial designs, most notably the Ferranti Atlas of 1962, for a time the fastest computer in the world. In 1963 they used the Ferranti-Packard 6000, developed independently at their Canadian division, as the "golden brick" in the sale of their entire commercial computing line to International Computers and Tabulators (ICT). ICT used the FP6000 as the basis for theIr 1900 line, which sold for years. Prior to the sale, Ferranti sold about 24% of all computing hardware in the UK.
As part of the deal with ICT, Ferranti were barred from sales into the commercial computer market. This left them with two existing architectures that had been developed for military uses, the small Ferranti Argus that had already become a success in the industrial controller market, and the FM1600, a larger machine used for realtime simulations. Both were built of individual transistors and small scale integration integrated circuits using Ferranti's MicroNor bipolar transistor process. These were both very successful in the market, generating hundreds of millions of pounds of sales through the late 1960s.
CDI
A significant problem with the MicroNor process was that a logic gate implemented using bipolar layout was significantly larger than one using the contemporary MOSFET process, about six times. In typical designs, the bipolar layout also required three or four extra masking steps, each of which was time-consuming and led to the possibility of the chip being damaged. Experience with MicroNor suggested that a maximum of about 100 gates was the limit for a single chip, in contrast to MOS, which was being used for designs with thousands of gates. However, the MOS system was more sensitive to impurities in the semiconductor feedstock, which led to electrical noise that reduced performance and also limited its operating conditions. Neither was acceptable in the military market.
In 1971, Ferranti licensed the new collector-diffusion-isolation (CDI) process from Fairchild Semiconductor. This process, originally developed at Bell Labs, produced a dramatically simplified bipolar gate which required fewer masking steps and was only slightly larger than the equivalent MOS. This was of little interest to either Bell or Fairchild, who were happy with their MOS processes, and neither had progressed beyond experimental systems.
Ferranti invested heavily in the CDI process, working to raise the operating voltage from 3 to 5V for compatibility with their existing transistor-transistor logic (TTL) devices that were already widely used in military applications. This led to a series of medium scale integration parts using the process. Most well known among these was a series of uncommitted logic arrays (ULA, or gate array), chips with no pre-set logic design that could be programmed by the developer to produce any required circuit. These became very popular, and by 1986 the company held about 20% of the worldwide market for ULAs.
F100-L
The introduction of the first microprocessors in the early 1970s cut into Ferranti's military computing business. While these early designs were not competitive in performance terms, their price/performance ratio was orders of magnitude better than Ferranti's discrete designs, in spite of several rounds of cost-reduction in the MicroNor line in the late 1960s. Convinced that the microprocessor represented a strategic change in military applications, in 1974 the UK Ministry of Defence agreed to sponsor an effort by Ferranti to produce a military-grade microprocessor design using the CDI process, whose high power-handling allowed them to operate in electrically noisy environments.
An internal survey within the company suggested that an 8-bit part would not have the capability needed by the various divisions, and the decision was made to produce a 16-bit part. Based on studies of the economics of chip fabrication, Ferranti concluded that they had a budget of about 1,000 gates before the design would be too expensive. To produce a 16-bit design with this limited gate count, the arithmetic logic unit, or ALU, had to operate in a bit-serial fashion. This slows the performance of mathematical operations, so that the minimum time needed to complete an instruction is 36 clock cycles. This performance hit is offset somewhat by the 8 MHz clock speed, roughly double that of the fastest CPUs of the era.
With 16-bit data and 15-bit addresses, normally 31 pins would be required to interface the design to the computer as a whole. Desiring a low-cost solution, it had to fit into a conventional 40-pin dual in-line package (DIP). To accomplish this, the data and address lines share pins, and thus require multiple cycles to complete the reading of a single instruction. For comparison, the Texas Instruments TMS9900, another 16-bit design introduced the same year, had double the gate count and was packaged in an expensive custom 64-pin DIP.
Ultimately the F100 failed to meet its 1,000 gate limitation and was built with about 1,500 gates on a 5.8 mm square surface. This was larger than their existing mask-production system and required them to develop a new version with a larger optical reduction ratio. The timing of the design effort also produced one advantage; the F100 was beginning to be readied for production just as the Micralign system was coming to market, and Ferranti adopted this projection alignment system for production, thereby greatly improving yields.
As was common at the time, the F100 was introduced along with a family of support chips, including memory bus interfaces, interrupt controller, a direct memory access controller and a basic serial bus controller. Most of these were built using their ULA chips. Perhaps most interesting among these was the F101-L, released shortly after the CPU, which performed hardware multiplication and division. This became so common that the CPU was soon offered with the F101 on the same die, as the FBH5092.
While the F100 was being developed, Ferranti produced a multi-card rackmount version of the CPU, the F100-M. This was used as a development platform and saw some civilian use as well. Programming tools were initially written in FORTRAN, but most projects were written in CORAL once a compiler for that language became available.
When it was first announced in 1977, 100-unit lots were priced at £57, but that was soon reduced to £39 by 1978. A set containing an F100 along with the F111-L control interface and two F112-L DMA controllers was available for an additional £18. While this made it uncompetitive with MOS-based commercial processors like the $25 Zilog Z80 or $11 MOS 6502 in the same 100-unit lots, it was very competitive with other military-spec designs like the Z80's military-rated unit at $165.
The F100 quickly found use in UK defense projects. Among the more well-known successes was the guidance unit for the Sea Eagle missile. Other examples include the gunnery computer for the Falcon self-propelled anti-aircraft gun, a variety of ballistics computers used in various tanks, the CPU for the UoSAT-1 satellite, and a number of naval computer applications. It was also used in the civilian field in engine management systems from Ultra Electronic Controls, a propeller speed limiter from Dowty Group, and even control of nuclear test equipment using the CAMAC protocol.
F200-L
The F100 line was updated in 1984 with the introduction of the F200-L, which was software and pin-compatible with the F100. The primary changes were to include the math processor, formerly the F101, as part of the base CPU. Improvements in fabrication also allowed the F200-L to run up to 20 MHz. The F200 also supported the 16th bit in addresses, expanding the memory to 64 kW (128 kB). The new F220-L memory management unit, launched at the same time, provided address lookup within a 1 MW (2 MB) memory space.
Plessey purchase
During the 1980s, Ferranti was very successful and cash-flush. Desiring to make more sales into the United States, the company began looking for an established US military supplier they could buy and use as the basis for their own division in the country. This process eventually led them to purchase International Signal and Control (ISC) in 1987, and along with it, changing the name of the company to Ferranti International.
Unfortunately, ISC's major business, unrevealed at the time, was illegal arms sales. This source of income evaporated with the purchase, leaving them with practically no ongoing business. A lengthy court process ensued, and the debt load of the purchase along with the cost of the litigation drove Ferranti into bankruptcy in December 1993.
As part of the bankruptcy proceedings, the company was broken up, and the semiconductor division was purchased by Plessey. This was subsequently part of the Siemens Plessey unit after Siemens purchased the company in 1989. The line continued to be produced through this period, with the F100/200 itself being produced until at least 1992, and some of the other members until 1995.
Today
Used primarily in military systems, few F100 systems remain today. Among the few are a display F100-L chip at the Museum of Science and Industry in Manchester, and a small number of cards from a F100 microcomputer at the Centre for Computing History.
Description
Registers
Most microprocessors of the 1970s used internal 8-bit wide processor registers, an 8-bit data bus and a 16-bit address bus. The F100 used 16-bit registers but only 15-bits in the address bus, but these addresses represented 16-bit words so the total addressable memory was 64 kB, as was the case with most 8-bit processors with 16-bit addressing. At the time the F100 was designed, memory was extremely expensive and typical machines of the era generally featured only 4 kB of SRAM, so the missing 16th bit in the address was not an important consideration.
There are three main user registers. The 16-bit ACC (accumulator) and OR (operand register) are used to hold values being manipulated by the arithmetic logic unit (ALU) during calculations and comparisons. The results of these operations set bits in the 7-bit CR (condition register). Two additional registers are used internally; the 15-bit PC (program counter) holds the address of the currently executing instruction and has an auto-increment feature, while the 16-bit IR (instruction register) is used to hold the actual instruction itself. If the instruction operates on a memory address, the value in the IR is moved to internal latches and the IR is then loaded with the address value.
The CR contained a set of seven bits:
Addressing modes
The F100 had a total of four addressing modes; direct, immediate, pointer and immediate indirect.
Direct mode encoded a constant value directly into the instruction. To do this, only the upper five bits were available for the opcode, allowing a total of 32 possible direct instructions, while the remaining lower 11 bits stored the numeric value. In the standard assembler mnemonics, this was indicated by placing the value directly after the instruction. For instance, AND 0x444 would perform a bitwise AND operation between the current value in the ACC and the 16-bit constant 0x444. Immediate mode was similar to direct, but the value to be accessed is placed in the 16-bits following the instruction in order to allow larger constants. This was indicated with a comma, for instance, AND ,0x4444.
As was common at the time, the F100 featured a form of zero page addressing they referred to as Pointer Indirect Addressing, or simply pointer. Address zero, a 16-bit word, was used as the stack pointer, which lacked its own register. This had to be set to an odd number. Locations 1 through 255 were available for the user. Pointer addressing used the lower 8 bits of the instruction to indicate one of the zero page addresses, whose value would be read as an address, and then the value at that address would be loaded. Pointer addressing was indicated with a slash, for instance, AND /0x44.
Additionally, the F100 had alternate forms of the pointer addressing instructions that performed a pre-increment or post-decrement of the value in the pointer in the zero page. These make it easy to perform loops over blocks of data in main memory without needing a separate increment operation to be read and performed. These were indicated using the + or - at the end of the pointer value, for instance, AND /0x44+ or AND /0x44-.
Finally, indirect addressing was similar to pointer addressing but allows any value in memory to hold the pointer, rather than just the zero page. This is more flexible, but as the address is stored in the 16 bits following the instruction, using this method is slower than zero page because two memory addresses have to be read instead of one. This mode was specified with a dot, for instance, AND .0x4444.
Some of the indirect addressing mode instructions also took a third value, indicating another location in memory. This was used for bitwise comparisons; the instructions included which bit to be tested as the first operand, the location in memory as the second, and the address to jump to as the third. For instance, JBS 0x2 0x4444 0x5555 would test the second bit of the value in location 0x4444 and then jump to location 0x5555 if it was set, or continue on if it was not.
Because the addressing format in the instructions varied in length, memory was naturally broken into segments. The first was the stack pointer in location zero, next was the remaining 255 locations of the zero page, then the maximum 2048 locations of the direct mode (which included the zero page), and finally the remaining memory which could be accessed by the 15-bit addresses.
Instructions
The F100 had a total of 29 instructions, which combined using the various addressing modes results in 153 opcodes. The instructions generally fall into six main categories; math and logical, double-length (32-bit) math and logical, bit tests and conditional branches, interrupt handling, and external functions. The later allows unused bits of the instruction to be passed to external chips for processing.
The instructions were relatively common but had some variations. For instance, and had alternate versions, and , which performed the operation and then stored the result back into the operand address. performed an unconditional jump, while called a subroutine, what most assemblers would call a , and performed a return. Conditional branches allowed test-and-jump.
The instruction format used various fields to encode instructions classes. The four most significant bits, 15 through 12, selected the actual instruction, for instance, 1001 was . The rest of the bits varied depending on the addressing mode. For instance, if direct addressing was being used, bit 11 was set to 0, 10 and 9 to 1, and the remaining 11 bits encoded the address of the operand. If the 11 bits were all set to zero, it instead read the operand from the next 16 bits in memory.
Start up
On startup or reset, the processor examines the AdSel pin (address select). If the pin voltage represents a zero, it jumps to location , or 16384 decimal, while if the pin is 1, it jumps to , 2048. By placing startup code in ROM at those locations, the boot process can be automated.
Notes
References
Citations
Bibliography
External links
The Ferranti F100-L Microprocessor, source code for a F100-L emulator written in Python.
16-bit microprocessors
Ferranti computers |
9914431 | https://en.wikipedia.org/wiki/Linux%20gaming | Linux gaming | Linux gaming refers to playing video games on a Linux operating system.
History
Linux gaming started largely as an extension of the already present Unix gaming scene, with both systems sharing many similar titles. These games were either mostly original or clones of arcade games and text adventures. A notable example of this was the so-called "BSD Games", a collection of interactive fiction and other text-mode titles. The free software and open source methodologies which spawned the development of the operating system in general also spawned the creation of various early free games. Popular early titles included NetHack, Netrek, XBill, XEvil, xbattle, Xconq and XPilot. As the operating system itself grew and expanded, the amount of free and open-source games also increased in scale and complexity.
1990–1998
The beginning of Linux as a gaming platform for commercial video games is widely credited to have begun in 1994 when Dave D. Taylor ported the game Doom to Linux, as well as many other systems, during his spare time. From there he would also help found the development studio Crack dot Com, which released the video game Abuse, with the game's Linux port even being published by Linux vendor Red Hat. id Software, the original developers of Doom, also continued to release their products for Linux. Their game Quake was ported to Linux in 1996, once again by Dave D. Taylor working in his free time. Later id products continued to be ported by David Kirsch and Timothee Besset, a practice that continued until the studio's acquisition by ZeniMax Media in 2009. In 1991 DUX Software contracted Don Hopkins to port SimCity to Unix, which he later ported to Linux and eventually released as open source for the OLPC XO Laptop. Other early commercial Linux games included Hopkins FBI, an adventure game released in 1998 by MP Entertainment, and Inner Worlds in 1996, which was released for and developed on Linux. In 1998, two programmers from Origin ported Ultima Online to Linux. A website called The Linux Game Tome began to catalog games created for or ported to Linux in 1995.
1998–2002
On November 9, 1998, a new software firm called Loki Software was founded by Scott Draeker, a former lawyer who became interested in porting games to Linux after being introduced to the system through his work as a software licensing attorney. Loki, although a commercial failure, is credited with the birth of the modern Linux game industry. Loki developed several free software tools, such as the Loki installer (also known as Loki Setup), and supported the development of the Simple DirectMedia Layer, as well as starting the OpenAL audio library project. These are still often credited as being the cornerstones of Linux game development. They were also responsible for bringing nineteen high-profile games to the platform before its closure in 2002. Loki's initial success also attracted other firms to invest in the Linux gaming market, such as Tribsoft, Hyperion Entertainment, Macmillan Digital Publishing USA, Titan Computer, Xatrix Entertainment, Philos Laboratories, and Vicarious Visions. During this time Michael Simms founded Tux Games, one of the first online Linux game retailers.
2002–2010
After Loki's closure, the Linux game market experienced some changes. Although some new firms, such as Linux Game Publishing and RuneSoft, would largely continue the role of a standard porting house, the focus began to change with Linux game proponents encouraging game developers to port their game products themselves or through individual contractors. Influential to this was Ryan C. Gordon, a former Loki employee who would over the next decade port several game titles to multiple platforms, including Linux. Around this time many companies, starting with id Software, also began to release legacy source code leading to a proliferation of source ports of older games to Linux and other systems. This also helped expand the already existing free and open-source gaming scene, especially with regards to the creation of free first person shooters.
The Linux gaming market also started to experience some growth towards the end of the decade with the rise of independent video game development, with many "indie" developers favouring support for multiple platforms. The Humble Indie Bundle initiatives helped to formally demonstrate this trend, with Linux users representing a sizable population of their purchase base, as well as consistently being the most financially generous in terms of actual money spent. The release of a Linux version of Desura, a digital distribution platform with a primary focus on small independent developers, was also heralded by several commentators as an important step to greater acknowledgement of Linux as a gaming platform. In 2009, the small indie game company Entourev LLC published Voltley to Linux which is the first commercial exclusive game for this operating system. In the same year, LGP released Shadowgrounds which was the first commercial game for Linux using the Nvidia PhysX middleware.
2010–present
In July 2012, game developer and content distributor Valve announced a port of their Source engine for Linux as well as stating their intention to release their Steam digital distribution service for Linux. The potential availability of a Linux Steam client has already attracted other developers to consider porting their titles to Linux, including previously Mac OS only porting houses such as Aspyr Media and Feral Interactive.
In November 2012, Unity Technologies ported their Unity engine and game creation system to Linux starting with version 4. All of the games created with the Unity engine can now be ported to Linux easily.
In September 2013 Valve announced that they were releasing a gaming oriented Linux based operating system called SteamOS with Valve saying they had "come to the conclusion that the environment best suited to delivering value to customers is an operating system built around Steam itself."
In March 2014 GOG.com announced they would begin to support Linux titles on their DRM free store starting the same year, after previously stating they would not be able due to too many distributions. GOG.com began their initial roll out on July 24, 2014, by offering 50 Linux supporting titles, including several new to the platform.
In March and April 2014 two major developers Epic Games and Crytek announced Linux support for their next generation engines Unreal Engine 4 and CryEngine respectively.
On August 22, 2018, Valve released their fork of Wine called Proton, aimed at gaming. It features some improvements over the vanilla Wine such as Vulkan-based DirectX 11 implementation, Steam integration, better full screen and game controller support and improved performance for multi-threaded games. It has since grown to include support for DirectX 9 and DirectX 12 over Vulkan.
On February 25, 2022, Valve released Steam Deck, a handheld game console running SteamOS 3.0.
Market share
The Steam Hardware Survey reports that as of July 2021, 1% of users are using some form of Linux as their platform's primary operating system. The Unity game engine used to make their statistics available and in March 2016 reported that Linux users accounted for 0.4% of players. In 2010, in the first Humble Bundle sales, Linux accounted for 18% of purchases.
Supported hardware
Linux as a gaming platform can also refer to operating systems based on the Linux kernel and specifically designed for the sole purpose of gaming. Examples are SteamOS, which is an operating system for Steam Machines, Steam Deck and general computers, video game consoles built from components found in the classical home computer, (embedded) operating systems like Tizen and Pandora, and handheld game consoles like GP2X, and Neo Geo X. The Nvidia Shield runs Android as an operating system, which is based on a modified Linux kernel.
The open source design of the Linux software platform allows the operating system to be compatible with various computer instruction sets and many peripherals, such as game controllers and head-mounted displays. As an example, HTC Vive, which is a virtual reality head-mounted display, supports the Linux gaming platform.
Performance
In 2013, tests by Phoronix showed real-world performance of games on Linux with proprietary Nvidia and AMD drivers were mostly comparable to results on Windows 8.1. Phoronix found similar results in 2015, though Ars Technica described a 20% performance drop with Linux drivers.
Software architecture
An operating system based on the Linux kernel and customized specifically for gaming, could adopt the vanilla Linux kernel with only little changes, or—like the Android operating system—be based on a relative extensively modified Linux kernel. It could adopt GNU C Library or Bionic or something like it. The entire middleware or parts of it, could very well be closed-source and proprietary software; the same is true for the video games. There are free and open-source video games available for the Linux operating system, as well as proprietary ones.
Linux kernel
The subsystems already mainlined and available in the Linux kernel are most probably performant enough so to not impede the gaming experience in any way, however additional software is available, such as e.g. the Brain Fuck Scheduler (a process scheduler) or the Budget Fair Queueing (BFQ) scheduler (an I/O scheduler).
Similar to the way the Linux kernel can be, for example, adapted to run better on supercomputers, there are adaptations targeted at improving the performance of games. A project concerning itself with this issue is called Liquorix.
Available software for video game designers
Debuggers
Several game development tools have been available for Linux, including GNU Debugger, LLDB, Valgrind, glslang and others. VOGL, a debugger for OpenGL was released on 12 March 2014. An open-source, cross-platform clone of Enterbrain's RPG Maker (2000, 2003, XP, VX), called OpenRPG Maker, is currently in development.
Available interfaces and SDKs
There are multiple interfaces and Software Development Kits available for Linux, and almost all of them are cross-platform. Most are free and open-source software subject to the terms of the zlib License, making it possible to static link against them from fully closed-source proprietary software. One difficulty due to this abundance of interfaces, is the difficulty for programmers to choose the best suitable audio API for their purpose. The main developer of the PulseAudio project, Lennart Poettering, commented on this issue.
Physics engines, audio libraries, that are available as modules for game engines, have been available for Linux for a long time.
The book Programming Linux Games covers a couple of the available APIs suited for video game development for Linux, while The Linux Programming Interface covers the Linux kernel interfaces in much greater detail.
Available middleware
Beside majority of the software which acts as an interface to various subsystems of the operating system, there is also software which can be simply described as middleware. A multitude of companies exist worldwide, whose main or only product is software that is meant to be licensed and integrated into a game engine. Their primary target is the video game industry, but the film industry also utilizes such software for special effects. Some very few well known examples are
classical physics: Havok, Newton Game Dynamics and PhysX
audio: Audiokinetic Wwise, FMOD
other: SpeedTree
A significant share of the available middleware already runs natively on Linux, only a very few run exclusively on Linux.
Available IDEs and source code editors
Numerous source code editors and IDEs are available for Linux, among which are Visual Studio Code, Sublime Text, Code::Blocks, Qt Creator, Emacs, or Vim.
Multi-monitor
A multi-monitor setup is supported on Linux at least by AMD Eyefinity & AMD Catalyst, Xinerama and RandR on both X11 and Wayland. Serious Sam 3: BFE is one example of a game that runs natively on Linux and supports very high resolutions and is validated by AMD to support their Eyefinity. Civilization V is another example, it even runs on a "Kaveri" desktop APU in 3x1 portrait mode.
Voice over IP
The specifications of the Mumble protocol are freely available and there are BSD-licensed implementations for both servers and clients. The positional audio API of Mumble is supported by e.g. Cube 2: Sauerbraten.
Wine
Wine is a compatibility layer that provides binary compatibility and makes it possible to run software, that was written and compiled for Microsoft Windows, on Linux. The Wine project hosts a user-submitted application database (known as Wine AppDB) that lists programs and games along with ratings and reviews which detail how well they run with Wine. Wine AppDB also has a commenting system, which often includes instructions on how to modify a system to run a certain game which cannot run on a normal or default configuration. Many games are rated as running flawlessly, and there are also many other games that can be run with varying degrees of success. The use of Wine for gaming has proved controversial in the Linux community as some feel it is preventing, or at least hindering, the further growth of native gaming on the platform.
Emulators
There are numerous emulators for Linux. There are also APIs, virtual machines, and machine emulators that provide binary compatibility:
Basilisk II for the 68040 Macintosh;
DOSBox and DOSEMU for MS-DOS/PC DOS and compatibles;
DeSmuME and melonDS for the Nintendo DS;
Dolphin for the Nintendo GameCube, Wii, and the Triforce;
FCEUX, Nestopia and TuxNES for the Nintendo Entertainment System;
Frotz for Z-Machine text adventures;
Fuse for the Sinclair ZX Spectrum;
Hatari for the Atari ST, STe, TT and Falcon;
gnuboy for the Nintendo Game Boy and Game Boy Color;
MAME for arcade games;
Mednafen and Xe emulating multiple hardware platforms including some of the above;
Mupen64Plus and the no longer actively developed original Mupen64 for the Nintendo 64;
PCSX-Reloaded, pSX and the Linux port of ePSXe for the PlayStation;
PCSX2 for the PlayStation 2;
PPSSPP for the PlayStation Portable;
ScummVM for LucasArts and various other adventure games;
SheepShaver for the PowerPC Macintosh;
Snes9x, higan and ZSNES for the Super NES;
Stella for the Atari 2600;
UAE for the Amiga;
VICE for the Commodore 64;
VisualBoyAdvance for the Game Boy Advance;
vMac for the 680x0 Macintosh;
Linux homebrew on consoles
Linux has been ported to several game consoles, including the Xbox, PlayStation 2, PlayStation 3, PlayStation 4, GameCube, and Wii which allows game developers without an expensive game development kit to access console hardware. Several gaming peripherals also work with Linux.
Linux adoption
Adoption by game engines
The game engine is the software solely responsible for the game mechanics, or rules defining game play. There are different game engines for first-person shooters, strategy video games, etc. Besides the game mechanics, software is also needed to handle graphics, audio, physics, input handling, and networking.
Game engines that are used by many video games and run on top of Linux include:
C4 Engine (Terathon Software)
CryEngine (Crytek)
Diesel 2.0 (Grin)
HPL Engine 1–3 (Frictional Games)
id Tech (id Software)
Serious Engine (Croteam)
Source (Valve)
Unigine (Unigine Corp)
Unity 5 (Unity Technologies)
Unreal Engine 1-4 (Epic Games)
Godot engine
Adoption by video games
There are many free and open-source video games as well as commercially distributed proprietary video games that run natively on Linux. Some independent companies have also begun porting prominent video games from Microsoft Windows to Linux.
Free and open-source games
Original games
A few original open source video games have attained notability:
0 A.D. is a real-time strategy game of ancient warfare, similar to Age of Empires.
AssaultCube is a first-person shooter.
AstroMenace is a 3D scroll-shooter.
BZFlag is a 3D First person tank shooter (With jumping).
Battle for Wesnoth is a turn-based strategy game.
Blob Wars: Metal Blob Solid is a 2D platform game.
Chromium B.S.U. is a fast-paced, arcade-style, top-scrolling space shooter.
CodeRED: Alien Arena is a sci-fi first-person shooter derived from the Quake II engine.
Crimson Fields is a turn-based tactical wargame.
Cube 2: Sauerbraten is a 3D first-person shooter with an integrated map editing mode.
Danger from the Deep is a submarine simulator set in World War II.
Glest is a real-time strategy game, with optional multiplayer.
NetHack and Angband are text-based computer role-playing games.
Netrek is a Star Trek themed multiplayer 2D space battle game.
Nexuiz is a first-person shooter. Although, this has been replaced by Xonotic.
Project: Starfighter a multi-directional, objective based shoot-em-up.
TORCS (The Open Racing Car Simulator) – considered one of the best open-source racing simulators, with realistic graphics and vehicle handling.
Tremulous is a 3D first-person shooter/real-time strategy game.
Tux Racer is a 3D racing game featuring Tux.
Urban Terror is a standalone Quake III Arena first-person shooter. (Proprietary mod).
Vega Strike is a space flight simulation.
Warsow is a Quake-like, fast-paced first-person shooter.
Clones and remakes
There are a larger number of open source clones and remakes of classic games:
FreeCiv is a clone of Civilization II.
FreeOrion is inspired by Master of Orion.
Frets on Fire is a clone of Guitar Hero.
Frozen Bubble is a clone of Puzzle Bobble.
Grid Wars is a clone of Geometry Wars.
Head Over Heels, a ZX-Spectrum action platformer, was remade for Linux, Windows, Mac OS X, and BeOS.
Oolite is a free and open-source remake of Elite.
OpenClonk is a free and open-source remake of Clonk.
OpenTTD is a remake of Transport Tycoon Deluxe.
OpenMW game engine reimplementation of Morrowind.
Performous is a remix of the ideas behind Guitar Hero, SingStar and Dance Dance Revolution.
Pingus is a clone of Lemmings.
Scorched 3D is a 3D adaptation of Scorched Earth.
Spring originally is a clone of Total Annihilation, but actually is a platform for real time strategy games.
StepMania is a clone of Dance Dance Revolution
SuperTuxKart and TuxKart are clones of Mario Kart.
SuperTux and Secret Maryo Chronicles are both clones of Super Mario Bros.
The Dark Mod is a stealth game inspired by the Thief (series) games (particularly 1 and 2) from Looking Glass Studios
The Zod Engine is an actively developed open source remake of the game Z.
UFO: Alien Invasion is heavily influenced by the X-COM series, mostly by UFO: Enemy Unknown.
UltraStar is an open source clone of SingStar
Ur-Quan Masters is based on the original source code for Star Control II
Warzone 2100 is a real-time strategy and real-time tactics hybrid computer game. Originally published by Eidos Interactive and later released as open source.
Widelands is a clone of The Settlers II.
Bill Kendrick has developed many free software games, most inspired by games for the Atari 8-bit and other classic systems.
Proprietary games
Available on Steam
Valve officially released Steam for Linux on February 14, 2013. the number of Linux-compatible games on Steam exceeds 6,500. With the launch of SteamOS, a distribution of Linux made by Valve intended to be used for HTPC gaming, that number is quickly growing. Listed below are some notable games available on Steam for Linux:
Age of Wonders III
Alien: Isolation
American Truck Simulator
And Yet It Moves
Another World
Aquaria
Bastion
The Binding of Isaac
BioShock Infinite
Borderlands 2
Borderlands: The Pre-Sequel!
Braid
Brütal Legend
Cave Story+
Civilization V
Civilization VI
Civilization: Beyond Earth
Counter-Strike
Counter-Strike: Global Offensive
Counter-Strike: Source
Day of the Tentacle Remastered
Dead Island
Deus Ex: Mankind Divided
Dirt Rally
Don't Starve
Dota 2
Empire: Total War
Fez
Freedom Planet
GRID Autosport
Grim Fandango Remastered
Half-Life
Half-Life 2
Hitman
Hitman Go
Kerbal Space Program
Lara Croft Go
Left 4 Dead 2
Life Is Strange
Life Is Strange 2
Limbo
Mad Max
Madout Big City Online
Metro 2033
Metro: Last Light
Middle-earth: Shadow of Mordor
Mini Metro
Pillars of Eternity
Portal
Portal 2
Rocket League
Saints Row 2
Saints Row IV
Saints Row: The Third
Shovel Knight
Skullgirls
Spec Ops: The Line
Star Wars Knights of the Old Republic II: The Sith Lords
Super Meat Boy
System Shock 2
The Talos Principle
Tank Force
Team Fortress 2
Tomb Raider
Total War: Warhammer
TowerFall Ascension
Undertale
VVVVVV
The Witcher 2: Assassins of Kings
XCOM: Enemy Unknown
XCOM 2
Independent game developers
Independent developer 2D Boy released World of Goo for Linux. Role-playing video game titles like Eschalon: Book I, Eschalon: Book II and Penny Arcade Adventures: On the Rain-Slick Precipice of Darkness were developed cross-platform from the start of development, including a Linux version. Sillysoft released Linux versions of their game Lux and its various versions.
Hemisphere Games has released a Linux version of Osmos. Koonsolo has released a Linux version of Mystic Mine. Amanita Design has released Linux versions of Machinarium and Samorost 2. Irrgheist released a Linux version of their futuristic racing game H-Craft Championship. Gamerizon has released a Linux version of QuantZ. InterAction Studios has several titles mostly in the Chicken Invaders series.
Kristanix Games has released Linux versions of Crossword Twist, Fantastic Farm, Guess The Phrase!, Jewel Twist, Kakuro Epic, Mahjong Epic, Maxi Dice, Solitaire Epic, Sudoku Epic, Theseus and the Minotaur. Anawiki Games has released a Linux versions of Path of Magic, Runes of Avalon, Runes of Avalon 2, Soccer Cup Solitaire, The Perfect Tree and Dress-Up Pups. Gaslamp Games released a Linux version of Dungeons of Dredmor. Broken Rules has released a Linux version of And Yet It Moves.
Frictional Games released Linux versions of both Penumbra: Black Plague and Penumbra: Overture, as well as the expansion pack Penumbra: Requiem. They also released Amnesia: The Dark Descent for Linux simultaneously with the Windows and Mac OS X versions. S2 Games released Linux clients for their titles Savage: The Battle for Newerth, Savage 2: A Tortured Soul and Heroes of Newerth. Wolfire Games released a Linux version of their game Lugaru and they will release its sequel Overgrowth for Linux. David Rosen's Black Shades was also ported to Linux. Arctic Paint has released a Linux version of Number Drill. Charlie's Games has released a Linux version of Bullet Candy Perfect, Irukandji, Space Phallus and Scoregasm.
Illwinter Game Design released Conquest of Elysium II, Dominions: Priests, Prophets and Pretenders, Dominions II: The Ascension Wars, and Dominions 3: The Awakening for Linux. Introversion Software released Darwinia, Uplink, and DEFCON. Cartesian Theatre is a Vancouver, British Columbia, Canada, based software house specializing in free, commercial, games for Linux. They have one title currently under active development, Avaneya. Kot-in-Action Creative Artel released their Steel Storm games for Linux. Hazardous Software have released their game Achron for Linux.
Unigine Corp developed Oil Rush using its Unigine engine technology that works on Linux. Unigine Corp was also developing a "shooter-type game" that would have been released for Linux, currently the development on this game is frozen until OilRush is released. The MMORPG game Syndicates of Arkon is also supposed to be coming to Linux. The game Dilogus: The Winds of War is also being developed with Unigine and is planned to have a Linux client.
A number of visual novel developers support Linux. Winter Wolves has released titles such as Spirited Heart, Heileen, The Flower Shop, Bionic Heart, Card Sweethearts, Vera Blanc, Planet Stronghold, and Loren The Amazon Princess for Linux. Hanako Games has released Science Girls, Summer Session, Date Warp, Cute Knight Kingdom, and are considering porting Fatal Hearts to Linux. sakevisual has brought Jisei, Kansei, Yousei, RE: Alistair and Ripples to Linux. Four Leaf Studios has also released Katawa Shoujo for Linux and Christine Love released Digital: A Love Story, both of which, along with Summer Session mentioned previously, are powered by the free software Ren'Py tool.
The Java-based sandbox game Minecraft by Indie developer Mojang is available on Linux, as is any other video game compiled for the Java virtual machine.
Dwarf Fortress, a sandbox management simulator / roguelike, has been made available for Linux by Tarn Adams.
The voxel-based space sandbox game, ScrumbleShip by Indie developer Dirkson is currently under development for Linux, Mac OS X, and Windows.
The realistic replay baseball simulation Out of the Park Baseball by OOTP Developments is currently available for Linux, Mac OS X, and Windows, for single player and multiplayer online leagues.
Grappling Hook, a first-shooter like puzzle game.
The German indie-studio Pixel Maniacs has released both of their games, ChromaGun and Can't Drive This for Linux.
In the Walking Simulator space, Dan Ruscoe's Dark Hill Museum of Death is available for Linux.
Game porters
Independent companies have also taken on the task of porting prominent Windows games to Linux. Loki Software was the first such company, and between 1998 and 2002 ported Civilization: Call to Power, Descent³, Eric's Ultimate Solitaire, Heavy Gear II,
Heavy Metal: F.A.K.K.², Heretic II, Heroes of Might and Magic III, Kohan: Immortal Sovereigns, Myth II: Soulblighter, Postal, Railroad Tycoon II, Quake III Arena, Rune, Sid Meier's Alpha Centauri, Sim City 3000, Soldier of Fortune, Tribes 2, and MindRover to Linux.
Tribsoft created a Linux version of Jagged Alliance 2 by Sir-Tech Canada before shutting down in 2002. Linux Game Publishing was founded in 2001 in response to the impending demise of Loki, and has brought Creatures: Internet Edition, Candy Cruncher, Majesty: Gold Edition, NingPo MahJong, Hyperspace Delivery Boy!, Software Tycoon, Postal²: Share The Pain, Soul Ride, X2: The Threat, Gorky 17, Cold War, Knights and Merchants: The Shattered Kingdom, Ballistics, X3: Reunion, Jets'n'Guns, Sacred: Gold, Shadowgrounds, and Shadowgrounds Survivor to Linux. Some of these games were ported for them by Gordon.
LGP-associated but freelance consultant Frank C. Earl is porting the game Caster to Linux and has released the first episode and also developed the Linux version of Cortex Command being included in the second Humble Indie Bundle. He is also working towards other porting projects such as the entire Myth series. He is largely taking recommendations and he comments as part of the Phoronix community. icculus.org has ported beta releases for Medal of Honor: Allied Assault and Devastation, versions of America's Army, and the titles Prey, Aquaria, Braid, Hammerfight and Cogs.
The German publisher RuneSoft was founded in 2000. They ported the games Northland,
Robin Hood: The Legend of Sherwood, Airline Tycoon Deluxe, Ankh, Ankh: Heart of Osiris, Barkanoid 2, and Jack Keane to Linux, as well as porting Knights and Merchants: The Shattered Kingdom and Software Tycoon, for Linux Game Publishing. Hyperion Entertainment ported games to several systems, they have ported Shogo: Mobile Armor Division and SiN to Linux, as well as porting Gorky 17 for Linux Game Publishing. Wyrmkeep Entertainment has brought the games The Labyrinth of Time and Inherit the Earth: Quest for the Orb to Linux. Alternative Games brought Trine and Shadowgrounds, and Shadowgrounds Survivor for Linux Game Publishing.
Aspyr Media released their first Linux port in June 2014, they claim they are porting to Linux due to Valve bringing out SteamOS. Aspyr Media later ported Borderlands 2 to Linux in September 2014.
Having ported games to Mac OS X since 1996, video game publisher Feral Interactive released XCOM: Enemy Unknown, its first game for Linux, in June 2014. Feral Interactive stated they port games to Linux thanks to SteamOS.
Other developers
Some id Software employees ported the Doom series, the Quake series, Return to Castle Wolfenstein, Wolfenstein: Enemy Territory and Enemy Territory: Quake Wars. Some games published by GarageGames which have Linux versions include Bridge Builder, Marble Blast Gold, Gish, Tribal Trouble, and Dark Horizons: Lore Invasion.
MP Entertainment released Hopkins FBI and Crack dot com released Abuse for Linux, becoming one of the first developers to release a native port. Inner Worlds, another early commercial Linux title, was released for and developed on Linux. Philos Laboratories released a Linux version of Theocracy on the retail disk. Absolutist has supported Linux for a number of years. GLAMUS GmbH released a Linux version of their game Mobility. Vicarious Visions ported the space-flight game Terminus to Linux.
Lava Lord Games released their game Astro Battle for Linux. Xatrix Entertainment released a Linux version of Kingpin: Life of Crime. BioWare released Neverwinter Nights for Linux. Croteam released the Serious Sam series, with the first game ported by Gordon and with the second self-ported. Gordon also ported Epic Games' shooter games Unreal Tournament 2003 and Unreal Tournament 2004.
Revolution System Games released their game Decadence: Home Sweet Home through Steam only for Linux for a period of time after Mac or windows release.
On 12 October 2013 Lars Gustavsson, creative director at DICE, said to polygon.com
Commercial games for non-x86 instruction sets
Some companies ported games to Linux running on instruction sets other than x86, such as Alpha, PowerPC, Sparc, MIPS or ARM. Loki Entertainment Software ported Civilization: Call to Power, Eric's Ultimate Solitaire, Heroes of Might and Magic III, Myth II: Soulblighter, Railroad Tycoon II Gold Edition and Sid Meier's Alpha Centauri with Alien Crossfire expansion pack to Linux PowerPC. They also ported Civilization: Call to Power, Eric's Ultimate Solitaire, Sid Meier's Alpha Centauri with Alien Crossfire expansion pack to Linux Alpha and Civilization: Call to Power, Eric's Ultimate Solitaire to Linux SPARC. Linux Game Publishing published Candy Cruncher, Majesty Gold, NingPo MahJong and Soul Ride to Linux PowerPC. They also ported Candy Cruncher, Soul Ride to Linux SPARC and Soul Ride to Linux Alpha. Illwinter Game Design ported Dominions: Priests, Prophets & Pretenders, Dominions II: The Ascension Wars and Dominions 3 to Linux PowerPC and Conquest of Elysium 3, Dominions 4: Thrones of Ascension to Raspberry Pi. Hyperion Entertainment ported Sin to Linux PowerPC published by Titan Computer and Gorky 17 to Linux PowerPC which later was published by LGP. Runesoft hired Gunnar von Boehn which ported Robin Hood – The Legend of Sherwood to Linux PowerPC. Later Runesoft ported Airline Tycoon Deluxe to Raspberry Pi was running Debian GNU/Linux.
Source ports
Several developers have released the source code to many of their legacy titles, allowing them to be run as native applications on many alternative platforms, including Linux. Examples of games which were ported to Linux this way include Duke Nukem 3D, Shadow Warrior, Rise of the Triad, Ken's Labyrinth, Seven Kingdoms, Warzone 2100, Homeworld, Call to Power II, Wolfenstein 3D, Heretic, Hexen, Hexen II, Aliens versus Predator, Descent, Descent II and Freespace 2. Several game titles that were previously released for Linux were also able to be expanded or updated because of the availability of game code, including Doom, Abuse, Quake, Quake II, Quake III Arena and Jagged Alliance 2. Some derivatives based on released source code have also been released for Linux, such as Aleph One and Micropolis for Marathon 2: Durandal and SimCity respectively.
Certain game titles were even able to be ported due to availability of shared engine code even though the game's code itself remains proprietary or otherwise unavailable, such as the video game Strife or the multiplayer component of Star Trek: Voyager – Elite Force. Some games have even been ported entirely or partially by reverse engineering and game engine recreation such as WarCraft II through Wargus or Commander Keen. Another trick is to attempt hacking the game to work as a mod on another native title, such as with the original Unreal. Additionally, some games can be run through the use of Linux specific runtime environments, such as the case of certain games made with Adventure Game Studio such as the Chzo Mythos or certain titles made with the RPG Maker tool. Games derived from released code, with both free and proprietary media, that are released for Linux include Urban Terror, OpenArena, FreeDoom, World of Padman, Nexuiz/Xonotic, War§ow and Excalibur: Morgana's Revenge.
Massively multiplayer online role-playing games
This is a selected list of MMORPGs that are native on Linux:
A Tale in the Desert III (2003, eGenesis) – A trading and crafting game, set in ancient Egypt, pay-to-play.
Crossfire (1992) – A medieval fantasy 2D game.
Diaspora (1999, Altitude Productions) – 2D Space trading MMORPG. (Project Diaspora version has a Linux client.)
Dofus (2005, Ankama Games) – A 2D fantasy MMORPG.
Eternal Lands (2003, Radu Privantu) – A 3D fantasy free-to-play MMORPG.
PlaneShift – A free 3D fantasy game.
Regnum Online – A 3D fantasy game, free-to-play with premium content.
RuneScape – Java fantasy 3rd person MMORPG.
Salem – An isometric, 3D fantasy game with a focus on crafting and permadeath.
Shroud of the Avatar – An isometric, 3D fantasy game and the spiritual successor to Ultima Online.
Spiral Knights – Java fantasy 3rd person game.
The Saga of Ryzom – has a Linux client and source code available.
Tibia – A 2D Medieval fantasy MMORPG game. Free-to-play with premium content. One of the oldest MMORPG, created January 1997. With Official Linux client.
Ultima Online has an unofficial Linux client.
Vendetta Online – A 3D spacecraft MMOFPS with growing RPG elements, pay to play. Maintains both Linux/32 and Linux/64 clients.
WorldForge – A game engine. There are Linux clients available.
Wyvern – A 2D fantasy MMORPG that runs on Java.
Yohoho! Puzzle Pirates – A puzzle game which runs on Java.
Many Virtual Worlds – (such as Second Life) also have Linux clients.
Types of Linux gaming
Libre gaming
Libre gaming is a form of Linux gaming that emphasizes libre software.
See also
Directories and lists
Free Software Directory
List of emulators
List of open source games
List of video game console emulators
Linux gaming software
Direct3D (alternative implementation)
Lutris
Proton (software)
Vulkan (API)
Wine (software)
Other articles
Linux for PlayStation 2
Sega Lindbergh
References
Gaming
Video game platforms |
9248668 | https://en.wikipedia.org/wiki/MacScan | MacScan | MacScan is anti-malware software for macOS developed by SecureMac.
Features
SecureMac runs on Apple macOS. It scans for and removes malware (including spyware, Trojan horses, keystroke loggers, and tracking cookies). It also scans for remote administration programs, like Apple Remote Desktop, allowing users to verify that such programs are installed only with their authorization.
The full version is available as shareware.
Unlike other anti-malware applications available for Mac OS X (and other systems), MacScan scans exclusively for malware that affects Macs, as opposed to scanning for all forms of known threats, which would include Windows malware. Given that there is considerably less macOS malware than Windows-based malware, MacScan's definition files are smaller and more optimized.
See also
List of Macintosh software
References
External links
'Review: MacScan 3', Macworld, May 3, 2016
'Review: MacScan 2.9.4', CNET Editor Review, February 26, 2013
'Review: MacScan 2.6', Softpedia, February 15, 2009
'Review: MacScan 2.6', Brighthub, January 15, 2009
'6 hot Macworld apps for business', CNN Money, January 7, 2009
'Review: MacScan 2.1', Macworld, July 17, 2006
'Review: MacScan 2.5', Laptop Magazine, April 22, 2008
'Review MacScan 2', MacWorld UK, May 14, 2008
'MacScan and Your Trojan Lesson', Apple Matters, April 27, 2006
MacOS security software
Spyware removal |
44503836 | https://en.wikipedia.org/wiki/Denuvo | Denuvo | Denuvo Anti-Tamper is an anti-tamper technology and digital rights management (DRM) scheme developed by Austrian software company Denuvo Software Solutions GmbH, a subsidiary of Irdeto. The company also developed an anti-cheat counterpart.
History
Denuvo is developed by Denuvo Software Solutions GmbH, a software company based in Salzburg, Austria. The company was formed through a management buyout of DigitalWorks, the arm of the Sony Digital Audio Disc Corporation that developed the SecuROM DRM technology. It originally employed 45 people. In January 2018, the company was acquired by larger software company Irdeto. Development of the Denuvo software started in 2014. FIFA 15, released in September 2014, was the first game to use Denuvo.
3DM, a Chinese warez group, first claimed to have breached Denuvo's technology in a blog post published on 1 December 2014, wherein they announced that they would release cracked versions of Denuvo-protected games FIFA 15, Dragon Age: Inquisition and Lords of the Fallen. Following onto this, 3DM released the version of Dragon Age: Inquisition about two weeks after that game had shipped. The overall cracking progress took about a month, an unusually long time in the game cracking scene. When asked about this development, Denuvo Software Solutions acknowledged that "every protected game eventually gets cracked". However, technology website Ars Technica noted that most sales for major games happen within 30 days of release, and so publishers may consider Denuvo a success if it meant a game took significantly longer to be cracked. In January 2016, 3DM's founder, Bird Sister, revealed that they were to give up on trying to break the Denuvo implementation for Just Cause 3, and warned that, due to the ongoing trend for the implementation, there would be "no free games to play in the world" in the near future. Subsequently, 3DM opted to not crack any games for one year to examine whether such a move would have any influence on game sales.
By October 2017, crackers were able to bypass Denuvo's protection within hours of a game's release, with notable examples being South Park: The Fractured but Whole, Middle-earth: Shadow of War, Total War: Warhammer 2 and FIFA 18, all being cracked on their release dates. In another notable case, Assassin's Creed Origins, which wrapped Denuvo within security tool VMProtect as well as Ubisoft's proprietary DRM used for their Uplay distribution software, had its security features bypassed by Italian collective CPY in February 2018, three months after the game's release. In December 2018, Hitman 2 protection was bypassed three days before its official release date due to exclusive pre-order access, drawing comparisons to Final Fantasy XV, which had its protection removed four days before release.
By 2019, several products like Devil May Cry 5, Metro Exodus, Resident Evil 2, Far Cry New Dawn, Football Manager 2019 and Soul Calibur 6, were cracked within their first week of release, with Ace Combat 7 taking thirteen days. In the case of Rage 2, which was released on Steam as well as Bethesda Softworks' own Bethesda Launcher, the Steam version was protected by Denuvo, whereas the Bethesda Launcher version was not, leading to the game being cracked immediately, and Denuvo being removed from the Steam release two days later.
A sister product, Denuvo Anti-Cheat, was announced in March 2019, and first used with Doom Eternal following a patch on 14 May 2020. However, less than a week later Doom developer id Software announced they would be removing it from the game following negative response from players.
Technology
Games protected by Denuvo require an online activation. The software uses a "64-bit encryption machine". Denuvo's marketing director, Thomas Goebl, stated that some console-exclusive games get PC releases due to this technology.
Criticism
Denuvo has been criticised for high central processing unit (CPU) usage and excessive writing operations on storage components, the latter causing significant life-span reductions for solid-state drives (SSDs). Denuvo Software Solutions has denied both claims. In the case of Tekken 7 and Sonic Mania Plus, Denuvo caused a significant decrease in performance in several parts of these games. Sam Machkovech of Ars Technica reviewed in-depth how Denuvo was causing performance penalties, releasing an article on the matter in December 2018. In December 2018, Joel Hruska of ExtremeTech compared the performance of multiple games with Denuvo enabled and disabled, and found that the games tested had significantly higher frame rates and lower loading times when Denuvo was not used. Richard Leadbetter of Digital Foundry compared the performance of a pirated version of Resident Evil Village which had stripped out Denuvo and Capcom's additional copy protection against the release version for Windows, and found that the DRM-stripped version performed far better than the released game. It has been confirmed that the stuttering was caused by CAPCOM's DRM and not by Denuvo.
In July 2018, Denuvo Software Solutions filed a lawsuit against Voksi, a 21-year-old Bulgarian hacker who had cracked several Denuvo-protected games. Voksi was arrested by Bulgarian authorities, and his website, Revolt, was taken offline.
In May 2020, Kaspersky Anti-Virus detected the now removed Denuvo implementation in Doom Eternal as malware, possibly due to its kernel-level access.
In November 2021, outrage occurred when many recent games were rendered unplayable due to a Denuvo owned domain name expiring. The same month it was discovered that many Denuvo games may not work with Intel 12th Gen Alder Lake CPUs. However, as of January 12 2022, the Alder Lake incompatibility issue has been addressed, bringing the list of 90 incompatible titles down to zero.
References
External links
2014 software
DRM for Windows
Proprietary software
Video game controversies |
58408632 | https://en.wikipedia.org/wiki/Cyber%20Security%20Collaborative%20Research%20Alliance | Cyber Security Collaborative Research Alliance | Cyber Security Collaborative Research Alliance (CSCRA) was a research program initiated and sponsored by the US Army Research Laboratory (ARL). The objective of the program was “to develop a fundamental understanding of cyber phenomena, including aspects of human attackers, cyber defenders, and end users, so that fundamental laws, theories, and theoretically grounded and empirically validated models can be applied to a broad range of Army domains, applications, and environments.”
Collaborative Technology and Research Alliances is a term for partnerships between Army laboratories and centers, private industry and academia for performing research and technology development intended to benefit the US Army. The partnerships are funded by the US Army.
History
Since approximately 1992, ARL formed a number of partnerships that involved the triad of industry, academia and government. One of them was the Cyber Security Collaborative Research Alliance (CSCRA) which was awarded on September 20, 2013. The program was expected to be completed in September 2022.
Objectives
Recognizing the need to address the growing threat of attacks on its cyber networks, the U.S. Army launched CSCRA. The alliance conducted research to advance the theoretical foundations of cyber science in the context of U.S. Army networks. According to the Army, research into cybersecurity is critical due to “the growing number and sophistication of attacks on military cyber networks coupled with the ever-increasing reliance on cyber systems to conduct the Army’s mission.” The ultimate goal of this research was the rapid development of cyber tools that could be used to dynamically assess cyber risks, detect hostile activities on friendly networks, and support agile maneuvers in cyber space in addressing novel threats.
Objectives of CSCRA included development of the following:
Fundamental understanding of cyber phenomena, including human aspects
Laws, theories, and theoretically grounded and empirically validated models
Concepts applicable to a broad array of Army domains, applications, and environments
Research Thrusts
The CSCRA program was organized around several research thrusts, including the following:
Risk, Detection, Agility
Participants
The research under this program was performed collaboratively by scientists of the US Army Research Laboratory and by scientists and engineers of the following institutions:
Army Research Laboratory
Pennsylvania State University
Carnegie Mellon University
Indiana University
University of California at Davis
University of California Riverside
Applied Communication Sciences
Results
Examples of research results developed by the CSCRA program include the following:
Four publicly available datasets generated using a testbed with simulated benign users and a manual attacker. The datasets were created to provide examples of cyber exploitations and aid in the production of reproducible research that addresses cyber security challenges.
An adaptive cyber deception system that provided a virtual network view to each host in an enterprise network, capable of detecting malicious activities resulting from intrusions and probing.
A common vocabulary and context for Cyber-Physical Systems (CPS) intended to support research, assessment and responses to threats in this area.
A finding that face-to-face interactions deter the success of cybersecurity teams. High-performing teams relied on leadership and functional specialization.
References
Military research
Computer security organizations
Cyberwarfare in the United States
2013 establishments in the United States |
3459930 | https://en.wikipedia.org/wiki/Saarland%20University | Saarland University | Saarland University (, ) is a public research university located in Saarbrücken, the capital of the German state of Saarland. It was founded in 1948 in Homburg in co-operation with France and is organized in six faculties that cover all major fields of science. In 2007, the university was recognized as an excellence center for computer science in Germany.
Thanks to bilingual German and French staff, the University has an international profile, which has been underlined by its proclamation as "European University" in 1950 and by establishment of Europa-Institut as its "crown and symbol" in 1951.
Nine academics have been honored with the highest German research prize, the Gottfried Wilhelm Leibniz Prize, while working at Saarland University.
History
Saarland University, the first to be established after World War II, was founded in November 1948 with the support of the French Government and under the auspices of the University of Nancy.
At the time the Saarland found itself in the special situation of being partly autonomous and linked to France by economic and monetary union. With its combination of the German and French educational traditions and the dual languages of instruction, the university had a European perspective right from the start. Prior to the foundation of the university, clinical training courses for medical students at the state hospital, Saarland University Hospital, in Homburg, Saarland, had been introduced in January 1946 and the "Centre Universitaire d'Etudes Supérieures de Hombourg" established on 8 May 1947 under the patronage of the University of Nancy. Students in certain disciplines can obtain degree certificates from both universities.
The first president of the independent university in 1948 was Jean Barriol. In the same year the university introduced the first courses in law, philosophy and languages.
In the 1950s Saarland University joined the Association of West-German Universities and accepted a new, more centralized organizational structure, and the Europa-Institut is established as a European politics and law think tank.
Organization and administration
The university is headed by a board, which includes a president and five vice presidents, responsible for planning and strategy, research and technology transfer, education, and administration and finance, respectively. The president is elected by both the senate and the council in separate votes.
The senate, consisting of nine professors, three students, three academic and two administrative staff members, acts as the legislative branch. Further, the university has a council which makes strategic decisions, allocates funding, and supervises the board. The council's members are representatives of private companies and academic institutions including other universities, in addition to representatives of the university's professors, staff members, and students.
The university is divided into six faculties:
Faculty of Human and Business Sciences
Faculty of Medicine
Faculty of Mathematics and Computer Science
Faculty of Natural Sciences and Technology
Faculty of Humanities
Faculty of Law
Academic profile
Research
Saarland University is known for research in Computer Science, nano technology, medicine, European relations, politics and law. The university campus and the surrounding area is home to several specialized research institutes, affiliated with various high-profile independent research societies and private companies, focused on primary and applied research.
Max Planck Institute for Computer Science
Max Planck Institute for Software Systems
German Research Centre for Artificial Intelligence - DFKI
CISPA – Helmholtz Center for Information Security
Dagstuhl, the Leibniz Center for Informatics
Fraunhofer IZFP
Fraunhofer Institute for Biomedical Engineering
Society for Environmentally Compatible Process Technology
Institut of the society for the promotion of the applied information research
Leibniz-Institute for New Materials INM
KIST - Korea Institute of Science and Technology Europe Research Society.
Intel Visual Computing Institute
Centre for Bio-informatics Saar
Institute for Formal Ontology and Medical Information Science - IFOMIS
HIPS – Helmholtz Institute for Pharmaceutical Research Saarland
The university science park provides a startup incubator and a technology/research transfer environment for companies mostly focused on IT, nanotechnology and biotechnology.
Education
With its numerous degree programmes and the variety of final qualifications offered (Diplom, Magister, Ph.D., state examinations and, increasingly, bachelor and master qualifications), Saarland University provides the broad spectrum of disciplines typical of a classical universitas litterarum. The more traditional subjects such as business administration and economics, law and medicine are just as much a part of Saarland University as the new degree programmes that have developed from modern interdisciplinary collaborations and which reflect the increasing demand for such qualifications in today's job market. Examples of these new courses include 'Biology with Special Focus on Human Biology and Molecular Biology', 'Bioinformatics /Computational Biology', 'Mechatronics Engineering', 'Micro- and Nanostructured Materials', 'Computer and Communications Technology', 'Historically-oriented Cultural Studies' and 'French Cultural Science and Intercultural Communication'.
Integrated degree courses, which can lead to the award of a joint degree, are organized by Saarland University and foreign partner universities in the fields of business administration, physics, chemistry, materials science and in the interdisciplinary programme 'Cross-border Franco-German Studies'. In the area of teacher training, Saarland University offers an integrated bilingual (French-German) course for prospective teachers of geography and history. A further distinctive feature of Saarland University is the fact that the university is able to award French degrees in subjects such as Droit, Allemand and Lettres modernes. Additional qualifications may also be obtained in numerous postgraduate courses.
The Europa-Institut is among the very few socio-economic research centers to focus primarily on European integration. Its European law and MBA in European management programmes uniquely focus on opportunities emerging from an expanding and more integrated Europe.
The university is also responsible for conducting Computer Science related courses for students enrolled in the graduate programmes of the MPI for Computer Science and MPI for Software Systems. Saarland University is one of the few universities in Germany where the entire master's programme in Computer Science is taught in English.
Cooperation
Saarland University is part of the Software-Cluster, a local association of universities, research institutes and IT companies in Karlsruhe, Darmstadt, Kaiserslautern, Waldorf and Saarbrücken with the purpose of fostering business software development.
Notable people
Leibniz Prize winners
Rolf Müller, Biotechnology (2021)
Joachim Weickert, Digital image processing (2010)
Hans-Peter Seidel, Computer Graphics (2003)
Manfred Pinkal, Computational Linguistics (2000)
Johannes Buchmann, Information Theory (1993)
Michael Veith, Inorganic Chemistry (1991)
Herbert Gleiter, Material Science (1989)
Günter Hotz, Kurt Mehlhorn and Wolfgang Paul, Computer Science (1987)
Alumni
David Bardens (born 1984), Physician
Susanne Albers (born 1965), Scientist
Peter Altmaier (born 1958), Politician (CDU)
Karl-Otto Apel (born 1922), Philosopher
Hans Hermann Hoppe (born 1949), Philosopher and Economist
Peter Bofinger (born 1954), Economist
F. Thomas Bruss (born 1949), Mathematician
Ralf Dahrendorf (1929–2009), Politician
Lars Feld (born 1966), Economist
Jürgen W. Falter (born 1944), Political Scientist
Winfried Hassemer (born 1940), Scientist
Philip Hall (born 1967), British diplomat
Werner Jeanrond (born 1955), Theologian
Alexandra Kertz-Welzel (born 1970), Professor of Music Education at LMU Munich
Reinhard Klimmt (born 1942), Politician (SPD)
Christian Graf von Krockow (1927–2002), Political Scientist and Author
Daniel Kroening, computer scientist
Oskar Lafontaine (born 1943), Politician (Linkspartei)
Wilfried Loth (born 1948), Historian
Werner Maihofer (1918–2009), Lawyer and Politician (FDP)
Alfred Werner Maurer (born 1945), architecte, archéologue, historien de l'art excavation directeur Mumbaqat Syrie
Bernhard Nebel (born 1956), Scientist
August-Wilhelm Scheer (born 1941), Scientist and Entrepreneur
Claus-Peter Schnorr (born 1943), Scientist
Ottmar Schreiner (born 1942), Politician (SPD)
Diana Stöcker (born 1970), Politician (CDU)
Christina Weiss (born 1953), Journalist and Politician
Michael Wolffsohn (born 1947), Historian
Johanna Narten (1930-2019), historical linguist and first woman member of the Bavarian Academy of Sciences and Humanities
Points of interest
The main campus in Saarbrücken is just outside the city, set between picturesque hills. Cycling from the university to the city or short wander in the forest close to campus is a favorite of students and faculty.
Botanischer Garten der Universität des Saarlandes, the university's botanical garden
The Hermann-Neuberger-Sportschule is located next to the campus and hosts the Olympiastützpunkt Rheinland-Pfalz/Saarland that is the Olympic Training Center for Rheinland-Pfalz and Saarland.
There is also a recreation center called Uni-Fit.
University hospital
The University Hospital of the Saarland (in German: Universitätsklinikum des Saarlandes or UKS) is the hospital of the University of Saarland in Homburg, Saarland, Germany.
It is concentrated on a campus south of the city center, with more than 100 clinic buildings scattered across more than 200 hectares of forest. In the course of the project UKS Projekt Zukunft, which was started in 2009, numerous new buildings are being built and the clinics for internal medicine are being combined in a large building complex. Affiliated are the medical faculty of the Saarland University with about 2000 medical students, and a school center with eleven schools for health professions
See also
BALL
Europa-Institut of Saarland University
Hochschule für Musik Saar
Homburg
References
External links
Saarland University Website
Universities and colleges in Saarland
Education in Saarbrücken
Buildings and structures in Saarbrücken
Educational institutions established in 1948
1948 establishments in Germany
1948 establishments in Saar
Universities established in the 1940s |
45679761 | https://en.wikipedia.org/wiki/Rainer%20Waser | Rainer Waser | Rainer Waser (born September 16, 1955, in Frankfurt) is a German professor of Electrical Engineering at RWTH Aachen University. He is also director of the section Electronic Materials at the Peter Grünberg Institute which is located on the campus of Jülich Research Center (Forschungszentrum Jülich). His research and teaching is on solid-state chemistry and defect chemistry to electronic properties and modelling, the technology of new materials and the physical properties of construction components.
Important findings include insights in the functioning of the so-called memristors.
Waser grew up in Heusenstamm near Frankfurt. He studied Physical Chemistry at Darmstadt University of Technology where he received a diploma degree in 1979. Then he went to the University of Southampton to conduct research at the Institute of Electrochemistry. After that he turned to Darmstadt and worked as scientific assistant until he completed his PhD.
Career
Waser joined the Philips research laboratories (research group Electronic Ceramics) at Aachen. In 1992, Waser accepted a Chair for Electronic Materials in the Faculty of Electrical Science and Information Technology at RWTH Aachen University. In 2012, Waser was elected to the post of Speaker of the Department of Electrical Engineering and Information Technology at Aachen university. Waser was awarded the renowned Gottfried Wilhelm Leibniz Prize in 2014.
Awards and honors
A comprehensive list can be found in the cv on the institute's website.
2015 – Honorary doctorate from the University of Silesia in Katowice
2014 – Tsungming-Tu Prize, awarded by the National Science Council in Taiwan (the country’s highest academic distinction which can be bestowed on non-Taiwanese citizens)
2014 – Gottfried Wilhelm Leibniz Prize
2007 – Masao Ikeda Award, Ikeda Memorial Foundation, Kyoto, Japan
2001 – Outstanding Achievements Award, International Symposium on Integrated Ferroelectrics (ISIF)
2000 – Ferroelectrics Recognition Award, IEEE Ultrasonics, Ferroelectrics, and Frequency Control Society
Fellowships and Academy Membership
Fellow of the North-Rhine Westphalian Academy of Sciences, Humanities and the Arts.
Spokesperson of the section Future information technology (FIT) within the Helmholtz-Zentrum Berlin
Other Functions
Executive Advisory Board Member of the journal Advanced Functional Materials
Selected works
External links
Website of the Electronic Materials Research Laboratory (Waser's Institute at RWTH Aachen)
References
1955 births
German electrical engineers
German physical chemists
RWTH Aachen University faculty
Gottfried Wilhelm Leibniz Prize winners
Living people
Technische Universität Darmstadt alumni
Engineers from Frankfurt |
15945217 | https://en.wikipedia.org/wiki/Michael%20M.%20Richter | Michael M. Richter | Michael M. Richter (June 21, 1938 – July 10, 2020) was a German mathematician and computer scientist. Richter is well known for his career in mathematical logic, in particular non-standard analysis, and in artificial intelligence, in particular in knowledge-based systems and case-based reasoning (CBR, Fallbasiertes Schließen). He is worldwide known as pioneer in case-based reasoning.
Life
Richter was born in Berlin into an educated family: his father was Dr. Paul Kurt Richter, a literary scientist; his grandfather was Dr. Carl Greiff, a medical scientist (in 1940, Greiff published a 544 pages book called Diabetes-Probleme with the publisher Johann Ambrosius Barth). Richter studied mathematics 1959–1965 at the University of Münster and the University of Freiburg, where he completed his Ph.D. in Mathematical Logic under the supervision of Walter Felscher and he obtained his Habilitation in 1973 in Mathematics at the University of Tübingen. After teaching at the University of Texas at Austin, he was Professor for Mathematics at the RWTH Aachen from 1975 to 1986. In 1986, he accepted a chair for Computer Science at the University of Kaiserslautern where he taught until his retirement in 2003.
During his academic career, he held visiting positions at Austin, Florianópolis and Calgary; he was also teaching at the University of St. Gallen from 1994 to 2000. Finally, he was Adjunct Professor at the University of Calgary and Visiting Professor at the Universidade Federal de Santa Catarina, Florianópolis, Brazil. He had 65 doctoral students and 296 Masters' students during his career, many of which now hold tenured academic positions in various parts of the world. He is the author of nine books the most recent of which is Case-Based Reasoning: A Textbook published with Springer Verlag.
His son Peter P. Richter (born 1976) is a geologist with a doctoral degree from the University of Mainz, currently employed at the University of Kiel.
He died on 10 July 2020 at the age of 82.
Activities
From 1981 to 1985 Michael Richter was President of the Deutsche Vereinigung für mathematische Logik und für Grundlagenforschung der exakten Wissenschaften (DVMLG). Starting 1987 he was for five years co-initiator and co-chair of an annual series of conferences Logic in Computer Science.
In 1989 Michael Richter became head of the research group Mathematical Logic (until 2004) from the Heidelberg Academy of Sciences (Heidelberger Akademie der Wissenschaften). There he continued and extended the Omega Bibliography, a worldwide unique scientific collection containing all publications in Mathematical Logic since 1889 in classified way.
In Kaiserslautern he was member of the managing committee of two consecutive special research groups of the Deutsche Forschungsgemeinschaft (DFG): Artificial Intelligence and Development of Large Systems with Generic Methods.
In 1988 he was one of the founders of the DFKI at Kaiserslautern, the German Research Center on Artificial Intelligence, the first scientific director and later on head of the Intelligent Engineering Group. He was one of the forerunners in turning static expert systems into flexible assistant systems. An outstanding project was ARC-TEC: Acquisition, Representation and Compilation of Technical Knowledge.
After 1990, his university group was participating in literally all major European projects on Case-Based Reasoning. The most influential project was Highlights of the European INRECA Projects (Inductive Reasoning on Cases), where a basic methodology was developed. In 1993 the group initiated the first European Workshop on Case-Based Reasoning in Kaiserslautern (EWCBR) which was after that a biannual event and complemented by the International Conferences on CBR (ICCBR 2007).
Work
In logic Michael Richter specialized on non-standard analysis where he wrote a monograph and created with his student B. Benninghofen the Theory of Superinfinitesimals. Under the influence of W.W. Bledsoe he became interested in Artificial Intelligence. In Aachen he developed the first and still only program to apply rewrite rules to group theory. In Software Engineering his group concentrated on process modeling. In his group the MILOS-System was developed. It was leading in process modeling and is now substantially extended by Frank Maurer in Calgary to the system MASE. Together with his student Aldo v. Wangenheim he created the Cyclops group, that worked on image understanding, and developed new tools based on configuration system. This research gave now rise to various applications and is heavily continued in Florianópolis, Brazil. Around 1990 Michael Richter started to work on Case-Based Reasoning. Initially, it was an extension of the work on technical expert systems. He introduced several basic concepts and views in CBR. A very influential one was the notion knowledge containers. It is basic for building and maintaining CBR systems. He made several important and systematic contributions to the notion of similarity. These include the relation of similarity measures to general concepts of uncertainty and the knowledge contained in similarity measures. On the foundational side his group related similarity to utility and Michael Richter gave a formal semantics of similarity in terms of utilities. Since 1990 Michael Richter was concerned with combining basic research and useful applications. As an example, his group founded tecinno company (now empolis) which is a very successful company in “selling CBR and knowledge management“.
Some major publications
Michael M. Richter has written numerous publications in Mathematics, General Computer Science, Artificial Intelligence, Medical Informatics and Operations Research. He has written and/or edited 25 books. Some influential publications are:
Michael M. Richter: Logikkalküle. Teubner Studienbücher Informatik (Leitfäden der angewandten Mathematik und Mechanik). Stuttgart 1978, p. 232
Michael M. Richter: Ideale Punkte, Monaden und Nichtstandardmethoden. Vieweg-Verlag, Wiesbaden 1982, p. 269
B. Benninghofen, Michael M. Richter: A general theory of superinfinitesimals. Fundamenta Mathematicae 128 (1987), pp. 199–215.
The Knuth-Bendix Completion Procedure, the Growth Function and Polycyclic Groups. In: Proc. Logic Colloquium ’86, ed. F. Drake, J. Truss, North-Holland Publ. Co. pp. 261–275.
B. Benninghofen, S. Kemmerich, Michael M. Richter: Systems of Reductions. SLN in Computer Science 277 (1987); 265 + VII p.
Michael M. Richter: Prinzipen der Künstlichen Intelligenz. Teubner Studienbücher Informatik, Stuttgart 1989, p. 355
Michael M. Richter: Prinzipen der Künstlichen Intelligenz (2nd Edition). Teubner Studienbücher Informatik, Stuttgart 1991, p. 355
Michael M. Richter, S. Wess: Similarity, Uncertainty and Case-Based Reasoning in PATDEX. In: R. S. Boyer (Ed.), Automated Reasoning, Essays in Honor of Woody Bledsoe, Kluwer Academic Publishers, 1991.
T. Pfeifer, Michael M. Richter: Diagnose von Technischen Systemen. Deutscher Universitätsverlag 1993
Recent Developments in Case-Based Reasoning: Improvements of Similarity Measures. In: New Approaches in Classification and Data Analysis, ed. E. Diday, Y. Lechevallier, M. Schader, P. Bertrand, B. Burtschy, Springer Verlag 1994, S. 594-601.
R. Kühn, R. Menzel, W. Menzel, U. Ratsch, Michael M. Richter, I. O. Stamatescu: Adaptivity and Learning: An Interdisciplinary Debate. Springer Verlag, 2003
Michael M. Richter, Agnar Aamodt: Case-based reasoning foundations. Knowledge Engineering Review, 20:3 Cambridge University Press, p. 203-207 (2006).
Foundations of Similarity and Utility. Proc. Flairs 07, AAAI Press
Similarity. In: Case-Based Reasoning for Signals and Imaging, ed. Petra Perner, Springer Verlag 2007, pp. 25–90.
Michael M. Richter, Rosina Weber: Case-Based Reasoning. A Textbook. Springer Verlag 2013, p. 546
References
External links
https://web.archive.org/web/20050316020146/http://wwwagse.informatik.uni-kl.de/research/sfb501/a2/pubs.html
https://web.archive.org/web/20160303235536/http://www.dfki.de/web/research/km/publications/base_view?pubid=2115
DOI.org
1938 births
2020 deaths
20th-century German mathematicians
German computer scientists
21st-century German mathematicians
Technical University of Kaiserslautern faculty
RWTH Aachen University faculty
University of Freiburg alumni
University of Münster alumni
University of Texas at Austin faculty
University of Tübingen alumni |
66873835 | https://en.wikipedia.org/wiki/Tahj%20Eaddy | Tahj Eaddy | Tahj Eaddy (born July 5, 1996) is an American professional basketball player for Labas GAS Prienai of the Lithuanian Basketball League. He played college basketball for the Southeast Missouri State Redhawks, the Santa Clara Broncos, and the USC Trojans.
High school career
Eaddy played basketball at Notre Dame High School in West Haven, Connecticut for three years. He moved to Tennessee Preparatory Academy in Memphis, Tennessee for his senior year. Eaddy averaged 24.6 points and was named MVP of the National Association of Christian Athletes Elite Division. He attended The Skill Factory in Atlanta, Georgia for a prep year. He committed to playing college basketball for Southeast Missouri State.
College career
As a freshman at Southeast Missouri State, Eaddy averaged 7.5 points and shot a team-high 42 percent from three-point range. For his sophomore season, he transferred to Santa Clara. After sitting out for one year due to NCAA transfer rules, Eaddy averaged 15 points and 3.2 assists per game, earning Second Team All-West Coast Conference honors. He scored a season-high 30 points in a 68–56 win over San Diego on January 3, 2019. During his junior season, Eaddy received less playing time and averaged 9.1 points. He transferred to USC for his senior season as a graduate transfer. On February 13, 2021, Eaddy scored 29 points in a 76–65 win over Washington State. As a senior, Eaddy averaged 13.6 points, 2.9 rebounds and 2.8 assists per game. He was named to the Second Team All-Pac-12. Following the season, he declared for the 2021 NBA draft instead of taking advantage of the NCAA's offer of an additional year of eligibility.
Professional career
After going undrafted in the 2021 NBA draft, Eaddy signed with the Orlando Magic.
Eaddy was drafted with the first pick in the second round by the Raptors 905 in the 2021 NBA G League draft. However he did not make the final roster. On December 17, 2021, Eaddy signed with BC Prienai of the Lithuanian Basketball League.
Career statistics
College
|-
| style="text-align:left;"| 2016–17
| style="text-align:left;"| Southeast Missouri State
| 30 || 12 || 22.2 || .372 || .424 || .925 || 2.3 || 2.3 || .8 || .0 || 7.5
|-
| style="text-align:left;"| 2017–18
| style="text-align:left;"| Santa Clara
| style="text-align:center;" colspan="11"| Redshirt
|-
| style="text-align:left;"| 2018–19
| style="text-align:left;"| Santa Clara
| 31 || 31 || 35.5 || .401 || .379 || .802 || 2.7 || 3.2 || .9 || .0 || 15.0
|-
| style="text-align:left;"| 2019–20
| style="text-align:left;"| Santa Clara
| 33 || 14 || 25.2 || .407 || .333 || .830 || 2.1 || 2.1 || .9 || .0 || 9.1
|-
| style="text-align:left;"| 2020–21
| style="text-align:left;"| USC
| 33 || 32 || 32.3 || .448 || .388 || .776 || 2.9 || 2.8 || .6 || .0 || 13.6
|- class="sortbottom"
| style="text-align:center;" colspan="2"| Career
| 127 || 89 || 28.8 || .412 || .378 || .818 || 2.5 || 2.6 || .8 || .0 || 11.3
Personal life
Eaddy is the son of Tanisha Younger-Eaddy and Emery Eaddy. His father played college basketball at Norfolk State.
References
External links
USC Trojans bio
Santa Clara Broncos bio
Southeast Missouri State Redhawks bio
1996 births
Living people
American men's basketball players
American expatriate basketball people in Lithuania
Basketball players from Connecticut
BC Prienai players
People from West Haven, Connecticut
Point guards
Santa Clara Broncos men's basketball players
Shooting guards
Southeast Missouri State Redhawks men's basketball players
USC Trojans men's basketball players |
25852537 | https://en.wikipedia.org/wiki/Computer%20science%20in%20sport | Computer science in sport | Computer science in sport is an interdisciplinary discipline that has its goal in combining the theoretical as well as practical aspects and methods of the areas of informatics and sport science. The main emphasis of the interdisciplinarity is placed on the application and use of computer-based but also mathematical techniques in sport science, aiming in this way at the support and advancement of theory and practice in sports. The reason why computer science has become an important partner for sport science is mainly connected with "the fact that the use of data and media, the design of models, the analysis of systems etc. increasingly requires the support of suitable tools and concepts which are developed and available in computer science".
Historical background
Going back in history, computers in sports were used for the first time in the 1960s, when the main purpose was to accumulate sports information. Databases were created and expanded in order to launch documentation and dissemination of publications like articles or books that contain any kind of knowledge related to sports science. Until the mid-1970s also the first organization in this area called IASI (International Association for Sports Information) was formally established. Congresses and meetings were organized more often with the aim of standardization and rationalization of sports documentation. Since at that time this area was obviously less computer-oriented, specialists talk about sports information rather than sports informatics when mentioning the beginning of this field of science.
Based on the progress of computer science and the invention of more powerful computer hardware in the 1970s, also the real history of computer science in sport began. This was as well the first time when this term was officially used and the initiation of a very important evolution in sports science.
In the early stages of this area statistics on biomechanical data, like different kinds of forces or rates, played a major role. Scientists started to analyze sports games by collecting and looking at such values and features in order to interpret them. Later on, with the continuous improvement of computer hardware - in particular microprocessor speed – many new scientific and computing paradigms were introduced, which were also integrated in computer science in sport. Specific examples are modeling as well as simulation, but also pattern recognition, design, and (sports) data mining.
As another result of this development, the term 'computer science in sport' has been added in the encyclopedia of sports science in 2004.
Areas of research
The importance and strong influence of computer science as an interdisciplinary partner for sport and sport science is mainly proven by the research activities in computer science in sport. The following IT concepts are thereby of particular interest:
Data acquisition and data processing
Databases and expert systems
Modelling (mathematical, IT based, biomechanical, physiological)
Simulation (interactive, animation etc.)
Presentation
Based on the fields from above, the main areas of research in computer science in sport include amongst others:
Training and coaching
Biomechanics
Sports equipment and technology
Computer-aided applications (software, hardware) in sports
Ubiquitous computing in sports
Multimedia and Internet
Documentation
Education
Research communities
A clear demonstration for the evolution and propagation towards computer science in sport is also the fact that nowadays people do research in this area all over the world. Since the 1990s many new national and international organizations regarding the topic of computer science in sport were established. These associations are regularly organizing congresses and workshops with the aim of dissemination as well as exchange of scientific knowledge and information on all sort of topics regarding the interdisciplinary discipline.
Historical survey
As a first example, in Australia and New Zealand scientists have built up the MathSport group of ANZIAM (Australia and New Zealand Industrial and Applied Mathematics), which since 1992 organizes biennial meetings, initially under the name "Mathematics and Computers in Sport Conferences", and now "MathSport". Main topics are mathematical models and computer applications in sports, as well as coaching and teaching methods based on informatics.
The European community was also among the leading motors of the emergence of the field. Some workshops on this topic were successfully organized in Germany since the late 1980s. In 1997 the first international meeting on computer science in sport was held in Cologne. The main aim was to spread out and share applications, ideas and concepts of the use of computers in sports, which should also make a contribution to the creation of internationalization and thus to boost research work in this area.
Since then, such international symposia took place every two years all over Europe. As the first conferences were a raving success, it was decided to go even further and the foundation of an organization was the logical consequence. This step was accomplished in 2003, when the International Association of Computer Science in Sport (IACSS) was established during the 4th international symposium in Barcelona, when Prof. Jürgen Perl was also chosen as the first president. A few years earlier, the first international e-journal on this topic (International Journal of Computer Science in Sport) was released already. The internationalization is confirmed moreover by the fact that three conferences already took place outside of Europe - in Calgary (Canada) in 2007, Canberra (Australia) in 2009 and Shanghai (China) in 2011. During the symposium in Calgary additionally the president position changed - it has been assigned to Prof. Arnold Baca, who has been re-elected in 2009 and 2011. The following Symposia on Computer Science in Sport took place in Europe again, in Istanbul (Turkey) in 2013 and in Loughborough (UK) in 2015. In 2017 the 11th Symposium of Computer Science in Sport took place in Constance (Germany). During the conference in Istanbul Prof. Martin Lames was elected as president of the IACSS. He was re-elected in 2015, 2017 and 2019.
The 12th International Symposium of Computer Science in Sports was held in Moscow (Russia) from 8 to 10 July 2019: https://iacss2019.ru/
National organizations
In addition to the international associations from above, currently the following national associations on computer science in sport exist (if available, the web addresses are also given):
Austrian Association of Computer Science in Sport - http://www.sportinformatik.at
British Association of Computer Science in Sport and Exercise
Chinese Association of Computer Science in Sport
Croatian Association of Computer Science in Sport
Section Computer Science in Sport of the German Association of Sport Science - http://www.dvs-sportinformatik.de (in German)
Swiss Association of Computer Science in Sport SACSS - http://sacss.org
Indian Federation of Computer Science in Sport - http://www.ifcss.in
Portuguese Association of Computer Science in Sport
Turkish Association of Computer Science in Sport
Russian Association of Computer Science in Sport - https://www.racss.ru/
References
Further reading
Baca, A. (2015). Computer Science in Sport - Research and practice, Routledge.
External links
MathSport - ANZIAM (Australia and New Zealand Industrial and Applied Mathematics)
ECSS (European College of Sport Science)
Havuz Ekipmanları
ISEA (International Sports Engineering Association)
IACSS (International Association of Computer Science in Sport)
Sport
Sports science |
24006727 | https://en.wikipedia.org/wiki/Tyron%20Smith | Tyron Smith | Tyron Jerrar Smith (born December 12, 1990) is an American football offensive tackle for the Dallas Cowboys of the National Football League (NFL). He played college football at USC where he won the Morris Trophy, recognizing the best offensive and defensive linemen on the West Coast, in 2010. Smith was drafted by the Cowboys with the ninth overall pick in the 2011 NFL Draft.
High school career
Smith attended Rancho Verde High School in Moreno Valley, California, where he played on the offensive and defensive line. He earned All-American honors by Parade, SuperPrep, PrepStar, Scout.com, and EA Sports, while also receiving numerous other All-Region honors. As a junior in 2006, he made Cal-Hi Sports All-State Underclass second team, All-CIF Central Division first team, and Riverside Press-Enterprise All-Riverside County second team. Smith played in the 2008 U.S. Army All-American Bowl. Also a standout in track & field at Rancho Verde, Smith notched top-throws of 14.23 meters (46 feet, 7 inches) in the shot put and 46.62 meters (152 feet, 10 inches) in the discus.
Considered a five-star recruit and described as "an amazing right tackle prospect" by Rivals.com, Smith was ranked as the No. 6 offensive tackle prospect. Scout.com, who also viewed Smith as a five-star recruit, listed him as the No. 1 offensive tackle prospect in the nation.
College career
Smith played three seasons with the USC Trojans from 2008 to 2010. As a freshman, he was the backup left offensive tackle. He appeared in 10 games. As a sophomore, he started the first twelve games at right offensive tackle. He earned All-Pac-10 honorable mention and CollegeFootballNews.com Sophomore All-American honorable mention for the 2009 season. As a junior, he appeared in twelve games.
Professional career
2011 NFL Draft
Smith was considered one of the top offensive tackle prospects in the 2011 NFL Draft, along with Gabe Carimi, Anthony Castonzo, and Nate Solder. Selected by the Dallas Cowboys with the ninth overall pick, he was the first offensive lineman drafted in the first round by the Cowboys since Jerry Jones bought the team in 1989, and the highest in franchise history since John Niland went fifth overall in 1966. He signed a four-year, $12.5 million contract.
Dallas Cowboys
2011 season
Entering the league as a 20-year-old rookie, Smith was named a starter at right tackle from the first day of Organized Team Activities, with Doug Free taking over the left tackle spot. His role became even more important after the Cowboys released veteran offensive linemen Marc Colombo, Leonard Davis, Andre Gurode, and Montrae Holland during the preseason. Smith started every game and earned praise for his play, prompting the media to speculate on a possible move to left tackle in the next season. He was named to the NFL All-Rookie Team.
2012 season
Starting the 2012 season, Smith switched to starting left tackle, switching sides on the offensive line with Free. On September 12, Smith was fined $15,000 for a horse-collar tackle he did during the season opener against the New York Giants. However, little celebrated was the fact that the tackle was touchdown-saving coming after an interception, which led to a goal line stand by the Dallas defense. He started 15 games for the Cowboys in the 2012 season.
2013 season
In his third year with the Cowboys, Smith committed just one holding penalty and allowed only one sack in his 16 starts. He was named to the 2014 Pro Bowl on Team Rice. He was ranked 78th by his fellow players on the NFL Top 100 Players of 2014.
2014 season
Smith signed an eight-year, $109 million contract extension with the Cowboys in July, making him the highest-paid offensive lineman in the league at the time. He was widely considered one of the top three offensive tackles in the league, and for his play against the Seattle Seahawks, he became the first offensive lineman in 10 years to be named Offensive Player of the Week. He started all 16 games for the NFL's second ranked rushing offense, while helping DeMarco Murray become the league's rushing leader. He was ranked 36th by his fellow players on the NFL Top 100 Players of 2015.
2015 season
Smith started all 16 games, helped clear the way for the NFL's fourth leading rusher (Darren McFadden) and earned his third Pro Bowl selection. He was ranked 42nd by his fellow players on the NFL Top 100 Players of 2016.
2016 season
Forced to play through nagging injuries throughout the season, Smith helped lead the Cowboys to a 13–3 record, and aided rookie Ezekiel Elliott in becoming the league's leading rusher. Smith was named the first team left tackle for the 2016 All-Pro Team, the second time he carried this honor in his career. He was named to his fourth consecutive Pro Bowl and was named First-team All-Pro, both honors being shared with fellow Cowboy offensive linemen Travis Frederick and Zack Martin. He was ranked 18th by his peers on the NFL Top 100 Players of 2017 as the highest ranked offensive lineman.
2017 season
Smith was named to his fifth straight Pro Bowl alongside guard Zack Martin and center Travis Frederick for the second straight year. Smith's 2017 season was marred by multiple injuries, which included to his knee, back, groin, and hip. He started and played in 13 games. He was placed on injured reserve on December 29, meaning that he would not play in the season finale against the Philadelphia Eagles. He was ranked 39th by his fellow players on the NFL Top 100 Players of 2018.
2018 season
Smith started 13 games at left tackle, missing three with injury, on his way to his sixth straight Pro Bowl. He was ranked 52nd by active NFL players on the NFL Top 100 Players of 2019.
2019 season
Smith started 13 games at left tackle in 2019, alike the 2018 campaign. He earned a solid 76.5 grade from PFF and marked him with 8 penalties, however he only allowed 1 sack. This led to his 7th straight Pro Bowl selection since 2013. He was named #78 on the NFL Top 100 Players of 2020.
2020 season
In 2020, Smith had been bothered by a neck issue spanning within the past years. On October 9, Smith announced that he would forgo the rest of the 2020 season after choosing to have surgery on his neck. He was subsequently placed on the injured reserve.
Personal life
Smith has one son, Jaxson.
References
External links
USC Trojans bio
Dallas Cowboys bio
1990 births
Living people
Sportspeople from Riverside County, California
People from Moreno Valley, California
Players of American football from California
American football offensive tackles
USC Trojans football players
Dallas Cowboys players
Unconferenced Pro Bowl players
National Conference Pro Bowl players |
1609808 | https://en.wikipedia.org/wiki/Data%20quality | Data quality | Data quality refers to the state of qualitative or quantitative pieces of information. There are many definitions of data quality, but data is generally considered high quality if it is "fit for [its] intended uses in operations, decision making and planning". Moreover, data is deemed of high quality if it correctly represents the real-world construct to which it refers. Furthermore, apart from these definitions, as the number of data sources increases, the question of internal data consistency becomes significant, regardless of fitness for use for any particular external purpose. People's views on data quality can often be in disagreement, even when discussing the same set of data used for the same purpose. When this is the case, data governance is used to form agreed upon definitions and standards for data quality. In such cases, data cleansing, including standardization, may be required in order to ensure data quality.
Definitions
Defining data quality in a sentence is difficult due to the many contexts data are used in, as well as the varying perspectives among end users, producers, and custodians of data.
From a consumer perspective, data quality is:
"data that are fit for use by data consumers"
data "meeting or exceeding consumer expectations"
data that "satisfies the requirements of its intended use"
From a business perspective, data quality is:
data that is "'fit for use' in their intended operational, decision-making and other roles" or that exhibits "'conformance to standards' that have been set, so that fitness for use is achieved"
data that "are fit for their intended uses in operations, decision making and planning"
"the capability of data to satisfy the stated business, system, and technical requirements of an enterprise"
From a standards-based perspective, data quality is:
the "degree to which a set of inherent characteristics (quality dimensions) of an object (data) fulfills requirements"
"the usefulness, accuracy, and correctness of data for its application"
Arguably, in all these cases, "data quality" is a comparison of the actual state of a particular set of data to a desired state, with the desired state being typically referred to as "fit for use," "to specification," "meeting consumer expectations," "free of defect," or "meeting requirements." These expectations, specifications, and requirements are usually defined by one or more individuals or groups, standards organizations, laws and regulations, business policies, or software development policies. Drilling down further, those expectations, specifications, and requirements are stated in terms of characteristics or dimensions of the data, such as:
accessibility or availability
accuracy or correctness
comparability
completeness or comprehensiveness
consistency, coherence, or clarity
credibility, reliability, or reputation
flexibility
plausibility
relevance, pertinence, or usefulness
timeliness or latency
uniqueness
validity or reasonableness
A systematic scoping review of the literature suggests that data quality dimensions and methods with real world data are not consistent in the literature, and as a result quality assessments are challenging due to the complex and heterogeneous nature of these data.
In 2021, the work group Data Quality of DAMA Netherlands has carried out research into definitions of dimensions of data quality. It has collected definitions from various sources and compared them with each other. The working group also tested the definitions against criteria derived from a standard for concepts and definitions: ISO 704. The results is a list of 60 dimensions of data quality and its definitions.
History
Before the rise of the inexpensive computer data storage, massive mainframe computers were used to maintain name and address data for delivery services. This was so that mail could be properly routed to its destination. The mainframes used business rules to correct common misspellings and typographical errors in name and address data, as well as to track customers who had moved, died, gone to prison, married, divorced, or experienced other life-changing events. Government agencies began to make postal data available to a few service companies to cross-reference customer data with the National Change of Address registry (NCOA). This technology saved large companies millions of dollars in comparison to manual correction of customer data. Large companies saved on postage, as bills and direct marketing materials made their way to the intended customer more accurately. Initially sold as a service, data quality moved inside the walls of corporations, as low-cost and powerful server technology became available.
Companies with an emphasis on marketing often focused their quality efforts on name and address information, but data quality is recognized as an important property of all types of data. Principles of data quality can be applied to supply chain data, transactional data, and nearly every other category of data found. For example, making supply chain data conform to a certain standard has value to an organization by: 1) avoiding overstocking of similar but slightly different stock; 2) avoiding false stock-out; 3) improving the understanding of vendor purchases to negotiate volume discounts; and 4) avoiding logistics costs in stocking and shipping parts across a large organization.
For companies with significant research efforts, data quality can include developing protocols for research methods, reducing measurement error, bounds checking of data, cross tabulation, modeling and outlier detection, verifying data integrity, etc.
Overview
There are a number of theoretical frameworks for understanding data quality. A systems-theoretical approach influenced by American pragmatism expands the definition of data quality to include information quality, and emphasizes the inclusiveness of the fundamental dimensions of accuracy and precision on the basis of the theory of science (Ivanov, 1972). One framework, dubbed "Zero Defect Data" (Hansen, 1991) adapts the principles of statistical process control to data quality. Another framework seeks to integrate the product perspective (conformance to specifications) and the service perspective (meeting consumers' expectations) (Kahn et al. 2002). Another framework is based in semiotics to evaluate the quality of the form, meaning and use of the data (Price and Shanks, 2004). One highly theoretical approach analyzes the ontological nature of information systems to define data quality rigorously (Wand and Wang, 1996).
A considerable amount of data quality research involves investigating and describing various categories of desirable attributes (or dimensions) of data. Nearly 200 such terms have been identified and there is little agreement in their nature (are these concepts, goals or criteria?), their definitions or measures (Wang et al., 1993). Software engineers may recognize this as a similar problem to "ilities".
MIT has an Information Quality (MITIQ) Program, led by Professor Richard Wang, which produces a large number of publications and hosts a significant international conference in this field (International Conference on Information Quality, ICIQ). This program grew out of the work done by Hansen on the "Zero Defect Data" framework (Hansen, 1991).
In practice, data quality is a concern for professionals involved with a wide range of information systems, ranging from data warehousing and business intelligence to customer relationship management and supply chain management. One industry study estimated the total cost to the U.S. economy of data quality problems at over U.S. $600 billion per annum (Eckerson, 2002). Incorrect data – which includes invalid and outdated information – can originate from different data sources – through data entry, or data migration and conversion projects.
In 2002, the USPS and PricewaterhouseCoopers released a report stating that 23.6 percent of all U.S. mail sent is incorrectly addressed.
One reason contact data becomes stale very quickly in the average database – more than 45 million Americans change their address every year.
In fact, the problem is such a concern that companies are beginning to set up a data governance team whose sole role in the corporation is to be responsible for data quality. In some organizations, this data governance function has been established as part of a larger Regulatory Compliance function - a recognition of the importance of Data/Information Quality to organizations.
Problems with data quality don't only arise from incorrect data; inconsistent data is a problem as well. Eliminating data shadow systems and centralizing data in a warehouse is one of the initiatives a company can take to ensure data consistency.
Enterprises, scientists, and researchers are starting to participate within data curation communities to improve the quality of their common data.
The market is going some way to providing data quality assurance. A number of vendors make tools for analyzing and repairing poor quality data in situ, service providers can clean the data on a contract basis and consultants can advise on fixing processes or systems to avoid data quality problems in the first place. Most data quality tools offer a series of tools for improving data, which may include some or all of the following:
Data profiling - initially assessing the data to understand its current state, often including value distributions
Data standardization - a business rules engine that ensures that data conforms to standards
Geocoding - for name and address data. Corrects data to U.S. and Worldwide geographic standards
Matching or Linking - a way to compare data so that similar, but slightly different records can be aligned. Matching may use "fuzzy logic" to find duplicates in the data. It often recognizes that "Bob" and "Bbo" may be the same individual. It might be able to manage "householding", or finding links between spouses at the same address, for example. Finally, it often can build a "best of breed" record, taking the best components from multiple data sources and building a single super-record.
Monitoring - keeping track of data quality over time and reporting variations in the quality of data. Software can also auto-correct the variations based on pre-defined business rules.
Batch and Real time - Once the data is initially cleansed (batch), companies often want to build the processes into enterprise applications to keep it clean.
There are several well-known authors and self-styled experts, with Larry English perhaps the most popular guru. In addition, IQ International - the International Association for Information and Data Quality was established in 2004 to provide a focal point for professionals and researchers in this field.
ISO 8000 is an international standard for data quality.
Data quality assurance
Data quality assurance is the process of data profiling to discover inconsistencies and other anomalies in the data, as well as performing data cleansing activities (e.g. removing outliers, missing data interpolation) to improve the data quality.
These activities can be undertaken as part of data warehousing or as part of the database administration of an existing piece of application software.
Data quality control
Data quality control is the process of controlling the usage of data for an application or a process. This process is performed both before and after a Data Quality Assurance (QA) process, which consists of discovery of data inconsistency and correction.
Before:
Restricts inputs
After QA process the following statistics are gathered to guide the Quality Control (QC) process:
Severity of inconsistency
Incompleteness
Accuracy
Precision
Missing / Unknown
The Data QC process uses the information from the QA process to decide to use the data for analysis or in an application or business process. General example: if a Data QC process finds that the data contains too many errors or inconsistencies, then it prevents that data from being used for its intended process which could cause disruption. Specific example: providing invalid measurements from several sensors to the automatic pilot feature on an aircraft could cause it to crash. Thus, establishing a QC process provides data usage protection.
Optimum use of data quality
Data Quality (DQ) is a niche area required for the integrity of the data management by covering gaps of data issues. This is one of the key functions that aid data governance by monitoring data to find exceptions undiscovered by current data management operations. Data Quality checks may be defined at attribute level to have full control on its remediation steps.
DQ checks and business rules may easily overlap if an organization is not attentive of its DQ scope. Business teams should understand the DQ scope thoroughly in order to avoid overlap. Data quality checks are redundant if business logic covers the same functionality and fulfills the same purpose as DQ. The DQ scope of an organization should be defined in DQ strategy and well implemented. Some data quality checks may be translated into business rules after repeated instances of exceptions in the past.
Below are a few areas of data flows that may need perennial DQ checks:
Completeness and precision DQ checks on all data may be performed at the point of entry for each mandatory attribute from each source system. Few attribute values are created way after the initial creation of the transaction; in such cases, administering these checks becomes tricky and should be done immediately after the defined event of that attribute's source and the transaction's other core attribute conditions are met.
All data having attributes referring to Reference Data in the organization may be validated against the set of well-defined valid values of Reference Data to discover new or discrepant values through the validity DQ check. Results may be used to update Reference Data administered under Master Data Management (MDM).
All data sourced from a third party to organization's internal teams may undergo accuracy (DQ) check against the third party data. These DQ check results are valuable when administered on data that made multiple hops after the point of entry of that data but before that data becomes authorized or stored for enterprise intelligence.
All data columns that refer to Master Data may be validated for its consistency check. A DQ check administered on the data at the point of entry discovers new data for the MDM process, but a DQ check administered after the point of entry discovers the failure (not exceptions) of consistency.
As data transforms, multiple timestamps and the positions of that timestamps are captured and may be compared against each other and its leeway to validate its value, decay, operational significance against a defined SLA (service level agreement). This timeliness DQ check can be utilized to decrease data value decay rate and optimize the policies of data movement timeline.
In an organization complex logic is usually segregated into simpler logic across multiple processes. Reasonableness DQ checks on such complex logic yielding to a logical result within a specific range of values or static interrelationships (aggregated business rules) may be validated to discover complicated but crucial business processes and outliers of the data, its drift from BAU (business as usual) expectations, and may provide possible exceptions eventually resulting into data issues. This check may be a simple generic aggregation rule engulfed by large chunk of data or it can be a complicated logic on a group of attributes of a transaction pertaining to the core business of the organization. This DQ check requires high degree of business knowledge and acumen. Discovery of reasonableness issues may aid for policy and strategy changes by either business or data governance or both.
Conformity checks and integrity checks need not covered in all business needs, it's strictly under the database architecture's discretion.
There are many places in the data movement where DQ checks may not be required. For instance, DQ check for completeness and precision on not–null columns is redundant for the data sourced from database. Similarly, data should be validated for its accuracy with respect to time when the data is stitched across disparate sources. However, that is a business rule and should not be in the DQ scope.
Regretfully, from a software development perspective, DQ is often seen as a nonfunctional requirement. And as such, key data quality checks/processes are not factored into the final software solution. Within Healthcare, wearable technologies or Body Area Networks, generate large volumes of data. The level of detail required to ensure data quality is extremely high and is often underestimated. This is also true for the vast majority of mHealth apps, EHRs and other health related software solutions. However, some open source tools exist that examine data quality. The primary reason for this, stems from the extra cost involved is added a higher degree of rigor within the software architecture.
Health data security and privacy
The use of mobile devices in health, or mHealth, creates new challenges to health data security and privacy, in ways that directly affect data quality. mHealth is an increasingly important strategy for delivery of health services in low- and middle-income countries. Mobile phones and tablets are used for collection, reporting, and analysis of data in near real time. However, these mobile devices are commonly used for personal activities, as well, leaving them more vulnerable to security risks that could lead to data breaches. Without proper security safeguards, this personal use could jeopardize the quality, security, and confidentiality of health data.
Data quality in public health
Data quality has become a major focus of public health programs in recent years, especially as demand for accountability increases. Work towards ambitious goals related to the fight against diseases such as AIDS, Tuberculosis, and Malaria must be predicated on strong Monitoring and Evaluation systems that produce quality data related to program implementation. These programs, and program auditors, increasingly seek tools to standardize and streamline the process of determining the quality of data, verify the quality of reported data, and assess the underlying data management and reporting systems for indicators. An example is WHO and MEASURE Evaluation's Data Quality Review Tool WHO, the Global Fund, GAVI, and MEASURE Evaluation have collaborated to produce a harmonized approach to data quality assurance across different diseases and programs.
Open data quality
There are a number of scientific works devoted to the analysis of the data quality in open data sources, such as Wikipedia, Wikidata, DBpedia and other. In the case of Wikipedia, quality analysis may relate to the whole article Modeling of quality there is carried out by means of various methods. Some of them use machine learning algorithms, including Random Forest, Support Vector Machine, and others. Methods for assessing data quality in Wikidata, DBpedia and other LOD sources differ.
Professional associations
IQ International—the International Association for Information and Data Quality
IQ International is a not-for-profit, vendor neutral, professional association formed in 2004, dedicated to building the information and data quality profession.
ECCMA (Electronic Commerce Code Management Association)
The Electronic Commerce Code Management Association (ECCMA) is a member-based, international not-for-profit association committed to improving data quality through the implementation of international standards. ECCMA is the current project leader for the development of ISO 8000 and ISO 22745, which are the international standards for data quality and the exchange of material and service master data, respectively. ECCMA provides a platform for collaboration amongst subject experts on data quality and data governance around the world to build and maintain global, open standard dictionaries that are used to unambiguously label information. The existence of these dictionaries of labels allows information to be passed from one computer system to another without losing meaning.
See also
Data validation
Record linkage
Information quality
Master data management
Data governance
Database normalization
Data visualization
Data Analysis
Clinical data management
References
Further reading
Baamann, Katharina, "Data Quality Aspects of Revenue Assurance", Article
Eckerson, W. (2002) "Data Warehousing Special Report: Data quality and the bottom line", Article
Ivanov, K. (1972) "Quality-control of information: On the concept of accuracy of information in data banks and in management information systems". The University of Stockholm and The Royal Institute of Technology. Doctoral dissertation.
Hansen, M. (1991) Zero Defect Data, MIT. Masters thesis
Kahn, B., Strong, D., Wang, R. (2002) "Information Quality Benchmarks: Product and Service Performance," Communications of the ACM, April 2002. pp. 184–192. Article
Price, R. and Shanks, G. (2004) A Semiotic Information Quality Framework, Proc. IFIP International Conference on Decision Support Systems (DSS2004): Decision Support in an Uncertain and Complex World, Prato. Article
Redman, T. C. (2008) Data Driven: Profiting From Our Most Important Business Asset
Wand, Y. and Wang, R. (1996) "Anchoring Data Quality Dimensions in Ontological Foundations," Communications of the ACM, November 1996. pp. 86–95. Article
Wang, R., Kon, H. & Madnick, S. (1993), Data Quality Requirements Analysis and Modelling, Ninth International Conference of Data Engineering, Vienna, Austria. Article
Fournel Michel, Accroitre la qualité et la valeur des données de vos clients, éditions Publibook, 2007. .
Daniel F., Casati F., Palpanas T., Chayka O., Cappiello C. (2008) "Enabling Better Decisions through Quality-aware Reports", International Conference on Information Quality (ICIQ), MIT. Article
Jack E. Olson (2003), "Data Quality: The Accuracy dimension", Morgan Kaufmann Publishers
Woodall P., Oberhofer M., and Borek A. (2014), "A Classification of Data Quality Assessment and Improvement Methods". International Journal of Information Quality 3 (4), 298–321. doi:10.1504/ijiq.2014.068656.
Woodall, P., Borek, A., and Parlikad, A. (2013), "Data Quality Assessment: The Hybrid Approach." Information & Management 50 (7), 369–382.
External links
Data quality course, from the Global Health Learning Center
Information science |
6241959 | https://en.wikipedia.org/wiki/Nanocomputer | Nanocomputer | Nanocomputer refers to a computer smaller than the microcomputer, which is smaller than the minicomputer.
Microelectronic components that are at the core of all modern electronic devices employ semiconductor transistors. The term nanocomputer is increasingly used to refer to general computing devices of size comparable to a credit card. Modern Single-Board Computers such as the Raspberry Pi and Gumstix would fall under this classification. Arguably, Smartphones and Tablets would also be classified as nanocomputers.
Future computers with features smaller than 10 nanometers
Die shrink has been more or less continuous since around 1970. A few years later, the 6 μm process allowed the making of desktop computers, known as microcomputers. Moore's Law in the next 40 years brought features 1/100th the size, or ten thousand times as many transistors per square millimeter, putting smartphones in every pocket. Eventually computers will be developed with fundamental parts that are no bigger than a few nanometers.
Nanocomputers might be built in several ways, using mechanical, electronic, biochemical, or quantum nanotechnology. There used to be consensus among hardware developers that it is unlikely that nanocomputers will be made of semiconductor transistors, as they seem to perform significantly less well when shrunk to sizes under 100 nanometers. Neverthelesss developers reduced microprocessor features to 22 nm in April 2012. Moreover, Intel's 5 nanometer technology outlook predicts 5 nm feature size by 2022. The International Technology Roadmap for Semiconductors in the 2010s gave an industrial consensus on feature scaling following Moore's Law. A Silicon-Silicon bond length is 235.2 pm, which means that a 5 nm-width transistor would be 21 silicon atoms wide.
See also
Nanotechnology
Quantum computer
Starseed launcher - Interstellar nanoprobes proposal
References
External links
A spray-on computer is way to do IT
Future Nanocomputer Technologies - Diagram of possible technologies (electronic, organic, mechanical, quantum).
Classes of computers
Nanoelectronics |
57829 | https://en.wikipedia.org/wiki/Keystroke%20logging | Keystroke logging | Keystroke logging, often referred to as keylogging or keyboard capturing, is the action of recording (logging) the keys struck on a keyboard, typically covertly, so that a person using the keyboard is unaware that their actions are being monitored. Data can then be retrieved by the person operating the logging program. A keystroke recorder or keylogger can be either software or hardware.
While the programs themselves are legal, with many designed to allow employers to oversee the use of their computers, keyloggers are most often used for stealing passwords and other confidential information.
Keylogging can also be used to study keystroke dynamics or human-computer interaction. Numerous keylogging methods exist, ranging from hardware and software-based approaches to acoustic cryptanalysis.
Application of keylogger
Software-based keyloggers
A software-based keylogger is a computer program designed to record any input from the keyboard. Keyloggers are used in IT organizations to troubleshoot technical problems with computers and business networks. Families and businesspeople use keyloggers legally to monitor network usage without their users' direct knowledge. Microsoft publicly stated that Windows 10 has a built-in keylogger in its final version "to improve typing and writing services". However, malicious individuals can use keyloggers on public computers to steal passwords or credit card information. Most keyloggers are not stopped by HTTPS encryption because that only protects data in transit between computers; software-based keyloggers run on the affected user's computer, reading keyboard inputs directly as the user types.
From a technical perspective, there are several categories:
Hypervisor-based: The keylogger can theoretically reside in a malware hypervisor running underneath the operating system, which thus remains untouched. It effectively becomes a virtual machine. Blue Pill is a conceptual example.
Kernel-based: A program on the machine obtains root access to hide in the OS and intercepts keystrokes that pass through the kernel. This method is difficult both to write and to combat. Such keyloggers reside at the kernel level, which makes them difficult to detect, especially for user-mode applications that do not have root access. They are frequently implemented as rootkits that subvert the operating system kernel to gain unauthorized access to the hardware. This makes them very powerful. A keylogger using this method can act as a keyboard device driver, for example, and thus gain access to any information typed on the keyboard as it goes to the operating system.
API-based: These keyloggers hook keyboard APIs inside a running application. The keylogger registers keystroke events as if it was a normal piece of the application instead of malware. The keylogger receives an event each time the user presses or releases a key. The keylogger simply records it.
Windows APIs such as GetAsyncKeyState(), GetForegroundWindow(), etc. are used to poll the state of the keyboard or to subscribe to keyboard events. A more recent example simply polls the BIOS for pre-boot authentication PINs that have not been cleared from memory.
Form grabbing based: Form grabbing-based keyloggers log Web form submissions by recording the form data on submit events. This happens when the user completes a form and submits it, usually by clicking a button or pressing enter. This type of keylogger records form data before it is passed over the Internet.
Javascript-based: A malicious script tag is injected into a targeted web page, and listens for key events such as onKeyUp(). Scripts can be injected via a variety of methods, including cross-site scripting, man-in-the-browser, man-in-the-middle, or a compromise of the remote website.
Memory-injection-based: Memory Injection (MitB)-based keyloggers perform their logging function by altering the memory tables associated with the browser and other system functions. By patching the memory tables or injecting directly into memory, this technique can be used by malware authors to bypass Windows UAC (User Account Control). The Zeus and SpyEye trojans use this method exclusively. Non-Windows systems have protection mechanisms that allow access to locally recorded data from a remote location. Remote communication may be achieved when one of these methods is used:
Data is uploaded to a website, database or an FTP server.
Data is periodically emailed to a pre-defined email address.
Data is wirelessly transmitted employing an attached hardware system.
The software enables a remote login to the local machine from the Internet or the local network, for data logs stored on the target machine.
Keystroke logging in writing process research
Since 2006, Keystroke logging has been an established research method for the study of writing processes. Different programs have been developed to collect online process data of writing activities, including Inputlog, Scriptlog, Translog and GGXLog.
Keystroke logging is used legitimately as a suitable research instrument in several writing contexts. These include studies on cognitive writing processes, which include
descriptions of writing strategies; the writing development of children (with and without writing difficulties),
spelling,
first and second language writing, and
specialist skill areas such as translation and subtitling.
Keystroke logging can be used to research writing, specifically. It can also be integrated into educational domains for second language learning, programming skills, and typing skills.
Related features
Software keyloggers may be augmented with features that capture user information without relying on keyboard key presses as the sole input. Some of these features include:
Clipboard logging. Anything that has been copied to the clipboard can be captured by the program.
Screen logging. Screenshots are taken to capture graphics-based information. Applications with screen logging abilities may take screenshots of the whole screen, of just one application, or even just around the mouse cursor. They may take these screenshots periodically or in response to user behaviors (for example, when a user clicks the mouse). Screen logging can be used to capture data inputted with an on-screen keyboard.
Programmatically capturing the text in a control. The Microsoft Windows API allows programs to request the text 'value' in some controls. This means that some passwords may be captured, even if they are hidden behind password masks (usually asterisks).
The recording of every program/folder/window opened including a screenshot of every website visited.
The recording of search engines queries, instant messenger conversations, FTP downloads and other Internet-based activities (including the bandwidth used).
Hardware-based keyloggers
Hardware-based keyloggers do not depend upon any software being installed as they exist at a hardware level in a computer system.
Firmware-based: BIOS-level firmware that handles keyboard events can be modified to record these events as they are processed. Physical and/or root-level access is required to the machine, and the software loaded into the BIOS needs to be created for the specific hardware that it will be running on.
Keyboard hardware: Hardware keyloggers are used for keystroke logging utilizing a hardware circuit that is attached somewhere in between the computer keyboard and the computer, typically inline with the keyboard's cable connector. There are also USB connector-based hardware keyloggers, as well as ones for laptop computers (the Mini-PCI card plugs into the expansion slot of a laptop). More stealthy implementations can be installed or built into standard keyboards so that no device is visible on the external cable. Both types log all keyboard activity to their internal memory, which can be subsequently accessed, for example, by typing in a secret key sequence. Hardware keyloggers do not require any software to be installed on a target user's computer, therefore not interfering with the computer's operation and less likely to be detected by software running on it. However, its physical presence may be detected if, for example, it is installed outside the case as an inline device between the computer and the keyboard. Some of these implementations can be controlled and monitored remotely using a wireless communication standard.
Wireless keyboard and mouse sniffers: These passive sniffers collect packets of data being transferred from a wireless keyboard and its receiver. As encryption may be used to secure the wireless communications between the two devices, this may need to be cracked beforehand if the transmissions are to be read. In some cases, this enables an attacker to type arbitrary commands into a victim's computer.
Keyboard overlays: Criminals have been known to use keyboard overlays on ATMs to capture people's PINs. Each keypress is registered by the keyboard of the ATM as well as the criminal's keypad that is placed over it. The device is designed to look like an integrated part of the machine so that bank customers are unaware of its presence.
Acoustic keyloggers: Acoustic cryptanalysis can be used to monitor the sound created by someone typing on a computer. Each key on the keyboard makes a subtly different acoustic signature when struck. It is then possible to identify which keystroke signature relates to which keyboard character via statistical methods such as frequency analysis. The repetition frequency of similar acoustic keystroke signatures, the timings between different keyboard strokes and other context information such as the probable language in which the user is writing are used in this analysis to map sounds to letters. A fairly long recording (1000 or more keystrokes) is required so that a large enough sample is collected.
Electromagnetic emissions: It is possible to capture the electromagnetic emissions of a wired keyboard from up to away, without being physically wired to it. In 2009, Swiss researchers tested 11 different USB, PS/2 and laptop keyboards in a semi-anechoic chamber and found them all vulnerable, primarily because of the prohibitive cost of adding shielding during manufacture. The researchers used a wide-band receiver to tune into the specific frequency of the emissions radiated from the keyboards.
Optical surveillance: Optical surveillance, while not a keylogger in the classical sense, is nonetheless an approach that can be used to capture passwords or PINs. A strategically placed camera, such as a hidden surveillance camera at an ATM, can allow a criminal to watch a PIN or password being entered.
Physical evidence: For a keypad that is used only to enter a security code, the keys which are in actual use will have evidence of use from many fingerprints. A passcode of four digits, if the four digits in question are known, is reduced from 10,000 possibilities to just 24 possibilities (104 versus 4! [factorial of 4]). These could then be used on separate occasions for a manual "brute force attack".
Smartphone sensors: Researchers have demonstrated that it is possible to capture the keystrokes of nearby computer keyboards using only the commodity accelerometer found in smartphones. The attack is made possible by placing a smartphone near a keyboard on the same desk. The smartphone's accelerometer can then detect the vibrations created by typing on the keyboard and then translate this raw accelerometer signal into readable sentences with as much as 80 percent accuracy. The technique involves working through probability by detecting pairs of keystrokes, rather than individual keys. It models "keyboard events" in pairs and then works out whether the pair of keys pressed is on the left or the right side of the keyboard and whether they are close together or far apart on the QWERTY keyboard. Once it has worked this out, it compares the results to a preloaded dictionary where each word has been broken down in the same way. Similar techniques have also been shown to be effective at capturing keystrokes on touchscreen keyboards while in some cases, in combination with gyroscope or with the ambient-light sensor.
Body keyloggers: Body keyloggers track and analyze body movements to determine which keys were pressed. The attacker needs to be familiar with the keys layout of the tracked keyboard to correlate between body movements and keys position. Tracking audible signals of the user' interface (e.g. a sound the device produce to informs the user that a keystroke was logged) may reduce the complexity of the body keylogging algorithms, as it marks the moment at which a key was pressed.
History
In the mid-1970s, the Soviet Union developed and deployed a hardware keylogger targeting typewriters. Termed the "selectric bug", it measured the movements of the print head of IBM Selectric typewriters via subtle influences on the regional magnetic field caused by the rotation and movements of the print head. An early keylogger was written by Perry Kivolowitz and posted to the Usenet newsgroup net.unix-wizards, net.sources on November 17, 1983. The posting seems to be a motivating factor in restricting access to /dev/kmem on Unix systems. The user-mode program operated by locating and dumping character lists (clients) as they were assembled in the Unix kernel.
In the 1970s, spies installed keystroke loggers in the US Embassy and Consulate buildings in Moscow.
They installed the bugs in Selectric II and Selectric III electric typewriters.
Soviet embassies used manual typewriters, rather than electric typewriters, for classified information—apparently because they are immune to such bugs.
As of 2013, Russian special services still use typewriters.
Cracking
Writing simple software applications for keylogging can be trivial, and like any nefarious computer program, can be distributed as a trojan horse or as part of a virus. What is not trivial for an attacker, however, is installing a covert keystroke logger without getting caught and downloading data that has been logged without being traced. An attacker that manually connects to a host machine to download logged keystrokes risks being traced. A trojan that sends keylogged data to a fixed e-mail address or IP address risks exposing the attacker.
Trojans
Researchers Adam Young and Moti Yung discussed several methods of sending keystroke logging. They presented a deniable password snatching attack in which the keystroke logging trojan is installed using a virus or worm. An attacker who is caught with the virus or worm can claim to be a victim. The cryptotrojan asymmetrically encrypts the pilfered login/password pairs using the public key of the trojan author and covertly broadcasts the resulting ciphertext. They mentioned that the ciphertext can be steganographically encoded and posted to a public bulletin board such as Usenet.
Use by police
In 2000, the FBI used FlashCrest iSpy to obtain the PGP passphrase of Nicodemo Scarfo, Jr., son of mob boss Nicodemo Scarfo.
Also in 2000, the FBI lured two suspected Russian cybercriminals to the US in an elaborate ruse, and captured their usernames and passwords with a keylogger that was covertly installed on a machine that they used to access their computers in Russia. The FBI then used these credentials to gain access to the suspects' computers in Russia to obtain evidence to prosecute them.
Countermeasures
The effectiveness of countermeasures varies because keyloggers use a variety of techniques to capture data and the countermeasure needs to be effective against the particular data capture technique. In the case of Windows 10 keylogging by Microsoft, changing certain privacy settings may disable it. An on-screen keyboard will be effective against hardware keyloggers; transparency will defeat some—but not all—screen loggers. An anti-spyware application that can only disable hook-based keyloggers will be ineffective against kernel-based keyloggers.
Keylogger program authors may be able to update their program's code to adapt to countermeasures that have proven effective against it.
Anti-keyloggers
An anti-keylogger is a piece of software specifically designed to detect keyloggers on a computer, typically comparing all files in the computer against a database of keyloggers, looking for similarities which might indicate the presence of a hidden keylogger. As anti-keyloggers have been designed specifically to detect keyloggers, they have the potential to be more effective than conventional antivirus software; some antivirus software do not consider keyloggers to be malware, as under some circumstances a keylogger can be considered a legitimate piece of software.
Live CD/USB
Rebooting the computer using a Live CD or write-protected Live USB is a possible countermeasure against software keyloggers if the CD is clean of malware and the operating system contained on it is secured and fully patched so that it cannot be infected as soon as it is started. Booting a different operating system does not impact the use of a hardware or BIOS based keylogger.
Anti-spyware / Anti-virus programs
Many anti-spyware applications can detect some software based keyloggers and quarantine, disable, or remove them. However, because many keylogging programs are legitimate pieces of software under some circumstances, anti-spyware often neglects to label keylogging programs as spyware or a virus. These applications can detect software-based keyloggers based on patterns in executable code, heuristics and keylogger behaviors (such as the use of hooks and certain APIs).
No software-based anti-spyware application can be 100% effective against all keyloggers. Software-based anti-spyware cannot defeat non-software keyloggers (for example, hardware keyloggers attached to keyboards will always receive keystrokes before any software-based anti-spyware application).
The particular technique that the anti-spyware application uses will influence its potential effectiveness against software keyloggers. As a general rule, anti-spyware applications with higher privileges will defeat keyloggers with lower privileges. For example, a hook-based anti-spyware application cannot defeat a kernel-based keylogger (as the keylogger will receive the keystroke messages before the anti-spyware application), but it could potentially defeat hook- and API-based keyloggers.
Network monitors
Network monitors (also known as reverse-firewalls) can be used to alert the user whenever an application attempts to make a network connection. This gives the user the chance to prevent the keylogger from "phoning home" with their typed information.
Automatic form filler programs
Automatic form-filling programs may prevent keylogging by removing the requirement for a user to type personal details and passwords using the keyboard. Form fillers are primarily designed for Web browsers to fill in checkout pages and log users into their accounts. Once the user's account and credit card information has been entered into the program, it will be automatically entered into forms without ever using the keyboard or clipboard, thereby reducing the possibility that private data is being recorded. However, someone with physical access to the machine may still be able to install software that can intercept this information elsewhere in the operating system or while in transit on the network. (Transport Layer Security (TLS) reduces the risk that data in transit may be intercepted by network sniffers and proxy tools.)
One-time passwords (OTP)
Using one-time passwords may prevent unauthorized access to an account which has had its login details exposed to an attacker via a keylogger, as each password is invalidated as soon as it is used. This solution may be useful for someone using a public computer. However, an attacker who has remote control over such a computer can simply wait for the victim to enter their credentials before performing unauthorized transactions on their behalf while their session is active.
Security tokens
Use of smart cards or other security tokens may improve security against replay attacks in the face of a successful keylogging attack, as accessing protected information would require both the (hardware) security token as well as the appropriate password/passphrase. Knowing the keystrokes, mouse actions, display, clipboard, etc. used on one computer will not subsequently help an attacker gain access to the protected resource. Some security tokens work as a type of hardware-assisted one-time password system, and others implement a cryptographic challenge–response authentication, which can improve security in a manner conceptually similar to one time passwords. Smartcard readers and their associated keypads for PIN entry may be vulnerable to keystroke logging through a so-called supply chain attack where an attacker substitutes the card reader/PIN entry hardware for one which records the user's PIN.
On-screen keyboards
Most on-screen keyboards (such as the on-screen keyboard that comes with Windows XP) send normal keyboard event messages to the external target program to type text. Software key loggers can log these typed characters sent from one program to another.
Keystroke interference software
Keystroke interference software is also available.
These programs attempt to trick keyloggers by introducing random keystrokes, although this simply results in the keylogger recording more information than it needs to. An attacker has the task of extracting the keystrokes of interest—the security of this mechanism, specifically how well it stands up to cryptanalysis, is unclear.
Speech recognition
Similar to on-screen keyboards, speech-to-text conversion software can also be used against keyloggers, since there are no typing or mouse movements involved. The weakest point of using voice-recognition software may be how the software sends the recognized text to target software after the user's speech has been processed.
Handwriting recognition and mouse gestures
Many PDAs and lately tablet PCs can already convert pen (also called stylus) movements on their touchscreens to computer understandable text successfully. Mouse gestures use this principle by using mouse movements instead of a stylus. Mouse gesture programs convert these strokes to user-definable actions, such as typing text. Similarly, graphics tablets and light pens can be used to input these gestures, however, these are becoming less common.
The same potential weakness of speech recognition applies to this technique as well.
Macro expanders/recorders
With the help of many programs, a seemingly meaningless text can be expanded to a meaningful text and most of the time context-sensitively, e.g. "en.wikipedia.org" can be expanded when a web browser window has the focus. The biggest weakness of this technique is that these programs send their keystrokes directly to the target program. However, this can be overcome by using the 'alternating' technique described below, i.e. sending mouse clicks to non-responsive areas of the target program, sending meaningless keys, sending another mouse click to the target area (e.g. password field) and switching back-and-forth.
Deceptive typing
Alternating between typing the login credentials and typing characters somewhere else in the focus window can cause a keylogger to record more information than it needs to, but this could be easily filtered out by an attacker. Similarly, a user can move their cursor using the mouse while typing, causing the logged keystrokes to be in the wrong order e.g., by typing a password beginning with the last letter and then using the mouse to move the cursor for each subsequent letter. Lastly, someone can also use context menus to remove, cut, copy, and paste parts of the typed text without using the keyboard. An attacker who can capture only parts of a password will have a larger key space to attack if they choose to execute a brute-force attack.
Another very similar technique uses the fact that any selected text portion is replaced by the next key typed. e.g., if the password is "secret", one could type "s", then some dummy keys "asdf". These dummy characters could then be selected with the mouse, and the next character from the password "e" typed, which replaces the dummy characters "asdf".
These techniques assume incorrectly that keystroke logging software cannot directly monitor the clipboard, the selected text in a form, or take a screenshot every time a keystroke or mouse click occurs. They may, however, be effective against some hardware keyloggers.
See also
Anti-keylogger
Black-bag cryptanalysis
Computer surveillance
Digital footprint
Hardware keylogger
Reverse connection
Session replay
Spyware
Trojan horse
Virtual keyboard
Web tracking
References
External links
Cryptographic attacks
Spyware
Surveillance
Cybercrime
Security breaches |
4469365 | https://en.wikipedia.org/wiki/Transistor%20count | Transistor count | The transistor count is the number of transistors in an electronic device. It typically refers to the number of MOSFETs (metal-oxide-semiconductor field-effect transistors, or MOS transistors) on an integrated circuit (IC) chip, as all modern ICs use MOSFETs. It is the most common measure of IC complexity (although the majority of transistors in modern microprocessors are contained in the cache memories, which consist mostly of the same memory cell circuits replicated many times). The rate at which MOS transistor counts have increased generally follows Moore's law, which observed that the transistor count doubles approximately every two years.
, the largest transistor count in a commercially available microprocessor is 57billion MOSFETs, in Apple's ARM-based M1 Max system on a chip, which is fabricated using TSMC's 5 nm semiconductor manufacturing process. , the highest transistor count GPU is AMD's Instinct MI250(X), built on TSMC's N6 process and totalling 59 billion MOSFETs across two dies. , the highest transistor count in any IC chip was Samsung's 1terabyte eUFS (3D-stacked) V-NAND flash memory chip, with 2trillion floating-gate MOSFETs (4bits per transistor). , the highest transistor count in any IC chip is a deep learning engine called the Wafer Scale Engine 2 by Cerebras, using a special design to route around any non-functional core on the device; it has 2.6trillion MOSFETs, manufactured using TSMC's FinFET process.
In terms of computer systems that consist of numerous integrated circuits, the supercomputer with the highest transistor count is the Chinese-designed Sunway TaihuLight, which has for all CPUs/nodes combined "about 400 trillion transistors in the processing part of the hardware" and "the DRAM includes about 12 quadrillion transistors, and that's about 97 percent of all the transistors." To compare, the smallest computer, dwarfed by a grain of rice, has on the order of 100,000 transistors. Early experimental solid state computers had as few as 130 transistors, but used large amounts of diode logic. The first carbon nanotube computer has 178 transistors and is a 1-bit one-instruction set computer, while a later one is 16-bit (while the instruction set is 32-bit RISC-V).
In terms of the total number of transistors in existence, it has been estimated that a total of 13sextillion () MOSFETs have been manufactured worldwide between 1960 and 2018. MOSFETs account for at least 99.9% of all transistors, the majority of which have been used for NAND flash memory manufactured during the early 21st century. This makes the MOSFET the most widely manufactured device in history.
Transistor count
Among the earliest products to use transistors were portable transistor radios, introduced in 1954, which typically used 4 to 8 transistors, often advertising the number on the radio's case. However, early junction transistors were relatively bulky devices that were difficult to manufacture on a mass-production basis, limiting the transistor counts and restricting their usage to a number of specialised applications.
The MOSFET (MOS transistor), invented by Mohamed Atalla and Dawon Kahng at Bell Labs in 1959, was the first truly compact transistor that could be miniaturised and mass-produced for a wide range of uses. The MOSFET made it possible to build high-density integrated circuits (ICs), enabling Moore's law and very large-scale integration. Atalla first proposed the concept of the MOS integrated circuit (MOS IC) chip in 1960, followed by Kahng in 1961, both noting that the MOSFET's ease of fabrication made it useful for integrated circuits. The earliest experimental MOS IC to be demonstrated was a 16-transistor chip built by Fred Heiman and Steven Hofstein at RCA Laboratories in 1962. Further large-scale integration was made possible with an improvement in MOSFET semiconductor device fabrication, the CMOS process, developed by Chih-Tang Sah and Frank Wanlass at Fairchild Semiconductor in 1963.
As the chip fabrication industry moves to newer processes, the number of transistors per unit area keeps rising.
The transistor count and transistor density are often reported as technical achievements.
Microprocessors
A microprocessor incorporates the functions of a computer's central processing unit on a single integrated circuit. It is a multi-purpose, programmable device that accepts digital data as input, processes it according to instructions stored in its memory, and provides results as output.
The development of MOS integrated circuit technology in the 1960s led to the development of the first microprocessors. The 20-bit MP944, developed by Garrett AiResearch for the U.S. Navy's F-14 Tomcat fighter in 1970, is considered by its designer Ray Holt to be the first microprocessor. It was a multi-chip microprocessor, fabricated on six MOS chips. However, it was classified by the Navy until 1998. The 4-bit Intel 4004, released in 1971, was the first single-chip microprocessor. It was made possible with an improvement in MOSFET design, MOS silicon-gate technology (SGT), developed in 1968 at Fairchild Semiconductor by Federico Faggin, who went on to use MOS SGT technology to develop the 4004 with Marcian Hoff, Stanley Mazor and Masatoshi Shima at Intel.
All chips over e.g. a million transistors have much memory, usually cache memories in level 1 and 2 or more levels, accounting for most transistors on microprocessors in modern times, where large caches have become the norm. The level 1 caches of the Pentium Pro die accounted for over 14% of its transistors, while the much larger L2 cache was on a separate die, but on-package, so it's not included in the transistor count. Later chips included more levels, L2 or even L3 on-chip. The last DEC Alpha chip made has 90% of it for cache.
While Intel's i960CA small cache of 1 KB, at about 50,000 transistors, isn't a big part of the chip, it alone would have been very large in early microprocessors. In the ARM 3 chip, with 4 KB, the cache was over 63% of the chip, and in the Intel 80486 its larger cache is only over a third of it because the rest of the chip is more complex. So cache memories are the largest factor, except for in early chips with smaller caches or even earlier chips with no cache at all. Then the inherent complexity, e.g. number of instructions, is the dominant factor, more than e.g. the memory the registers of the chip represent.
GPUs
A graphics processing unit (GPU) is a specialized electronic circuit designed to rapidly manipulate and alter memory to accelerate the building of images in a frame buffer intended for output to a display.
The designer refers to the technology company that designs the logic of the integrated circuit chip (such as Nvidia and AMD). The manufacturer refers to the semiconductor company that fabricates the chip using its semiconductor manufacturing process at a foundry (such as TSMC and Samsung Semiconductor). The transistor count in a chip is dependent on a manufacturer's fabrication process, with smaller semiconductor nodes typically enabling higher transistor density and thus higher transistor counts.
The random-access memory (RAM) that comes with GPUs (such as VRAM, SGRAM or HBM) greatly increase the total transistor count, with the memory typically accounting for the majority of transistors in a graphics card. For example, Nvidia's Tesla P100 has 15billion FinFETs (16 nm) in the GPU in addition to 16GB of HBM2 memory, totaling about 150billion MOSFETs on the graphics card. The following table does not include the memory. For memory transistor counts, see the Memory section below.
FPGA
A field-programmable gate array (FPGA) is an integrated circuit designed to be configured by a customer or a designer after manufacturing.
Memory
Semiconductor memory is an electronic data storage device, often used as computer memory, implemented on integrated circuits. Nearly all semiconductor memory since the 1970s have used MOSFETs (MOS transistors), replacing earlier bipolar junction transistors. There are two major types of semiconductor memory, random-access memory (RAM) and non-volatile memory (NVM). In turn, there are two major RAM types, dynamic random-access memory (DRAM) and static random-access memory (SRAM), as well as two major NVM types, flash memory and read-only memory (ROM).
Typical CMOS SRAM consists of six transistors per cell. For DRAM, 1T1C, which means one transistor and one capacitor structure, is common. Capacitor charged or not is used to store 1 or 0. For flash memory, the data is stored in floating gate, and the resistance of the transistor is sensed to interpret the data stored. Depending on how fine scale the resistance could be separated, one transistor could store up to 3-bits, meaning eight distinctive level of resistance possible per transistor. However, the fine the scale comes with cost of repeatability therefore reliability. Typically, low grade 2-bits MLC flash is used for flash drives, so a 16 GB flash drive contains roughly 64 billion transistors.
For SRAM chips, six-transistor cells (six transistors per bit) was the standard. DRAM chips during the early 1970s had three-transistor cells (three transistors per bit), before single-transistor cells (one transistor per bit) became standard since the era of 4Kb DRAM in the mid-1970s. In single-level flash memory, each cell contains one floating-gate MOSFET (one transistor per bit), whereas multi-level flash contains 2, 3 or 4 bits per transistor.
Flash memory chips are commonly stacked up in layers, up to 128-layer in production, and 136-layer managed, and available in end-user devices up to 69-layer from manufacturers.
Transistor computers
Before transistors were invented, relays were used in commercial tabulating machines and experimental early computers. The world's first working programmable, fully automatic digital computer, the 1941 Z3 22-bit word length computer, had 2,600 relays, and operated at a clock frequency of about 4–5 Hz. The 1940 Complex Number Computer had fewer than 500 relays, but it was not fully programmable. The earliest practical computers used vacuum tubes and solid-state diode logic. ENIAC had 18,000 vacuum tubes, 7,200 crystal diodes, and 1,500 relays, with many of the vacuum tubes containing two triode elements.
The second generation of computers were transistor computers that featured boards filled with discrete transistors, solid-state diodes and magnetic memory cores. The experimental 1953 48-bit Transistor Computer, developed at the University of Manchester, is widely believed to be the first transistor computer to come into operation anywhere in the world (the prototype had 92 point-contact transistors and 550 diodes). A later version the 1955 machine had a total of 250 junction transistors and 1300 point-contact diodes. The Computer also used a small number of tubes in its clock generator, so it was not the first transistorized. The ETL Mark III, developed at the Electrotechnical Laboratory in 1956, may have been the first transistor-based electronic computer using the stored program method. It had about "130 point-contact transistors and about 1,800 germanium diodes were used for logic elements, and these were housed on 300 plug-in packages which could be slipped in and out." The 1958 decimal architecture IBM 7070 was the first transistor computer to be fully programmable. It had about 30,000 alloy-junction germanium transistors and 22,000 germanium diodes, on approximately 14,000 Standard Modular System (SMS) cards. The 1959 MOBIDIC, short for "MOBIle DIgital Computer", at 12,000 pounds (6.0 short tons) mounted in the trailer of a semi-trailer truck, was a transistorized computer for battlefield data.
The third generation of computers used integrated circuits (ICs). The 1962 15-bit Apollo Guidance Computer used "about 4,000 "Type-G" (3-input NOR gate) circuits" for about 12,000 transistors plus 32,000 resistors.
The IBM System/360, introduced 1964, used discrete transistors in hybrid circuit packs. The 1965 12-bit PDP-8 CPU had 1409 discrete transistors and over 10,000 diodes, on many cards. Later versions, starting with the 1968 PDP-8/I, used integrated circuits. The PDP-8 was later reimplemented as a microprocessor as the Intersil 6100, see below.
The next generation of computers were the microcomputers, starting with the 1971 Intel 4004. which used MOS transistors. These were used in home computers or personal computers (PCs).
This list includes early transistorized computers (second generation) and IC-based computers (third generation) from the 1950s and 1960s.
Logic functions
Transistor count for generic logic functions is based on static CMOS implementation.
Parallel systems
Historically, each processing element in earlier parallel systems—like all CPUs of that time—was a serial computer built out of multiple chips. As transistor counts per chip increases, each processing element could be built out of fewer chips, and then later each multi-core processor chip could contain more processing elements.
Goodyear MPP: (1983?) 8 pixel processors per chip, 3,000 to 8,000 transistors per chip.
Brunel University Scape (single-chip array-processing element): (1983) 256 pixel processors per chip, 120,000 to 140,000 transistors per chip.
Cell Broadband Engine: (2006) with 9 cores per chip, had 234 million transistors per chip.
Other devices
Transistor density
The transistor density is the number of transistors that are fabricated per unit area, typically measured in terms of the number of transistors per square millimeter (mm2). The transistor density usually correlates with the gate length of a semiconductor node (also known as a semiconductor manufacturing process), typically measured in nanometers (nm). , the semiconductor node with the highest transistor density is TSMC's 5 nanometer node, with 171.3million transistors per square millimeter.
MOSFET nodes
See also
Gate count, an alternate metric
Dennard scaling
Electronics industry
Integrated circuit
List of best-selling electronic devices
List of semiconductor scale examples
MOSFET
Semiconductor
Semiconductor device
Semiconductor device fabrication
Semiconductor industry
Transistor
Cerebras Systems
Notes
References
External links
Transistor counts of Intel processors
Evolution of FPGA Architecture
Integrated circuits
MOSFETs
Count |
580856 | https://en.wikipedia.org/wiki/Steel%20Battalion | Steel Battalion | is a video game created by Capcom for the Xbox console where the player controls a "Vertical Tank"—a bipedal, heavily armed mecha. To control the tank and play the game requires the use of a large controller (Mega-Jockey-9000) made specially for Steel Battalion. The controller consists of 44 input points—mainly buttons—but also uses 2 joysticks, a throttle handle, a radio channel dial, 5 switches, an eject button, and 3 foot pedals. Only limited quantities were made available. These quickly sold out, making the game a collector's piece. It has since been re-released in limited quantities worldwide, with blue controller buttons distinguishing it from the first edition with green buttons.
Gameplay
At the beginning of every mission, the player must 'start up' the machine and operating system; this is handled through a series of switches and buttons dedicated to this purpose. If a corner is turned too fast, the machine will tumble over. If the player's machine overheats, its operating system must be reset. The game even simulates window wipers in case of mud hitting the monitor. If the player does not eject when prompted, the player's in-game character will "die", and the game will delete its own saved data, prompting the player to start over from the beginning.
Vertical tanks (VTs) are the vehicles piloted in the series. Essentially bipedal walking weapons platforms, VTs are classed by their developmental generation and sub-categorised by their combat role. Primary combat roles are standard combat, assault, support, scout, and fast attack. Vertical tanks are divided into three weight classes: light, medium and heavy. As the player progresses, new generations of VTs become available. This allows a newer, more advanced operating system, startup sequence, and combat functions, as well as a wider cockpit view and layout. New generation VTs also handle better and can provide better firepower over previous generations.
Development
Steel Battalion was developed by Capcom Production Studio 4 in collaboration with former Human Entertainment designers that would go on to form Nude Maker. Producer Atsushi Inaba stated at the Game Developers Conference in 2005 that the Steel Battalion was a "product-focused project" in which the team initially focused on creating a new peripheral and software designed to go with it. Inaba's superiors were skeptical about putting such a game on the market. The number of staff working on the project grew according to the team's experience with making new hardware. The earliest build of the game was created for the PlayStation 2. However, when the Xbox became available, the development team switched to it because of the system's greater power. Online play was taken out of consideration close to the development's start due to being too ambitious. While the game and its special controller received critical acclaim, the project turned little profit. Inaba said that Steel Battalion was developed to show "what can be done in the game industry that cannot be done in others".
Reception
The game received "favorable" reviews according to the review aggregation website Metacritic. In Japan, Famitsu gave it a score of 35 out of 40. It was nominated for GameSpots 2002 "Best Graphics (Artistic)" and "Best Game No One Played" awards among Xbox games, but also the publication's "Most Disappointing Game on Xbox" prize.
Steel Battalion was the fifth best-selling game during its week of release in Japan at about 15,092 copies. Inaba concluded that the game ultimately broke even in terms of units shipped and units sold.
A reviewer on IGN wrote "where MechAssault and Robotech wouldn't let us into the cockpit, Steel Battalion won't let us out" and joked the US$200 cost was for the controller while the game disc was free.
Sequels
A sequel called Steel Battalion: Line of Contact was released in 2004, and also used the game's unique controller. The third installment called Steel Battalion: Heavy Armor was released in June 2012. This installment uses the Kinect motion sensor control rather than the original controller.
Notes
References
External links
2002 video games
Capcom franchises
Capcom games
Microsoft games
Military science fiction video games
Nude Maker games
Video games about mecha
Video games developed in Japan
Xbox games
Xbox-only games |
4363230 | https://en.wikipedia.org/wiki/Bannari%20Amman%20Institute%20of%20Technology | Bannari Amman Institute of Technology | The Bannari Amman Institute of Technology (Autonomous) is an engineering college located in Sathyamangalam, Erode, Tamil Nadu, India. It was founded by the Bannari Amman Group in 1996 and is affiliated to Anna University. The institute offers 21 undergraduate, 10 postgraduate programmes in Engineering, Technology and Management studies. All the departments of Engineering and Technology are recognized by Anna University, Chennai to offer Ph.D. programmes. The institution is ISO 9001:2000 certified for its quality education, and most of the eligible courses are accredited by National Board of Accreditation (NBA), New Delhi and National Assessment and Accreditation Council (NAAC) with "A+" Grade. The institute received the best Engineering College Award from Indian Society for Technical Education in the year 2009. The institute was also awarded the silver medal for Best Overall Industry-Linked Engineering College from AICTE-CII National Survey on Industry-Linked Engineering Institutes in 2012.
Programmes offered
Undergraduate Programmes
Bachelor of Engineering in
Aeronautical Engineering
Agriculture Engineering
Automobile Engineering
Biomedical Engineering
Civil Engineering
Computer Science & Engineering
Electrical & Electronics Engineering
Electronics & Communication Engineering
Electronics & Instrumentation Engineering
Information Science & Engineering
Mechanical Engineering
Mechatronics
Bachelor of Technology in
Artificial Intelligence and Data Science
Artificial Intelligence and Machine Learning
Biotechnology
Computer Science and Business Systems
Computer Technology
Food Technology
Fashion Technology
Information Technology
Textile Technology
Postgraduate Programmes
Master of Engineering in
Communication Systems
Computer Science & Engineering
Industrial Automation & Robotics
Industrial Safety & Engineering
Software Engineering
Structural Engineering
Master of Technology in
Biotechnology
Master of Business Administration
Ph.D. / M.S. (by research) Programmes
Aeronautical Engineering
Agriculture Engineering
Automobile Engineering
Biotechnology
Civil Engineering
Computer Science & Engineering
Electrical & Electronics Engineering
Electronics & Communication Engineering
Electronics & Instrumentation Engineering
Fashion Technology
Information Technology
Mechanical Engineering
Mechatronics
Textile Technology
Physics
Chemistry
Admissions
Admissions is through Tamil Nadu Engineering Admission (TNEA) ranking based on 12th standard exam results facilitated by Directorate Of Technical Education (DoTE). ME/M.Tech admissions are based on ranking in TANCET examination conducted by Anna University. The Counselling code of the Institution is 2702.Management seats are available after the final round of counselling.
Infrastructure
Library
The five-storeyed, air-conditioned and computerized library is well-stacked. 83000 Volumes, 400 National and International Journals, 6500 CD-ROMs, a Digital Library with 6000 e-journals, 274 NPTEL and 166 NITTTR video courses are part of the resources. BIT is an Institutional member of the British Council Library, Chennai, DELNET, New Delhi and INDEST Consortium, New Delhi.
Hostels
The institute has four hostels for male students and five hostels for female students. All hostels are fully furnished and single, double and four occupancy rooms are available
Hostel Details:
Gents Hostel - 3931 inmates
Ladies Hostel - 2191 inmates
Other Facilities in Hostel: Dining Halls, Mini Cine Theatres, Indoor Courts for Shuttle & Table Tennis.
Sports
All necessary sports facilities with state-of-the-art technology are available in the campus. The Existing sports facilities for students includes 400 m standard athletic track with 8 lanes including a Long Jump & Triple Jump Pit, Sectors for Shot-put, Discus & Javelin throws,
In additional, it has also a standard bed for High jump and Pole vault events, a football field and two kho- kho courts, a Hockey field with kerb, 65 m radius cricket field with two net practice pitches & one portable nets, two Volleyball courts, two Ball Badminton Courts,
one Handball Courts and two Kabaddi Courts. Total area of the BIT Play Field is 5,74,580 sq. ft.
Auditorium
An fully air-conditioned indoor Vedanayagam auditorium with a capacity to seat 750 students.
Town Hall
An fully air-conditioned main auditorium with a capacity to seat 2500 student's.
Achievements
First prize in AICTE Clean and Smart Campus Award 2020 under the category of 'Indian Knowledge Systems'
BIT becomes the first autonomous engineering college in India to attain International Accreditation from IET – UK
School of Management Studies ranked 2nd among all the Private Engineering Colleges offering MBA in Tamil Nadu by Business Today 2020
Prestigious IMC Ramakrishna Bajaj National Quality Award 2019 in the Education Category from the IMC Chamber of Commerce and Industry.
BIT is awarded 'Clean & Smart Campus Award 2019' for the best practices carried out in the institute premises by AICTE.
BIT has been recognized with the National award under the category 'Top Performing Engineering College - Research and Development' by Society for Engineering Education Enrichment, Noida.
BIT was ranked 98 among engineering colleges in India by the National Institutional Ranking Framework (NIRF) in 2019 and overall rank 151–200. It was ranked 42 by Outlook India in 2019.
Tamil Nadu Government Environmental Awards – 2017 - Environmental Protection and Management Award 2017
5S Excellence Award 2014 for Strong Commitment from CII - Southern Region.
References
External links
https://www.annauniv.edu/
https://www.aicte-india.org/
Social Media
Facebook: https://www.facebook.com/bitsathyindia
Twitter : https://twitter.com/Bitsathyindia
Instagram : https://instagram.com/lifeatbit
YouTube : https://www.youtube.com/bitsathyindia
LinkedIn : https://in.linkedin.com/school/bitsathyindia/
Engineering colleges in Tamil Nadu
Colleges affiliated to Anna University
Universities and colleges in Erode district
Educational institutions established in 1996
1996 establishments in Tamil Nadu |
58150431 | https://en.wikipedia.org/wiki/Foreshadow | Foreshadow | Foreshadow, known as L1 Terminal Fault (L1TF) by Intel, is a vulnerability that affects modern microprocessors that was first discovered by two independent teams of researchers in January 2018, but was first disclosed to the public on 14 August 2018. The vulnerability is a speculative execution attack on Intel processors that may result in the disclosure of sensitive information stored in personal computers and third-party clouds. There are two versions: the first version (original/Foreshadow) () targets data from SGX enclaves; and the second version (next-generation/Foreshadow-NG) () targets virtual machines (VMs), hypervisors (VMM), operating systems (OS) kernel memory, and System Management Mode (SMM) memory. A listing of affected Intel hardware has been posted.
Foreshadow is similar to the Spectre security vulnerabilities discovered earlier to affect Intel and AMD chips, and the Meltdown vulnerability that also affected Intel. However, AMD products, according to AMD, are not affected by the Foreshadow security flaws. According to one expert, "[Foreshadow] lets malicious software break into secure areas that even the Spectre and Meltdown flaws couldn't crack". Nonetheless, one of the variants of Foreshadow goes beyond Intel chips with SGX technology, and affects "all [Intel] Core processors built over the last seven years".
Foreshadow may be very difficult to exploit. As of 15 August 2018, there seems to be no evidence of any serious hacking involving the Foreshadow vulnerabilities. Nevertheless, applying software patches may help alleviate some concern, although the balance between security and performance may be a worthy consideration. Companies performing cloud computing may see a significant decrease in their overall computing power; individuals, however, may not likely see any performance impact, according to researchers. The real fix, according to Intel, is by replacing today's processors. Intel further states, "These changes begin with our next-generation Intel Xeon Scalable processors (code-named Cascade Lake), as well as new client processors expected to launch later this year [2018]."
On 16 August 2018, researchers presented technical details of the Foreshadow security vulnerabilities in a seminar, and publication, entitled "Foreshadow: Extracting the Keys to the Intel SGX Kingdom with Transient Out-of-Order Execution" at a USENIX security conference.
History
Two groups of researchers discovered the security vulnerabilities independently: a Belgian team (including Raoul Strackx, Jo Van Bulck, Frank Piessens) from imec-DistriNet, KU Leuven reported it to Intel on 3 January 2018; a second team from Technion – Israel Institute of Technology (Marina Minkin, Mark Silberstein), University of Adelaide (Yuval Yarom), and University of Michigan (Ofir Weisse, Daniel Genkin, Baris Kasikci, Thomas F. Wenisch) reported it on 23 January 2018. The vulnerabilities were first disclosed to the public on 14 August 2018.
Mechanism
The Foreshadow vulnerability is a speculative execution attack on Intel processors that may result in the disclosure of sensitive information stored in personal computers and third-party clouds. There are two versions: the first version (original/Foreshadow) ( [attacks SGX]) targets data from SGX enclaves; and the second version (next-generation/Foreshadow-NG) ( [attacks the OS Kernel and SMM mode] and [attacks virtual machines]) targets virtual machines (VMs), hypervisors (VMM), operating systems (OS) kernel memory, and System Management Mode (SMM) memory. Intel considers the entire class of speculative execution side channel vulnerabilities as "L1 Terminal Fault" (L1TF).
For Foreshadow, the sensitive data of interest is the encrypted data in an SGX enclave. Usually, an attempt to read enclave memory from outside the enclave is made, speculative execution is permitted to modify the cache based on the data that was read, and then the processor is allowed to block the speculation when it detects that the protected-enclave memory is involved and reading is not permitted. However, "... if the sensitive data is in level 1 cache, speculative execution can use it before the processor determines that there's no permission to use it." The Foreshadow attacks are stealthy, and leave few traces of the attack event afterwards in a computer's logs.
On 16 August 2018, researchers presented technical details of the Foreshadow security vulnerabilities in a seminar, and publication, at a USENIX security conference.
Impact
Foreshadow is similar to the Spectre security vulnerabilities discovered earlier to affect Intel and AMD chips, and the Meltdown vulnerability that affected Intel. AMD products, according to AMD, are not affected by the Foreshadow security flaws. According to one expert, "[Foreshadow] lets malicious software break into secure areas that even the Spectre and Meltdown flaws couldn't crack". Nonetheless, one of the variants of Foreshadow goes beyond Intel chips with SGX technology, and affects "all [Intel] Core processors built over the last seven years".
Intel notes that the Foreshadow flaws could produce the following:
Malicious applications, which may be able to infer data in the operating system memory, or data from other applications.
A malicious guest virtual machine (VM) may infer data in the VM's memory, or data in the memory of other guest VMs.
Malicious software running outside of SMM may infer data in SMM memory.
Malicious software running outside of an Intel SGX enclave or within an enclave may infer data from within another Intel SGX enclave.
According to one of the discoverers of the computer flaws: "... the SGX security hole can lead to a "Complete collapse of the SGX ecosystem."
A partial listing of affected Intel hardware has been posted, and is described below. (Note: a more detailed - and updated - listing of affected products is on the official Intel website.)
Intel Core i3/i5/i7/M processor (45 nm and 32 nm)
2nd/3rd/4th/5th/6th/7th/8th generation Intel Core processors
Intel Core X-series processor family for Intel X99 and X299 platforms
Intel Xeon processor 3400/3600/5500/5600/6500/7500 series
Intel Xeon Processor E3 v1/v2/v3/v4/v5/v6 family
Intel Xeon Processor E5 v1/v2/v3/v4 family
Intel Xeon Processor E7 v1/v2/v3/v4 family
Intel Xeon Processor Scalable family
Intel Xeon Processor D (1500, 2100)
Foreshadow may be very difficult to exploit, and there seems to be no evidence to date (15 August 2018) of any serious hacking involving the Foreshadow vulnerabilities.
Mitigation
Applying software patches may help alleviate some concern(s), although the balance between security and performance may be a worthy consideration. Companies performing cloud computing may see a significant decrease in their overall computing power; individuals, however, may not likely see any performance impact, according to researchers.
The real fix, according to Intel, is by replacing today's processors. Intel further states, "These changes begin with our next-generation Intel Xeon Scalable processors (code-named Cascade Lake), as well as new client processors expected to launch later this year [2018]."
See also
BlueKeep (security vulnerability)
Hardware security bug
Microarchitectural Data Sampling
TLBleed, similar security vulnerability
Transient execution CPU vulnerabilities
References
Further reading
Foreshadow – Technical details (USENIX; FSA)
External links
Speculative execution security vulnerabilities
Hardware bugs
Side-channel attacks
X86 architecture
X86 memory management
2018 in computing |
33387949 | https://en.wikipedia.org/wiki/Complex%20and%20Adaptive%20Systems%20Laboratory | Complex and Adaptive Systems Laboratory | Complex and Adaptive Systems Laboratory (CASL) is an interdisciplinary research institute in University College Dublin. It is formed around four research clusters. The institute houses research groups from a number of Schools within UCD, notably computer science.
It is located in the Belfield Business Park at the northern end of the main UCD campus. The institute is involved in all aspects of research into complex systems from atomistic models, through to societal modelling.
Natural Computing and Optimisation
Led by Dr. Mike O'Neill this cluster studies computational systems inspired by the Natural World, including complex, physical, social and biological systems, and optimisation and model induction methods in their broader sense. The main facets of the cluster are: nature-inspired problem solving, understanding natural systems, exploiting natural processes as computational machines, and developing the next generation of optimisation and model induction methods. Groups in this cluster develop and apply methods to a broad range of problem domains including Finance, Computer Science, Design, Architecture, Music, Sound Synthesis, Bioinformatics, Engineering and Telecommunications.
Networks and Data Analysis
The Networks and Data Analysis cluster is concerned with the analysis of complex data from and about networks. These networks may be social networks (in the broadest sense), biological networks, sensor networks or communications networks. The defining characteristic of the research in this cluster is the significance of the network structure in the data. The research concerns the discovery of interesting structure in the data and the fusion of data from different sources.
Security and Trust
The Security and Trust cluster seeks to combine fundamental mathematics, computer science and engineering, with practical software engineering expertise and knowledge of human behaviour, to study problems in the areas of security and trust. Research topics include cryptography, security, privacy, trust, voting issues, information security, network coding and network information theory, watermarking, steganography, error correction, modulation, signal processing.
Simulation Science and Extreme Events
This cluster aims to study and link the broad common underpinning causes of extreme weather, market crashes, social fads, and global epidemics using simulation science as the tool of discovery.
Research Training
A number of structured PhD Programmes provide post-graduate education throughout the Institute:
Simulation Science
Bioinformatics and Systems Biology
Complex Systems and Computational Social Sciences
Mathematics, Coding, Cryptography and Information Security
Innovation Policy for the Smart Economy
Computational Infection Biology.
Computer science departments
Research institutes in the Republic of Ireland |
1411089 | https://en.wikipedia.org/wiki/Mavis%20Beacon%20Teaches%20Typing | Mavis Beacon Teaches Typing | Mavis Beacon Teaches Typing is an application software program designed to teach touch typing.
According to Vice, Mavis Beacon Teaches Typing is not a game, rather a "system for teaching you how to type without looking at the keyboard".
History
The typing program was initially released in late 1987 by The Software Toolworks and has been published regularly ever since. The first version written for MS-DOS was created by Norm Worthington, Walt Bilofsky, and Mike Duffy. Editions of Mavis Beacon are currently published by Encore Software (hybrid Mac and Windows) and Software MacKiev (macOS only) and are available throughout the retail sales world. An early version supported both QWERTY and the alternative Dvorak Simplified Keyboard layout. Later versions supported only QWERTY until the 2011 Ultimate Mac Edition from Software MacKiev which returned full Dvorak keyboard lessons to the product. Earlier versions were made for Apple II, Commodore 64, Atari 8-bit family (version 1 only), Apple IIGS, Atari ST, Mac OS, Microsoft Windows, Palm OS (version 16), and Amiga. The current Windows and Mac versions are published under the Broderbund trademark by both Encore and Software MacKiev.
Features
The program includes a number of speed tests and constantly tracks the user's words-per-minute typing speed. It also includes a number of typing games of which some versions have been included since the first release. (The 2011 Ultimate Mac Edition for macOS, published by Software MacKiev, also includes two-player competitive typing network games, integration with iTunes, Dvorak keyboard support, practice typing song lyrics, RSS news feeds and classic novels.) A certificate of achievement can be printed by the user upon the completion of tests.
Name
Mavis Beacon is not a real person. The original photo of Mavis Beacon was of Caribbean-born model Renee L'Esperance. She was introduced to Les Crane, the former talk-show host, while he was shopping at Saks Fifth Avenue in Beverly Hills. Crane, who was then a partner in The Software Toolworks, devised the sobriquet.
Mavis Beacon's first name was taken from Mavis Staples, lead vocalist for the Staple Singers. The surname derives from beacon, as in a light to guide the way.
Reception
Mavis Beacon Teaches Typing
A favorable review in 1987 by Peter Lewis, technology writer for The New York Times, gave the program an early boost. Compute! favorably reviewed the program in 1989, stating that children, adults, and experienced typists would find it useful, and citing its support of Dvorak training. The Washington Post felt the product "conceals the typing drills rather nicely behind a game".
Mavis Beacon Teaches Typing II
Paul Tyrrell for Amiga Format wrote that the program was well researched, well written, and easy to use. Nick Veitch for CU Amiga felt the product was much more interesting than other educational multimedia products.
Mavis Beacon Teaches Typing Version 5
Superkids described it as a "well-polished program".
Mavis Beacon Teaches Typing For Kids
Metzo Magic appreciated that the game had only few Americanised words, which increased the game's appeal in areas that use British spelling.
Mavis Beacon Teaches Typing Version 9
The New York Times noted that by 1999, although the product wasn't the "flashiest" option for players, it remained an effective typing program.
Sales
By 1999, the series had sold over six million copies.
On April 21, 2000, two products reached the Top Selling Educational Software list: Mavis Beacon Teaches Typing 10.0 (4th) and Mavis Beacon Teaches Typing 5.0 (8th).
See also
Typequick
References
External links
Mavis Beacon Teaches Typing by Encore (Windows and Mac editions)
Mavis Beacon Teaches Typing by Software MacKiev (Mac OS X edition)
1987 video games
Children's educational video games
Typing video games
Apple II games
Classic Mac OS games
Amiga games
Windows games
Apple IIGS games
Atari ST games
Typing software
Video games developed in the United States
The Software Toolworks games |
37525240 | https://en.wikipedia.org/wiki/Code42 | Code42 | Code42 is an American cybersecurity software company based in Minneapolis specializing in insider risk management. It is the maker of the cloud-native data protection products Incydr and CrashPlan. Code42’s Incydr is a SaaS data-loss protection product. Incydr is designed to help enterprise security teams detect insider risks to data that could lead to data leak and data loss and insider threat breaches, and respond to them appropriately. Code42’s CrashPlan for Small Business is cloud data backup and recovery software.
History
Code42 was founded as an IT consulting company in 2001, by Matthew Dornquast, Brian Bispala, and Mitch Coopet. The company's name honors Douglas Adams, who authored Hitchhiker’s Guide to the Galaxy and had died that year. In the book, the number 42 is the "answer to the ultimate question of life, the universe and everything".
Some of Code42's first projects included a redesign of Sun Country Airlines’ website in 2002, a project for the retailer Target Corporation, and the ticket booking engine for Midwest Airlines. Income from the IT services business was used to fund product ideas for six years.
In 2006, the company planned to create a Facebook-like desktop application, but the project became too large and impractical. Code42 focused on the online storage element of the application, creating CrashPlan in 2007.
In June 2011, Code42 acquired a Minneapolis-based mobile development company, Recursive Awesome LLC, to support its software on mobile devices.
In 2012, Code42 raised $52.5 million in funding. The funding was the first distribution from a $100 million pool established in 2011 by Accel Partners to fund Big Data companies.
In mid 2015, former Eloqua CEO Joe Payne succeeded co-founder Matthew Dornquast as CEO. The company raised an additional $85 million in funding in October 2015.
On August 22, 2017, Code42 announced they were shutting down CrashPlan for Home, effective in October 2018. They were not accepting new subscriptions but would maintain existing subscriptions until the end of their existing subscription period, at which point the backups would be purged. The Home plans had been replaced by CrashPlan for Small Business, which are business-focused, although still possible to use for private purposes. Backups to friends/family are not supported in the new product, the company explained: "As we shift our business strategy to focus exclusively on enterprise and small business segments, you have two great options to continue getting the best backup solution.".
In September 2020, Code42 launched its Incydr data risk detection and response product, a SaaS data protection tool for enterprises. Incydr allows security teams to effectively mitigate file exposure and exfiltration risks without disrupting legitimate work and collaboration. Incydr guards intellectual property, source code and trade secrets. Incydr is Code42’s flagship product.
Also in September 2020, Code42 leaders Joe Payne, Jadee Hanson, and Mark Wojtasiak, co-authored and published the book Inside Jobs: Why Insider Risk is the Biggest Cyber Threat You Can’t Ignore. The book explores the problem of insider risk, what drives it, why they believe traditional methods of protecting company data are inadequate and what security leaders can do to keep their data secure.
Business
As of April 2011, 80% of Code42 Software’s revenue comes from business customers. Most of the remainder comes from consumers and a small portion from service provider partners. It was reported in 2012 that Code42 had been profitable each year since it was founded. It grew from $1.4 million in revenue in 2008 to $11.46 million in 2010 and $18.5 million in 2011. In 2020, Code42's SaaS business was $100 million annually. As of 2012, the company had backed up 100 petabytes of data and processed 100 billion files a day.
Products and services
Code42 is the maker of the Incydr data loss detection and response product. It allows security teams to mitigate file exposure and exfiltration risks without disrupting collaboration. Incydr comes in two plans: Basic and Advanced.
Incydr displays information about what data is relevant, including how, when and where that data is moving, and who is moving it. It monitors the creation, deletion, modification and movement of all files, whether the activity is within a company’s security protocols or not. Even though Incydr monitors all file activity, it distinguishes between acceptable team collaboration and file sharing and events that represent risks to businesses.
Code42 also is the maker of cloud backup and recovery software CrashPlan for Small Business. CrashPlan backs up data to remote servers or hard drives. It is available on Mac, Windows and Linux. As of 2018, backup to other computers is no longer supported.
Initial backups may take several hours via LAN or days over the internet, depending on the amount of data and bandwidth available, but afterwards, continuous and incremental backups are conducted without user intervention.
Around 2012, there used to be a paid option for seed loading, in which a hard drive was sent to the user, so a faster local backup could be performed to the drive and it could be shipped back to Code42 for initial backup. However this Seeded Backup service was no longer offered in 2016; neither was the corresponding Restore-to-Door service, which would allow a hard drive containing extensive restore data from backups to be shipped back to the user faster than an over-the-Internet download.
With CrashPlan, Data is encrypted, password-protected and stored in a proprietary format. There is also an option for a more secure private key. Corporate users in 2012 that had CrashPlan PROe back up to private servers instead of Code42's data centers in four out of five cases. In 2012, the software had an option to create a private on-site backup server.
In 2013, Code42 developed, released and marketed a file sharing service called SharePlan. According to the Star Tribune, it competed with DropBox, but SharePlan used a PIN to access files and track users.
In October 2014, a revision of the software added features for regulatory compliance like Sarbanes-Oxley and options for a private, public or hybrid cloud deployment. It had a single login with Crashplan using a feature called the "Code42 EDGE Platform", which was improved in December 2014 with two-factor authentication features. Shareplan was discontinued in August 2015.
In a comparative review published in 2015 in The Wall Street Journal, Geoffrey Fowler observed CrashPlan was his favorite out of the four services evaluated. He observed it lacked "fine print", whereas some of the other services charged additional fees for basic features or weren't really unlimited. PC Magazine in 2017 gave CrashPlan 4.5 out of 5 stars and awarded it Editor's Choice. The review praised it for its user interface, local backup options, and security features, but said its mobile and explorer-based features were "limited."
A 2012 product review on MacWorld gave CrashPlan a rating of 4.5 out of 5, and Gartner, in 2012, gave the enterprise version, CrashPlan PROe, an "excellent" rating. All Things Digital praised CrashPlan for its operating system support and configuration options. Also in 2012, Ars Technica said CrashPlan had better features and pricing options than its competitors.
See also
Comparison of online backup services
References
External links
code42.com Official website
crashplan.com Official website
Companies based in Minneapolis
Software companies based in Minneapolis
Software companies based in Minnesota
Backup software
Web hosting
Classic Mac OS software
File hosting for Linux
File hosting for macOS
File hosting for Windows
Software companies of the United States |
12790696 | https://en.wikipedia.org/wiki/MacJournal | MacJournal | MacJournal is journaling and blogging software originally developed for Mac OS X. It is published by Dan Schimpf Software. MacJournal offers only basic text formatting and limited page layout features. MacJournal's audience includes diarists, bloggers and podcasters.
MacJournal supports online blog tools including: LiveJournal, Blogger, Movable Type and WordPress. It also contains powerful searching capabilities, allows keeping multiple nested journals. Includes password protection, AES-256 encryption and Palm (PDA) syncing.
MacJournal was written by Dan Schimpf and was awarded Best Mac OS X Student Product at the 2002 Apple Design Awards. It was initially distributed as Freeware, then made Shareware. In 2004 the project was purchased by Mariner Software, and Schimpf was hired to continue development.
In 2012, MacJournal was given an Editors' Choice Award by Macworld.
In early 2019, development and distribution of MacJournal was reverted from Mariner Software back to its original developer, Dan Schimpf. In March 2019, Dan Schimpf Software released version 7.0.0 of MacJournal as freeware. As of May 2019, Schimpf is continuing active development of MacJournal, releasing occasional beta updates.
External links
ExpertReviews.co.uk review of MacJournal 4 (2006)
Macworld review of MacJournal 4 (2006)
Macworld review of MacJournal 6 (2012)
References
Blog software
MacOS Internet software
Blog client software |
33107966 | https://en.wikipedia.org/wiki/Carahsoft | Carahsoft | Carahsoft, founded in 2004, is a privately held business located in Reston, VA that sells IT hardware, software and consulting services to federal, state and local governments, and educational institutions.
Business model
Carahsoft sells IT hardware, software and consulting services related to data analysis and storage, cyber defense and security, business intelligence, and other corporate and government functions.
Author Mark Amtower categorized Carahsoft as a “boutique reseller” because the company “sells a limited number of products, usually those that address a specific need in the market.” The company supports more than 3,000 prime contractors, value-added re-sellers, system integrators and other channel partners.
Contracts
U.S. Department of Defense
The U.S. Department of Defense is one of Carahsoft’s largest customers.
2020
On April 2, 2020, Naval Information Warfare Systems Command contracted Carahsoft for a variety of BlackBerry services. On May 22, 2020, the U.S. Air Force awarded Carahsoft an $81 million contract to help the Space Command and Control Division within Space & Missile Systems Center (Los Angeles Air Force Base) create and implement software development and information technology operations. On July 16, 2020, Carahsoft was awarded roughly $29.8 million to work at Fort Belvoir, moving an Army logistics modernization program to the cloud. On July 27, 2020, the U.S. Army awarded Carahsoft a $16 million contract to support the Army Enterprise Systems Integration Program and Global Combat Support System. On August 31, the DoD included Carahsoft in a 10-year, $13 billion firm-fixed-price contract with 30 other tech companies to supply off-the-shelf enterprise infrastructure software and maintenance to the U.S. Army, Department of Defense and all federal agencies.
2019
The U.S. Navy included Carahsoft on a 10-year, $975 million blanket purchase agreement to provide SAP software products, a five-year $69.1 million BPA to provide Symantec software licenses, and a four-year $440 million BPA to provide McAfee hardware, software and services. Carahsoft was awarded a basic ordering agreement from the U.S Army to support a transition to cloud computing environments, for an estimated cost of $247.7 million. The DoD also included Carahsoft on an $820,450,000 BPA to supply information technology asset management software, software maintenance support, information technology professional services; and related services to the DoD, intelligence community and U.S. Coast Guard.
2018
The DoD awarded an estimated $131,000,866 in contracts to Carahsoft as of June 2018.
2017
The DoD awarded an estimated $270,475,338 in contracts to Carahsoft in 2017.
2016
The DoD awarded an estimated $80,075,312 in contracts to Carahsoft in 2016.
U.S. General Services Administration
2019
Carahsoft was one of 11 teams selected by the U.S. GSA and National Geospatial-Intelligence Agency (NGA) as part of a blanket purchase agreement (BPA) to provide geospatial earth observation data, products and services. Carahsoft and Grant Thornton were given multi-million dollar task orders as part of a blanket purchase agreement related to NewPay, a U.S. General Services Administration initiative to modernize federal payroll IT and services.
2018
Carahsoft was one of two teams selected by the U.S. GSA as part of a 10-year, $2.5 billion blanket purchase agreement to provide Software-as-a-Service (SaaS) applications for payroll, work schedule and leave management.
Overcharging allegations and settlement
In 2010 a lawsuit was filed against Carahsoft and VMware for allegedly overcharging government customers. The firms denied the allegations. To avoid protracted litigation, they settled the case with the United States Department of Justice for $75.5m.
References
Technology companies of the United States
Companies based in Reston, Virginia
Technology companies established in 2004
American companies established in 2004
2004 establishments in Virginia |
2211767 | https://en.wikipedia.org/wiki/Rayat%20Institute%20of%20Engineering%20%26%20Information%20Technology | Rayat Institute of Engineering & Information Technology | Rayat Institute of Engineering & Information Technology (RIEIT) is private college affiliated to Punjab Technical University, offering engineering courses at undergraduate and Graduate level, leading to B.Tech and M.Tech degrees. Also offers doctoral study centres. RIEIT is part of the Rayat-Bhara Group which also include Rayat Bahra University and Bahra University.
Overview
Rayat Institute of Engineering and Information Technology was established in 2001 and offers five B.Tech. Programmes in the areas of Computer Science & Engineering, Information Technology, Electronics and Communication Engineering, Mechanical Engineering and Electrical Engineering. Subsequently, M.Tech programmes in Computer Science and Engineering, Electronics and Communication Engineering and Mechanical Engineering were added, all affiliated to the Punjab Technical University. It occupies an area of .
The RIEIT also houses a polytechnic college, leading to Diploma in engineering. In the campus B.Pharmacy program, M.Pharma in different specialisations (viz. Pharmaceutics, Pharmacology, Pharmaceutical chemistry) are also being offered. College of Law, of education Table gives an account of various programmes being run by the Institute and their dates of introduction.
Location
The Institute is situated within the Rayat Technology Centre of Excellence, Ropar, and is located 6 km from Ropar city on Chandigarh-Ropar-Jalandhar Highway.
Campus
The aesthetic environmentally friendly campus extends over , close to the Shivalik hills. The campus is divided into zones for hostels, Main College Buildings, Administration Block, Residential Complex, etc.
Campus of Rayat Institute of Engineering and Information Technology is regarded as largest among the Privately funded Engineering Colleges in North India.
There are 8 campus of Rayat in India
Admissions
Admissions to Bachelor of Engineering courses are made through the:
I. K. Gujral Punjab Technical University (for students who belongs to Punjab through PTU Central Counseling). Candidates admitted through this counseling get some advantages.
based on Joint Entrance Examination (Main) Score (conducted by All India Engineering Entrance Examination)
Also available to other states or internationally.
reservation of seats for the students of other states to be admitted through Joint Entrance Examination (Main)
NRI students can expect best of attention to them, because of group's experiences and connections
Academics
Bachelor of Engineering courses (four years):
Computer Science and Engineering
Electrical Engineering
Electronics and Communication Engineering
Information Technology
Mechanical Engineering
Master of Engineering courses (two years):
Computer Science and Engineering
Electronics and Communication Engineering
Mechanical Engineering
Other departments
Applied science
Department of Career Development and Placements
The Training and Placement Department is staffed with full-time placement professionals. RIEIT support students an employment placement service though success for positions is dictated by merit. Past performance, and successes prommptfor the continued expectation of a good future.
Minerva
Minerva is a national level annual cultural and technical symposium organized by students of the Rayat Institute of Engineering and Information Technology. Considered to be one of the biggest college festivals of North India, it attracts many students from colleges across north India. Events include quizzes, workshops, game shows, informal events, fashion show, dance show and rock night.
Alumni Association
Rayat Old Student Association(RIEITos) is a non profit, non political organization of former graduates and former faculty members (hereafter collectively referred to as the Alumni) of RIEIT, Ropar campus.
See also
Rayat Bahra University, Greater Mohali
References
External links
All India Council for Technical Education
Engineering colleges in Punjab, India
2001 establishments in Punjab, India
Educational institutions established in 2001 |
26051183 | https://en.wikipedia.org/wiki/Automation%20Anywhere | Automation Anywhere | Automation Anywhere is an American global software company that develops robotic process automation (RPA) software.
Founded in 2003, the company is headquartered in San Jose, California.
History
Automation Anywhere was originally founded as Tethys Solutions, LLC in San Jose, by Mihir Shukla, Neeti Mehta Shukla, Ankur Kothari and Rushabh Parmani. The company rebranded itself as Automation Anywhere, Inc. in 2010.
As of early 2021, the company reported more than 2,800 current customers in 90+ countries. Customers cited in 2020 included Accenture, Boston Scientific, Cisco, Cognizant, Comcast, Dell EMC, Deloitte, Hitachi, IBM, Juniper Networks, LinkedIn, MasterCard, PricewaterhouseCoopers, Quest Diagnostics, Siemens, Stanley Black & Decker, Symantec, Tesco, Unilever, Volkswagen, Whirlpool, and the World Bank, as well as the World Health Organization.
The company's 2,100+ partner relationships include collaborations with Microsoft, Google, and Amazon Web Services to advance intelligent automation, and with Salesforce, to help its customers automate their front office business processes. In early 2021, the company had 110+ customers who are also partners.
Between 2018 and 2019, Automation Anywhere received a total of $840 million in Series A and Series B investments at a post-money valuation of $6.9 billion. In 2018 the company announced a total of Series A investments of $550 million from General Atlantic, Goldman Sachs, NEA, World Innovation Lab, SoftBank Investment Advisers, and Workday Ventures. In late 2019, a Series B round, led by Salesforce Ventures, raised $290 million.
In 2019, the company acquired Klevops, a privately owned company based in Paris that works in the finance, banking and telecommunications industries.
In 2021, Automation Anywhere reported a total of 2.8 million bots deployed since it began.
In December 2021, Automation Anywhere announced it intends to acquire process discovery startup FortressIQ.
References
External links
Computer companies of the United States
Software testing tools
Automation software
Software companies based in California
Web scraping
Software companies of the United States
2003 establishments in the United States
2003 establishments in California
Software companies established in 2003
Companies established in 2003 |
48167307 | https://en.wikipedia.org/wiki/IPSW | IPSW | IPSW, iPhone Software, is a file format used to install iOS, iPadOS, tvOS, HomePod, and most recently, macOS firmware for devices equipped with Apple silicon. All Apple devices share the same IPSW file format for iOS firmware and their derivatives, allowing users to flash their devices through Finder or iTunes on macOS or Windows, respectively. Users can flash Apple silicon Macs through Apple Configurator 2.
Structure
The .ipsw file itself is a compressed archive file (renamed Zip archive) containing at least three Apple Disk Image files with one containing the root file system of the OS and two ram disks for restore and update. tvOS, audioOS and macOS also include a disk image for the recovery environment (recoveryOS).
The file also holds the kernel caches, and a "Firmware" folder which contains iBoot, LLB (Low-Level Bootloader), iBSS (iBoot Single Stage), iBEC (iBoot Epoch Change), the Secure Enclave Processor firmware, the Device Tree, Firmware Images (Apple logo, battery images, Recovery mode screen and more), baseband firmware files in .bbfw format (renamed zip file), and other firmware files.
There are two more files named "BuildManifest.plist" and "Restore.plist", both property lists that contain compatibility information and SHA-256 hashes for different components.
BuildManifest.plist is sent to Apple's TSS server and checked in order to obtain SHSH blobs before every restore. Without SHSH blobs, the device will refuse to restore, thus making downgrades very difficult to achieve.
Security and rooting
The archive is not password-protected, but iBoot, LLB, iBEC, iBSS, iBootData and the Secure Enclave Processor firmware images inside it are encrypted with AES. Until iOS 10, all the firmware files (including the root file system and Restore and Update ramdisks) were encrypted. While Apple does not release these keys, they can be extracted using different iBoot or bootloader exploits, such as limera1n (created by George Hotz, more commonly known as geohot). Since then, many tools were created for the decryption and modification of the root file system.
Government data access
After the 2015 San Bernardino attack, the FBI recovered the shooter's iPhone 5C, which belonged to the San Bernardino County Department of Public Health. The FBI recovered iCloud backups from one and a half months before the shooting, and wanted to access encrypted files on the device. The U.S. government ordered Apple to produce an IPSW file that would allow investigators to brute force the passcode of the iPhone. The order used the All Writs Act, originally created by the Judiciary Act of 1789, to demand the firmware, in the same way as other smartphone manufacturers have been ordered to comply.
Tim Cook responded on the company's webpage, outlining a need for encryption, and arguing that if they produce a backdoor for one device, it would inevitably be used to compromise the privacy of other iPhone users:
References
External links
iPSW at Apple Support
iPSW at File Extensions
iPSW at The iPhone Wiki
BASEBAND files at The iPhone Wiki
IOS
Computer file formats
Archive formats |
6816862 | https://en.wikipedia.org/wiki/Ethernet%20over%20USB | Ethernet over USB | Ethernet over USB refers to using a USB as an Ethernet network. It also refers to an Ethernet device which is connected over USB (instead of e.g. PCI or PCIe).
Protocols
There are numerous protocols for Ethernet-style networking over USB. The use of these protocols is to allow application-independent exchange of data with USB devices, instead of specialized protocols such as video or MTP. Even though the USB is not a physical Ethernet, the networking stacks of all major operating systems are set up to transport IEEE 802.3 frames, without needing a particular underlying transport.
The main industry protocols are (in chronological order): Remote NDIS (RNDIS, a Microsoft vendor protocol), Ethernet Control Model (ECM), Ethernet Emulation Model (EEM), and Network Control Model (NCM). The latter three are part of the larger Communications Device Class (CDC) group of protocols of the USB Implementers Forum (USB-IF). They are available for download from the USB-IF (see below). The RNDIS specification is available from Microsoft's web site. Regarding de facto standards, some standards, such as ECM, specify use of USB resources that early systems did not have. However, minor modifications of the standard subsets make practical implementations possible on such platforms. Remarkably, even some of the most modern platforms need minor accommodations and therefore support for these subsets is still needed.
Of these protocols ECM is could be classified the simplest—frames are simply sent and received without modification one at a time. This was a satisfactory strategy for USB 1.1 systems (current when the protocol was issued) with 64 byte packets but not for USB 2.0 systems which use 512 byte packets.
One significant problem is, the Ethernet frames are about 1500 bytes in size—about 3 USB 2.0 packets, and 23 USB 1.1 packets. The USB system works by each packet being sent as a transfer, a series of maximum-length packets terminated by a short packet or a special ZLP (zero-length packet). After this, there is bus latency, where nothing is sent until another transfer can be initiated. Such reduces bus occupancy, meaning that nothing is sent for considerable fractions of bus time. A gap every 23 frames is not noticeable, but a gap every three frames can be view as very costly to throughput.
As USB has become faster, devices utilise more data and hence there is now demand for sending large amounts of data—either to be stored on the device, or be relayed over wireless links (see 3GPP Long Term Evolution).
These new devices are still much lower in power than desktop PCs, thus the issue of careful data handling arises, to maximize use of DMA resources on the device and minimize (or eliminate) copying of data (zero-copy). The NCM protocol has elaborate provisions for this. See link below for careful protocol comparisons.
Linux-specific driver
The USB-eth module in Linux makes the computer running it a variation of an Ethernet device that uses USB as the physical medium. It creates a Linux network interface, which can be assigned an IP address and otherwise treated the same as a true Ethernet interface. Any applications that work over real Ethernet interfaces will work over a USB-ethernet interface without modification, as there is no distinction between utilising proper or improper Ethernet hardware.
On Linux hosts, the corresponding Ethernet-over-USB kernel module is called usbnet. The Bahia Network Driver is a usbnet-style driver available for Win32 hosts.
The approach allows devices with very limited communications hardware to operate over IP networks. The Linux kernel for the iPAQ uses this communications strategy exclusively, since the iPAQ hardware has neither an accessible legacy (RS-232/RS-422) serial port nor a dedicated network interface.
See also
RNDIS
References
External links
CDC Subclass Specification for Ethernet Emulation Model Devices 1.0
Computer networking
Serial buses
USB |
18160288 | https://en.wikipedia.org/wiki/Kainos | Kainos | Kainos Group plc (commonly referred to simply as Kainos or Kainos Software) is a software company headquartered in Belfast, Northern Ireland that develops information technology for businesses and organisations. It is listed on the London Stock Exchange and is a constituent of the FTSE 250 Index.
Name
The word Kainos comes from ancient Greek, meaning "new" or "fresh".
History
1986 to 2009
Kainos was founded as a joint venture between Fujitsu and The Queen's University of Belfast business incubation unit (QUBIS Ltd) on 14 April 1986. In January 1987, the company began trading in the QUBIS building on Malone Road, Belfast and Kainos founder, Frank Graham, was appointed managing director.
A spin-off company of Kainos was established in 1994 called Lagan Technologies, which grew to become a software supplier into local government in the UK and the US. It was eventually acquired by U.S. based Kana Software in 2010.
By 1997, due to the expansion of the company, Kainos relocated and opened offices in Mount Charles, Belfast, and opened its main headquarters in Upper Crescent, Belfast.
After several years of business, in which Fujitsu was its main customer, the company sold most of its stake to ACT Venture capital in 2000. ACT Venture capital later divested its shareholding in Kainos.
In September 2004, Kainos entered into a partnership agreement with Mediasurface, the UK's largest content management provider. The following year, Kainos entered into another partnership agreement in 2005 with TIBCO, a provider of infrastructure software.
In 2007, a wholly owned subsidiary of Kainos called SpeechStorm was established, providing automated products for call centres, including touch tone, SMS, speech and visual IVR. That same year, management in Kainos bought out Fujitsu's remaining 20% shares.
2010 to present
Kainos expanded its operations and opened a research office in Silicon Valley on 18 October 2010.
On 19 November 2014, Kainos announced its plans to expand the Evolve Electronic Medical Records platform into Ireland.
On 10 July 2015, Kainos was admitted to the main market of the London Stock Exchange, trading as Kainos Group plc.
As part of an extension of its US operation, in June 2020, Kainos announced plans to create 133 jobs in Indianapolis. The plans included expenditure of £650,000 to build new facilities in the area, marking the first venture into the US Midwest.
Divisions
Digital Services
Kainos is a provider of digital services to government departments and agencies.
WorkSmart
Kainos has been a Workday implementation partner since 2011, providing management, integration, support and testing services for the Workday SaaS product.
Evolve
Kainos provides digital services to healthcare providers such as hospitals and community care organisations. Developed with NHS clinicians and managers, the Evolve EMR is the main product in Kainos's suite of healthcare products.
See also
References
External links
Official website
Software companies of the United Kingdom
Companies established in 1986
1986 establishments in Northern Ireland
Companies of Northern Ireland
Companies based in Belfast
Brands of Northern Ireland |
27739 | https://en.wikipedia.org/wiki/Shogi | Shogi | , also known as Japanese chess (the game of generals), is a two-player strategy board game that is the Japanese variant of chess. It is the most popular chess variant in Japan. Shōgi means general's (shō ) board game (gi ).
Shogi was the earliest chess variant to allow captured pieces to be returned to the board by the capturing player. This drop rule is speculated to have been invented in the 15th century and possibly connected to the practice of 15th century mercenaries switching loyalties when captured instead of being killed.
The earliest predecessor of the game, chaturanga, originated in India in the sixth century, and the game was likely transmitted to Japan via China or Korea sometime after the Nara period. Shogi in its present form was played as early as the 16th century, while a direct ancestor without the drop rule was recorded from 1210 in a historical document Nichūreki, which is an edited copy of Shōchūreki and Kaichūreki from the late Heian period (c. 1120).
Equipment
Two players face each other across a board composed of rectangles in a grid of 9 ranks (rows, ) by 9 files (columns, ) yielding an 81 square board. In Japanese they are called Sente (first player) and Gote (second player), but in English are conventionally referred to as Black and White, with Black the first player.
The board is nearly always rectangular, and the rectangles are undifferentiated by marking or color. Pairs of dots mark the players' promotion zones.
Each player has a set of 20 flat wedge-shaped pentagonal pieces of slightly different sizes. Except for the kings, opposing pieces are undifferentiated by marking or color. Pieces face forward by having the pointed side of each piece oriented toward the opponent's side – this shows who controls the piece during play. The pieces from largest (most important) to smallest (least important) are:
1 king
1 rook
1 bishop
2 gold generals
2 silver generals
2 knights
2 lances
9 pawns
Several of these names were chosen to correspond to their rough equivalents in international chess, and not as literal translations of the Japanese names.
Each piece has its name written on its surface in the form of two kanji (Chinese characters used in Japanese), usually in black ink. On the reverse side of each piece, other than the king and gold general, are one or two other characters, in amateur sets often in a different color (usually red); this side is turned face up during play to indicate that the piece has been promoted.
Following is a table of the pieces with their Japanese representations and English equivalents. The abbreviations are used for game notation and often when referring to the pieces in speech in Japanese.
English speakers sometimes refer to promoted bishops as horses and promoted rooks as dragons, after their Japanese names, and generally use the Japanese term tokin for promoted pawns. Silver generals and gold generals are commonly referred to simply as silvers and golds.
The characters inscribed on the reverse sides of the pieces to indicate promotion may be in red ink, and are usually cursive. The characters on the backs of the pieces that promote to gold generals are cursive variants of 'gold', becoming more cursive (more abbreviated) as the value of the original piece decreases. These cursive forms have these equivalents in print: for promoted silver, for promoted knight, for promoted lance, and for promoted pawn (tokin). Another typographic convention has abbreviated versions of the original values, with a reduced number of strokes: for a promoted knight , for a promoted lance , and the as above for a promoted silver, but (a hiragana symbol for the syllable "to") for tokin.
The suggestion that the Japanese characters have deterred Western players from learning shogi has led to "Westernized" or "international" pieces which use iconic symbols instead of characters. Most players soon learn to recognize the characters, however, partially because the traditional pieces are already iconic by size, with more powerful pieces being larger. As a result, Westernized pieces have never become popular. Bilingual pieces with both Japanese characters and English captions have been developed as have pieces with animal cartoons.
Setup and gameplay
Each player sets up friendly pieces facing forward (toward the opponent).
In the rank nearest the player:
the king is placed in the center file;
the two gold generals are placed in files adjacent to the king;
the two silver generals are placed adjacent to each gold general;
the two knights are placed adjacent to each silver general;
the two lances are placed in the corners, adjacent to each knight.
That is, the first rank is
{| class="wikitable"
|- style="text-align:center;"
| L || N || S || G || K || G || S || N || L
|}
or
{| class="wikitable"
|- style="text-align:center;"
|||||||||||||||||
|}
In the second rank, each player places:
the bishop in the same file as the left knight;
the rook in the same file as the right knight.
In the third rank, the nine pawns are placed one per file.
A furigoma 振り駒 'piece toss' is used to decide who moves first. One of the players tosses five pawns. If the number of tokins (promoted pawns, と) facing up is higher than unpromoted pawns (歩), then the player who tossed the pawns plays gote 後手 'white' (that is, getting the second move).
After the piece toss furigoma, the game proceeds. If multiple games are played, then players alternate turns for who goes first in subsequent games. (The terms "Black" and "White" are used to differentiate sides although there is no difference in the color of the pieces.) For each turn, a player may either move a piece that is currently on the board (and potentially promote it, capture an opposing piece, or both) or else drop a piece that has been previously captured onto a square of the board. These options are explained below.
Rules
Objective
The usual goal of a game is for one player to checkmate the other player's king, winning the game.
Movement
Most shogi pieces can move only to an adjacent square. A few may move across the board, and one jumps over intervening pieces.
The lance, bishop, and rook are ranging pieces: They can move any number of squares along a straight line limited only by intervening pieces and the edge of the board. If an opposing piece intervenes, it may be captured by removing it from the board and replacing it with the moving piece. If a friendly piece intervenes, the moving piece must stop short of that square; if the friendly piece is adjacent, the moving piece may not move in that direction at all.
A king (玉/王) moves one square in any direction, orthogonal or diagonal.
A rook (飛) moves any number of squares in an orthogonal direction.
A bishop (角) moves any number of squares in a diagonal direction. Because they cannot move orthogonally, the players' unpromoted bishops can reach only half the squares of the board, unless one is captured and then dropped.
A gold general (金) moves one square orthogonally, or one square diagonally forward, giving it six possible destinations. It cannot move diagonally backwards.
A silver general (銀) moves one square diagonally, or one square straight forward, giving it five possible destinations. Because an unpromoted silver can retreat more easily than a promoted one, it is common to leave a silver unpromoted at the far side of the board. (See Promotion).
A knight (桂) jumps at an angle intermediate to orthogonal and diagonal, amounting to one square straight forward plus one square diagonally forward, in a single move. Thus the knight has two possible forward destinations. Unlike international chess knights, shogi knights cannot move to the sides or in a backwards direction. The knight is the only piece that ignores intervening pieces on the way to its destination. It is not blocked from moving if the square in front of it is occupied, but neither can it capture a piece on that square. It is often useful to leave a knight unpromoted at the far side of the board. A knight must promote, however, if it reaches either of the two furthest ranks. (See Promotion.)
A lance (香) moves just like the rook except it cannot move backwards or to the sides. It is often useful to leave a lance unpromoted at the far side of the board. A lance must promote, however, if it reaches the furthest rank. (See Promotion.)
A pawn (歩) moves one square straight forward. It cannot retreat. Unlike international chess pawns, shogi pawns capture the same as they move. A pawn must promote if it arrives at the furthest rank. (See Promotion.) In practice, however, a pawn is usually promoted whenever possible. There are two restrictions on where a pawn may be dropped. (See Drops.)
All pieces but the knight move either horizontally, vertically, or diagonally. These directions cannot be combined in a single move; one direction must be chosen.
Every piece blocks the movement of all other non-jumping pieces through the square it occupies.
If a piece occupies a legal destination for an opposing piece, it may be captured by removing it from the board and replacing it with the opposing piece. The capturing piece may not continue beyond that square on that turn. Shogi pieces capture the same as they move.
Normally, when moving a piece, a player snaps it to the board with the ends of the fingers of the same hand. This makes a sudden sound effect, bringing the piece to the attention of the opponent. This is also true for capturing and dropping pieces. On a traditional shogi-ban, the pitch of the snap is deeper, delivering a subtler effect.
Promotion
A player's promotion zone consists of the furthest one-third of the board – the three ranks occupied by the opponent's pieces at setup. The zone is typically delineated on shogi boards by two inscribed dots. When a piece is moved, if part of the piece's path lies within the promotion zone (that is, if the piece moves into, out of, or wholly within the zone; but not if it is dropped into the zone – see Drops), then the player has the option to promote the piece at the end of the turn. Promotion is indicated by turning the piece over after it moves, revealing the character of the promoted piece.
If a pawn or lance is moved to the furthest rank, or a knight is moved to either of the two furthest ranks, that piece must promote (otherwise, it would have no legal move on subsequent turns). A silver general is never required to promote, and it is often advantageous to keep a silver general unpromoted. (It is easier, for example, to extract an unpromoted silver from behind enemy lines; whereas a promoted silver, with only one line of retreat, can be easily blocked.) A rook, bishop, or pawn is almost always promoted, unless there is a problem due to "mate with a dropped pawn".
Promoting a piece changes the way it moves. The various pieces promote as follows:
A silver general, knight, lance, or pawn has its normal power of movement replaced by that of a gold general.
A rook or bishop keeps its original movement and gains the power to move one square in any direction (like a king). For a promoted bishop, this means it is able to reach any square on the board, given enough moves.
A king or a gold general does not promote; nor can a piece that is already promoted.
When captured, a piece loses its promoted status. Otherwise promotion is permanent.
A promoted rook ("dragon king", 龍王 ryūō; alternate forms: 龍, 竜) moves as a rook and as a king. It is also called a dragon.
A promoted bishop ("dragon horse", 龍馬 ryūma; alternate form: 馬) moves as a bishop and as a king. It is also known as a horse.
A promoted silver (成銀 narigin; alternate forms: 全, cursive 金), a promoted knight (成桂 narikei; alternate forms: 圭, 今, cursive 金), a promoted lance (成香 narikyō; alternate forms: 杏, 仝, cursive 金) and a promoted pawn (と金 tokin; alternate forms: と, 个) all move the same way as a gold general. The promoted pawn is often called by its Japanese name tokin, even by non-Japanese players.
Drops
Captured pieces are retained in hand and can be brought back into play under the capturing player's control. The Japanese term for piece(s) in hand is either 持ち駒 mochigoma or 手駒 tegoma. On any turn, instead of moving a piece on the board, a player may select a piece in hand and place it – unpromoted side up and facing the opposing side – on any empty square. The piece is then one of that player's active pieces on the board and can be moved accordingly. This is called dropping the piece, or simply, a drop. A drop counts as a complete move.
A drop cannot capture a piece, nor does dropping within the promotion zone result in immediate promotion. Capture and/or promotion may occur normally, however, on subsequent moves of the piece.
Restrictions. There are three restrictions on dropping pieces; the last two of these apply only to pawns.
Piece with No Moves ( ikidokorononaikoma): Pawns, lances and knights may not be dropped onto the last (9th) rank, and knights may not be dropped onto the penultimate (8th) rank; this is because such dropped pieces would have no legal moves on subsequent turns (as they can only move in the forward direction).
Two Pawns ( nifu): A pawn may not be dropped onto a file (column) containing another unpromoted pawn of the same player (promoted pawns do not count).
Drop Pawn Mate ( uchifudzume): A pawn may not be dropped to give an immediate checkmate. (This rule only applies specifically to pawns, drops and checkmates − to clarify, a player may deliver an immediate checkmate by dropping a non-pawn piece, a player may checkmate a king with a pawn that is already on the board, and a pawn may be dropped to give an immediate check as long as it does not also result in checkmate.)
A corollary of the second restriction is that a player with an unpromoted pawn on every file is unable to drop a pawn anywhere. For this reason, it is common to sacrifice a pawn in order to gain flexibility for drops.
Captured pieces are typically kept on a wooden stand (駒台 komadai) which is traditionally placed so that its bottom-left corner aligns with the bottom-right corner of the board from the perspective of each player. It is not permissible to hide pieces from full view.
It is common for players to swap bishops, which oppose each other across the board, early in the game. This leaves each player with a bishop in hand to be dropped later. The ability for drops in shogi gives the game tactical richness and complexity. The fact that no piece ever goes entirely out of play accounts for the rarity of draws.
Check
When a player's move threatens to capture the opposing king on the next turn, the move is said to give check to the king and the king is said to be in check. If a player's king is in check, that player's responding move must remove the check if possible. Ways to remove a check include moving the king away from the threat, capturing the threatening piece, or placing another interposing piece between the king and the threatening piece.
To announce check in Japanese, one can say ōte (). However, this is an influence of international chess and is not required, even as a courtesy. Announcing a check vocally is unheard of in serious play.
End of the game
The usual way for shogi games to end is for one side to checkmate the other side's king, after which the losing player will be given the opportunity to admit defeat. Unlike western chess or xiangqi, checkmate is almost always the result in shogi since pieces never retire from play which gives the players a sufficient number of pieces to deliver checkmate. That said, there are three other possible ways for a game to end: repetition ( sennichite), impasse ( jishōgi), and an illegal move (反則手). The first two – repetition and impasse – are particularly uncommon. Illegal moves are also uncommon in professional games although this may not be true with amateur players (especially beginners).
Unlike western chess, there is no tradition of offering a mutual draw by agreement.
Checkmate
If the king is in check and there is no possible move which could protect the king, the move is said to checkmate (tsumi 詰み) the king. Checkmate effectively means that the opponent wins the game as the player would have no remaining legal moves. (See also: tsumeshogi, hisshi.)
Resignation
The losing player will usually resign when the situation is thought to be hopeless and may declare the resignation at any time during their turn. Although a player may resign just after they are checkmated, playing up to the checkmate point rarely occurs in practice as players normally resign as soon as a loss is deemed inevitable – such as when a tsume (forced mate sequence) is realized by the losing player. Similarly, if a player were to lose in an Entering King situation (see section below) by having less than 24 points (or by any of the other Impasse rules used by amateurs), then the player will usually resign before that point.
In traditional tournament play, a formal resignation is required – that is, a checkmate is not a sufficient condition for winning. The resignation is indicated by bowing and/or saying 'I lost' (負けました makemashita) and/or placing the right hand over the piece stands. Placing the hand over the piece stand is a vestige of an older practice of gently dropping one's pieces in hand over the board in order to indicate resignation. In western practice, a handshake may be used.
Illegal move
In professional and serious (tournament) amateur games, a player who makes an illegal move loses immediately. The loss stands even if play continued and the move was discovered later in game. However, if neither the opponent nor a third party points out the illegal move and the opponent later resigned, the resignation stands as the result.
Illegal moves include:
Violating the Two Pawns (nifu) restriction (See §Drops above.)
Violating the Drop Pawn Mate (uchifuzume) restriction
Dropping or moving a piece to position where it cannot move (such as dropping a knight to an opponent's last two ranks, etc.)
Dropping a piece with its promoted value
Playing out of turn, e.g. making more than one move or white moving first instead of moving second.
Making perpetual check four times (cf. sennichite)
Leaving one's king in check, or moving one's king into check
Moving a piece contrary to how its movements are defined (for example, moving a gold like a silver, or moving an unpromoted bishop off its legal diagonal)
In friendly amateur games, this rule is sometimes relaxed, and the player may be able to take back the illegal move and replay a new legal move.
In particular, the Two Pawn violation is the most common illegal move played by professional players. The Two Pawn violation played by Takahiro Toyokawa (against Kōsuke Tamura) in the 2004 NHK Cup is infamous since it was broadcast on television. On the 109th move, Toyokawa (playing as Black) dropped a pawn to the 29 square while he already had a pawn in play on the board on the 23 square and, thus, lost the game.
Repetition (draw)
If the same game position occurs four times with the same player to move and the same pieces in hand for each player, then the game ends in a repetition draw (千日手 sennichite, lit. "moves for a thousand days"), as long as the positions are not due to perpetual check. Perpetual check (連続王手の千日手) is an illegal move (see above), which ends the game in a loss in tournament play.
In professional shogi, a repetition draw outcome is not a final result as draws essentially do not count. There can be only one victorious through wins. In the case of a repetition draw, professional shogi players will have to immediately play a subsequent game (or as many games as necessary) with sides reversed in order to obtain a true win outcome. (That is, the player who was White becomes Black, and vice versa.) Also, depending on the tournament, professional players play the subsequent game in the remainder of the allowed game time.
Thus, aiming for a repetition draw may be a possible professional strategy for the White player in order to play the second replay game as Black, which has a slight statistical advantage and/or greater initiative. For instance, Bishop Exchange Fourth File Rook is a passive strategy for White with the goal of a repetition draw (as it requires two tempo losses – swinging the rook and trading the bishops) while it is a very aggressive strategy if played by Black.
Repetition draws are rare in professional shogi occurring in about 1–2% of games and even rarer in amateur games. In professional shogi, repetition draws usually occur in the opening as certain positions are reached that are theoretically disadvantaged for both sides (reciprocal zugzwang). In amateur shogi, repetition draws tend to occur in the middle or endgame as a result of player errors.
Impasse
The game reaches an Impasse or Deadlock (持将棋 jishōgi) if both kings have advanced into their respective promotion zones – a situation known as 相入玉 (ai-nyū gyoku "double entering kings") – and neither player can hope to mate the other or to gain any further material. An Impasse can result in either a win or a draw. If an Impasse happens, the winner is decided as follows: each player agrees to an Impasse, then each rook or bishop, promoted or not, scores 5 points for the owning player, and all other pieces except kings score 1 point each. A player scoring fewer than 24 points loses. (Note that in the start position, both players have 27 points each.) If neither player has fewer than 24, the game is no contest — a draw. In professional shogi, an Impasse result is always a draw since a player that cannot obtain the 24 points will simply resign. Jishōgi is considered an outcome in its own right rather than no contest, but there is no practical difference. As an Impasse needs to be agreed on for the rule to be invoked, a player may refuse to do so and attempt to win the game in future moves. If that happens, there is no official rule about the verdict of the game.
However, in amateur shogi, there are different practices most of which force a win resolution to the Impasse in order to avoid a draw result.
The first draw by Impasse occurred in 1731 in a bishop handicap game between the seventh Lifetime Meijin, , and his brother, Sōkei Ōhashi.
Entering King
As a practical matter, when an opponent's king has entered a player's own territory especially with supporting defending pieces, the opponent's king is often very difficult to mate given the forward attacking nature of most shogi pieces. This state is referred to as entering king (入玉 nyū gyoku). If both players' kings are in entering king states, the game becomes more likely to result in an impasse.
In the adjacent diagram example, although White's king is in a strong Bear-in-the-hole castle, Black's king has entered White's territory making it very difficult to mate. Therefore, this position favors Black.
An example of Entering King occurred in the fourth game of the 60th Ōi title match between Masayuki Toyoshima and Kazuki Kimura held on August 2021, 2019. After being unsuccessful in attacking Kimura and also in defending his own king within his camp, Toyoshima (playing as White) moved his king away from Kimura's attacking pieces by fleeing up the second file, ultimately entering his king into Kimura's camp by move 150. Although Toyoshima had achieved Entering King, he still had only 23 pointsone point shy of the required 24 points for an Impasse drawwhile Kimura (Black) had 31 points. Toyoshima then spent the next 134 moves trying to bring his point total, which fluctuated between 17 and 23, up to the necessary 24. By the 231st move, the game had reached a Double Entering Kings state, and by move 285 Kimura had successfully kept Toyoshima's point total at bay. Here, Toyoshima with 20 points (and Kimura at 34 points) resigned. Incidentally, this game broke the record of longest game in a title match.
Amateur resolutions
For amateur games, there are various guidances with little standardization. Fairbairn reports a practice in the 1980s (considered a rule by the now defunct Shogi Association for The West) where the dispute is resolved by either player moving all friendly pieces into the promotion zone and then the game ends with points tallied.
Another resolution is the 27-Point (27点法) rule used for some amateur tournaments. One version of this is simply the player who has 27 or more points is the winner of the Impasse. Another version is a 27-Point Declaration rule. For instance, the Declaration rule on the online shogi site, 81Dojo, is that the player who wants to declare an Impasse win must (i) declare an intention win via Impasse, (ii) have the king in the enemy camp (the promotion zone for that player), (iii) 10 other pieces must be in the promotion zone, (iv) not be in check, (v) have time remaining, and (vi) must have 28 points if Black or 27 points if White. If all of these conditions are met, then the Impasse declarer will win the game regardless of whether the opponent objects.
Yet another resolution to Impasse is the so-called Try Rule (トライルール torairūru). In this case, after both kings have entered their corresponding promotion zones, then the player who first moves the king to the opponent's king's start square (51 for Black, 59 for White) first will be the winner. As an example, the popular 将棋ウォーズ (Shogi Wars) app by HEROZ Inc. used the Try Rule up until 2014. (Now the app uses a variant of the 27-Point Declaration Rule – although it differs from the variant used on the 81Dojo site.) The idea of "Try Rule" was taken from rugby football (see Try (rugby)).
Draws in tournaments
In professional tournaments, the rules typically require drawn games to be replayed with sides reversed, possibly with reduced time limits. This is rare compared to chess and xiangqi, occurring at a rate of 1–2% even in amateur games.
The 1982 Meijin title match between Makoto Nakahara and Hifumi Katoh was unusual in this regard with an impasse draw in the first (Double Fortress) game on April 13–14 (only the fifth draw in the then 40-year history of the tournament). This game (with Katoh as Black) lasted for 223 moves with 114 minutes spent pondering a single move. One of the reasons for the length of this game was that White (Nakahara) was very close to falling below the minimum of 24 points required for a draw. Thus, the end of the endgame was strategically about trying to keep White's points above the 24-point threshold. In this match, sennichite occurred in the sixth and eighth games. Thus, this best-of-seven match lasted eight games and took over three months to finish; Black did not lose a single game and the eventual victor was Katoh at 4–3.
Time control
Professional games are timed as in international chess, but professional shogi players are almost never expected to keep time in their games. Instead a timekeeper is assigned, typically an apprentice professional. Time limits are much longer than in international chess (9 hours a side plus extra time in the prestigious Meijin title match), and in addition byōyomi (literally "second counting") is employed. This means that when the ordinary time has run out, the player will from that point on have a certain amount of time to complete every move (a byōyomi period), typically upwards of one minute. The final ten seconds are counted down, and if the time expires the player to move loses the game immediately. Amateurs often play with electronic clocks that beep out the final ten seconds of a byōyomi period, with a prolonged beep for the last five.
Player rank and handicaps
Amateur players are ranked from 15 kyū to 1 kyū and then from 1 dan to 8 dan. Amateur 8 dan was previously only honorarily given to famous people. While it is now possible to win amateur 8 dan by actual strength (winning amateur Ryu-oh 3 times), this has yet to be achieved.
Professional players operate with their own scale, from 6 kyū to 3 dan for pro-aspiring players and professional 4 dan to 9 dan for formal professional players. Amateur and professional ranks are offset (with amateur 4 dan being equivalent to professional 6 kyū).
Handicaps
Shogi has a handicap system (like go) in which games between players of disparate strengths are adjusted so that the stronger player is put in a more disadvantageous position in order to compensate for the difference in playing levels. In a handicap game, one or more of White's pieces are removed from the setup, and instead White plays first.
Notation
There are two common systems used to notate piece movements in shogi game records. One is used in Japanese language texts while a second was created for western players by George Hodges and Glyndon Townhill in the English language. This system was updated by Hosking to be closer to the Japanese standard (two numerals). Other systems are used to notate shogi board positions. Unlike chess, the origin (11 square) is at the top right of a printed position rather than the bottom left.
In western piece movement notation, the format is the piece initial followed by the type of movement and finally the file and rank where the piece moved to. The piece initials are K (King), R (Rook), B (Bishop), G (Gold), S (Silver), N (Knight), L (Lance), and P (Pawn). Simple movement is indicated with -, captures with x, and piece drops with *. The files are indicated with numerals 1–9. The older Hodges standard used letters a–i for ranks, and the newer Hosking standard also uses numerals 1–9 for the ranks. Thus, Rx24 indicates 'rook captures on 24'. Promoted pieces are notated with + prefixed to the piece initial (e.g. +Rx24). Piece promotion is also indicated with + (e.g. S-21+) while unpromotion is indicated with = (e.g. S-21=). Piece ambiguity is resolved by notating which square a piece is moving from (e.g. N65-53+ means 'knight from 65 moves to 53 and promotes,' which distinguishes it from N45-53+).
The Japanese notation system uses Japanese characters for pieces and promotion indication and uses Japanese numerals instead of letters for ranks. Movement type aside from drops is not indicated, and the conventions for resolving ambiguity are quite different from the western system. As examples, the western Rx24 would be in Japanese notation, +Rx24 would be , S-21+ would be , S-21= would be , and N65-53+ would be showing that the leftmost knight jumped (implicitly from the 65 square), which distinguishes it from in which the rightmost knight jumped.
Although not strictly part of the notational calculus for games, game results are indicated in Japanese newspapers, websites, etc. with wins indicated by a white circle and losses indicated by a black circle.
Strategy and tactics
Shogi is similar to chess but has a much larger game tree complexity because of the use of drops, greater number of pieces, and larger board size. In comparison, shogi games average about 140 (half-)moves per game (or 70 chess move-pairs) whereas chess games average about 80 moves per game (or 40 chess move-pairs) and minishogi averages about 40 moves per game (or 20 chess move-pairs).
Like chess, however, the game can be divided into the opening, middle game and endgame, each requiring a different strategy. The opening consists of arranging one's defenses usually in a castle and positioning for attack; the mid game consists of attempting to break through the opposing defenses while maintaining one's own; and the endgame starts when one side's defenses have been compromised.
In the adjacent diagram, Black has chosen a Ranging Rook position (specifically Fourth File Rook) where the rook has been moved leftward away from its starting position. Additionally, Black is utilizing a Silver Crown castle, which is a type of fortification structure constructed with one silver and two gold pieces and the king moved inside of the fortification – the silver crown name comes from the silver being positioned directly above the king's head on the 27 square as if it were a crown. In the diagram, White has chosen a Static Rook position, in which the rook remains on its starting square. This Static Rook position is specifically a type of Counter-Ranging Rook position known as Bear-in-the-hole Static Rook that uses an Bear-in-the-hole castle. The Bear-in-the-hole fortification has the king moved all the way into very edge corner of the board on the 11 square as if it were a badger in a hole with a silver moved to the 22 square in order to close up the hole and additional reinforcing golds on 31 and 32 squares. This board position required 33 moves (or 12 move pairs as counted in western chess) to construct.
Etiquette
Shogi players are expected to follow etiquette in addition to rules explicitly described. Commonly accepted etiquette include the following:
greetings to the opponent both before and after the game
avoiding disruptive actions both during the game and after, for instance:
not changing the move once realized on the board
fair withdrawal without any disruption, such as scattering pieces on the board to demonstrate frustration
announcing one's resignation
Shogi piece sets may contain two types of king pieces, (king) and (jewel). In this case, the higher classed player, in either social or genuine shogi player rank, may take the king piece. For example, in titleholder system games, the current titleholder takes the king piece as the higher.
The higher-ranked (or older) player also sits facing the door of the room and is the person who takes the pieces out of the piece box.
Shogi does not have a touch-move rule as in western chess tournament play or chu shogi. However, in professional games, a piece is considered to be moved when the piece has been let go of. In both amateur and professional play, any piece may be touched in order to adjust its centralization within its square (to look tidy).
Taking back moves (待った matta) in professional games is prohibited. However, in friendly amateur games in Japan, it is often permitted.
Professional players are required to follow several ritualistic etiquette prescriptions such as kneeling exactly 15 centimeters from the shogi board, sitting in the formal seiza position, etc.
Game setup
Traditionally, the order of placing the pieces on the board is determined. There are two commonly used orders, the Ōhashi order 大橋流 and the Itō order 伊藤流. Placement sets pieces with multiples (generals, knights, lances) from left to right in all cases, and follows the order:
king
gold generals
silver generals
knights
In ito, the player now places:
5. pawns (left to right starting from the leftmost file)
6. lances
7. bishop
8. rook
In ohashi, the player now places:
5. lances
6. bishop
7. rook
8. pawns (starting from center file, then alternating left to right one file at a time)
Furigoma
Among amateur tournaments, the higher-ranked player or defending champion performs the piece toss. In professional games, the furigoma is done on the behalf of the higher-ranked player/champion by the timekeeper who kneels by the side of the higher-ranked player and tosses the pawn pieces onto a silk cloth. In friendly amateur games, a player will ask the opponent to toss the pawns out of politeness. Otherwise, the person who tosses the pawns can be determined by Rock–paper–scissors.
History
From The Chess Variant Pages:
It is not clear when chess was brought to Japan. The earliest generally accepted mention of shogi is (1058–1064) by Fujiwara Akihira. The oldest archaeological evidence is a group of 16 shogi pieces excavated from the grounds of Kōfuku-ji in Nara Prefecture. As it was physically associated with a wooden tablet written on in the sixth year of Tenki (1058), the pieces are thought to date from that period. These simple pieces were cut from a writing plaque in the same five-sided shape as modern pieces, with the names of the pieces written on them.
The dictionary of common folk culture, (c. 1210–1221), a collection based on the two works and , describes two forms of shogi, large (dai) shogi and small (shō) shogi. These are now called Heian shogi (or Heian small shogi) and Heian dai shogi. Heian small shogi is the version on which modern shogi is based, but the Nichūreki states that one wins if one's opponent is reduced to a single king, indicating that drops had not yet been introduced. According to Kōji Shimizu, chief researcher at the Archaeological Institute of Kashihara, Nara Prefecture, the names of the Heian shogi pieces keep those of chaturanga (general, elephant, horse, chariot and soldier), and add to them the five treasures of Buddhism (jade, gold, silver, katsura tree, and incense).
Around the 13th century the game of dai shogi developed, created by increasing the number of pieces in Heian shogi, as was sho shogi, which added the rook, bishop, and drunken elephant from dai shogi to Heian shogi. The drunken elephant steps one square in any direction except directly backward, and promotes to the prince, which acts as a second king and must also be captured along with the original king for the other player to win. Around the 15th century, the rules of dai shogi were simplified, creating the game of chu shogi. Chu shogi, like its parent dai shogi, contains many distinct pieces, such as the queen (identical with Western chess) and the lion (which moves like a king, but twice per turn, potentially being able to capture twice, among other idiosyncrasies). The popularity of dai shogi soon waned in favour of chu shogi, until it stopped being played commonly. Chu shogi rivalled sho shogi in popularity until the introduction of drops in the latter, upon which standard shogi became ascendant, although chu shogi was still commonly played until about World War II, especially in Kyoto.
It is thought that the rules of standard shogi were fixed in the 16th century, when the drunken elephant was removed from the set of pieces present in sho shogi. There is no clear record of when drops were introduced, however.
In the Edo period, shogi variants were greatly expanded: tenjiku shogi, dai dai shogi, maka dai dai shogi, tai shogi, and taikyoku shogi were all invented. It is thought that these were played to only a very limited extent, however. Both standard shogi and Go were promoted by the Tokugawa shogunate. In 1612, the shogunate passed a law giving endowments to top shogi players (). During the reign of the eighth shōgun, Tokugawa Yoshimune, castle shogi tournaments were held once a year on the 17th day of Kannazuki, corresponding to November 17, which is Shogi Day on the modern calendar.
The title of meijin became hereditary in the Ōhashi and Itō families until the fall of the shogunate, when it came to be passed by recommendation. Today the title is used for the winner of the Meijin-sen competition, the first modern title match. From around 1899, newspapers began to publish records of shogi matches, and high-ranking players formed alliances with the aim of having their games published. In 1909, the was formed, and in 1924, the was formed. This was an early incarnation of the modern , or JSA, and 1924 is considered by the JSA to be the date it was founded.
In 1935, meijin Kinjirō Sekine stepped down, and the rank of meijin came to be awarded to the winner of a . became the first Meijin under this system in 1937. This was the start of the shogi title matches (see titleholder system). After the war other tournaments were promoted to title matches, culminating with the in 1988 for the modern line-up of seven. About 200 professional shogi players compete. Each year, the title holder defends the title against a challenger chosen from knockout or round matches.
After the Second World War, SCAP (occupational government mainly led by US) tried to eliminate all "feudal" factors from Japanese society and shogi was included in the possible list of items to be banned along with Bushido (philosophy of samurai) and other things. The reason for banning shogi for SCAP was its exceptional character as a board game seen in the usage of captured pieces. SCAP insisted that this could lead to the idea of prisoner abuse. But Kozo Masuda, then one of the top professional shogi players, when summoned to the SCAP headquarters for an investigation, criticized such understanding of shogi and insisted that it is not shogi but western chess that potentially contains the idea of prisoner abuse because it just kills the pieces of the opponent while shogi is rather democratic for giving prisoners the chance to get back into the game. Masuda also said that chess contradicts the ideal of gender equality in western society because the king shields itself behind the queen and runs away. Masuda's assertion is said to have eventually led to the exemption of shogi from the list of items to be banned.
Tournament play
There are two organizations for shogi professional players in Japan: the JSA, and the , or LPSA. The JSA is the primary organization for men and women's professional shogi while the LPSA is a group of women professionals who broke away from the JSA in 2007 to establish their own independent organization. Both organize tournaments for their members and have reached an agreement to cooperate with each other to promote shogi through events and other activities. Top professional players are fairly well-paid from tournament earnings. In 2016, the highest tournament earners were Yoshiharu Habu and Akira Watanabe who earned ¥91,500,000 and ¥73,900,000. (The tenth highest earner, Kouichi Fukaura, won ¥18,490,000.)
The JSA recognizes two categories of shogi professionals: , and . Sometimes kishi are addressed as , a term from Go used to distinguish kishi from other classes of players. JSA professional ranks and female professional ranks are not equivalent and each has their own promotion criteria and ranking system. In 2006, the JSA officially granted women "professional status". This is not equivalent, however, to the more traditional way of "gaining professional status", i.e., being promoted from the : leagues of strong amateur players aspiring to become a professional. Rather, it is a separate system especially designed for female professionals. Qualified amateurs, regardless of gender, may apply for the "Shoreikai System" and all those who successfully "graduate" are granted kishi status; however, no woman has yet to accomplish this feat (the highest women have reached is "Shoreikai 3 dan league" by Kana Satomi and Tomoka Nishiyama), so kishi is de facto only used to refer to male shogi professionals.
The JSA is the only body which can organize tournaments for professionals, e.g., the eight major tournaments in the titleholder system and other professional tournaments. In 1996, Yoshiharu Habu became the only kishi to hold seven major titles at the same time. For female professionals, both the JSA and LPSA organize tournaments, either jointly or separately. Tournaments for amateurs may be organized by the JSA and LPSA as well as local clubs, newspapers, private corporations, educational institutions or municipal governments for cities or prefectures under the guidance of the JSA or LPSA.
Since the 1990s, shogi has grown in popularity outside Japan, particularly in the People's Republic of China, and especially in Shanghai. The January 2006 edition of stated that there were 120,000 shogi players in Shanghai. The spread of the game to countries where Chinese characters are not in common use, however, has been slower.
In Europe
, in Europe there are currently over 1,200 active players.
Computer shogi
Shogi has the highest game complexity of all popular chess variants. Computers have steadily improved in playing shogi since the 1970s. In 2007, champion Yoshiharu Habu estimated the strength of the 2006 world computer shogi champion Bonanza at the level of two-dan shoreikai.
The JSA prohibits its professionals from playing computers in public without prior permission, with the reason of promoting shogi and monetizing the computer–human events.
On October 12, 2010, after some 35 years of development, a computer finally beat a professional player, when the top ranked female champion Ichiyo Shimizu was beaten by the Akara2010 system in a game lasting just over 6 hours.
On July 24, 2011, computer shogi programs Bonanza and Akara crushed the amateur team of Kosaku and Shinoda in two games. The allotted time for the amateurs was one hour and then three minutes per move. The allotted time for the computer was 25 minutes and then 10 seconds per move.
On April 20, 2013, GPS Shogi defeated 8-dan professional shogi player Hiroyuki Miura in a 102-move game which lasted over 8 hours.
On December 13, 2015, the highest rated player on Shogi Club 24 was computer program Ponanza, rated 3455.
On April 10, 2016, Ponanza defeated Takayuki Yamasaki, 8-dan in 85 moves. Takayuki used 7 hours 9 minutes.
In October 2017, DeepMind claimed that its program AlphaZero, after a full nine hours of training, defeated elmo in a 100-game match, winning 90, losing 8, and drawing two.
From a computational complexity point of view, generalized shogi is EXPTIME-complete.
Video games
Hundreds of video games were released exclusively in Japan for several consoles.
Culture
According to professional player Yoshiharu Habu, in Japan shogi is viewed as not merely a game as entertainment or a mind sport but is instead an art that is a part of traditional Japanese culture along with haiku, tanka, noh, ikebana, and the Japanese tea ceremony. Its elevated status was established by the iemoto system supported by the historical shogunate.
The backwards uma (shogi horse symbol) is often featured on merchandise (such as on large decorative shogi piece sculptures, keychains, and other keepsakes) available for sale in Tendō. It also serves as a symbol of good luck. (Cf. Rabbit's foot.) There are multiple theories on its origin. One is that uma (うま ) spelled in the Japanese syllabary backwards is まう mau (舞う), which means (to) dance and dancing horses are a good luck omen.
In popular culture
In the manga and anime series Naruto, shogi plays an essential part in Shikamaru Nara's character development. He often plays it with his sensei, Asuma Sarutobi, apparently always beating him. When Asuma is fatally injured in battle, he reminds Shikamaru that the shogi king must always be protected, and draws a parallel between the king in shogi and the children who would grow up to take care of the Hidden Leaf (Konoha) in the future, as well as his yet-unborn daughter, Mirai, whom he wanted Shikamaru to guide.
Shogi has been a central plot point in the manga and anime Shion no Ō, the manga and anime March Comes in Like a Lion, and the manga and television drama 81diver.
In the manga and anime Durarara!!, the information broker Izaya Orihara plays a twisted version of chess, go and shogi, where he mixes all three games into one as a representation of the battles in Ikebukuro.
In the video game Persona 5, the Star confidant, a girl named Hifumi Togo, is a high school shogi player looking to break into the ranks of the professionals. The player character will gain knowledge stat when spending time with the confidant, supposedly from learning to play shogi. The abilities learned from ranking up the confidant comes from Japanese shogi terms.
In the light novel, manga, and anime The Ryuo's Work is Never Done!, protagonist Yaichi Kuzuryū is a prodigy shogi player who won the title of Ryūō at the age of 16. He is approached by Ai Hinatsuru, a 9-year-old girl who begs him to make her his disciple. Astonished by Ai's potential, Yaichi agrees to become her master, and the two then brave themselves together in the world of shogi with their friends and rivals.
In the anime Asobi Asobase, Hanako's butler Maeda tells her shogi is a sport where you fire a beam from your butt, because he does not know the rules, so he cannot teach her how to actually play shogi. He follows this by demonstrating the sport and destroying the roof with a laser beam fired from behind.
In the anime “That Time I was Reincarnated as a Slime” Episode 24, Storm Dragon Veldora Tempest creates a house rule where the King General can be promoted to Emperor to avoid checkmate.
In the manga and anime Hikaru no Go, the character Tetsuo Kaga is the strongest shogi player in Haze Middle School and president of their shogi club. He originally was forced to play go by his father, but switched to shogi. This is due to Akira Toya letting him win in order to prevent Tetsuo from being thrown out of his house. While mostly devoted to shogi, he plays go occasionally and is a strong amateur player.
See also
Mind sport
Shogi tactics
Shogi strategy
Shogi variants
Chu shogi
Dai shogi
Dōbutsu shōgi
Tsumeshogi
Chess variants
Crazyhouse
Computer shogi
List of world championships in mind sports
Nine men's morris
Janggi
Xiangqi
Notes
References
Bibliography
SHOGI Magazine (70 issues, January 1976 – November 1987) by The Shogi Association (edited by George Hodges)
External links
Shogi Shack
Reijer Grimbergen's Shogi Page
Shogi.Net
Shogi Hub portal for current information about the shogi world (tournaments, news, etc.)
Shogi-L shogi mailing list
Ricoh Shogi Page
Japanese–English shogi glossary
Hans Geuns' Basic Shogi Vocabulary
International Shogi Magazine
Rules
Shogi Harbour: Level 1 Shogi Course by women's professional player Karolina Styczyńska
40 shogi lessons on YouTube by HIDETCHI
An Introduction to Shogi for Chess Players
Shogi by Hans Bodlaender and Fergus Duniho, The Chess Variant Pages
Rules and Manners of Shogi by Tomohide Kawasaki (a.k.a. HIDETCHI)
FESA - Shogi official playing rules
Shogi, the Japanese Chess by Jean-Louis Cazaux
Shogi and Dobutsu-Animal shogi rules to download by Filip Marek
Online play
81Dojo English-language shogi play online
Lishogi
Shogi Dojo 24 shogi server in Japan
Shogi Wars
Shogi Quest
PlayOK shogi
GoldToken online turn-based shogi
World Shogi League international online tournament associated with 81Dojo and the Japan Shogi Association
HamShogi handicap shogi against the computer, instructions
boardspace.net real time play against human or (weak) computer players.
Online tools
将棋DB2 shogi game record database
Kyokumenpedia game record databases as move decision tree with user-generated wiki annotations (associated with 81Dojo)
Shogi Playground record or play through games, mate problems, board positions
Create Shogi Diagram on the Web
Abstract strategy games
Japanese games
Traditional board games
Games related to chaturanga |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.