content
stringlengths
71
484k
url
stringlengths
13
5.97k
When to use this option? - When the project file size is large (over 100MB) - When there are modeling issues (gaps in the enclosure, elements overlap, model conditions are causing slow processing - When the model has multiple extraneous elements that are not required for energy analysis eg. topography, furniture, plumbing, mechanical equipment, excessive views / detailing / presentation or rendering views etc - Insert project as a link into a new file/template - Ensure Link is non Room-Bounding (so it will not be analyzed) - Create levels in this new project to match the linked model - Set Site Location in the Manage Tab of Revit - Copy essential elements from the link using the Tab key on the keyboard to select (walls, floors, roofs and glazing) - Hide the linked model or use worksets to turn off - Analyze the host model with Guardian Glass for BIM with these copied elements only While there still may be modeling issues, this simplifies a project and allows examination of essential elements to isolate areas that require attention. Copy/Monitor may be used to synchronize with the original file if there are any changes. Create an Energy Model in Revit and if successful, copy internal walls to add spatial detail and analyze Heat Gains on the internal spaces.
https://guardian-support.fenestrapro.com/knowledgebase/12-0-3-simplify-an-advanced-detailed-model/
Context: As serious as the current health and economic crisis is, COVID-19 may just be the harbinger of future crises – because Antimicrobial resistance (AMR) one of the greatest challenges of the 21st century seems to be ignored. Relevance: GS-III: Science and Technology, GS-II: Social Justice (Health related issues) Mains Questions: How can engaging the health, agricultural, trade and environment sectors help in tackling the problem of Antimicrobial resistance (AMR)? (10 marks) Dimensions of the Article: - What is Antimicrobial Resistance (AMR)? - What is Multi drug resistance? - The seriousness exposed by Covid-19 - Concerns regarding AMR - Concerns regarding AMR in India - Steps taken in India regarding AMR - What can be done to fight against AMR? What is Antimicrobial Resistance (AMR)? - Antimicrobial resistance (AMR) is the ability of microorganisms such as bacteria, fungi, viruses, and parasites to remain unaffected or survive antimicrobial drugs such as antibiotics, antivirals and antimalarials. - AMR occurs when microorganisms exposed to antimicrobial drugs develop antimicrobial resistance resulting in standard treatments becoming ineffective leading to persistence of infections and spreading of infections. - Microorganisms that develop antimicrobial resistance are sometimes referred to as “superbugs”. - The misuse of antimicrobials in medicine and inappropriate use in agriculture is one of the major causes of spread of Antimicrobial Resistance. - Contamination around pharmaceutical manufacturing sites where untreated waste releases large amounts of active antimicrobials into the environment also leads to spread of AMR. What is Multi drug resistance? - Multiple drug resistance (MDR), multidrug resistance or multi-resistance is antimicrobial resistance shown by a species of microorganism to multiple antimicrobial drugs. - The types most threatening to public health are MDR bacteria that resist multiple antibiotics; other types include MDR viruses, parasites (resistant to multiple antifungals, antiviral, and antiparasitic drugs of a wide chemical variety). - Recognizing different degrees of MDR, the terms extensively drug resistant (XDR) and pandrug-resistant (PDR) have been introduced. Basis of Antimicrobial Resistance - Some bacteria due to the presence of resistance genes are intrinsically resistant and therefore survive on being exposed to antibiotics. - Bacteria can also acquire resistance by sharing and transferring resistance genes present in the rest of the population, or by genetic mutations that help the bacteria survive antibiotic exposure. The seriousness exposed by Covid-19 - Since January 2020, there have been over three million deaths globally on account of COVID-19, starkly exposing the vulnerabilities of health systems to infectious diseases, even in the richest countries. - The speed of COVID-19’s spread across international borders has underscored the need for cross-national cooperation around surveillance, monitoring and disease notification — the key activities that underpin our ability to minimise the impact of acute public health events and maintain global health security. - Antimicrobial resistance (AMR), the phenomenon by which bacteria and fungi evolve and become resistant to presently available medical treatment, is one of the greatest challenges of the 21st century and it is already responsible for up to 7,00,000 deaths a year. - Unless urgent measures are taken to address this threat, we could soon face an unprecedented health and economic crisis of 10 million annual deaths and costs of up to $100 trillion by 2050 according to WHO. Concerns regarding AMR - Medical procedures such as organ transplantation, cancer chemotherapy, diabetes management and major surgery (for example, caesarean sections or hip replacements) become very risky due to AMR. - AMR increases the cost of healthcare with lengthier stays in hospitals, additional tests and use of more expensive drugs. - No new classes of antibiotics have made it to the market in the last three decades, largely on account of inadequate incentives for their development and production. - Without urgent action, we are heading towards a future without antibiotics and with bacteria becoming completely resistant to treatment and when common infections and minor injuries could once again kill (referred to as antibiotic apocalypse). - It is putting the gains of the Millennium Development Goals at risk and endangers achievement of the Sustainable Development Goals. Concerns regarding AMR in India - India, with its combination of large population, rising incomes that facilitate purchase of antibiotics, high burden of infectious diseases and easy over-the-counter access to antibiotics, is an important locus for the generation of resistance genes. - The multi-drug resistance determinant, New Delhi Metallo-beta-lactamase-1 (NDM-1), emerged from this region to spread globally – Africa, Europe and other parts of Asia have also been affected by multi-drug resistant typhoid originating from South Asia. - In India, over 56,000 newborn deaths each year due to sepsis are caused by organisms that are resistant to first line antibiotics. Steps taken in India regarding AMR - India has undertaken many activities like Mission Indradhanush — to address low vaccination coverage — strengthened micro-planning and additional mechanisms to improve monitoring and accountability. - The Ministry of Health & Family Welfare (MoHFW) identified AMR as one of the top 10 priorities for the ministry’s collaborative work with the World Health Organisation (WHO). - India has also launched the National Action Plan on AMR resistance 2017-2021. What can be done to fight against AMR? - We need sustained investments and global coordination to detect and combat new resistant strains on an ongoing basis. - Issuing standard treatment guidelines that would empower providers to stand up to inappropriate demand as well as provide point-of-care diagnostics to aid clinical decision-making will help in preventing unnecessary over-exposure to antimicrobials. - Efforts to control prescription of antimicrobials should be accompanied by efforts to educate consumers as well in order to prevent overexposure to antimicrobial drugs. - It is critical to ensure that all those who need an antimicrobial have access to it – hence, in addition to developing new antimicrobials, infection-control measures should be introduced to reduce dependence on antibiotic use. - Measure to track the spread of resistance in microbes – such as surveillance to identify these organisms should implemented. Their implementation also needs to be expanded to beyond hospitals and encompass livestock, wastewater and farm run-offs.
https://www.legacyias.com/editorials-opinions-analyses-for-upsc-30-april-2021/
Jul 26 Antibiotic Resistance Can Spread Through The Air, Scientists Warn. Originally published on Science Alert. Antibiotic resistant bacteria is one of the biggest issues we facing in the coming decades, but there's one type of spread that isn't getting enough attention, says a new study – antibiotic resistance genes are spreading through the air. If that sounds terrifying to you, you're not alone. Bacteria's antibiotic resistance genes aren't just inherited through reproduction – in the case of bacteria that's asexual reproduction, where one parent cell becomes two daughter cells, also known as vertical gene transfer. Unlike humans, bacteria can also spread their genes through something called horizontal gene transfer, where bacteria will replicate and then gift genes to other bacteria through a needle-like mechanism called a pilus. But bacteria don't even need to be alive to pass their genes on horizontally, because once they die they release their entire insides into their environment – leaving little DNA packages around for other bacteria that happen to pass by. The act of bacteria literally reaching out, picking up DNA from its environment and hauling it back into itself with its pilus, was recently captured on camera for the first time. To make matters worse, both dead and alive bacteria can easily become airborne, moving to new locations and spreading their genes further abroad. An international collaboration led by researchers from Peking University in Beijing, wanted to take a survey of just how prevalent and varied these airborne genes are. The bad news: they're everywhere. Jan 28 Are antibiotic resistant bacteria the new global epidemic?
https://www.infections.nz/home/2018/8/6/antibiotic-resistance-can-spread-through-the-air-scientists-warn
Scientists from the Carol Yu Centre for Infection at the University of Hong Kong examined Escherichia coli bacteria responsible for causing human urinary tract infections (UTIs) and bacteria in faecal samples from humans and food-producing animals. They found an identical gene for antibiotic resistance was present in all the samples in similar proportions and locations, suggesting that the gene is likely to be transferred between bacteria residing in different hosts. The gene, called aacC2, encodes resistance to a commonly-used antibiotic gentamicin and was found in approx. 80% of human and animal samples. The ene was also found on sections of DNA that are known to swap between different bacterial populations. Both these factors, combined with the identical gene sequences, led the researchers to suggest that aacC2 can transfer between separate populations of bacteria that colonise different species. Dr Pak-Leung Ho, who led the study, says that the ability of antibiotic resistance genes to transfer between human and animals could make the problem harder to control. "These resistance genes may possibly spread to the human gut via the food chain, through direct contact with animals or by exposure to contaminated water sources. When the resistance genes end up in bacteria that cause infections in humans, the diseases will be more difficult to treat," he said. According to Dr. Ho there is currently a lack of quantitative data on the human risk of exposure to antibiotic-resistant bacteria from animal sources. "Health authorities need to closely monitor the transmission of resistance between food-producing animals and humans and assess how such transfers are affecting the effectiveness of human use of antibiotics," he said. "With the international trading of meats and food animals, antibiotic resistance in one geographic area can easily become global," he explained. Source: ScienceDaily Natalie Berkhout To comment, login here Or register to be able to comment. Stay up-to-date... The Australian Chicken Meat Federation celebrated World Antimicrobial Awareness Week (18 – 24 November) noting... With the search on to find suitable alternatives to antibiotics, essential plant oils could be viable... Animal health in the poultry sector will improve in the coming years. The reasons are, among other things,... In many countries around the world, there is a strong focus on reducing the use of antimicrobials in livestock...
https://www.poultryworld.net/Home/General/2010/5/Identical-human-and-animal-gene-for-antibiotic-resistance-WP007484W/
Skip to 0 minutes and 4 secondsWelcome to this lecture on the emergence and spread of antibiotic-resistant bacteria. This lecture explains how antibiotic-resistant bacteria can increase in numbers when exposed to an antibiotic, and how they can spread between people, animals, in society, and across the world. Antibiotic resistance is a natural process. Mutations and other changes in the bacterial DNA happen continuously, and by chance, some bacteria will change so that they're able to survive antibiotic exposure. Skip to 0 minutes and 38 secondsHere you see a typical bacterial population. The majority of the bacteria are sensitive to antibiotics, but there may be one or a few antibiotic-resistant bacteria present. If the bacteria are now treated with an antibiotic, most will be killed or at least stop growing. But the resistant bacteria will survive. Skip to 1 minute and 3 secondsAs long as the antibiotic is there, the resistant bacteria have no competition and can increase in number. The end result is a bacterial population consisting of mainly resistant bacteria. Skip to 1 minute and 19 secondsIf we then look at what happens inside the human body when we have a bacterial infection, we see a similar pattern. Any resistant bacterium that is present will survive antibiotic treatment and can start to increase in number. This can ultimately lead to treatment failure, although this depends on many factors such as how well you own immune system can take care of the infection. Skip to 1 minute and 46 secondsAt the same time, an antibiotic treatment will also have an effect on the bacteria that normally lives in the body. The so-called normal flora that have nothing to do with the infection in question. Instead, these bacteria often have beneficial functions for humans. They can, for example, degrade food or protect against pathogens. Resistance can also develop in the normal flora. And similar to at the site of infection, the antibiotic will allow these resistant bacteria to increase in numbers. This could become a problem later in life if the resistance spreads to bacteria that can cause disease, or if bacteria from the normal flora cause an infection. Once the resistant bacteria increase in number, the risk for spread increases. Skip to 2 minutes and 40 secondsThis infographic by the European Centre for Disease Prevention and Control, ECDC, explains how resistant bacteria can spread. Point 1 to 3 describes the role of animal farming. Animals may be given antibiotics to treat or prevent disease, or to make them go faster. And they can, therefore, carry antibiotic resistant bacteria. Vegetables may be contaminated with antibiotic-resistant bacteria from animal manure used as fertilisers. Antibiotic-resistant bacteria can then spread to humans through food and direct contact with animals. Trade with animals, meat, vegetables, and other goods can further disseminate the resistant bacteria across the world. Skip to 3 minutes and 33 secondsPoint 4 describes spread within the community. Humans sometimes receive antibiotics prescribed to treat infections, or sometimes self-medicate with antibiotics. Particularly in settings where antibiotics are easily available. However, bacteria develop resistance to antibiotics as a natural adaptive reaction. Antibiotic-resistant bacteria can then spread from the treated patient to other persons. Skip to 4 minutes and 5 secondsPoint 5 and 6 describe spread in health care settings. Humans may receive antibiotics in hospitals, and then carry antibiotic-resistant bacteria. These can spread to other patients via unclean hands or contaminated objects. Patients who may be carrying antibiotic-resistant bacteria will ultimately be sent home and can spread these resistant bacteria to other persons. Skip to 4 minutes and 33 secondsPoint 7 and 8 highlights the role of international travel. Travellers requiring hospital care while visiting a country with high prevalence of antibiotic resistance may return with antibiotic-resistant bacteria. Even when not in contact with health care, travellers often become carriers and import resistant bacteria acquired during travel. So, in conclusion, antibiotic resistance will eventually develop to any antibiotic. The more antibiotics we use, the more likely it is that resistance is maintained among the bacteria. The resistant bacteria can then spread between people, animals, and in the environment via multiple routes. It is unclear exactly how much each of these routes contributes to this process, but all of them play a role in the worldwide emergence and spread of antibiotic-resistant bacteria. Emergence and spread of antibiotic resistance Listen to Maria Pränting explaining the mechanisms behind antibiotic resistance and spread of resistant bacteria. Emergence of resistance The picture below illustrates the mechanism though which bacteria can become resistant to antibiotics: 1. alteration of the target site for the antibiotic 2. production of enzymes that inactivate the antibiotic 3. alterations in the cell membrane resulting in decreased permeability and thus decreased uptake of the antibiotic 4. removal of the antibiotic using active transportation of the antibiotic out of the bacteria through so called efflux pumps 5. use of alternative pathways, which compensates for the action inhibited by the antibiotic Figure 1. Resistance strategies in bacteria. Courtesy of Dr. E. Gullberg. Spread of resistant bacteria Resistant bacteria spread via many routes. Poor hygiene, poor sanitation and poor infection control are three interconnected key factors contributing to the spread of resistant bacteria in health care facilities as well as in the community. Bacteria know no boundaries and international traveling and trade help disseminate resistant bacteria across the world. This contributes to the complexity of the antibiotic resistance problem and underpins the fact that it is a global issue. Here follows an overview with descriptions of some of the ways resistant bacteria can spread. Within health-care facilities Health care facilities are hot spots for resistant bacteria, since many sick people are in close vicinity of each other and antibiotic usage is high resulting in selection and spread of resistant strains. Poor hygiene practices may facilitate spread of resistant bacteria via the hands or clothes of doctors, nurses and other health care staff as well as via patients or visitors. Other risk factors include crowded wards, few isolation rooms, improper cleaning of the facilities and instruments that are used in patient care. Between people in the community Bacteria can spread from one person to another through direct contact between people. Transmission can also occur indirectly, for example when someone coughs. If a person contaminates a surface (such as a doorknob) with bacteria, these bacteria can be transferred to another person who touches the same surface. Good hand hygiene is important to limit spread of pathogens and the risk of becoming a carrier of resistant bacteria. Still, even with good hygiene practices, bacteria are a normal part of our surroundings that we will be continuously exposed to. International travel International travelers help spread resistant bacteria across the world. Any given day several million people will catch a flight, and if someone carries a resistant bacterium they will bring it along. Many studies have demonstrated that a large proportion of international travelers acquire resistant bacteria during visits in areas with a high prevalence of resistant bacteria. In some studies, more than 70% of people travelling to certain geographical areas were colonized with multidrug-resistant ESBL-producing bacteria upon return. The risk is even higher for hospitalized patients, who are exposed to additional risk factors such as surgery and antibiotic therapy. Several hospital outbreaks have originated from patients transferred from another hospital with a higher prevalence of resistant bacteria. From animals to humans and from humans to animals Bacteria can spread from animals to humans, but also the other way around. Many people come in close contact with animals in their daily life as we keep them as pets in our homes or raise animals for food. Resistant bacteria are common in livestock and there are several examples of how farmers and their families have become colonized with the same resistant bacteria as their animals. Likewise, livestock veterinarians are at risk of carrying livestock-associated resistant bacteria. The bacteria may then spread further in society. Resistant bacteria are also found in wildlife and migratory birds but this probably has a limited impact on the increasing rates of resistance in humans. Food In many animal farms, antibiotics are used in large quantities to prevent and treat infections as well as for growth promotion, and therefore many farm animals have become colonized with antibiotic-resistant bacteria. During slaughter or when processing the meat, these bacteria can potentially be transferred to the product. Furthermore, fruits and vegetables can become contaminated with animal feces directly from the animals or via contaminated water that is used for irrigation of the crops. Eating food contaminated with bacteria may directly cause an infection, such as diarrhea caused by salmonella, campylobacter and E. coli. Resistant bacterial strains, or genes encoding resistance, may also be transferred to the normal flora of the consumer without causing an infection. The resistant bacteria can potentially cause infections later on and spread to other people. Resistant bacteria are frequently detected in chicken and meat. However, the impact this has on human health is currently not known and may differ in different parts of the world. Some studies demonstrate similarities between the antibiotic-resistance genes found in meat and those found in human pathogens, while other studies have not seen this connection. More research is needed to determine the scale of the problem. Proper cooking and handling of food helps to decrease spread of infections as well as resistant bacteria. Water Bacteria can spread via drinking water or water supplies that are used for irrigation, washing cooking utensils or for hygienic purposes. There are many ways resistant bacteria can end up in the water; release of untreated waste from animals and humans is one important source. Resistant bacteria have been found in many water sources such as drinking wells, rivers and effluents from wastewater treatment plants. Several bacterial diseases can spread via contaminated water, including typhoid fever and cholera. Find out more Go to Downloads at the bottom of this step to access factsheets about the spread of antibiotic resistance by ECDC, CDC and WHO. There is also a downloadable PDF containing the slides from the above lecture with Maria Pränting.
https://www.futurelearn.com/courses/antibiotic-resistance/0/steps/19878
Within the next four years, the focus of the LRA INFECTIONS research will be on the spread of antimicrobial resistant microbes in an increasingly urbanised society. The highly interdisciplinary and collaborative research agenda of the 18 participating Leibniz institutes and three external partners enables the development of long-term synergies to advance current knowledge, contribute to the development of countermeasures and provide policy recommendations. Altogether six projects, the so called Interdisciplinary Project Teams (IPT1-6), which will be worked on over the next four years through the combined and multidisciplinary expertise of the project partners, span natural, agricultural and urban areas. Among other things, the influence of water, different fly species and containment measures in hospitals on the spread of pathogens will be investigated. The focus of this funding period is a virtual transect - an imaginary line along which measurements are taken or samples are taken at various points. This transect includes agricultural, natural and urban habitats along which the projects are aligned (see figure). Our central objective is to understand the spread of antimicrobial resistance (AMR) between livestock, the environment, and humans. Within the Leibniz Research Alliance INFECTIONS, IPT1 serves as a central hub for bacteriological, molecular biological and spectroscopic identification of AMR bacteria and for bioinformatics analyses of large-scale genomic sequencing data. Metagenomic sequencing of DNA from samples collected within the other IPTs will provide cultivation-independent information on the diversity and abundance of AMR microorganisms. We will combine short-read and long-read DNA sequencing, specific bioinformatics pipelines and public databases to identify antibiotic resistance genes and mutations. Subsequently, specific AMR of interest (e.g. common resistance genes) will be traced across a larger number of samples by using quantitative PCR. Dilution to extinction in selective cultivation media will be used to isolate AMR pathogenic bacteria from environmental samples. Bacterial isolates will be tested for their antibiotic susceptibilities and their cell responses to antibiotics will be characterized by using Raman spectroscopy. Further, the genomes from selected bacterial isolates will be sequenced to track the sources of environmental AMR, such as from flies (IPT4), hygiene studies (IPT5) and open waters (IPT6). Participating Institutes: DSMZ, IPHT, IGB, IOER, IZW Antimicrobial resistance (AMR) is becoming an increasingly urgent concern in our world. As more and more microbes develop resistance to antimicrobials, infectious diseases which were commonly able to be treated, become more difficult to deal with. As part of Interdisciplinary Project Team 2 (IPT2) we are primarily focused on microbes which may cause infections of the lung and are well known to develop resistance to antimicrobials. These infections do not usually occur in healthy people, but can occur in those having problems with the immune system or those who suffer from lung conditions, such as cystic fibrosis. Often these microbes do not cause infections alone, but form highly-structured polymicrobial communities, known as “biofilms”. In view of high cell densities and species diversity, it is not surprising that physical and social interactions within mixed biofilms may lead to cooperation or competition between the cells, which in turn may result in remodelling or even evolution of the members of the biofilm community, including the development of antimicrobial resistance to name just one possible outcome of microbial evolution. It is therefore extremely important to understand how the microorganisms evolve and adapt to new environments in mixed biofilm-forming consortia, given the rapid development of antimicrobial resistance and the consequences for the treatment of infectious diseases. Within the framework of the current project we intend to examine the interactions between pulmonary opportunists and arising antimicrobial resistance in mixed biofilms, placing a special focus on bacterial and fungal pathogens affecting cystic fibrosis patients, such as Stenotrophomonas maltophilia and Candida albicans. In order to mimic the conditions of the human respiratory system and achieve the complexity of the in vivo situation in an in vitro model, we will investigate the biofilm communities in physiologically relevant cell culture systems at the air-liquid interface. Participating Institutes: FZB, HKI, DPZ, DSMZ, LIV, IPHT, ISAS The rising use of antibiotics in health and agriculture are driving the increase in antimicrobial resistance (AMR). The increasing use of antibiotics in the health- and livestock sectors along with ineffective stewardship have already led to alarming rates of antimicrobial resistant pathogens – a situation with severe consequences for society. Experts estimate that by 2050 ten million people will die each year from common infections that will then be no longer treatable. The central objective of this research project is to support optimal patterns of antibiotic use worldwide. The focus of this project is on two areas in particular: One, reducing the excessive use of antibiotics and two, promoting equal access to quality antibiotics. The excessive use of antibiotics stems from two sources: A lack of stewardship, standards and controls of medicines, and a lack of diagnostic tools that allow distinguishing between bacterial and viral infections – an information essential for treatment that is effective. Hence, this project will assess the role of better diagnostic tools, more information and transparency for curbing antimicrobial resistance. It will assess the cost-effectiveness of a variety of measures and interventions at the micro- and the macro level and provide concrete policy recommendations. Participating Institute: BNITM, GIGA, FZB, IfW The project focuses on vector ecology. Vectors like filth flies are able to transport and disseminate various pathogens including bacteria, fungi, viruses and parasites. The common house fly (Musca domestica) can transport bacteria over distances of 5-7 km. It breeds in and feeds on decaying organic matter and animal feces. Due to this coprophagic behaviour bacteria may be ingested and can later be transmitted onto human food by regurgitation and defecation. The frequent preventive use of antibiotics on farms and in lifestock production leads to the emergence of antimicrobial resistant bacteria. These can be disseminated by flies from farm environments to urbanized areas causing severe nosocomial infections among citizens. To understand the potential of flies to spread bacteria over different landscape types and from farm environments to urbanized areas, a mark-release-recapture experiment will be conducted. For this purpose, adults of M. domestica are captured, marked by different colours with a permanent dye and released again. The recapture rate will be determined using specialized dipteran traps. The results will help understand the flight range and the habitat binding of the muscid. Strategies to prevent and inhibit the spread of antimicrobial-resistant bacteria by flies will be elucidated. In a further step, the persistence and multiplication of these bacteria in the flies will be investigated under a laboratory setting. The three larval stages of the muscid fly develop in manure and animal feces where they may ingest antimicrobial resistant bacteria excreted by the lifestock animals. Of special intrest is if the ingested bacteria survive the metamorphosis. This indicates that they can remain in the puparium after histolysis or persist in the newly emerged imagos. Altogether, this project will help understand how antimicrobial resistant bacteria survive and persist during the life cycle of flies and how adult flies transport and disseminate them to urbanized areas. Participating Institute: ZALF, DSMZ, ATB, IOER Within Interdisciplinary Project Team 5 (IPT5) the ATB and partners work intensively together to determine the mechanisms of antimicrobial resistance (AMR) transmission in animal husbandry and to evaluate possible intervention measures. We compare AMR occurrence in typical conventionally grown pigs in a standard and improved hygienic environment (by using insecticides, flytraps, increased disinfection and dusting). In addition, the antibacterial activities of different feed additives will be compared to the basic diet based on microbial colonization in the gut of piglets during the early fattening period. In close cooperation with other INFECTIONS partners, the survival and propagation of AMR microbes in pig faeces, dust and flies under all conditions will be analyzed. The abundance of AMR based on bacteriological methods, PCR and cultivation-independent DNA sequencing will be conducted. With support from other IPTs, the findings will guide interventions and potential mitigation strategies to minimize AMR spread in commercial animal husbandry to reduce potential contamination of the environment (e.g. surface water, when the slurry is used as organic fertilizer on agricultural fields). Participating Institute: ATB, DSMZ, ZALF, HKI, IPHT, IOER, VRC2 Antimicrobial resistance (AMR) has been declared by the World Health Organization (WHO) as one of the top 10 global public health threats in the presence and near future. According to WHO, AMR will have a disastrous impact within a generation unless an urgent and effective global mitigation plan is finally implemented. In a recent study, data from 204 countries suggest a burden of 1,27 million deaths associated with AMR bacteria in 2019. One of the challenges of monitoring AMR organisms and antimicrobial resistance genes (ARGs) is their ubiquitous distribution, i.e. they are generally found in humans, animals, plants, microbes and the environment like water, soil and air. In particular, water plays a key role as a vector and reservoir for AMR transmission across urban and rural areas. Our preliminary results suggest that the abundance of AMR in urban waters and sediments is substantially higher compared to rural water bodies. In addition, rural lakes sediments and water derived from farmlands is observed to be a major source of environmental AMR due to frequent antibiotics use for cattle raising. The approach defined for Interdisciplinary Project Team 6 (IPT6) seeks to provide further understanding on AMR dispersion from WWTPs to the environment as well as to characterize the humanization of urban water microbiomes. In particular, the use of long-read sequencing in this project will allow to recover mobile genetic elements (MGEs) such as plasmids and genomic islands (GIs) and thus a higher resolution in the metagenome assembly. Also, risk assessment will be performed in order to evaluate the environmental exposure to ARGs. The results from this project will allow to directly link frequency and diversity of AMR and the carrying bacteria to other findings and team up with institutions of the Leibniz Research Alliance (LRA) INFECTIONS, providing a broad overview of AMR profiles in different waters and potential disease vectors of concern in the area of Berlin-Brandenburg.
https://leibnizinfections.de/en/research/current-projects
Antibiotic resistance in bacteria is a growing threat to global health. Many of the genes responsible for resistance are carried on mobile genetic elements which can be transferred laterally between strains and species. The most important of these are conjugative and mobilisable elements including plasmids and integrating and conjugating elements, ICEs. Haemophi/us influenzae is an important human pathogen, which was first identified as carrying antibiotic resistance genes in the 1970s. Much of this resistance is encoded by ICEHin1056, which is present in H. influenzae strains worldwide. The aims of this study were to describe features of the biology of ICEHin1056, with particular reference to the genetic site and control mechanisms responsible for instigating conjugative transfer. The origin of transfer has been localised to a sequence on ICEHin1056 and an environmental stressor initiating conjugative transfer, oxidative stress, has been identified. In addition, detailed phylogenetic analysis has demonstrated ICEHin1056 to be part of a much larger family of mobile genetic elements, widely distributed in proteobacteria and carrying accessory genes responsible for survival in adverse environments, virulence and antibiotic resistance. The ICEs in the family have conserved homology of gene content and synteny of gene arrangement over deep evolutionary time, challenging the accepted paradigm of modular mosaicism of mobile genetic elements. A key event in increasing dissemination of the ICE, acquisition of a phage type integrase gene has also been identified. The findings presented provide significant insight into the behaviour of ICEs and may in future allow predictions about the spread of virulence factors and antibiotic resistance genes, with important implications for human and animal health.
https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.572683
A new gene that makes bacteria highly resistant to a last-resort class of antibiotics has been found in people and pigs in China - including in samples of bacteria with epidemic potential, researchers said on Wednesday. The discovery was described as "alarming" by scientists, who called for urgent restrictions on the use of polymyxins - a class of antibiotics that includes the drug colistin and is widely used in livestock farming. "All use of polymyxins must be minimized as soon as possible and all unnecessary use stopped," said Laura Piddock, a professor of microbiology at Britain's Birmingham University who was asked to comment on the finding. Researchers led by Hua Liu from the South China Agricultural University who published their work in the Lancet Infectious Diseases journal found the gene, called mcr-1, on plasmids - mobile DNA that can be easily copied and transferred between different bacteria. This suggests "an alarming potential" for it to spread and diversify between bacterial populations, they said. The team already has evidence of the gene being transferred between common bacteria such as E.coli, which causes urinary tract and many other types of infection, and Klesbsiella pneumoniae, which causes pneumonia and other infections. This suggests "the progression from extensive drug resistance to pandrug resistance is inevitable," they said. "(And) although currently confined to China, mcr-1 is likely to emulate other resistance genes ... and spread worldwide." The discovery of the spreading mcr-1 resistance gene echoes news from 2010 of another so-called "superbug" gene, NDM-1, which emerged in India and rapidly spread around the world. Piddock and others said global surveillance for mcr-1 resistance is now essential to try to prevent the spread of polymyxin-resistant bacteria. China is one of the world's largest users and producers of colistin for agriculture and veterinary use. Worldwide demand for the antibiotic in agriculture is expected to reach almost 12,000 tonnes per year by the end of 2015, rising to 16,500 tonnes by 2021, according to a 2015 report by the QYResearch Medical Research Center. In Europe, 80 percent of polymixin sales - mainly colistin - are in Spain, Germany and Italy, according to the European Medicines Agency's Surveillance of Veterinary Antimicrobial Consumption (ESVAC) report. For the China study, researchers collected bacteria samples from pigs at slaughter across four provinces, and from pork and chicken sold in 30 open markets and 27 supermarkets in Guangzhou between 2011 and 2014. They also analyzed bacteria from patients with infections at two hospitals in Guangdong and Zhejiang. They found a high prevalence of the mcr-1 gene in E coli samples from animals and raw meat. Worryingly, the proportion of positive samples increased from year to year, they said, and mcr-1 was also found in 16 E.coli and K.pneumoniae samples from 1,322 hospitalized patients. David Paterson and Patrick Harris from Australia's University of Queensland, writing a commentary in the same journal, said the links between agricultural use of colistin, colistin resistance in slaughtered animals, colistin resistance in food, and colistin resistance in humans were now complete. "One of the few solutions to uncoupling these connections is limitation or cessation of colistin use in agriculture," they said. "Failure to do so will create a public health problem of major dimensions."
http://www.thanhniennews.com/health/alarming-new-superbug-gene-found-in-animals-and-people-in-china-53904.html
Metagenomic analysis software reveals new causes of superbug emergence Researchers from ITMO University and Center of Physical and Chemical Medicine have developed an algorithm capable of tracking the spread of antibiotic resistance genes in gut microbiota DNA and revealed additional evidence of resistance gene transfer between bacterial species. The method can not only contribute to the development of effective therapy schemes, but also curb the spread of superbugs. The results of the research were published in Bioinformatics. In recent years, the spread of antibiotic resistance has become a global health care problem. As a consequence of excessive antibiotics use in medicine and agriculture, gut microbiota accumulate antibiotic resistance genes in their DNA or metagenomes. On the one hand, these genes help the normal flora to survive. However, recent studies show that gut microbiota are capable of sharing resistance genes with pathogens, thus making them resistant to available therapies. In this light, studying the spread of resistance genes is especially important. Programmers from ITMO University with colleagues from the Research Center of Physical and Chemical Medicine developed an algorithm called MetaCherchant that makes it possible to explore the drug resistance gene environment and see how it changes depending on bacteria species. "We created a tool that enables scientists to have a closer look at the difference between gene surroundings in two or more samples of microbiota. We can analyze microbiota samples collected from different people or from the same person at different times, for example, before and after antibiotic treatment," says Vladimir Ulyantsev, associate professor of the Computer Technologies Department at the ITMO University. "Based on the obtained data, we can suggest how a particular resistance gene could spread from one microbial species to another." Studies of the antibiotic resistance gene environment are primarily important for designing effective antimicrobial treatment schemes. "Using MetaCherchant, we can analyze how microbiota contributes to the spread of resistance to a particular antibiotic class. Looking forward, it is possible to predict the antibiotics to which pathogens are most likely to spread resistance. On the other hand, we can also find drugs with low resistance risk. This, in turn, will help us adjust and tune specific therapies. This is the question of the next couple of years," says Evgenii Olekhnovich, lead author and researcher at the Center of Physical and Chemical Medicine. Potential applications of the algorithm are not limited to gut microbiota genes analysis, since the program can be also used to study genome samples from soil, water or sewage. "We can evaluate the spread of resistance within a single bacterial community, such as gut microbiota, as well as between different communities. This allows us, for example, to identify global pathways of antibiotic resistance spread through the environment," says Evgenii Olekhnovich. "The problem of resistance is complex and requires a complex approach, where our tool can be really useful."
https://phys.org/news/2017-11-metagenomic-analysis-software-reveals-superbug.html
Infectious Diseases & Endocrinology 2019: Antimicrobial resistance-A global public health challenge Antimicrobial resistance (AMR or AR) is a microbe's ability to resist the effects of medication that could once successfully treat the microbe. The term antibiotic resistance (AR or ABR) is a subset of AMR, as it only applies to bacteria that become antibiotic resistant. Resistant microbes are harder to treat which require alternative medicines or higher doses of antimicrobials. These approaches can be costlier, more toxic or both. Multi-antimicrobial resistant microbes are called multi-drug resistant (MDR). Those considered to be highly drug resistant (XDR) or completely drug resistant (TDR) are sometimes referred to as "superbugs." Resistance occurs by one of three mechanisms: natural resistance in some forms of bacteria, genetic mutation, or resistance of one species from another. Resistance may develop in all classes of microbes. Fungi build resistance to antimicrobials. Viruses develop resistance to the antivirals. Protozoa develop resistance to protozoa, and bacteria develop resistance to antibiotics. Due to random mutations resistance will appear spontaneously. Extended use of antimicrobials, however, tends to promote mutation selection which may make antimicrobials ineffective. Preventive steps involve the use of antibiotics only when appropriate, thus preventing the abuse of antibiotics or antimicrobials. Where appropriate, narrow-spectrum antibiotics are favored over broad-spectrum antibiotics, since successful and precise targeting of specific species is less likely to cause resistance and side effects. Education regarding proper use is important for people who are taking these drugs at home. Health care providers can reduce the spread of resistant infections by using good sanitation and hygiene, including hand washing and disinfecting between patients, and should promote patients , visitors, and family members alike. Increasing drug resistance is caused mainly by human and other animal use of antimicrobials, and the spread of resistant strains between the two. Increasing resistance has also been associated with the dumping from the pharmaceutical industry of inadequately treated effluents, particularly in countries where bulk drugs are made. Antibiotics increase selective pressure in populations of bacteria, causing the death of susceptible bacteria; this increases the amount of resistant bacteria that continues to develop. Resistant bacteria can have a growth advantage even at very low levels of antibiotics, and grow faster than vulnerable bacteria. Resistance to antimicrobials is growing globally due to greater access to antibiotic drugs in developing countries. According to figures, 700,000 to several million deaths annually occur. In the U.S., at least 2.8 million people get infected with antibiotic-resistant bacteria each year, resulting in at least 35,000 deaths. There are public calls for collective global action to address the threat which includes proposals for antimicrobial resistance international treaties. Worldwide antibiotic resistance is not completely known but more is being caused by developing countries with weaker healthcare systems. The WHO describes antimicrobial resistance as the resistance of a microorganism to an antimicrobial medication that could once cure an infection from that microorganism. One person can't become antibiotic resistant. Resistance is a feature of the microbe, not a microbially infected human or other organism. Antibiotic resistance is a subset of resistance to the antimicrobials. This more specific resistance is associated with pathogenic bacteria and is therefore broken down into two additional subsets, microbiological and clinical. Microbiologically linked resistance is the most common and occurs from mutated or inherited genes which enable the bacteria to resist the mechanism associated with certain antibiotics. Clinical tolerance is demonstrated by the failure of many therapeutic methods in which the bacteria usually susceptible to medication are immune after surviving the treatment outcome. In both cases of acquired resistance the bacteria can pass through conjugation, transduction, or transformation to the genetic catalyst for resistance. The primary cause of antimicrobial resistance is overuse of antimicrobials. This results in microbes either forming a protection against drugs that are used to treat them, or certain strains of microbes that have natural antimicrobial resistance being far more prevalent than those that are easily vanquished with medication. While antimicrobial resistance occurs naturally over time, the use of antimicrobial agents has led to an increasing prevalence of antimicrobial resistance in a variety of settings both within the healthcare industry and outside of. Resistance to antimicrobials can evolve naturally as the evolutionary response to continued antimicrobial exposure. Natural selection means organisms which can adapt to their environment can survive and continue to produce offspring. As a result, the types of microorganisms that can persist over time while these antimicrobial agents continue to strike will inevitably become more abundant in the ecosystem, and those lacking this tolerance will become redundant. Over the course of time, most of the strains of bacteria and pathogens present would be the form immune to the antimicrobial agent used to treat them, rendering this agent now ineffective over combating most microbes. With the increased use of antimicrobial agents this natural process is accelerating. Over the couple of decades the antimicrobial resistance is one of most common global public health problems not only in the developed countries but also developing countries. In the daily clinical practice antibiotics are commonly prescribed in case of respiratory tract infections, many of the genitourinary tract infections, acute of chronic gastroenteritis or other gastro intestinal symptoms, traumatized patients to prevent secondary infections. Antibiotics are commonly used to prevent and control the bacterial infection for reducing the mortalities and morbidities but its resistance has become the major public health challenge in the era of 21st century. After achieving the millennium development goal, antibiotic resistance will be one of the major stakeholders to set the sustainable developmental goals as the scenario is more endangering and life threatening than our current anticipation. A complex mechanism of interaction between genetic, pathogenic properties, environmental and host factors are related to develop antimicrobial resistance. Out of which several factors including inappropriate antibiotics practicing, patient???s illiteracy, unauthorized sale of antibiotics, inadequate supervision by drug monitoring agencies and nonhuman use of antibiotics such as animal production are modifiable. Many of the pathogens have shown highly resistance to several commonly used antimicrobials reported in various studies which is really alarming for us. So, the judicious strategies should be planned to prevent and combat against the antimicrobial resistance and make the globe livable for our generation next.
https://www.scitechnol.com/abstract/infectious-diseases-endocrinology-2019-antimicrobial-resistancea-global-public-health-challenge-11271.html
Objective’s analysis is a comprehensive, professional and an independent evaluation of the fund, designed to assist investors in the process of investment decision-making. The analysis analyzes all aspects of the fund and reviews its various characteristics. Each investor can be assisted by the analysis and use it to substantiate an investment decision. To perform the analysis, Objective’s team obtains a large number of materials from the fund, including, inter alia, questionnaires, presentations, private placement memorandums (PPMs), quarterly reports administrator reports, and uses external information systems for market and returns analyses, etc. For investors electing to make an investment, Objective continues to monitor the fund and issue an annual analysis that reviews its condition. Moreover, Objective initiates meetings with the fund’s representatives, both prior to the investment and throughout the fund’s life cycle. Key Tools Used to Prepare an Analysis Assessment and Analysis Model Investment Mapping Questionnaire Content Gathering and Access to the Data Room Meeting with the Funds Advisory Committee Extensive Database Broad Professional Knowledge and Practical Experience Analysis Outline The analysis structure described below is a permanent format to which Objective adds layers and makes adjustments according to the characteristics of each fund. Analysis throughout the Investment Life Cycle For those investors who decide to make the investment, Objective will continue to regularly monitor the fund and produce an annual report based on the enhanced pre-investment analysis and changes that actually occurred in the fund compared to planning. The annual report contains, among others, changes in the fund’s structure, a report on investments actually made and their impact on the fund’s return, an update on market changes that occurred during the period, and the realization of risks, if any risks occurred and new risks are added. The annual report allows the investor to obtain a complete and reliable picture and remain updated on his investments. Alternative Investment Consulting Objective provides advice services tailored to the investor’s needs, including attendance at Investment Committee meetings and presentation of opinions, reviews, recommendations and reports. Objective offers several consulting services in the scope of alternative investments: - Customized Consultation – Objective’s consultation service includes meetings with the investor, analyzing the investor’s alternative investment portfolio, in which an overview is provided, inter alia, of the alternative investment portfolio, along with personally tailored recommendations. - Presentation of Alternative Investment Fund – Objective will provide an overview of a particular analysis, comprehensive explanation and advice on the fund’s suitability to the investor’s needs. Objective’s wide-ranging analysis of the alternative investment will enable the investor to decide whether to include it in the general investment portfolio and to what extent. - Alternative Investment Knowledge-Base – A broad overview of all current funds raising capital in the market, focusing on different categories (hedge funds, real estate funds, infrastructure funds, credit funds and others) according to the investor’s requirements. With this information, the investor can obtain a broad picture of those funds that are relevant to him, and if he is interested, he will initiate an analysis of the fund he wishes to explore. This service enables the investor to make better-reasoned decisions. External Operational Control The internal process of investment operations of alternative funds is carried out manually, as opposed to the tradable investments for which data are recorded in the bank’s systems based on regular reports from the stock exchange. The investor also carries out the operational aspect of the investment, according to instructions and reports provided by the fund’s administrator (e.g., the number of units held by the investor, money at call, holdings value). Some investors do not maintain control of the activities of their Operations Department and their bank as they operate under heavy load and difficulties in maintaining control of all aspects of the investment. Objective’s controls are secondary controls of the alternative investment activities of the investor Operations Department and the bank. Objective provides a quarterly service of control of the appropriateness of the alternative investment portfolio data, which includes identification of failures & monitoring activities in all aspects: security definition, financial movements, revaluation of assets, returns and management fees and verifying the correctness of the data in the funds’ reports. Types of Controls: - Monitoring and overseeing the revaluation of the invested funds. - Appropriate controls to ensure that periodic reports and regular updates are complete, accurate and up-to-date, which are used as a basis for revaluation of the investment funds. Our control services will focus on the following: - Definitions of a Security – verification of correctness of asset classification. - Financial Movements – evaluation of the correctness of execution of financial orders. - Revaluation of Assets – comparison between the actual results of the revaluation and the funds’ statements. - Returns – verification of the calculation of funds returns. - Management Fees – control of management fees in accordance with valid agreements. The control is based on the following data: Email correspondence, quarterly reports and historical data regarding alternative investment funds from the investment house and the operating bank.
https://objective.finance/en/services/
By Rohan Chinchwadkar Managing a complex investment portfolio can be challenging for individual investors, especially if financial planning is not done in a systematic way. Many times, investors focus too much on specific questions like ‘which stock to pick’ or ‘when to buy/sell’ and end up with a portfolio which does not satisfy important financial needs. Before creating an investment plan, two critical arrangements have to be made: cash reserve (to cover living expenses for a few months in case of an emergency like job layoff) and insurance (life, health and general insurance). Once cash reserve and insurance are in place, the investor can design and manage the investment portfolio by following a simple four-step portfolio management process. Policy statement The first step in the portfolio management process involves the construction of a policy statement. The policy statement specifies how much and which types of risk the investor is willing to take. The aim is to understand and articulate investment goals and constraints as accurately as possible. The first thing a good financial planner will do is make you think about your short-term, medium-term and long-term financial needs so that you can construct a clear policy statement. This is an extremely important step of the process since it ensures that the constructed portfolio will be customised to suit your needs. A well-defined policy statement also allows you to set a benchmark for portfolio evaluation in the future. You can measure the success of your investment strategy only if you are clear about what you need. Investment goals are usually expressed in terms of risk (which the investor is willing to take) and return (which the investor expects). Goals should be SMART: specific, measurable, achievable, realistic and time-bound. “To make a lot of money” is not a well-defined goal. It is important to note that a careful analysis of the investor’s risk preferences should precede any discussion of return objectives. Investment constraints can be of different forms: liquidity needs (how quickly you might need to sell assets and generate cash), time-bound (need return over a specific time period), tax concerns (capital gains, income tax), personal preferences (common example: no investments in alcohol or tobacco companies) and unique needs arising from your personal situation (husband-wife are both pilots so they should avoid buying stocks of airline companies). Investment strategy The second step involves assessing the external financial and economic conditions and developing a point of view about the future. This assessment along with the investor’s needs (defined by the policy statement) will jointly determine the investment strategy. Since market conditions undergo significant changes over a period of time, they need to be monitored and appropriate changes have to be made in the portfolio to reflect future expectations. This step also helps in setting realistic investment goals and return expectations. Portfolio construction The third step is to construct the portfolio by implementing the investment strategy and deciding how to allocate capital across geographies, asset classes (like equity, debt, real estate and gold) and securities (stocks, bonds). The main objective of portfolio construction is to meet investor needs by taking the minimum possible risk. Different approaches can be used by investors and portfolio managers to construct portfolios. Traditional finance theory suggests that investors can build an optimal portfolio by focusing only on the risk and return characteristics of various securities. This approach usually recommends a highly diversified portfolio because it believes that markets are efficient and it is difficult for investors to select “winner” stocks. A more practical approach, especially for emerging markets which have significant market inefficiencies, involves a three-step process to select securities: macroeconomic analysis, industry analysis and company analysis (along with stock valuation). Investors who do not want to engage in selection of securities themselves can consider mutual funds and exchange-traded funds. Continuous monitoring and evaluation Once a portfolio is constructed, it is critical to continuously monitor investor needs and market conditions so that appropriate changes can be made to the policy statement and investment strategy, whenever necessary. It is also important to evaluate portfolio performance on a risk-adjusted basis and compare it with a suitable market benchmark. Investors should remember that this is an eternally ongoing process and they should revisit all the steps at regular intervals (maybe an annual review) to ensure that the portfolio gets realigned when investor needs and market expectations change.
https://stevenfresco.com/construct-manage-investment-portfolio-4-simple-steps/
- A version of this paper can be found here or here. - Want to read our summaries of academic finance papers? Check out our Academic Research Insight category What are the Research Questions With the current market conditions and the wild ride we've all been on, we've pivoted our attention to focus on supplying academic research on responding to a crisis. This article investigates what the appropriate tactical adjustments investors should consider when making changes to their portfolio holdings following large losses in wealth during a crisis. What are the Academic Insights? In a crisis the typical response by investors is a flight to safety. However, in the zero-sum game of trading equities, for every dollar chasing safety there is an equal number of dollars moving aggressively to take on risk. Thus, in a crisis only a subset of investors can flee to safety. By considering how the simple economics of supply and demand play out in a situation where both risk and risk aversion have increased significantly, the authors developed a very simple model calibrated to capture the stylized facts that describe the recent crisis, and within that model, they considered how investors should trade among themselves as conditions change. The authors observe the following: - When the extremely high- and low-risk-tolerant investors do not make up a big portion of investors, the appropriate tactical response for most investors in a crisis can actually be rather small. Specifically, in the base of the analysis with no differences in investor expectations, the authors found that for 80% of the investors, the appropriate adjustment involves less than 4% turnover. Only investors who are extremely risk-averse or risk-tolerant will find it appropriate to make significant changes in their allocations. - Once you introduce grounds that allow for more divergent expectations by investors in the crisis scenario, or if some investors follow a target weight allocation policy that induces extra trading “on autopilot,” then turnover will naturally be higher. Also, if there is much greater heterogeneity among investors in their risk tolerances, there will be a higher demand for trading between very risk-tolerant and very risk-intolerant investors again, with increases in turnover. Why does it matter? This study not only highlights that the appropriate action is minimal adjustments by most investors during a crisis, but also discusses the importance of the supply and demand principle in asset allocation. Any tactical portfolio adjustments that investors wish to make in response to changed market conditions take place in a market where the laws of supply and demand govern, and tactical responses must be developed with that in mind. In a crisis, prices and risk premiums must adjust so that, as a rough approximation, one can say that the “average investor” will not want to trade. The trades that any particular investor will want to make depend on how that investor’s risk preferences and other characteristics compare with those of the average investor. The Most Important Chart from the Paper: Abstract With respect to the recent financial crisis, the authors argue that the appropriate adjustments to portfolio allocations in response to the market dislocation are determined by equilibrium considerations (supply must equal demand) and depend on individual investors’ characteristics relative to societal averages. Using a simple model that captures the magnitude of the recent crisis, the authors show that the optimal tactical adjustments for most portfolios require a turnover of less than 10%.
https://alphaarchitect.com/2020/05/26/tactically-adjusting-everything-in-a-financial-crisis-bad-idea/
This website is intended for discussion purposes only. Any information relating to performance in these materials is illustrative, and no assurance is given that any indicative returns, performance, or results will be achieved, whether historical or hypothetical. This website does not constitute an offer to sell or a solicitation to buy interests in any fund managed by Ultra Blue Capital, LLC (“UBC”). None of the information provided should be considered impartial investment advice or advice given in a fiduciary capacity. The information contained in this website is qualified in its entirety by reference to disclosures made in the confidential private placement memorandum and related attachments and exhibits for the funds. These documents should be requested and carefully reviewed prior to making an investment decision. Neither the information nor any opinion expressed on this website constitutes an offer by UBC to buy or sell any securities or financial instruments, or to provide any investment advice or service. The services, securities and financial instruments described on this website may not be available to or suitable for you, and not all strategies are appropriate at all times. The value and income of any of the securities or financial instruments mentioned on this website can fall as well as rise, and an investor may get back less than he or she invested. Foreign-currency denominated securities and financial instruments are subject to fluctuations in exchange rates that could have a positive, or adverse, effect on the value, price or income of such securities and financial instruments. Past performance is not necessarily a guide to future performance. Independent advice should be sought in all cases. Except where otherwise indicated herein, the information provided herein is based on matters as they exist as of the date of preparation and not as of any future date and will not be updated or otherwise revised to reflect information that subsequently becomes available or circumstances existing or changes occurring after the date hereof. Further, many factors may affect actual performance, including changes in market conditions and interest rates and changes in response to other economic, political, social health or financial developments. Past performance is not indicative of future results. There is no guarantee that the investment objective of the fund will be achieved. Information and data included on this website are subject to change based on the market and other conditions. Unless otherwise indicated, the information presented on this website is current as of the date of the website. An investment in a private investment fund is speculative and involves a high degree of risk. Opportunities for withdrawals or redemptions are restricted, so investors may not have access to capital when needed. There is no secondary market for interests in the private investment funds, and none is expected to develop. An investor should not invest unless prepared to lose all or a substantial portion of its investment. The fees and expenses charged in connection with this investment may be higher than the fees and expenses of other investment alternatives and may offset profits. This website may contain forward-looking projections or statements related to the future performance of a potential portfolio company. These Forward-looking statements can be identified by the use of forward-looking terminology such as those found in this non-exhaustive list: “expect,” “anticipate,” “estimate," “forecast," “initiative," “objective," “plan," “goal," “project," “outlook," “priorities," “target," “intend," “evaluate," “pursue," “seek," “may," “would," “could," “should," “believe," “potential," “continue,” or the negative of any of those words or similar expressions is intended to identify forward-looking statements. Due to various risks and uncertainties, actual events or results or the actual performance of a fund and individual trades may differ materially from those reflected or contemplated in such forward-looking statements. Any such projections are subject to a number of risks and uncertainties beyond the control of UBC or any of its subsidiaries. Should one or more of these risks or uncertainties materialize, or should underlying assumptions be incorrect, actual results may vary materially from those projected. Investors are cautioned not to place reliance on forward-looking statements.
https://ultrabluecap.com/disclosures
In recent Client Question of the Month pieces we provided analyses on the advantages of portfolio diversification. In October, we highlighted the benefits of holding a global investment portfolio across equities, fixed income, commodities, and cash. In November, we further emphasized the value of holding both equities (stocks) and fixed income (bonds). To recap, diversified portfolios can lead to more consistent and less volatile results than a single asset class and potentially decrease the odds that an investor will need to make a withdrawal after a significant market decline. Consistency and downside protection are both critically important to long-term investment success. At Winthrop Wealth, portfolio diversification is a key component of our total net worth approach to financial planning and investment management. This month we will continue our series on diversification and underscore the benefit of a global equity portfolio. We actively allocate equity exposures across different regions, countries, market caps, factors, and sectors based on our market outlook and investment process. The result of our process is a diversified portfolio tilted toward the areas we believe have the most value. Our approach avoids over-concentration in a single strategy or asset class that can move in and out of favor. Our view is that a single strategy (i.e. value or growth equities) may be appropriate for a portion of a portfolio, but not for a client’s total net worth. The following chart displays the rolling 90-day volatility of the MSCI All Country World Index (ACWI) and its top nine individual countries from 1995 to present. According to MSCI, the ACWI Index is designed to represent the performance of the full opportunity set of large- and mid-cap stocks across 23 developed and 26 emerging markets countries. Note on the chart that the volatility level of the MSCI ACWI (white line) is consistently lower than most of the individual countries that make up the index. The reason for the lower volatility is that the individual countries are not perfectly correlated, meaning they do not move in the same direction at the same time. In general, as the number of countries in the index increases, overall volatility decreases. Bottom Line: a global equity portfolio can increase diversification and lower overall volatility. At Winthrop Wealth, we apply a total net worth approach to both comprehensive financial planning and investment management. Financial planning drives the investment strategy and provides a roadmap to each client’s unique goals and objectives. The comprehensive financial plan defines cash flow needs, optimizes account structures, considers tax minimization strategies, and continuously evaluates financial risks as circumstances and/or goals change. The investment management process is designed to provide well-diversified portfolios constructed with a methodology based on prudent risk management, asset allocation, and security selection. As always, please contact us if you have any updates to your personal or financial circumstances. DOWNLOAD ______________________________________________________________________________ DISCLOSURES: The economic forecasts set forth in this material may not develop as predicted and there can be no guarantee that strategies promoted will be successful. Content in this material is for general information only and not intended to provide specific advice or recommendations for any individual. International investing involves special risks such as currency fluctuation and political instability and may not be suitable for all investors. These risks are often heightened for investments in emerging markets. The fast price swings in commodities and currencies will result in significant volatility in an investor’s holdings. All indexes mentioned are unmanaged indexes which cannot be invested into directly. Unmanaged index returns do not reflect fees, expenses, or sales charges. Index performance is not indicative of the performance of any investment. Past performance is no guarantee of future results. MSCI ACWI: Morgan Stanley Capital International All Country World Index (MSCI ACWI) is an index designed to capture large and mid cap representation across 23 developed markets and 23 emerging market countries. The index covers approximately 85% of the global investable equity opportunity set. Represented countries are: Developed Markets: Australia, Austria, Belgium, Canada, Denmark, Finland, France, Germany, Hong Kong, Ireland, Israel, Italy, Japan, Netherlands, New Zealand, Norway, Portugal, Singapore, Spain, Sweden, Switzerland, the United Kingdom, and the United States. Emerging Markets: Brazil, Chile, China, Colombia, Czech Republic, Egypt, Greece, Hungary, India, Indonesia, Korea, Malaysia, Mexico, Peru, Philippines, Poland, Russia, Qatar, South Africa, Taiwan, Thailand, Turkey, and United Arab Emirates. Financial planning is a tool intended to review your current financial situation, investment objectives and goals, and suggest potential planning ideas and concepts that may be of benefit. There is no guarantee that financial planning will help you reach your goals. Likewise, it is important to remember that no investment strategy assures success or protects against loss. Asset allocation does not ensure a profit or protect against loss. There is no guarantee that a diversified portfolio will enhance overall returns or outperform a non-diversified portfolio. Diversification does not protect against market risk. All investing involves risk which you should be prepared to bear.
https://www.winthropwealth.com/client-questions/january-2020-client-question-equity-diversification/
As the industry continues to implement CRM2, IFIC busts seven common misconceptions that advisors, and clients, have about the new reporting standards. To learn more, watch the presentation below by clicking on “Start Prezi.” Then, click the play button. Myth #1: CRM2 applies mainly to mutual funds. Fact: CRM2 applies to all securities, dealers and portfolio managers registered with any Canadian securities commission. The securities commissions are encouraging firms to include non-securities products in client reporting, to the extent possible. Myth #2: The 2015 statement changes take effect in July 2015. Fact: These statement changes come into effect December 31, 2015. Read: CSA releases mutual fund fee report Myth #3: Investors will begin receiving the two new annual reports as of July 15, 2016. Fact: The rule comes into effect on July 15, 2016, at which point dealers have one year to begin sending these reports to their clients. In the majority of cases, investors will begin receiving reports early in 2017. This is because most firms are choosing to provide the information on a calendar-year basis. Myth #4: The report on charges and compensation will tell investors how much their advisor is being paid. Fact: The report on charges and compensation provides details about the money received by the dealer firm over the previous year to provide services to the investor. A portion of this money is paid as compensation to the investor’s financial advisor. The report on charges and compensation does not provide a breakdown of how much the advisor is paid and how much the dealer firm keeps. Read: AGF cuts fees on selected funds Each firm determines this amount differently, based on its business model and split responsibilities between the firm and advisor. Services provided by the dealer firm may include administration, advice and investor protection. Myth #5: The report on charges and compensation will tell investors the total cost of their investments. Fact: CRM2 focuses only on the amount paid either directly or indirectly by an investor to the dealer firm. For mutual funds, it does not include the amount paid to the investment manager. For an understanding of the total cost of a mutual fund, investors can review the fund’s MER, which can be found in the Fund Facts document for individual mutual funds, as well as financial statements. Myth #6: The new report on investment performance will provide benchmarks so that investors can evaluate their personal returns based on a benchmark. Fact: The report on investment performance will not provide benchmarks. The report focuses on the individual investor’s personal rate of return and this cannot be compared to a benchmark. Read: Use CRM2 to prove your value The personal rate of return is based on the individual investor’s specific deposits into and withdrawals out of his account, as well as dividends and interest earned within the account and changes in the value of the securities held within the account. Since each investor has a different combination of deposits and withdrawals, each investor could have a different personal rate of return. Myth #7: When investors receive their first reports on charges and compensation, they will be surprised to learn how much their dealers are being paid. Fact: Investors already receive information about dealer compensation, through information presented in percentage terms in the Fund Facts document and in the simplified prospectus. The only change under CRM2 is that these amounts will be provided in dollars and cents and at the account level, rather than just in percentage terms. The average MFDA account in 2014 was $44,000, while the average IIROC account was $71,000. For accounts consisting of funds with embedded commissions, the average dealer compensation is between 50 to 100 basis points, which is approximately in the range of $225 to $700 annually.
https://www.advisor.ca/news/industry-news/7-myths-about-crm2/
ASSET ALLOCATION RANGESHISTORICAL REFERENCE ALLOCATIONThe allocation to asset class asset class will typically fall within these ranges:WHAT DOES THIS MODEL DO?This model seeks to generate a blend of capital growth and income. The historical reference allocation above shows how a portfolio has typically been constructed to achieve the stated risk and return figures. However, the risks and returns of different assets are not static over time and historical returns are not a guide to future returns. This model therefore uses Kleinwort Hambros’ dynamic asset allocation to invest in a wider range of asset classes in response to changing market and economic conditions. The model’s current indicative asset class ranges are detailed above, and may vary over time.PERFORMANCE UPDATEOver the quarter the portfolio rose in value by 3.0%, behind its ARC peer group, which returned 3.4%. This was largely attributable to growth strategies outperforming income over the period. The portfolio’s equity allocation added the most value, benefiting from the strength of the Japanese market over the quarter, though the fixed income positions also added value.The strongest equity performance came from the Japanese market, though the US, Asian and emerging markets also added value. Income was out of favour in most equity markets, impacting our UK, Asian and emerging market positions, resulting in the modest shortfall relative to its benchmark. Funds of note were: Tokio Marine Japanese Equity Focus (+9.8%), Polar Capital Emerging Market Income (+3.6%), Schroder Asian Income (+4.9%), SPDR S&P US Dividend Aristocrats (+5.7%) and Aviva US Equity Income (+3.9%).Our fixed income investments delivered a modest return, driven by the government bond allocation, though our short duration positioning was a modest detractor. The credit exposure delivered a small positive return, but lagged on a relative basis. Over the quarter we removed the exposure to the Bloomberg Commodities benchmark, using the process to add a new holding of Invesco Sterling Bond Fund to increase the overall yield of the portfolio.CURRENT ASSET ALLOCATIONEQUITY ALLOCATIONSource: Kleinwort Hambros as at 31 Dec 2017Actual weighting and investment allocations are subject to change on an ongoing basis and may not be exactly as 10 HOLDINGSPERFORMANCE CHARTPERFORMANCEYIELD, FEES AND CHARGESNotes: model launch date 01/04/2012; performance net of underlying fund fees but gross of Kleinwort Hambros’s annual management charge, platform fees and advisor charges; Latest quarter of ARC performance data are based on ARC estimates. Available Platforms:AscentricAXA ElevateNoviaTransactAvivaNucleusStandard LifePast performance should not be seen as an indication of future performance. Investments may be subject to market fluctuations and the price and value of investments and the income derived from them can go down as well as up. The tax benefits and liabilities will depend on individual circumstances and may change in the future.Source: ARC, Morningstar, Bloomberg and Kleinwort Hambros as at 31 Dec 2017Important Information – please readThis document has been designed for professional intermediaries only and is not intended for client use. The information in this publication is provided for information purposes only and does not take into account the investment objective, the financial situation or the individual needs of any particular person. It is not an offer to buy or sell any particular security or investment. This publication does not constitute advice. All potential investors should seek and obtain advice specific to their circumstances from a qualified financial adviser before making investment decisions. This publication is intended to be used by the recipient only and may not be passed on or disclosed to any other persons and/or in any jurisdiction that would render the distribution illegal. It is the responsibility of any person in possession of this document to inform himself or herself of and to observe all applicable laws and regulations of the relevant jurisdictions. This document is in no way intended to be distributed in or into the United States of America nor directly or indirectly to any U.S. person.Financial promotionThis document is a financial promotionInvestment PerformancePast performance should not be seen as an indication of future performance. Investments may be subject to market fluctuations and the price and value of investments and the income derived from them can go down as well as up. Capital may be at risk and clients may not get back the amount invested. Changes in inflation, interest rates and the rate of exchange may have an adverse effect on the value, price and income of investments. The effects of charges and an investor’s personal tax circumstances may reduce any returns. Tax treatment depends on an investor’s individual circumstances and may be subject to change.Tax, Accounting and LegalAny services and investments may have tax consequences and it is important to bear in mind that the Kleinwort Hambros Group does not provide tax advice. The level of taxation depends on individual circumstances and such levels and basis of taxation can change. Professional tax advice should be sought in order to understand any applicable tax consequences. In addition, the material is not intended to provide, and should not be relied on for, accounting or legal purposes and independent advice should be sought where appropriate. Some products and services are not available in all Kleinwort Hambros Group entities. Their availability depends on local laws and tax regulations. In addition, they have to comply with the Societe Generale Group Tax Code of Conduct. Furthermore, accessing some of these products, services and solutions might be subject to other conditions, amongst which is eligibility.Regulatory informationIt has been approved and issued in the United Kingdom by SG Kleinwort Hambros Bank Limited which is authorised by the Prudential Regulation Authority and regulated by the Financial Conduct Authority and the Prudential Regulation Authority. The firm reference number is 119250. The company is incorporated in England and Wales under number 964058 and its registered address is 5th Floor, 8 St James’s Square, London SW1Y 4JU. CompensationSG Kleinwort Hambros Bank Limited is covered by the Financial Services Compensation Scheme (“FSCS”). Clients may be entitled to compensation from the FSCS if we cannot meet our obligations. This depends on the type of business and the circumstances of the claim. For investment business, compensation may be available to eligible investors in respect of protected claims up to a maximum of ?50,000 per claimant. For further information about the compensation provided by the FSCS (including the amounts covered and eligibility to claim) please ask contact a private banker or refer to the FSCS website .uk or call the FSCS on 020 7741 4100 or 0800 678 1100. Please note only compensation related queries should be directed to the FSCS.General Kleinwort Hambros is part of Societe Generale Private Banking, which is part of the wealth management arm of the Societe Generale Group. Societe Generale is a French Bank authorised in France by the Autorité de Contr?le Prudentiel et de Resolution, located at 61, rue Taitbout, 75436 Paris Cedex 09 and under the prudential supervision of the European Central Bank. It is also authorised by the Prudential Regulation Authority and regulated by the Financial Conduct Authority and the Prudential Regulation Authority.Further information on the Kleinwort Hambros Group including additional legal and regulatory details can be found at: Any unauthorised use, duplication, redistribution or disclosure in whole or in part is prohibited without the prior consent of Societe Generale. The key symbols, Societe Generale, Societe Generale Private Banking and Kleinwort Hambros are registered trademarks of Societe Generale. ? Copyright the Societe Generale Group 2017. All rights reserved. ................ ................ In order to avoid copyright disputes, this page is only a partial summary. To fulfill the demand for quickly locating and searching documents. It is intelligent file search solution for home and business.
https://5y1.org/info/dividend-aristocrats-dividend-rate_1_bd7ee5.html
OTTAWA, CANADA -- June 5, 2009 -- Wi-LAN Inc. ("Wi-LAN" or the "Company") (TSX: WIN), a leading technology innovation and licensing company, today provided a litigation update pertaining to the claims filed on September 30, 2008 by Intel Corporation ("Intel") in the Northern District of California requesting a declaratory judgment that 18 of Wi-LAN's U.S. patents are invalid and have not been infringed. Wi-LAN brought a motion to dismiss or transfer this Intel lawsuit, which was argued before U.S. District Judge Ware on May 4, 2009. In a ruling received by Wi-LAN today, Judge Ware granted Wi-LAN's motion to transfer to the Eastern District of Texas all claims of Intel involving Wi-LAN's U.S. patent 6,549,759. Judge Ware denied Wi-LAN's motion to dismiss or transfer the claims relating to the other Wi-LAN patents named in Intel's action in the Northern District of California. Judge Ware also scheduled a case management conference for this lawsuit which will be held on June 22, 2009. About Wi-LAN Wi-LAN, founded in 1992, is a leading technology innovation and licensing company. Wi-LAN has licensed its intellectual property to over 180 companies worldwide. Inventions in our portfolio have been licensed by companies that manufacture or sell a wide range of communication and consumer electronics products including 3G cellular handsets, Wi-Fi-enabled laptops, Wi-Fi/DSL routers, xDSL infrastructure equipment, WiMAX base stations and digital televisions. Wi-LAN has a large and growing portfolio of more than 550 issued or pending patents. For more information: www.wilan.com. Forward-looking Information Certain statements in this release, other than statements of historical fact, may include forward-looking information that involves various risks and uncertainties that face the Company; such statements may contain such words as "may", "would", "could", "will", "intend", "plan", "anticipate", "believe", "estimate", "expect" and similar expressions, and may be based on management's current assumptions and expectations related to all aspects of the wireless and wireline telecommunications industries and the global economy. Risks and uncertainties that may face the Company include, but are not restricted to: licensing of the Company's patents can take an extremely long time and may be subject to variable cycles; the Company is currently reliant on licensees paying royalties under existing licensing agreements and additional licensing of its patent portfolio to generate future revenues and increased cash flows; the Company may be required to establish the enforceability of its patents in court in order to obtain material licensing revenues; changes in patent laws or in the interpretation or application of patent laws could materially adversely affect the Company; a court may determine that certain of the Company's patents are not infringed by certain standards or products or may disagree with management with respect to whether one or more of the Company's patents apply to certain standards or products, which could adversely affect the Company; the Company will need to acquire or develop new patents to continue and grow its business; fluctuations in foreign exchange rates impact and may continue to impact the Company's revenues and operating expenses, potentially adversely affecting financial results; the Company has made and may make acquisitions of technologies or businesses which could materially adversely affect the Company; the Company may require investment to translate its intellectual property position into sustainable profit in the market; the generation of future V-chip revenues and the likelihood of the Company signing additional V-chip licenses could be negatively impacted by changes in government regulation; the Company is dependent on its key officers and employees; the price of the Company's common shares is volatile and subject to market fluctuation; and the Company may be negatively affected by reduced consumer spending due to the uncertainty of economic and geopolitical conditions. These risks and uncertainties may cause actual results to differ from information contained in this release, when estimates and assumptions have been used to measure and report results. There can be no assurance that any statements of forward-looking information contained in this release will prove to be accurate. Actual results and future events could differ materially from those anticipated in such statements. These and all subsequent written and oral statements containing forward-looking information are based on the estimates and opinions of management on the dates they are made and expressly qualified in their entirety by this notice. Except as required by applicable laws, the Company assumes no obligation to update forward-looking statements should circumstances or management's estimates or opinions change. Readers are cautioned not to place undue reliance on any statements of forward looking information that speak only as of the date of this release. Additional information identifying risks and uncertainties relating to the Company's business are contained under the heading "Risk Factors" in Wi-LAN's current Annual Information Form and its other filings with the various Canadian securities regulators which are available online at www.sedar.com. This press release does not constitute an offer to sell or a solicitation of an offer to buy any securities in the United States All trademarks and brands mentioned in this release are the property of their respective owners. Contacts: Wi-LAN Inc.Tyler Burns Director, Investor Relations & Communications O: 613-688-4330 C: 613-697-0367 [email protected] With our Interactive Analyst Center (IAC), historical financial data, both quarterly and annual, is available in an easy to access spreadsheet format. View and export our financial statements, non-GAAP reconciliations as well as share information. Please note that you are now entering a website directly or indirectly maintained by a third party (the "External Site") and that you do so at your own risk. Wi-LAN Inc. and its affiliates (“WIN”) have no control over the External Site, any data or other content contained therein or any additional linked websites. The link to the External Site is provided for convenience purposes only. By clicking “Accept” you acknowledge and agree that neither WIN nor third party provider Virtua Research, Inc. (“Virtua) is responsible, or accepts or assumes any responsibility or liability whatsoever for, the content, the data or the technical operation of the Linked Site. Further, by entering the External Site, you also acknowledge and agree that you completely and irrevocably waive any and all rights and claims against WIN and Virtua and further acknowledge and agree that in no event shall WIN or Virtua, or their respective officers, employees, directors and agents be liable for any (i) indirect, consequential, incidental, special, compensatory or punitive damages, (ii) damages for loss of income, loss of business profits, business interruption, loss of data or business information, loss of or damage to property, (iii) claims of third parties, or (iv) other pecuniary loss, arising out of or related to the Legal Notice, this disclaimer or the External Site By entering the External Site, you further acknowledge and agree that the disclaimer of warranties and limitations of liability set out in this disclaimer shall apply regardless of the causes, circumstances or form of action giving rise to the loss, damage, claim or liability, even if such loss, damage, claim or liability is based upon breach of contract (including, without limitation, a claim of fundamental breach or breach of a fundamental term), tort (including, without limitation, negligence), strict liability or any other legal or equitable theory, and even if WIN and Virtua are advised of the possibility of the loss, damage, claim or liability. The waiver and release specifically includes, without limitation, any and all rights and claims pertaining to the processing of personal data, including but not limited to any rights under any applicable data protection statute(s). If in any jurisdiction, any part of this disclaimer is held to be unenforceable by a court of competent jurisdiction, such part of this disclaimer shall be restricted or eliminated to the minimum extent and the remaining disclaimer shall otherwise remain in full force and effect. Please note the information presented is deemed representative at the time of its original release. Changes in historical information may occur due to adjustments in accounting and reporting standards & procedures. In addition to disclosing results determined in accordance with GAAP, WIN may also disclose certain non-GAAP and pro forma non-GAAP results of operations, including certain ratios, operational and miscellaneous data, as well as net income, diluted earnings per share, operating expenses, and operating income that make certain adjustments or exclude certain charges and gains that are outlined in the schedules included in this website. Management believes that this non-GAAP and pro forma non-GAAP information provides investors with additional information to assess WIN operating performance by making certain adjustments or excluding costs or gains and assists investors in comparing WIN's operating performance to prior periods. Management uses this non-GAAP and pro forma non-GAAP information, along with GAAP information, in evaluating its historical operating performance. WIN and Virtua also take no responsibility for third party pricing data provided for informational purposes and certain ratio results formulated from the provided third party pricing data. The non-GAAP information is not prepared in accordance with GAAP and may not be comparable to non-GAAP information used by other companies. The non-GAAP information should not be viewed as a substitute for, or superior to, other data prepared in accordance with GAAP.
http://www.wilan.com/news/news-releases/news-release-details/2009/Wi-LANLitigationUpdate1120946/default.aspx
It is challenging to systematically time financial markets, mostly because the uncertainty about expected asset returns is large (and especially so during periods of macroeconomic and market stress). As a result, investors are well advised to remain wary of strategies that rely on an elusive ability to accurately time financial markets. Instead, investors should focus on understanding and valuing macro uncertainty and its sources, measuring assets' exposures to this uncertainty, and aligning their strategic allocations with their tolerance for uncertainty and long-term macro conditions. They should also explore the potential benefits of hedging changes in macro uncertainty with low cost solutions. Tactical multi-asset allocation strategies are often premised upon the a priori benefits of rebalancing around changes in macroeconomic and market conditions. In a varying macro and market environment, so the argument goes, the equity risk premium (the expected excess equity-return over bonds) also varies. Thus, timing these variations and realigning the relative equity-bond allocations according to a pre-defined rule could potentially benefit, compared to rebalancing to fixed strategic weights. However, the key question for investors is: in practice, do such tactical equity-bond reallocations actually deliver on their promise to systematically outperform their strategic counterpart, net of fees? We propose to address this issue by backtesting the performance of two tactical (long-only) equity-bond strategies, relative to a given strategic portfolio rebalancing to fixed equity-bond allocations. More precisely, this strategic portfolio allocates 70% to the US cap-weighted equity market and 30% to a 10-year duration US treasury bond portfolio. The 70-30 equity-bond allocation was chosen to represent the average US public pension fund relative allocation to risky assets (including public equity, private equity and private real estate) . In this benchmark strategy, portfolio weights are rebalanced back to the 70-30 split quarterly. As for the two tactical strategies, the first version relies on a commonly used rule, based on a relative long-term valuation measure - the equity market trend earnings-to-price ratio (inverse CAPE) relative to the bond market trend yield. The second strategy relies on our measure of macro uncertainty and long-term equity risk premia. In each case, allocations are rebalanced quarterly. Regardless of the rules employed, our analysis indicate that tactical strategies did not systematically outperform the strategic portfolio, even gross of fees. Let's examine the performance of these two tactical strategies in sequence. Exhibit 1 portrays, from March 1966 to June 2022, the evolution of long-term (10-year horizon) expected US equity market and 10-year US constant maturity treasury bond real returns, based on relative equity and bond valuations. Equity valuations are measured by the CAPE ratio developed by Robert J. Shiller. According to the Exhibit, the (annualised) long-term equity risk premium increased significantly from about 3% in March 1966 to 10% in June 1982, at the height of the oil crises. It then declined in a nearly continuous stretch, bottoming at about 1.5% in Q3 1999, consistent with the extreme high valuations preceding the TMT bubble burst. The risk premium increased again around the 2008 global financial crisis, reaching 5% in Q1 2009, as valuations bottomed. Since then, it has been averaging at a lower level of about 2.6%, and bottoming at about 1.7% during the pandemic. Accordingly, a long-term risk tolerant investor trusting these variations in risk premia would find it efficient (on an ex-ante basis) to deviate from its strategic 70-30 equity-bond allocations. More precisely, the optimal tactical allocation rule would dictate to increase (decrease) the relative equity allocations as the risk premium increases (decreases) . However, did these (ex-ante) deviations actually benefit in practice (ex-post)? Exhibit 2 summarises the realised performance of this tactical strategy based on relative valuations from March 1966 to June 2022. In addition, it reports the performances of the strategic portfolio and of a broad US cap-weighed equity market portfolio. The principal conclusion is that the tactical strategy did not deliver on its promise of its alleged ability to profit from timing markets, even on a gross of fees basis. On average, the tactical strategy underperformed the strategic portfolio over the sample period, by about 30bps (annualised, gross of fees). Moreover, the tactical strategy experienced higher volatility (13% compared to 12.7%). It underperformed during periods of macro and financial markets stress, such as the 2008 global financial crisis (GFC) (-27% compared to -24.3%) and the Covid-19 pandemic (-13.5% compared to -12%). And as real (inflation-adjusted) interest rates increased in 2022, the year-to-date performance of the tactical strategy (as of June 2022) fared slightly worse (-20.8%) compared to the strategic portfolio (-20.6%). However, these are precisely the times when such tactical strategies claim to deliver higher returns - that is, if they could indeed time the markets accurately. But perhaps the reason for the poor market timing is the valuations-based rule. Indeed valuation measures, taken alone, tend to be unreliable predictors of forward-looking equity returns (and risk premia), even directionally sometimes. In our view, this is because they fail to explicitly account for the impact of dividend growth risk and discounting, and how dividend growth risk and discount rates relate to macro uncertainty. Valuations-based measures have been shown to be particularly unreliable during the more recent times of ultra-low real (inflation-adjusted) interest rates during the pandemic (extreme high valuations but subsequent high realised returns) , and increasing real interest rates as experienced so far in 2022 (decreasing valuations and lower subsequent realised returns) . Would an alternative tactical rule have performed better by timing markets? To address this question, we consider a tactical rule based on our model-implied long-term return expectations . Exhibit 3 portrays the evolution of the Navega US long-term (10-year horizon) equity market and 10-year US constant maturity treasury bond real return from March 1966 to June 2022. As also shown in the Exhibit, these return expectations strongly depend on our measure of long-term macro uncertainty. For the most part until the 2008 GFC, the evolutions of our model-based and the valuation-based measure of equity risk premium were qualitatively similar. However, since the 2008 GFC, the two measures have been differing markedly. By contrast to the valuations-based measure, following the GFC, our model-implied equity risk premium remained persistently high until about 2016, driven by continued high levels of macro uncertainty. After decreasing, as uncertainty receded until about 2019, it rose back to its highest historically observed level (of about 6.1%), driven by the pandemic induced surge in macro uncertainty. Over the last two quarters, the equity risk premium has declined, with macro uncertainty receding (i.e. increased certainty about the continuation of sluggish long-term real economic growth trend). Exhibit 4 summarises the performance of this macro uncertainty-based tactical strategy. According to the Exhibit, the macro uncertainty-based tactical strategy has performed better than the valuations-based strategy, both on average over the whole sample, and during the selected times of macro and market stress. However, it still did not manage to outperform the strategic portfolio (even gross of fees). The reason is because the uncertainty-based tactical allocations do not deviate much from the strategic weights. Indeed, by contrast to the valuations-based tactical strategy rule, our model-optimised portfolio explicitly takes into account investors' confidence in the inputs to the portfolio decision. These are the expected returns and risk measures themselves. This confidence about the model depends both on a given investor's tolerance for uncertainty, but also on our measure of statistical reliability (or "model uncertainty") of expected returns and risk, and their modeling assumptions - given the information available to the investor (i.e. past observations of macro variables and asset returns). Given the acutely high level of uncertainty around times of macro and financial markets stress (2008 GFC and Covid-19 for example), even an uncertainty tolerant investor would doubt the ability of timing markets under such conditions, and place less confidence in the expected changes in risk premia implied by our own models. In turn, investors would actually find it efficient not to deviate much from the strategic portfolio. In addition, our model-optimised portfolio also explicitly incorporates equilibrium effects. Simply put, on balance, all investors' portfolio holdings should ultimately add-up to the "market" portfolio of all assets held in their capitalisation weight. Thus, investors cannot all deviate away from this "market portfolio" in the same direction, as would be implied by common valuations-based tactical strategies! By contrast, our uncertainty-based tactical strategy implies that as some uncertainty tolerant investors increase (decrease) their equity allocations, away from the "market" portfolio, some uncertainty averse investors correspondingly decrease (increase) their equity allocations. In the context of the present analysis, none of the investors find it efficient to significantly deviate from their strategic holdings, and across investors, portfolios do add-up to the cap-weighted "market". The takeaway for investors is that chasing returns by attempting to time markets is a fool's errand. Instead, investors should focus on aligning long-term strategic allocations with their tolerance to macro uncertainty and long-term macro conditions, together with an understanding of the sources of this uncertainty, and its impact on long-term asset returns. Investors should also explore the potential benefits from hedging changes in macro uncertainty with low cost solutions. As we will show in subsequent notes, this exploration requires looking for investment opportunities beyond the cap-weighted stock market and bonds. This document is for informational purposes only. This document is intended exclusively for the person to whom it has been delivered, and may not be reproduced or redistributed to any other person without the prior written consent of Navega Strategies LLC (“Navega”). The information contained herein is based on Nageva’s proprietary research analytics of data obtained from third party statistical services, company reports or communications, publicly available information, or other sources, believed to be reliable. However, Navega has not verified this information, and we make no representations whatsoever as to its accuracy or completeness. Navega does not intend to provide investment advice through this document. This document is in no way an offer to sell or a solicitation of an offer to buy any securities. Investing in securities involves risk of loss, including a loss of principal, that clients should be prepared to bear. Past performance is not indicative of future results, which may vary materially. While this summary highlights important data, it does not purport to capture all dimensions of risk. The methodology used to aggregate and analyze data may be adjusted periodically. The results of previous analyses may differ as a result of those adjustments. Navega has made assumptions that it deems reasonable and used the best information available in producing any calculations herein. Statements that are nonfactual in nature, including opinions, projections and estimates, assume certain economic conditions and industry developments and constitute only current opinions that could be incorrect and are subject to change without notice. All information provided herein is as of the delivery date the document (unless otherwise specified) and is subject to modification, change or supplement in the sole discretion of Navega without notice to you. This information is neither complete nor exact and is provided solely as reference material with respect to the services offered by Navega. Information throughout this document, whether stock quotes, charts, articles, or any other statement or statements regarding market or other financial information, is obtained from sources which we and our suppliers believe reliable, but we do not warrant or guarantee the timeliness or accuracy of this information. The information presented here has not been personalized, and is not based on the financial circumstances of the recipient. This information may not be applicable to your particular financial needs, and should not, by itself, be used to make determinations regarding the purchase or sale of securities, or other investment decisions. The model performance information presented is based on the application of Navega’s factor analysis, backtested against actual historical data. “Backtesting” is a process of objectively simulating historical performance information by applying a set of rules backward in time. The results of the application of Navega’s model do not reflect actual performance or actual historical data. Such models are prepared with the full benefit of hindsight, and it is not likely that similar results could be achieved in the future. The model portfolios were constructed by Navega with the benefit of hindsight to illustrate certain performance metrics. The performance shown was not actually achieved by any investor. The investments in these hypothetical portfolios were selected with the full benefit of hindsight, after performance over the period shown was known. It is not likely that similar results could not be achieved in the future. The hypothetical portfolios presented here are purely illustrative, and representative only of a small sample of possible scenarios. The projections shown do not represent actual performance, and are based on assumptions which may not occur. It is possible that the markets will perform better or worse than shown in the projections, the actual results of an investor who invests in the manner these projections suggest will be better or worse than the projections, and an investor may lose money by relying on these projections.
https://navegastrategies.com/research/the-futility-of-market-timing/
Aerojet Rocketdyne Holdings, Inc. Board of Directors Approves New $100 Million Share Repurchase Program EL SEGUNDO, Calif., March 13, 2020 (GLOBE NEWSWIRE) -- Aerojet Rocketdyne Holdings, Inc. (NYSE:AJRD) (the “Company”) today announced that its Board of Directors authorized and approved a new share repurchase program allowing the Company to repurchase its outstanding common stock with an aggregate market value of up to $100 million, from time to time, over a period of up to 18 months. Management has discretion as to whether any repurchase may be under the share repurchase program. The timing of any share repurchases will be based on available liquidity, cash flows and general market conditions. The repurchase program may be executed through various methods, including open market purchases or privately negotiated transactions. Forward-Looking Statements This release contains certain “forward-looking statements” within the meaning of the United States Private Securities Litigation Reform Act of 1995. Such statements in this release and in subsequent discussions with the Company’s management are based on management’s current expectations and are subject to risks, uncertainty and changes in circumstances, which could cause actual results, performance or achievements to differ materially from anticipated results, performance or achievements. All statements contained herein and in subsequent discussions with the Company’s management that are not clearly historical in nature are forward-looking and the words “anticipate,” “believe,” “expect,” “estimate,” “plan,” and similar expressions are generally intended to identify forward-looking statements. Actual events or the actual future results of the Company may differ materially from any forward-looking statement due to these and other risks, uncertainties and significant factors. The forward-looking statements speak only as of the date of this release. The Company expressly disclaims any obligation or undertaking to release publicly any updates or revisions to any forward-looking statement included in this release to reflect any changes in expectations with regard thereto or any changes in events, conditions, or circumstances on which any such statement is based. About Aerojet Rocketdyne Holdings, Inc. Aerojet Rocketdyne Holdings, Inc., headquartered in El Segundo, California, is an innovative technology-based manufacturer of aerospace and defense products and systems, with a real estate segment that includes activities related to the entitlement, sale, and leasing of the Company’s excess real estate assets. More information can be obtained by visiting the Company’s websites at www.rocket.com or www.aerojetrocketdyne.com. Contact information: Investors: Kelly Anderson, investor relations 310-252-8155 You can register yourself on the website to receive press releases directly via e-mail to your own e-mail account.
https://www.teletrader.com/news/details/51517978
Here are some ways you can avoid the pitfalls of market volatility. The importance of withdrawal rates It’s vital to know how much money you’ll need in retirement. But it’s just as important to know how much of your savings you can withdraw each year. Being prepared for your golden years will help you stay calm when markets are volatile. And rebalancing your portfolio from time to time helps ensure that you won't run out of money down the road, no matter how much the market fluctuates. It’s common for investors’ goals to change over time. Initially, most investors want their portfolios to grow in value. Later on, they want to preserve the capital they’ve accumulated over the years. That's why it's important to determine how much of your savings you should withdraw annually once you retire. If you withdraw too much, you could outlive your savings. To prevent this from happening, you need to make sure that your savings grow while they’re invested. This means that your portfolio needs to be properly diversified, as diversification mixes a wide variety of investments within a portfolio. A person who invests only in bonds will not achieve the same returns as someone whose portfolio is diversified. A good example Let's look at someone who invests all their savings in bonds. If this person withdraws 4% of the savings they invested each year, odds are less than 50% that their savings will last more than 30 years. On the other hand, consider a person who invests 50% of their savings in bonds and 50% in stocks. At a 4% withdrawal rate, this investor likely won’t have to worry about running out of money. The following table gives a few more examples: Allocation and Withdrawal Rates are Key to achieving goals An appropriate asset allocation and prudent withdrawal rate may help you meet your retirement income and estate planning goals. For illustrative purposes only. The scenarios depicted are hypothetical in nature and are not intended to reflect the actual returns of any product managed by Sun Life Global Investments or any market or index. The importance of rebalancing A knowledgeable investor won’t stick with the same asset mix throughout their lifetime. The mix will shift as time goes on. Rebalancing can help your portfolio stay in line with your goals. “Working with your financial advisor to rebalance your portfolio is part of sound portfolio monitoring,” comments Chhad Aul, Chief Investment Officer and Head of Multi-Asset Solutions, SLGI Asset Management Inc. “Adjusting your investments as your risk tolerance and/or market conditions change can help you ensure that you have the appropriate portfolio to meet your investment objectives.” For example, people starting out in their career typically choose a more aggressive portfolio. But as they near retirement, they will need to adjust the mix so they are not exposed to risk if the stock market drops dramatically. A portfolio could lose more than its initial value if the value of its holdings takes a tumble. A financial advisor can review your investment portfolio at least annually. An advisor can help you make any adjustments that are needed. These adjustments are based on your needs as an investor and the state of the markets. In short, having a suitable asset mix and a prudent withdrawal rate can help you meet your retirement income goals. And it can keep your portfolio aligned with your level of comfort with risk. Important information Chart source: MFS Research. Data source: Journal of Financial Planning, September 2012. Data for stock returns are monthly total returns to the S&P 500 Index, and bond returns are total monthly returns to high-grade corporate bonds. Both sets of returns data are from January 1926 through December 2009, as published in the Ibbotson SBBI 2010 Classic Yearbook from Morningstar. Inflation adjustments were calculated using annual values of the CPI-U, as published by the US Bureau of Labor Statistics at www.bls.gov. Updated January 2018 by Wade Pfau, Professor at The American College and Principal at McLean Asset Management as published on Forbes.com, latest data available. Commissions, trailing commissions, management fees and expenses all may be associated with mutual fund investments. Investors should read the prospectus before investing. Mutual funds are not guaranteed, their values change frequently and past performance may not be repeated. The information contained in this document is provided for information purposes only and is not intended to represent specific individual financial investment, tax or legal advice nor does it constitute a specific offer to buy an/or sell securities. While the information contained in this document has been obtained from sources believed to be reliable, SLGI Asset Management Inc. cannot guarantee its accuracy, completeness or timeliness. Information in this document is subject to change without notice and SLGI Asset Management Inc. disclaims any responsibility to update it. Sun Life Global Investments is a trade name of SLGI Asset Management Inc., Sun Life Assurance Company of Canada and Sun Life Financial Trust Inc. © SLGI Asset Management Inc. and its licensors, 2022. SLGI Asset Management Inc. is a member of the Sun Life group of companies. All rights reserved.
https://www.sunlifeglobalinvestments.com/en/insights/investor-education/understanding-market-volatility/how-to-deal-with-volatility-and-aim-not-to-outlive-your-money/
MiFID II, or the second Markets in Financial Instruments Directive, came into force on 3 January 2018: this revised version was launched by the European Commission in order to strengthen the existing legal provisions of MiFID I and to make European markets more transparent, more efficient and safer for investors. Concretely, what does it change for you? The MiFID rules were implemented into Luxembourg legislation earlier this year and mainly concern investment firms regulated by the PSF Law, including investment advisers, brokers in financial instruments, portfolio managers, distributors of shares in investment funds, credit institutions when providing investment services and activities, investment firms and credit institutions when selling or advising clients in relation to structured deposits, etc. They all now have new obligations when executing, receiving and transmitting orders, or when conducting portfolio management services. What is the impact of these new obligations on you as a private investor? To make it clear and simple, there are three key areas where you will notice significant changes: the suitability assessment, the reporting frequency and the client disclosures. A more suitability-centric approach An investment firm has always had to show suitability on certain products, looking at your objectives, time horizon (short or long term), risk appetite, knowledge about the financial markets and their products, and financial position (assets). Under MiFID II, the suitability tests are extended to all advice. In other words, an investment firm must provide you with a report for each piece of advice given in case of an investment advice, even if the advice is to do nothing. In this report, the investment firm must explain why they believe that the investment advice is suitable based on the information that you have transmitted. In addition, they must document changes of circumstances in even more detail (changes in your family, changes in the type or the number of the assets that you own, and so on). An investment firm must also clearly indicate whether a suitability assessment is performed periodically with regard to your existing portfolio. Increased report frequency An investment firm has always had to provide you with regular reports such as transaction confirmations (e.g. contract notes) and periodic updates (e.g. valuation summaries and statements). Now, according to the new rules, some documents must be sent more often. For instance, financial statements will no longer be issued bi-annually, but quarterly. If your money is being managed on a discretionary basis, either by advisers or discretionary fund managers, new safeguards under MiFID II mean you must be notified if and when the value of your portfolio falls by 10%, and each subsequent 10% fall, since the last quarterly statement. Better transparency of cost and charges An investment firm has always had to provide you details of the total costs of the investment services and the financial instruments, and to inform you of any changes in a way that is clear, fair and not misleading. MiFID II stipulates extra requirements and clarifies a number of existing ones. An investment firm must now provide a total overview of all the expected costs BEFORE providing any services, make use of an illustration to give you insight into the cumulative effect of the cost on the return and inform you about the manner in which the costs are charged. The investment firm with which you have an ongoing relationship must provide you information about all the total costs charged to you at least once a year. This information can be included in the regular periodical reports. Know who you are before starting to invest In short, the philosophy of MiFID II is the more the investor is informed, the better. Unfortunately, this remains to be seen. According to some specialists, the new rules could have an adverse effect for inexperienced investors: information overload and the 10% rule could have unintended consequences such as panic selling. This is why it is so important to determine your investor profile, preferably with a professional adviser. You need to know who you really are before starting to invest. Be good at money with a detailed investor profile: www.ing.lu/mifid !
http://mail.bifilmseason.lu/opinion/22916-mifid-ii-what-is-it-and-why-does-it-matter-for-you
The overall goal of this two-day workshop is to equip participants with the analytic skills and understanding to assess the risk and rewards inherent in Collateralized Loan Obligations (CLOs). Key Learning Outcomes: - Use a structured approach to evaluate the underlying assets, asset manager and transaction structures - Understand the impact of key variables on risk assessment models - Critique the structure to identify and assess the risks and protections afforded - Appreciate the rationale for CLOs from the perspective of the issuer and the investor - Highlight differences between CLO 1.0 and 2.0. Analytic Framework The goals of this section are to highlight key elements of the CLO market and to establish a framework of analysis for CLOs. Industry overview - CLOs in the context of the high yield market - Why CLOs? Motivations behind the deal Analytic approach to credit evaluation - Applying the analytic framework to CLOs: purpose, payback, risks and structure - Exercise: examining differences in pre- and post-crisis CLOs. Risks to Repayment The goal of this section is to consider issues related to the CLO assets and asset managers which could affect repayment of CLOs. Underlying assets - Defining the underlying assets - Identifying the key variables which impact likelihood of default and recovery - Assessing asset credit quality - Default probability: the use of credit ratings, calculating WARF (weighted average rating factor) - Recovery rate: examining asset security and adjustments to standard assumptions. - Impact of covenant-lite loans - Importance of diversification - Asset concentration and correlation - Cash flow analysis: understanding portfolio credit quality under various stress scenarios - Exercise: using the Portfolio Credit Model to understand sensitivity in portfolio credit risk. Asset manager - Scope of manager's role - Methodology for assessing CLO asset managers - Key man risk - Management replacement provisions. Structure The goal of this section is to understand how the features of CLO transactions address repayment risk and provide returns to investors. Credit enhancement - The role of credit enhancement: loss allocation - Sizing: how the level of credit enhancement is determined for the target ratings. Note profile - Terms and conditions - Ramp up, reinvestment and amortization periods - Class X: the role of excess spread - Waterfall structures: protecting priority of payments - Unravelling payment flows: sources, applications and redistribution of funds - Expected and legal maturity; extension risk, optional redemption features - Exercise: Unravelling waterfall structures. Structural safeguards - Target portfolio characteristics and managing to dynamic collateral quality tests - Coverage tests (OC and IC): rationale, definitions and implications - Events of default; liquidations - Understanding sales, trading and reinvestment criteria and controls - Counterparty risk - Exercise: Examining CLO coverage tests - Exercise: Evaluating Structural Features of CLOs. Legal framework - Bankruptcy remoteness and non-consolidation - Validity of transfer/perfection of security - Regulatory issues: risk retention proposals, Foreign Account Tax Compliance Act. Market Conditions - Relative value: comparing returns across asset classes - Current market topics: CLOs going static, asset manager consolidation - CLO market performance. Monitoring Performance The goal of this section is to highlight the on-going evaluation of CLO programs. - Timely and adequate reporting - Tracking portfolio changes and performance - Asset defaults, restructurings and recoveries - Surveillance and rating changes: expectations vs. performance through the cycle. Who should attend Investors, credit risk managers, issuers, regulators, bankers and other professionals who need to understand the key risks and features of CLOs.
https://coursalytics.com/courses/clo-credit-risk-fitch-learning
Relative safety versus equities is the cornerstone of fixed-income investing. But, beyond performance, do your core fixed-income investments adequately help dampen volatility and complement equities? Are they the best options if the market environment becomes more challenging going forward? Alpha is a measure of the performance of a portfolio after adjusting for risk. Alpha is calculated by comparing the volatility of the portfolio and comparing it to some benchmark. The alpha is the excess return of the portfolio over the benchmark. 1 Source: Wellington Management, as of 12/31/20. Firm assets include assets under management and non-discretionary assets. 2 Morningstar rankings are based on average total returns of all mutual funds in their peer group and do not take into account sales charges. This communication is provided solely as general information about our products and services and should be considered general investment education. None of the information provided should be regarded as investment advice or an investment recommendation. Neither Hartford Funds nor its affiliates are undertaking to provide impartial investment advice to any individual investor or retirement plan sponsor. If you are an individual investor or retirement plan sponsor, contact your financial professional or other fiduciary unrelated to Hartford Funds about whether any given investment product or strategy may be appropriate for your circumstances. Important Risks: Investing involves risk, including the possible loss of principal. Security prices fluctuate in value depending on general market and economic conditions and the prospects of individual companies. The Fund may allocate a portion of its assets to specialist portfolio managers, which may not work as intended. • Fixed income security risks include credit, liquidity, call, duration, event and interest-rate risk. As interest rates rise, bond prices generally fall. • The risks associated with mortgage-related and asset-backed securities as well as collateralized loan obligations (CLOs) include credit, interest-rate, prepayment, liquidity, default and extension risk. • The purchase of securities in the To-Be-Announced (TBA) market can result in additional price and counterparty risk. • Derivatives are generally more volatile and sensitive to changes in market or economic conditions than other securities; their risks include currency, leverage, liquidity, index, pricing, and counterparty risk. • Foreign investments may be more volatile and less liquid than US investments and are subject to the risk of currency fluctuations and adverse political and economic developments. These risks may be greater for investments in emerging markets. • Investments in high-yield (“junk”) bonds involve greater risk of price volatility, illiquidity, and default than higher-rated debt securities. • Obligations of US Government agencies are supported by varying degrees of credit but are generally not backed by the full faith and credit of the US Government. • Restricted securities may be more difficult to sell and price than other securities. • The Fund may have high portfolio turnover, which could increase its transaction costs and an investor’s tax liability. Additional Information Regarding Bloomberg Barclays Indices Source: Bloomberg Index Services Limited. BLOOMBERG® is a trademark and service mark of Bloomberg Finance L.P. and its affiliates (collectively “Bloomberg”). BARCLAYS® is a trademark and service mark of Barclays Bank Plc (collectively with its affiliates, “Barclays”), used under license. Bloomberg or Bloomberg’s licensors, including Barclays, own all proprietary rights in the Bloomberg Barclays Indices. Neither Bloomberg nor Barclays approves or endorses this material, or guarantees the accuracy or completeness of any information herein, or makes any warranty, express or implied, as to the results to be obtained therefrom and, to the maximum extent allowed by law, neither shall have any liability or responsibility for injury or damages arising in connection therewith. The views expressed herein are those of Wellington Management, are for informational purposes only, and are subject to change based on prevailing market, economic, and other conditions. The views expressed may not reflect the opinions of Hartford Funds or any other sub-adviser to our funds. They should not be construed as research or investment advice nor should they be considered an offer or solicitation to buy or sell any security. This information is current at the time of writing and may not be reproduced or distributed in whole or in part, for any purpose, without the express written consent of Wellington Management or Hartford Funds. WP484 223804 Your session has expired. Please login again.
https://www.hartfordfunds.com/insights/market-perspectives/fixed-income/core-fixed-income-and-the-pursuit-of-alpha.html
Myth 2: Go to cash, avoid duration risk.” Duration risk pertains to the sensitivity between the price of a bond and the change in interest rates. For example, bonds with longer maturities tend to be more sensitive to interest rate increases, so their prices are more likely to fall when interest rates go up. Rather than fearing bonds when interest rates are projected to rise, bonds, or bond funds, with shorter durations are an effective option. Myth 3: “When interest rates are rising – don’t just stand there – do something!” Unfortunately, the notion that doing nothing will hurt you is widespread, though not necessarily true. If interest rates are low and investors want downside protection in their portfolios, nothing has to be done if they already own short-term bond funds. Selling for the sake of selling doesn’t make the decision to sell a correct one. While difficult, sometimes doing nothing is actually the best thing an investor can do. We all know that interest rates can’t stay this low forever, though the length of time they’ve been this low has been a surprise to most market prognosticators. The most important takeaway is knowing that bonds have a place in almost every investor’s portfolio. Good quality, short-duration bonds can not only help to cushion a portfolio against drops in the stock market, they also provide the liquidity that investors need for ongoing or emergency cash-flow needs. Owning bonds may not be sexy, but boring is beautiful when markets are most volatile. 1 Aliaga-Diaz, Roger. “Rising Rates Don't Negate Benefits of Bonds.” Vanguard, 30 Sept. 2021. Please remember that past performance may not be indicative of future results. Different types of investments involve varying degrees of risk, and there can be no assurance that the future performance of any specific investment, investment strategy, or product (including the investments and/or investment strategies recommended or undertaken by S.F. Ehrlich Associates, Inc. (“SFEA”), or any non-investment related content, made reference to directly or indirectly in this newsletter will be profitable, equal any corresponding indicated historical performance level(s), be suitable for your portfolio or individual situation, or prove successful. Due to various factors, including changing market conditions and/or applicable laws, the content may no longer be reflective of current opinions or positions. Moreover, you should not assume that any discussion or information contained in this newsletter serves as the receipt of, or as a substitute for, personalized investment advice from SFEA. To the extent that a reader has any questions regarding the applicability of any specific issue discussed above to his/her individual situation, he/she is encouraged to consult with the professional advisor of his/her choosing. SFEA is neither a law firm nor a certified public accounting firm and no portion of the newsletter content should be construed as legal or accounting advice. A copy of SFEA’s current written disclosure Brochure discussing our advisory services and fees is available upon request. If you are a SFEA client, please remember to contact SFEA, in writing, if there are any changes in your personal/financial situation or investment objectives for the purpose of reviewing, evaluating, or revising our previous recommendations and/or services.
https://www.sfehrlich.com/blog/benefit-owning-bonds-regardless-low-or-rising-interest-rates
Please read the following rules carefully before signing in. We strongly recommend to read the terms and conditions of the convention before registration on the website and becoming the company's investor. The rules, General terms and conditions of cooperation of the company Cryptoinc Corp (the "Company") and the investor (hereinafter - the Investor) are prescribed in this section. This document alludes to the fact that both parties accept all regulations, which are spelled out in the document and agree to abide by them. The document comes into force once the registration on the website of the Company is completed by the Investor. In order to register on the website of the Company and to become the Investor, the person must be at least 18 years old at the moment of the registration. The user automatically receives the status of the Investor immediately after registration on the website and accepting all the terms of the agreement. If the user disagrees with any of the provisions of this agreement or if they have any doubts on certain items - the registration should be terminated. All financial transactions carried out through the Company's website, are confidential and are not disclosed to third parties. The Investor has an opportunity to carry out financial transactions and to use other services of the Company only after registration on the website. The Company undertakes to use funds from investors for its intended purpose and to conduct real activity on the Forex market ,Mining and Stocks. The company guarantees the safety of the Investor's funds and undertakes to perform deposit and withdrawal of profit timely. The company is not responsible for any technical malfunctions of electronic payment systems. Financial transactions that are associated with deposit and withdrawal of funds to the account of electronic payment systems are irreversible and final. The company shall not be personally liable for incorrectly executed transactions with monetary funds and for incorrectly issued financial account. The company is responsible for maintaining the confidentiality of personal information that has been provided by the Investor. The investor, while filling in the registration form, is personally responsible for the accuracy of the information provided. The Investor is obliged to review each transaction on their financial account. In case of detecting any inaccuracies or discrepancies, the Investor can seek help from support services. All services provided by the Company, shall be used by the Investor only in order to conduct investment activities. The investor consents to the processing of personal information in accordance with the provisions stated in the legislation. The company minimizes the risks that may arise during the conduct of activities in the currency market. In addition, there are risks which are related to the use of an Internet-based deal execution trading system and include risks connected with possible hardware or software failures. www.cryptoinc.biz website, all the information and services hosted are the private property of the Company and are protected by copyright law. In the case of copyright infringement (harming website, copy materials, and so on.) the Investor will be prosecuted; his account will be blocked, along with funds that were on his personal account. For the duration of circumstances of inseparable force (changes in legislation, natural disasters, military situation, etc.); the Company shall be entitled to suspend its activities indefinitely. Such circumstances can not be subject of influence from both the Company and the Investor. The circumstances of inseparable force imply the impossibility to carry out any transactions and financial transactions in standard mode. Applicable rules and the terms of the agreement can be reviewed by the project administration. The administration has the right to make changes and additions at any time. Additions and changes are published in this section and shall take effect immediately after their announcement. In order to be aware of possible changes, we encourage you to periodically review this section. Termination of the cooperation between the Company and the Investor may be initiated by either side. The Company has the right to unilaterally terminate the cooperation with the Investor in case of violation of terms and conditions of the agreement. The Investor can terminate the agreement if they decide to cease their investment activities in the Company. Conflict resolution between the Company and the Investor shall be held in the format of negotiations or in accordance with applicable law.
https://cryptoinc.biz/?a=rules
By Deborah J. Rumsey . To obtain a measure of variation based on the five-number summary of a statistical sample, you can find what’s called the interquartile range, or IQR.... To find the IQR of a box plot, you must identify the location of Q1 and Q3. Subtracting the values will give you the IQR. This video reviews how to do this. Example Three - Quartiles and IQR. We cannot always use a small set of easy numbers such as the doll heights in example two. Here, we will calculate the same Q 1, Median and Q 3, but using rules that are better suited to the large quantity of data that is more common in statistics. By Deborah J. Rumsey . To obtain a measure of variation based on the five-number summary of a statistical sample, you can find what’s called the interquartile range, or IQR. The range of values from Q1 to Q3 are called the inter-quartile range(IQR). Outliers live outside the inner quartile range. By statistical definition, they live 1.5 times below or above your inner quartile range. How to Find the Interquartile Range Finding the Interquartile Range. We can find the interquartile range of a set of numbers. Imagine a teacher had set their mathematics class a test.
http://playhouseharlow.com/alberta/how-to-find-the-iqr-in-math.php
The box in the box plot displays the dataset’s median, first and third quartile, and the interquartile range. The line in the center of the box shows the median, the edges shows the first and third quartiles, and the interquartile range is visualized by the width of the box. Usage of Side-by-side Box plots Usage of Side-by-side Box plots The two datasets can be analyzed visually by placing two box plots side by side. This allows easy comparison of median, first and third quartiles and the IQR of the datasets. Side-by-side Boxplots Side-by-side Boxplots In Python’s Matplotlib library, if multiple datasets are specified in function pyplot.boxplot(), then those datasets will be visualized as side by side box plots. Box Plot Outliers Box Plot Outliers In a box plot, the data points that fall beyond the whiskers are called outliers. They are usually labeled with a dot or an asterisk. Box Plot Whiskers Box Plot Whiskers A box plot’s whiskers are the lines that extends from the 1st or 3rd quartile to points farthest from the median. The upper whisker of the box plot is the largest dataset number smaller than 1.5IQR above the third quartile and the lower whisker is the smallest dataset number larger than 1.5IQR below the first quartile. Boxplot in Matplotlib Boxplot in Matplotlib In Python’s Matplotlib library, the pyplot.boxplot() function takes a dataset as input and returns a box plot. - 1Boxplots are one of the most common ways to visualize a dataset. Like histograms , boxplots give you a sense of the central tendency and spread of the data. Take a look at the boxplot on this pag… - 2When making a box plot, the easiest place to start is the line that is inside the box. This line is the median of the dataset. Half of the data falls above that line and half falls below it. We … - 3Now that we’ve drawn the median, let’s draw the edges of the box. The box extends to the first and third quartile of the dataset. This visually splits the data into fourths. One-quarter of the d… - 4The whiskers of a boxplot display information related to the spead of the dataset. There are many different ways to plot the whiskers of a boxplot. You might see some boxplots where the whiskers … - 5The final piece of a boxplot is the representation of outliers. An outlier is a point in the dataset that falls outside of the whiskers. Outliers are usually represented with a dot or an asterisk.
https://external.codecademy.com/learn/learn-statistics-with-python/modules/boxplots
Want to save this article for later? As an analyst, you can explore the relationship between variables both quantitatively and visually. However, only looking at the quantitative indicators like correlation could be leaving out much of the bigger picture. Numerical v/s Numerical The Anscombe quartet as shown below is a classic example: The points in the datasets are such that the mean, variance and correlation are all the same for each of the 4 datasets. What gets confusing is though the regression line suggests a linear relationship, the relationship among variables for all the datasets is in fact, not linear. Categorical v/s Numerical Let’s now take a look at the relationship between a categorical and numerical variable with the help of box plots: Here, we look at the relationship between revenue and Operating System (OS). Box plots are a quick and efficient way to visualize a relationship between a categorical and a numerical variable. The boxes represent the observations from the 25th percentile (Q1 – Quartile 1) to 75th percentile (Q3 -Quartile 3). Together, the data between Q1 and Q3 constitute IQR – Inter Quartile Range of the data and the line in the middle of the box represents the median (Quartile 2 – Q2). The points beyond the IQR (below Q1 and above Q3) and which are within 1.5 times the IQR constitute the whiskers. All points beyond the whiskers qualify to be outliers. (Read this article to know more about outliers) From the above graph, it seems the Q1 (25th percentile observation) of iOS users is higher than the median (Q2) on both Android and Windows users. It can be inferred that users who have transacted an amount greater than $2600 (median of the iOS users) are more likely to be iOS users. The type of OS does seem to have a bearing on the amount of Revenue generated per transaction. Categorical v/s Categorical Suppose we have table for Gender by OS in the below format: Let’s now take a look at the above table by visualizing the relationship between the categorical variables with the help of mosaic plots: The essential idea behind a mosaic plot is to recursively sub-divide a unit square into rectangular tiles such that the area of each tile is proportional to the cell frequency. The cell frequency in the above case is for Android users whose gender is male represented by the larger rectangle shown in blue. The color range from red to colorless to blue is based on a chi-square statistical test. The darker shades indicate that the observed frequency exhibits a large deviation from the expected frequency as compared to lighter shades. In the above plot, the observed female android users seems to be quite less compared to expected female android users while it is the opposite for iOS female users. It may be concluded that gender has a bearing on type of users on iOS and android whereas it doesn’t seem to have a bearing on windows users. Thus, visualizing variable relationships are not only desirable but also required to confirm the relationship between variables arrived with quantitative indicators. When it comes to variable relationships, numbers simply may not be enough. Get the full picture by building a visual representation of the data.
https://clevertap.com/blog/exploring-the-relationship-between-variables-visually/?utm_source=ref_article_applesoranges1
A measure of spread, sometimes also called a measure of dispersion, is used to describe the variability in a sample or population. It is usually used in conjunction with a measure of central tendency, such as the mean or median, to provide an overall description of a set of data. You are watching: This measure of spread is affected the most by outliers Why is it important to measure the spread of data? There are many reasons why the measure of the spread of data values is important, but one of the main reasons regards its relationship with measures of central tendency. A measure of spread gives us an idea of how well the mean, for example, represents the data. If the spread of values in the data set is large, the mean is not as representative of the data as if the spread of data is small. This is because a large spread indicates that there are probably large differences between individual scores. Additionally, in research, it is often seen as positive if there is little variation in each data group as it indicates that the similar. We will be looking at the range, quartiles, variance, absolute deviation and standard deviation. Range The range is the difference between the highest and lowest scores in a data set and is the simplest measure of spread. So we calculate range as: Range = maximum value - minimum value For example, let us consider the following data set: |23||56||45||65||59||55||62||54||85||25| The maximum value is 85 and the minimum value is 23. This results in a range of 62, which is 85 minus 23. Whilst using the range as a measure of spread is limited, it does set the boundaries of the scores. This can be useful if you are measuring a variable that has either a critical low or high threshold (or both) that should not be crossed. The range will instantly inform you whether at least one value broke these critical thresholds. In addition, the range can be used to detect any errors when entering data. For example, if you have recorded the age of school children in your study and your range is 7 to 123 years old you know you have made a mistake! Join the 10,000s of students, academics and professionals who rely on muzic-ivan.info muzic-ivan.info.TAKE THE TOURPLANS & PRICING Quartiles and Interquartile Range Quartiles tell us about the spread of a data set by breaking the data set into quarters, just like the median breaks it in half. For example, consider the marks of the 100 students below, which have been ordered from the lowest to the highest scores, and the quartiles highlighted in red. |Order||Score||Order||Score||Order||Score||Order||Score||Order||Score| |1st||35||21st||42||41st||53||61st||64||81st||74| |2nd||37||22nd||42||42nd||53||62nd||64||82nd||74| |3rd||37||23rd||44||43rd||54||63rd||65||83rd||74| |4th||38||24th||44||44th||55||64th||66||84th||75| |5th||39||25th||45||45th||55||65th||67||85th||75| |6th||39||26th||45||46th||56||66th||67||86th||76| |7th||39||27th||45||47th||57||67th||67||87th||77| |8th||39||28th||45||48th||57||68th||67||88th||77| |9th||39||29th||47||49th||58||69th||68||89th||79| |10th||40||30th||48||50th||58||70th||69||90th||80| |11th||40||31st||49||51st||59||71st||69||91st||81| |12th||40||32nd||49||52nd||60||72nd||69||92nd||81| |13th||40||33rd||49||53rd||61||73rd||70||93rd||81| |14th||40||34th||49||54th||62||74th||70||94th||81| |15th||40||35th||51||55th||62||75th||71||95th||81| |16th||41||36th||51||56th||62||76th||71||96th||81| |17th||41||37th||51||57th||63||77th||71||97th||83| |18th||42||38th||51||58th||63||78th||72||98th||84| |19th||42||39th||52||59th||64||79th||74||99th||84| |20th||42||40th||52||60th||64||80th||74||100th||85| The first quartile (Q1) lies between the 25th and 26th student"s marks, the second quartile (Q2) between the 50th and 51st student"s marks, and the third quartile (Q3) between the 75th and 76th student"s marks. Hence: First quartile (Q1) = (45 + 45) ÷ 2 = 45Second quartile (Q2) = (58 + 59) ÷ 2 = 58.5Third quartile (Q3) = (71 + 71) ÷ 2 = 71 In the above example, we have an even number of scores (100 students, rather than an odd number, such as 99 students). This means that when we calculate the quartiles, we take the sum of the two scores around each quartile and then half them (hence Q1= (45 + 45) ÷ 2 = 45) . However, if we had an odd number of scores (say, 99 students), we would only need to take one score for each quartile (that is, the 25th, 50th and 75th scores). You should recognize that the second quartile is also the median. Quartiles are a useful measure of spread because they are much less affected by outliers or a skewed data set than the equivalent measures of mean and standard deviation. For this reason, quartiles are often reported along with the median as the best choice of measure of spread and central tendency, respectively, when dealing with skewed and/or data with outliers. A common way of expressing quartiles is as an interquartile range. The interquartile range describes the difference between the third quartile (Q3) and the first quartile (Q1), telling us about the range of the middle half of the scores in the distribution. Hence, for our 100 students: Interquartile range = Q3 - Q1= 71 - 45= 26 However, it should be noted that in journals and other publications you will usually see the interquartile range reported as 45 to 71, rather than the calculated range. See more: Days Of Our Lives Spoilers: Fans Grieve Brad Bufanda Days Of Our Lives A slight variation on this is the semi-interquartile range, which is half the interquartile range = ½ (Q3 - Q1). Hence, for our 100 students, this would be 26 ÷ 2 = 13.
https://muzic-ivan.info/this-measure-of-spread-is-affected-the-most-by-outliers/
variable (mean snowfall in meters). Figure 3.1 Clustered bar chart comparing the mean snowfall of alpine forests between 2013 and 2015 in Mammoth, CA; Mount Baker, WA; and Alyeska, AK. Notice that Figure 3.1 gives a clear depiction of the differences in the mean snowfall at the three localities. By adding error bars (standard deviations), the researcher is also able to illustrate the variance in each one of the groups of data. For instance, the snowfall in Alyeska, AK is less variable than the snowfall in Mount Baker, WA. Figure 3.2 Clustered bar chart comparing the mean snowfall of alpine forests between 2013 and 2015 in Mount Baker, WA and Alyeska, AK. An improperly scaled axis exaggerates the differences between groups. One of the most important considerations when displaying data with a bar graph is the scaling of the axes. Unfortunately, graphs built in the programs Excel and Numbers are often created with an improperly scaled y-axis. If the y-axis does not begin with zero, then the differences between groups appear exaggerated. By “zooming in” on this smaller set of y-axis values, the graph can be misleading. Take the previous example of snowfall measurements. It is clear that the average snowfall in Mount Baker, WA and Alyeska, AK are very similar. However, if the graph of these two localities is built with a modified yaxis as in Figure 3.2, with a minimum value set to 16.5, the differences appear dramatic, when in reality they are not. Figure 3.3 Clumped bar chart comparing the mean snowfall of alpine forests by year (2013, 2014, and 2015) in Mammoth, CA; Mount Baker, WA; and Alyeska, AK. Clumped Bar Charts If this same researcher was interested in illustrating the trends in snowfall patterns over a 3-year period, a clumped bar chart would be useful. Figure 3.3 shows snowfall patterns within each locality over a 3-year period. By using a clumped bar chart, the researcher can demonstrate trends within each category of data. For example, we can see that the snowfall was exceptionally high in 2014 at the Mount Baker, WA location; however, the snowfall at the Mammoth, CA location was fairly stable over time. Stacked Bar Charts Next the researcher wants to illustrate differences in the timing of snowfall by month within each site. For this example, a stacked bar chart is helpful in illustrating the relative contributions of parts to the whole. Figure 3.4 shows the amount of snow that fell within the months of January, February, and March, 2015. Notice that in Mammoth, CA there was zero snowfall in the month of January. Figure 3.4 Stacked bar chart comparing the mean snowfall of alpine forests by month (January, February, and March) for 2015 in Mammoth, CA; Mount Baker, WA; and Alyeska, AK. Figure 3.5 Histogram of seal size. Histograms Histograms are another form of bar charts used to display continuous categories, like a consecutive range of values for age. If your data are made up of quantitative variables, then consider constructing a histogram. The format is similar to that of a bar chart; however, the categories along the bottom are represented with a set range of values. Hence, both axes will be represented on a numerical scale. Also, the aesthetics are slightly different because there are no spaces between the bars. In a histogram, there will never be space between bars because the horizontal axis is representing continuous values (Figure 3.5). If a space does exist between bars, then it means that there are no values for that range. Box Plots The box plot (also called a box and whisker plot) is a convenient way to illustrate several key descriptive statistics from a dataset. Box plots show the median, as well as the distribution of the data through the use of quartiles, which divide ranked data into four equal groups, each consisting of a quarter of the data. Consider the following dataset: The first step in developing a box plot for these data is to define the quartiles. Several methods are currently debated regarding how to define quartiles; the following example uses the simplest and most intuitive method. In the sample dataset above, the numbers must first be rearranged so that they are in order: Second, find the median, which is also defined as the second quartile (Q2). In the current example, there is an even number of data points, so the median is calculated as the average of the middle two numbers (Q2 = 24). If there were an odd number of points, the median would be excluded for the next step. Third, calculate the median of each half of the data (on either side of the median); these medians are the first and third quartiles (Q1 and Q3): The box component of a box plot spans the first quartile to the third quartile, and is known as the interquartile range (IQR); the median is shown inside the box at the position of the second quartile, as illustrated in Figures 3.6 and 3.7. Figure 3.6 Example box plot showing the median, first and third quartiles, as well as the whiskers. Figure 3.7 Comparison of the box plot to the normal distribution of a sample population. By showing the median, as well as the position of the first and third quartiles, box plots give information about the degree of dispersion, as well as the skewness of the data. Box plots often also have lines (the whiskers) extending from the box to represent the variability of the data outside of the upper and lower quartiles. The whiskers usually mark the minimum and maximum values for the dataset. However, if the dataset contains outliers, the whiskers will extend only up to a certain point, defined as Q1 − 1.5 × IQR or Q3 + 1.5 × IQR (Figure 3.7). Outliers will be depicted as points outside of the whiskers (Figure 3.8). Figure 3.8 Sample box plot with an outlier. The box plots on previous pages, Figures 3.7 and 3.8, have been drawn for illustrative purposes in a horizontal orientation, but are most often shown vertically, as in Figure 3.8. In Figure 3.8, descriptive information from two groups of data is depicted. Although the medians for the two groups are the same, the differences in the dispersion and skew of the data are apparent. While group B shows a normal distribution, group A shows a “positive skew,” with a tail that extends in the positive direction. The box plot for group A also shows the position of an outlier, whose value is beyond the range of the whiskers. Generating box plots is straightforward in both SPSS and R and is included in this book's tutorials. However, generating box plots in Excel and Numbers is both lengthy and complex, and involves manipulating stacked bar charts. If you do not have access to SPSS or R, we recommend looking for a free, online box plot generator, which is an easy and quick solution for creating box plots of your data. Tutorials How to Make a Bar Chart in Excel The following tutorial will walk you through the construction of a bar chart (also known as column graph or bar plot) using Excel. The data involve the number of rows of snail radula. *Data taken from the research of Vanessa C. Morales, Robert Candelaria, and Dr. Kathleen Weaver. Refer to Chapter 12 for tips and tools when using Excel. Excel offers two methods to construct a simplified bar chart with error bars. While the first method may be more challenging at first, the lessons learned will give you greater mastery and flexibility. Calculate the average and standard deviation of the radula from each population prior to beginning the tutorial. Method 1 1. Arrange data in columns on the spreadsheet. 2. Click on an empty cell. Select Insert, Column, and select the first 2-D Column option. There are several types of bar graphs available. Use the one appropriate for the data you want to display. 3. A blank canvas will appear. 4. Right click on the blank canvas and choose the Select Data option. 5. Under Legend Entries, select Add. Note: Add each data point as separate series so that the standard deviation bars can be entered separately. 6. Select the icon corresponding to the Series name subheading. 7. Select the first series title then click on the icon to the right. 8. Select the icon corresponding to the Series values. 9. Select the first value then click the icon on the right. 10. Click OK. 11. You will be directed to the original popup. Repeat steps 5–10 to input the remaining values. 12. After the second variable is added, you should be left with a graph that looks like the following. After the third variable: 13. Once all the variables have been added to the graph, click OK. 14. A very basic column graph will appear, similar to the one below. 15. As a default, Excel labels the x-axis as “1.” To delete this label, select label “1.” A box will appear. Then, press delete on your keyboard.
https://toc.123doc.net/document/2754666-3-bar-graphs-histograms-and-box-plots.htm
How to Describe the Variation of Data in R A single number doesn’t tell you much about your data. Often it’s as important to know the spread of your data. You can use R to look at this spread using a number of different approaches. First, you can calculate either the variance or the standard deviation to summarize the spread in a single number. For that, you have the convenient functions var() for the variance and sd() for the standard deviation. For example, you calculate the standard deviation of the variable mpg in the data frame cars like this: > sd(cars$mpg) 6.026948 Next to the mean and variation, you also can take a look at the quantiles. A quantile, or percentile, tells you how much of your data lies below a certain value. The 50 percent quantile, for example, is nothing but the median. Again, R has some convenient functions to help you with looking at the quantiles. How to calculate data range in R The most-used quantiles are actually the 0 percent and 100 percent quantiles. You could just as easily call them the minimum and maximum, because that’s what they are. You can get both min() and max() functions together using the range() function. This function conveniently gives you the range of the data. So, to know between which two values all the mileages are situated, you simply do the following: > range(cars$mpg) 10.4 33.9 How to calculate data quartiles in R The range still gives you only limited information. Often statisticians report the first and the third quartile next to the range and the median. These quartiles are, respectively, the 25 percent and 75 percent quantiles, which are the numbers for which one-fourth and three-fourths of the data is smaller. You get these numbers using the quantile() function, like this: > quantile(cars$mpg) 0% 25% 50% 75% 100% 10.400 15.425 19.200 22.800 33.900 The quartiles are not the same as the lower and upper hinge calculated in the five-number summary. The latter two are, respectively, the median of the lower and upper half of your data, and they differ slightly from the first and third quartiles. To get the five number statistics, you use the fivenum() function. How to get on speed with the quantile function in R The quantile() function can give you any quantile you want. For that, you use the probs argument. You give the probs (or probabilities) as a fractional number. For the 20 percent quantile, for example, you use 0.20 as an argument for the value. This argument also takes a vector as a value, so you can, for example, get the 5 percent and 95 percent quantiles like this: > quantile(cars$mpg, probs=c(0.05, 0.95)) 5% 95% 11.995 31.300 The default value for the probs argument is a vector representing the minimum (0), the first quartile (0.25), the median (0.5), the third quartile (0.75), and the maximum (1). All of these functions have an argument na.rm that allows you to remove all NA values before calculating the respective statistic. If you don’t do this, any vector containing NA will have NA as a result. This works identically to the na.rm argument of the sum() function.
https://www.dummies.com/programming/r/how-to-describe-the-variation-of-data-in-r/
Calculated Formula questions present students with a question that requires them to make a calculation and respond with a numeric answer. The numbers in the question change with each user and are pulled from a range that you set. The correct answer is a specific value or a range of values. You may grant partial credit for answers falling within a range. Calculated Formula questions are graded automatically. In this example, the numbers 6 and 9 are randomly generated from a range of values set by an instructor. An instructor created this question by typing the following question text: If a small glass can hold [x] ounces of water, and a large glass can hold [y] ounces of water, what is the total number of ounces in 4 large and 3 small glasses of water? When a student views the question, the variables [x] and [y] are replaced with values that are generated randomly from number ranges that an instructor specifies. Before You Begin The process for adding a calculated question to an assessment has three steps: - Create the question and formula - Define the values for the variables - Confirm the variables and answers This question type allows you to randomize the value of variables in an equation, making it useful when creating math drills or testing students seated closely together. How to Create the Question and Formula - Access a test, survey, or pool. To learn more, see Tests, Surveys, and Pools. - On the action bar, point to Create Question and click Calculated Formula. - In the Question Text box, type the information that will appear to students. The question text must contain at least one variable. Surround variables with square brackets. Variables are replaced by values when shown to students. Variables can be letters, digits (0-9), periods (.), underscores (_) and hyphens (-). Variables cannot contain the letters “e,” “i” and “pi” because they are reserved. Variable names must be unique, and you cannot reuse them. All other occurrences of the opening rectangular bracket (“[“) should be preceded by the back-slash (“\”). - Type the Answer Formula. The formula is the mathematical expression used to find the correct answer. Choose operators from the buttons across the top of the Answer Formula box. In our example, the formula is 4y+3x. The answer formula tool is written by WIRIS. To learn more, see the WIRIS manual [PDF]. The formula is not visible to students—it is used by Blackboard to determine the correct answer to the question. - In the Options area, leave the Answer Range at zero if the answer must be exact. If you will allow a range of answers, set the answer options to define the range of full-credit answers. You can also Allow Partial Credit for a range of answers, and select Units Required. To learn more, see About Setting Answer Options. - Click Next to proceed. Significant Figures in Calculated Formula Questions Previously when instructors created Calculated Formula questions, they could only choose to have the formula calculate to a number of decimal places. This update allows the formula to calculate significant numbers. When creating a Calculated Formula question, the instructor will now see the option to select Significant Figures under Answer Set options. When the Instructor presses Calculate, the randomly generated variables will now display Significant Figures. Note: Courses that are copied/restored that contain Calculated Formula questions will maintain the decimal values previously specified, but may be edited to change calculated answers to Significant Figures. How to Define the Variables The next page in the process defines the variables in the formula. - In the Define Variables section, provide the Minimum Value and Maximum Value. When the question is presented to a student, Blackboard Learn replaces the variable with a value randomly selected from the range you defined. Optionally, select a decimal place using the Decimal Places drop-down list. - In the Answer Set Options section, select the Decimal Places for Answer from the drop-down list. Students must provide the correct answer to this decimal place. - Type the Number of Answer Sets. This determines the number of possible variations of the question that will be presented to students. You can also specify the number of decimal places and if the correct answer format is normal or exponential. - Click Calculate to populate the answer sets. How to Confirm the Variables and Answers The last step in the process displays the answer sets in a table. Each answer set represents one of the possible variations of the question that can be presented to students. - If needed, edit answer sets and click Calculate to update the list. Click Remove to the right of an answer set to delete it. - Optionally, type feedback for correct and incorrect answers. - Optionally, add question metadata. To learn more, see Question Metadata. You must enable the options for feedback and metadata on the Question Settings page for those options to appear in individual questions. - Click Submit and Create Another -OR- Submit to add the question to the test. About Setting Answer Options Options for partial credit and units appear after you select the check box for Allow Partial Credit or Units Required. In the preceding example: - An answer that is within plus or minus 4 is awarded 100% of the point total. - An answer that is within the partial credit range of plus or minus 5 to 8 is awarded 50% of the point total. The available options are: - Answer Range: The range of answers that are awarded full credit. Select whether it is a Numeric range or a Percentage range. If the answer must be exact, type zero for the range. - Allow Partial Credit: Allow partial credit on a less accurate range of answers. Set the Partial Credit Points Percentage to be awarded if the student’s answer is within the partial credit range. - Units Required: The unit of measurement must be provided in the student’s answer. Type the Answer Units and Units Points Percentage to be awarded if the units are entered correctly. Examples The following two examples use variables in equations. You can see how the instructor crafted the question text and the resulting student view of the question.
https://www.etskb-fac.cidde.pitt.edu/blackboard/calculated-formula-questions/
Slightly less than one-half of one-tenth, or 2. 5%. Is this the answer to the riddle? The percentage difference between two values is calculated by dividing the absolute value of the difference between two numbers by the average of those two numbers. Multiplying the result by 100 will yield the solution in percent, rather than decimal form. Refer to the equation below for clarification.Percentage increase and decrease are calculated by computing the difference between two values and comparing that difference to the initial value. Mathematically, this involves using the absolute value of the difference between two values, and dividing the result by the initial value, essentially calculating how much the initial value has changed. This percentage calculator is a tool that lets you do a simple calculation: what percent of X is Y? The tool is pretty straightforward. All you need to do is fill in two fields, and the third one will be calculated for you automatically. This method will allow you to answer the question of how to find a percentage of two numbers. Furthermore, our percentage calculator also allows you to perform calculations in the opposite way, i.e., how to find a percentage of a number. Try entering various values into the different fields and see how quick and easy-to-use this handy tool is. Is only knowing how to get a percentage of a number is not enough for you? If you are looking for more extensive calculations, hit the advanced mode button under the calculator. (Source: www.omnicalculator.com) Generally, the way to figure out any percentage is to multiply the number of items in question, or ​X​, by the ​decimal​ form of the percent. To figure out the decimal form of a percent, simply move the decimal two places to the left. For example, the decimal form of 10 percent is 0.1. Then, to calculate what 10 percent of is, say, 250 students, simply multiply the number of students by 0.1.
https://www.futurestarr.com/blog/mathematics/what-percentage-of-30-is-10-or
What’s the correct math calculation for deciding whether or not to hazard a guess on the next Millionaire question? The payouts for the initial round are as follows: $25,000, $15,000, $10,000 … The contestant starts before the first question with a bank of zero, so a complete guess (4 choices translates to 25% of randomly guessing the correct answer) is always better than walking away before the first question (besides the fact that you can skip the first two questions if nothing else). After the first question, the contestant’s average bankroll will be ($25,000 plus $15,000 + …) divided by the total number of initial round payouts. Without knowing the precise numbers, my guess is about $7,000. That would leave about $6,000 average for the next question’s payout, so the total estimated monetary outcome of a correct answer for question #2 would be about $13,000. A random quess with 25% chance would yield $13,000x0.25= about $4,250 versus $1,000 x0.75 = 750 for a wrong answer, so it pays to randomly guess on question #2. But although $6,000 would be the average payout in general, for any particular contestant, it would be calculated by subtracting the actual dollar amount of the first round then doing the averaging. Then factor in that a contestant may be 99.9% certain that a particular choice is wrong although unsure of the other choices. That would change the percentage for a random correct guess to 33.33%, and if the contestant is certain that two choices are wrong, then the random guess becomes a 50/50 outcome. Has anyone calculated the math on when to proceed on a random guess based on payout odds?
https://boards.straightdope.com/t/math-calculation-for-who-wants-to-be-a-millionaire-tv-gameshow/678943
Can I set penalty to a TM (all TUs of the TM)? 6 comments - Annelie Yes, you can set penalties for text differences (wording, spelling) and/or for formatting differences (inline tags). You can set this up independently for each TM/termbase., but also in the default configuration to be applied to all. Independently: Select the TM and click “Search settings” Default configuration : “Settings” > “Translation Memories” - Mike Holland Exactly how many percentage points are subtracted at each level of the setting? - Stephan Böhmig Your question goes straight to the point. I'd love to be able to tell you "2%" less, but I am unable: The mechanism that compares two segments in order to calculate a percentage is quite complex. It looks at word positions, segment lengths, languages, character differences, word/token differences, grammatical adjustments and much more. After comparison, the system obtains a list of differences (e.g. a comma not present in the other segment), each with information of "importance" and an individual percentage: "house" and "houses" will give a < 100% similarity. The system then combines all those differences into one single match percentage. The algorithm is driven by many parameters, among which the settings described by Annelie. Some parameters are themselves calculated and others are preset by us. When you move the text or formatting differences settings up or down, the individual differences' percentages are adjusted. For example setting the "text differences" to "Highest" will change 99% to 98%, 98% to 96%, 97% to 94% and so on. "Lowest" will change all values below 99% to 99%. Once this is done, all the differences will be combined together using other parameters. It is therefore not possible to give a simple answer other than: Try out which settings are best for you. ASIAN LANGUAGES: For Japanese, Chinese or Korean it is highly recommended to set the "text difference" penalty to "Highest". This will substantially reduce the amount of noise in results and lead to best performance. - Richard Issac Delgado Jr. What is a TM? - Annelie TM is the abbreviation for translation memory. - Sales Hello Richard, Indeed, TM stands for Translation Memory.
https://wordbee.zendesk.com/hc/en-us/community/posts/207475398-Can-I-set-penalty-to-a-TM-
To improve prognostic classification, the MDS Risk Analysis Workshop developed the Myelodysplastic Syndrome International Prognostic Scoring System (IPSS). The IPSS was published in 1997 and updated in 2012. [17, 18] The revised IPPS (IPSS-R) score is calculated on the basis of five variables: - Hemoglobin level - Absolute neutrophil count - Platelet count - Percentage of bone marrow blasts - Cytogenetic category Did this answer your question? Additional feedback? (Optional) Thank you for your feedback!
https://www.medscape.com/answers/207347-62554/how-is-the-myelodysplastic-syndrome-international-prognostic-score-for-myelodysplastic-syndrome-mds-calculated
Debt to Income (DTI) ratios is an important real estate math concept that you will need to know for your real estate licensing exam. When your buyers are getting a loan, one of the major factors to qualifying for a loan will be their debt to income ratios (DTI). Here’s what you need to know: There are two types of DTI ratios: - The Front End Ratio is calculated as the percentage of income that will go towards the homeowner’s future PITI (Principal, Interest, Taxes, and Insurance). Essentially, the bank wants to check that your buyer’s mortgage payment isn’t going to eat up all of their income and that they will have enough for other living expenses. - The Back End Ratio is calculated as the percentage of income that goes towards all of their debts, including the future PITI, school loans, car payments, credit card payments, etc. Again, the bank wants to know that your buyer is not already so far in debt that they cannot afford their mortgage payment. Most banks have a set front-end and back-end DTI ratios that they use as lending guidelines. In the US, the typical ratios are 0.28 for the front end, and .36 for the back end. This means that a buyer’s future PITI payment cannot exceed 28% of their monthly income and their total debt (including the future PITI payment) cannot exceed 36% of their monthly income. On the real estate licensing exam, a common real estate math problem that you may encounter will be to calculate a buyer’s front end or back end ratio. Another common real estate math exam question will be to calculate the maximum PITI payment based on a buyer’s monthly income and current debt amounts. Real Estate Math Example Question: Doug has a monthly income of $3000. What is the maximum PITI that Doug can pay using the 0.28 front-end DTI ratio? Answer: To get the maximum PITI, you multiple $3000 x 0.28 = $840/month. Doug can afford a maximum PITI of $840/month. Question: Doug decides to buy a new car before he buys a house and incurs a $50/month debt. Can Doug still afford a $840/month mortgage payment? Answer: First, we calculate Doug’s back-end DTI ratio, which is ($840+$50)/$3000 = 0.30. Since this ratio is less than the typical 0.36 back-end DTI, Doug can still afford the $840/month PITI payment. For more real estate math practice, check our 125 Real Estate Math Problems Solved Workbook and Solutions Manual. Bonus video explanations of all questions are also included!
https://ezrealestatemath.com/real-estate-math-concept-explained-debt-to-income-ratios/
We often get questions, and inevitable misunderstandings, regarding the difference between tax collections and tax burdens by state. A tax burden is a calculation of the total amount paid by residents in taxes divided by the total income in each state (when looked at as percentage of state income). The burden also reflects the economic incidence of taxes that are commonly shifted to out-of-state taxpayers. Tax collections are simply what a state collects in taxes (which is to say, not the same as revenue). The goal is to focus not on the tax collectors but on the taxpayers. That is, we answer the question: What percentage of their income are the residents of this state paying in state and local taxes? We are not trying to answer the question: How much money have state and local governments collected? The Census Bureau publishes the definitive comparative data answering that question. Here are some examples of the difference between collections (focusing on the tax collector) and burdens (focusing on the taxpayer). When Connecticut residents work in New York City and pay income tax there to both the state and the city, the Census Bureau will duly tally those amounts as New York tax collections, but we will count them as part of the tax burden of Connecticut’s residents. Here I’ll provide maps of Tax Foundation calculated tax burdens by state and tax collections per state for comparison. As you can see, a few states significantly move in ranks. Alaska and North Dakota have a low tax burden—50th and 33rd respectively—while very high tax collections—1st and 6th highest in the country. This disparity is caused by their oil, which states tax before it is sold to people outside of the state. Click here to see the State and Local Tax Burden methodology and study.
https://taxfoundation.org/monday-maps-state-and-local-tax-burdens-vs-state-tax-collections/
Let us suppose that 36% of respondents in the poll decided to vote candidate A and the margin of error for the survey published by the researchers is 3%. Then the survey shows that the ’true’ percentage of people decided to vote for the candidate is somewhere in the interval between 36 - 3 = 33% and 36 + 3 = 39%. It is as simple as that, as long as we believe there is a ’true’ percentage and we do not question the 3% error. If you ask an eligible voter on the street whom would he/she vote five minutes later, you may get an answer different from the first one. Not because the voter is a liar, but simply he/or she does not care. Despite of that, the belief in existence of the true percentage of decided voters is no objection in the practical use of a poll result. Finding the margin of error, and consequently the pool where this percentage lives despite the variable mood of the voters, is more important. To explain the margin of error calculation we switch from percentages to proportions. That means, the true percentage is calculated as 100p, where p is the true proportion of decided voters in the total of all eligible ones. If the survey asks for opinion only n voters and just x of them say they choose the candidate A then x∕n is an estimate of p and 100x∕n is an estimate of the true percentage of decided voters. Technically, some rounding is involved to get a percentage without decimals. The formula for the margin of error commonly assumes that voters are chosen for the poll randomly, their answers are independent from each other, the chance of a vote for A is p, and the answers are written down as a series of zeros and ones. Zero means the answer is ’A no’ and one otherwise. Undecided voters are excluded from the poll. The assumption can be made more realistic by using special sampling techniques, such as randomization, stratification etc. Under the above assumptions, the answers follow the same model as independent draws from an urn containing black and white balls. The ball is returned after each draw. This model is described by the so-called binomial distribution with expectation np and variance np(1 - p). Due to the central limit theorem, if the sample size n is sufficiently large, then is approximately a normal random variable and the probability of z occurring in the interval (-1.96, 1.96) is approximately 0.95. In other words, the interval covers the true proportion p with probability 0.95, or in about 95 cases out of 100, if you wish, and the interval covers the true percentage of decided voters accordingly. The quantity is known as the margin of error. Since the proportion p is unknown, the value p is in calculations replaced by the proportion x∕n. One question people often ask is: how many respondents must enter the poll in order the margin of error me does not exceed, say, the 3%? An approximate answer, based on the above considerations, is obtained as follows. If we knew p in advance then the answer would be simple (c.f. me): where the squared brackets indicate that we must round to the nearest integer, which is larger than the bracketed fraction. For an unknown p we must consider the probability for which p(1 - p) is the largest, which is 0.5. If p = 0.5 then and that yields Hence, if a survey company wants to claim a margin of error not exceeding 3% in the poll indicated above, then the investigators must get answers from at least 1068 randomly selected respondents. Margin of Error and Opinion Poll on Wikipedia.
http://eminf.com/margin.html
Understanding Net Promoter Score What is Net Promoter Score (NPS)? Net Promoter Score is a methodology used to calculate the answer to a single question: How likely are you to recommend brand/dealer/product to a friend or colleague? Why use Net Promoter Score? The NPS methodology is a leading indicator of growth. When an organization strives to improve their NPS, they will experience faster growth, increased profits and likely outperform others in the market. Calculating Net Promoter Score NPS is calculated by dividing respondents into 3 categories: - Promoters (respond 9-10) are loyal enthusiasts fueling growth - Passives (respond 7-8) are satisfied customers vulnerable to competition - Detractors (respond 0-6) are unhappy customers impeding growth You are able to calculate NPS by subtracting the percentage of Detractors from the percentage of Promoters. NPS can range anywhere from –100 (all Detractors) to +100 (all Promoters). To read more about how to calculate Net Promoter Score, click here.
https://www.satisfyd.com/blog/customer-experience-blog/understanding-net-promoter-score/
Hybridization Experiment In one of Mendel’s experiments with plants, 1064 offspring consisted of 787 plants with long stems. According to Mendel’s theory, 3/4 of the offspring plants should have long stems. Assuming that Mendel’s proportion of 3/4 is correct, find the probability of getting 787 or fewer plants with long stems among 1064 offspring plants. Based on the result, is 787 offspring plants with long stems significantly low? What does the result imply about Mendel’s claimed proportion of 3/4? Experts are waiting 24/7 to provide step-by-step solutions in as fast as 30 minutes!* Q: When the _ is calculated, the _ is calculated as an intermediate step. a)variance; standard deviati... A: Variance is the sum of squared deviations divided by number of observations. Variance is calculated ... Q: The distribution of the number of people in line at a grocery store has a mean of 3 and a variance o... A: Click to see the answer Q: The following are body mass index (BMI) scores measured in 12 patients who are free of diabetes and ... A: Sample mean and sample standard deviation: Sample mean and sample standard deviation are calculated ... Q: Based on data from a statistical abstract, only about 10% of senior citizens (65 years old or older)... A: Since you have posted a question with multiple sub-parts, we will solve first three sub-parts for yo... Q: 5.) The owner of Showtime Movie Theaters, Inc., would like to predict weekly gross revenue as a func... A: The weekly gross revenue is the dependent variable (y). a. The estimated regression equation with th... Q: A machine that builds gas tanks operates in such a way that the capacity of each tank is a normal ra... A: Click to see the answer Q: Third Attempt at this question. (a) Make a line chart and fit an m-period moving average to the exch... A: Click to see the answer Q: It is thought that basketball teams that make too many fouls in a game tend to lose the game even if... A: a). Step-by-step procedure to obtain scatterplot using EXCEL. Select the data and go to Insert. Sel... Q: 12% of all college students volunteer their time. Is the percentage of college students who are volu...
https://www.bartleby.com/questions-and-answers/hybridization-experiment-in-one-of-mendels-experiments-with-plants-1064-offspring-consisted-of-787-p/b203626b-61d7-45d8-8a3b-6381444007d8
I am looking for a particular word that describes: a question that is asked in order to expose ignorance/lack of knowledge. As with a rhetorical question, the questioner knows the answer, but suspects the person being addressed doesn't. Stack Exchange network consists of 175 Q&A communities including Stack Overflow, the largest, most trusted online community for developers to learn, share their knowledge, and build their careers.Visit Stack Exchange English Language & Usage Stack Exchange is a question and answer site for linguists, etymologists, and serious English language enthusiasts. It only takes a minute to sign up.Sign up to join this community I am looking for a particular word that describes: a question that is asked in order to expose ignorance/lack of knowledge. As with a rhetorical question, the questioner knows the answer, but suspects the person being addressed doesn't. Teachers sometimes refer to this kind of question as a trap: From The Pragmatics of Mathematics Education by Tim Rowland: One common perception is that the questions teachers ask their pupils are not searchlights focused to reveal truth, but traps set to expose ignorance. Rowland was quoted in Teacher-student Interaction by Alandeom Wanderlei de Oliveira. Seymour B. Sarason expresses the same notion in a different way in Letters to a Serious Education President: They are both surprised and puzzled by my question, as if I am setting a trap to expose their ignorance. If you are trying to educate, instead of expose, the answerer, I would say Socratic. Teachers and politicians sometimes call these "gotcha questions." Here's an excerpt from a discussion of gotcha questions in a Daily Caller article: The infamous “gotcha” question is something politicians always rail >against. But what exactly defines a “gotcha”? “I suppose a gotcha question is one that’s fundamentally unfair because it has a hidden or misleading premise,” former Clinton White House adviser and CNN contributor Paul Begala told The Daily Caller. He provided this example: “Q: Which Yankee before Jeter had 3,000 hits? A: No one!” “A gotcha question is a knowledge question in which the moderator attempts to make the person … look stupid,” offered infamous Republican political consultant Roger Stone. “I think it is more like saying to Donald Trump, you know: ‘How many members of the U.S. House of Representatives [are there]?’” A pointed question; one that cannot be answered with a vague generalization, but only precisely. BTW "asking a rhetorical question" doesn't mean that you suspect the hearer(s) don't already know the answer. It means you are making a statement (perhaps of something that is obvious) more emphatic by expressing it as a question, for example "Do you want to be free men or slaves?" While I am unable to offer a noun, there are a couple of adjectival descriptions which typify questions designed to achieve certain ends, which could prove useful, i.e., “tactical”, “calculated”. tactical adjective: of, relating to, or constituting actions carefully planned to gain a specific military end. • (of a person or their actions) showing adroit planning; aiming at an end beyond the immediate action. synonyms: calculated, planned; see Google tactical calculate verb: 3rd person present: calculates; past tense: calculated; past participle: calculated; gerund or present participle: calculating 2. intend (an action) to have a particular effect. "his last words were calculated to wound her" Or, “the question was calculated to expose his ignorance.” synonyms: intend, mean, aim, design; Google calculate test \ˈtest\ noun -MW 2,a : (2) something (as a series of questions or exercises) for measuring the skill, knowledge, intelligence, capacities, or aptitudes of an individual or group If (underlined) If you happen to be a troll, this question was a test of our gullibility; seeking the knowledge of if we're unbeknownst to your trollishness and how far you'd get away with it. I think the best option would be a disingenuous question. Brainstorming some more ideas: Trick question. A question designed to show someone up. Insincere, testing question. A question designed to catch someone out or show their ignorance. Malicious question. Uncomfortable question. I think the closest term to what you're looking for is a trick question, defined by Wiktionary as: A complex question, whose wording hinders the ability to answer it correctly. Basically, these are questions designed to make the person answering fail. For example: - When did Elvis Presley die? - Is that a trick question? The King's not dead! I saw it wasn't listed so it took me an hour of googling to find this specific word for you. Depending on your intention of use this is a word that captures a different but similar meaning to what you said you are trying to find. A shibboleth (/ˈʃɪbəlɛθ/ or /ˈʃɪbələθ/) is a word or custom whose variations in pronunciation or style can be used to differentiate members of ingroups from those of outgroups. Within the mindset of the ingroup, a connotation or value judgment of correct/incorrect or superior/inferior can be ascribed to the two variants. 'In A cleft stick' - "In a difficult situation, unable to choose between unfavourable options; in a dilemma. " Source; Wiktionary http://en.wiktionary.org/wiki/in_a_cleft_stick 'Caught between a rock and a hard place' - "Having the choice between two unpleasant or distasteful options; in a predicament or quandary." Source; Wiktionary http://en.wiktionary.org/wiki/between_a_rock_and_a_hard_place Just as those who are sent on and attempt to accomplish “a fool’s errand” are doomed to failure and ridicule, for the errand's goal is impossible to obtain; those who are asked and attempt to answer “a fool’s question” suffer similar fates, for "there are no answers to a fool’s question." I think a "trick question" usually means what you are asking about. While it is a colloquial phrase, it usually means a question which offers a choice of answers none of which is the correct one. It forces the person answering the question to pick one of the answers thereby exposing the fact that he does not know the true answer. Perhaps probing or fathoming one's depth probing question probing adjective 1 : that investigates something in a tentative way : that tests or tries out something experimentally a probing procedure 2 : that penetrates deeply in an exploratory way to the essence of something : keen and to the point : sharply analytical : searching a probing question a probing study Merriam-Webster Unabridged Dictionary My girlfriend uses the word interrogation every time I do this, and it seems to fit: synonyms: questioning, cross-questioning, cross-examination, quizzing. Wiles - The use of clever tricks by someone in order to get what they want or make someone behave in a particular manner. Artifice - The use of clever tricks to cheat somebody. "Challenge" You ask someone a question that you know the answer to in order to expose them as ignorant. You are challenging their competence. This is a yiddish expression that my mother uses whenever my brother asks her a question that he knows the answer to, just to see if she knows it. She always says, "I don't need you to farher me." A farher is an oral exam. Not an answer but related.
https://english.stackexchange.com/questions/237154/a-question-asked-in-order-to-expose-ignorance
Suppose a firm has the following cost: Output (units): 10 11 12 13 14 15 16 17 18 19 Total cost: $50 $52 $56 $62 $70 $80 $92 $106 $122 $140 a. If the prevailing market price is $14 per unit, how much should the firm produce? b. How much profit will it earn at the output rate? c. If it increases output by 2 units, will it make more profit or less? a. The total cost of the firm is given and the marginal cost can be calculated by subtracting the TC2 from TC1 and so on. The Total revenue can be calculated by multiplying the output unit with the prevailing market price of $14. The profit can be calculated by subtracting the total cost from the total revenue of the firm. Thus, the total cost, revenue and profit can be calculated and tabulated as follows: The firm has the profit maximizing point where the marginal cost is equal to the marginal revenue and price. In this case, the price is given to be $14 which is also the marginal revenue of the firm. The marginal revenue equals the marginal cost at the output level of 17 units. So, the firm should produce 17 units of output. b. The profit maximizing level of output of the firm is calculated to be 17 units. The total cost of the firm at the profit ma... Solutions are written by subject experts who are available 24/7. Questions are typically answered within 1 hour.*See Solution Q: Only answer question 6A A: Total variable cost is the cost of producing an output which increases with an increase in the outpu... Q: Question 2 (i) A bicycle manufacturer expects the price of bicycle to rise in the near future, the s... A: (i)When the bicycle manufacturer expects the price of the bicycles to rise in the economy, it means ... Q: Show the effect of a 25% profit tax on the accounting profit of all firms in a competitive industry.... A: The profit is the excess revenue made by the firm from the sale of the goods and services from the m... Q: VMPL 7ihworker A: Marginal productivity of labor (MPL) is the additional amount of output gained by hiring one extra l... Q: Your boss, whose background is in financial planning, is concerned about the company’s high weighted... A: It is given that the weighted average cost of capital (WACC) of the company is 15 percent and the de... Q: Please thoroughly and completely explain how the 3 tools of monetary policy would work to address th... A: Answer - 3 tools of monetary policy would work to address the problem of inflation -1. Tightening ... Q: Figure 18-2 shows the widget market before and after an excise tax is imposed. What percentage of th... A: The price paid by the consumer before tax imposition is $100 (where the demand curve is equal to sup... Q: The country of HOYA produces phones and fridges. The table below shows price and output data in thes... A: GDP (Gross Domestic Product): It refers to the total of market value of all produced final goods and... Q: Online MBA programs significantly reduce the cost to existing managers of obtaining an MBA, as they ... A: Online MBA programmes have reduced the cost of obtaining an MBA, to the existing managers. The onlin...
https://www.bartleby.com/questions-and-answers/suppose-a-firm-has-the-following-cost-output-units-10-11-12-13-14-15-16-17-18-19-total-cost-dollar50/027e526f-12e6-44ed-9001-6d1eb7d1f11c
Question.1 What is proposed from statistics for aggregates? Answer: You can ask the system to get the optimal characteristics for aggregation by using the BW statistics data (like query run time, etc). Question.2 Can you define aggregates on time dependent navigational attributes? Question.3 What are the proposal options available? Proposal from BW statistics cube – history of BW statistics. Propose from BW statistics (tables) – based on database tables RSDDSTAT and RSDDSTATAGGRDEF. Propose from last navigation – suggestion based on the last entry of the above mentioned tables for the current user. Question.4 What is aggregate hierarchy? Answer: Aggregates that are built on other aggregates; the hierarchy is done automatically; it is recommended that you create a few base large hierarchies and small aggregates can be built from hierarchy. Question.5 How does change run affect aggregate? Answer: Newly loaded master data is not active until the change run has been applied to hierarchy. Question.6 In real world when do you recommend aggregate? Answer: Look at the BW statistics’ if a query spends more than 50% of the time in Database access and the ratio of records read from database to records processed is more than you create aggregates. Question.7 What is are characteristic? Answer: The characteristics In this area are not displayed in the initial view of the query but you can drill down and filter once you execute the query. Question.8 What is filter area? Answer: The characteristics in this area are restricted and cannot be filtered or drilled down further. Question.9 What is IROWCOUNT key figure? Answer: For-ODS and infosets, the query designer automatically adds this key figure “number of records” to count the records. Question.10 What is a calculated key figure? Answer: To do complicated calculations on key figures such as mathematics functions, percentage functions etc. For example you can have a calculated key figure to find out sales tax from sale price. Question.11 What is the percentage variance? Answer: This is defined as parameter1 % parameter2; example actual expenses exceeds the % budgeted expenses. Question.12 What is percentage share? Answer: It is defined as parameter1 %A parameter?; gives the percentage share of parameter1 of parameter?. Question.13 What is percentage share of result (%CT)? Answer: Defined as %CT parameter1; gives the result in percentage with respect to result. Question.14 Percentage result of overall result (%GT)? Answer: Defined as %GT parameter1; similar to previous but the percentage is share of overall results. Question.15 What is COUNT function? Answer: Count (parameter) returns value 1 if parameter is not zero else zero. Answer: NDIVO(parameter) returns 0 if the parameter returns division by 0. Question.17 What is a structure? Answer: Combination of characteristics and key figures; for example you can create a structure containing sales for products A, B and C as three different restricted key figure contained in a structure; if this is global you can reuse it in other queries. Question.18 What is a reusable structure? Answer: These are query level structures which can be used in any queries. Question.19 How do you create a reusable structure from a local structure? Answer: Right click on the local structure, select save as, enter technical name and description.
https://learn.bigclasses.com/sap-bw-interview-questions-and-answers-for-freshers
Assignment 4 (Short-Answer and Algebraic Questions): (The numbers in square brackets give the breakdown of the points for various parts of each question. To receive full credit, please explain your answers. Total of 60 points plus 10 points extra credit.) - This questions is based on the article, “Signs of a slowdown,” published by The Economist on June 6, 2015. The article discusses the trends in the value of the yen and its consequences during 2012 and 2015. The average annual rate of inflation during 2013 and 2014 was 1.55 percent in both the US and Japan. Assume that the yen is the home currency so that the exchange rate is expressed in terms of US dollars per yen and the appreciation/depreciation is calculated as the percentage change of that exchange rate. - This question is based on a speech, U.S. Economic Outlook and Monetary Policy, by the Fed Vice Chair Richard H. Clarida on May 21, 2020, at the New York Association for Business Economics. In that speech, Richard Clarida first lays out the economic outlook at the time and then discusses the Fed’s policy responses to the situation. - China has a trade surplus and the People’s Bank of China (PBC, China’s central bank) purchases all the excess foreign currency earning of the country’s exporters. This policy is equivalent to bond purchases by the PBC through open market operations. - The article and its chart show that the yen-dollar exchange rate at the end of 2012 was e0 =1/87¥/$ and by the end of 2014 reached e1 = 1/120¥/$. By what percentage did the yen depreciate vis-à-vis the dollar in nominal terms in those two years? By what percentage did the yen depreciate vis-à-vis the dollar in real terms in those two years? - According to the article, what was the factor driving the depreciation of the yen against the dollar? - The article points out that global trade of the main emerging markets except China and Hong Kong saw weaker exports in early 2015 compared to the situation during the same period in 2014. What were the causes of that sluggishness highlighted by the article? - According to the article, in the situation prevailing in 2014 and early 2015, central banks were happy to see their currencies weaken. Why did this lead to exporting deflation to the rest of the world? - Extra Points Question: The article states that “QE means that central banks are absorbing an awful lot of new government debt.” How does this help keep sovereign-bond yields low? What are the potential problems that this policy may cause for the world economy? - In his speech, Clarida points out that while economic situation seemed dire at the time, financial conditions had improved considerably after mid-March. What was his explanation for this contrast between economic and financial circumstances? - In his speech, Clarida mentioned a number of policy measures that the Fed had taken to reduce the economic and financial impact of the COVID-19 pandemic. Which measures that he mentioned were conventional and which ones were unconventional? Answer: Answer: Answer: Answer: Answer: Answer: Answer: - What is the impact of the PBC’s policy of foreign currency purchase on the country’s money supply? - If all foreign conditions are exogenous and the aggregate real income, the price level, and the future conditions of the Chinese economy (including the expected exchange rate) can be taken as given, what is the consequence of the policy for the current interest rate and the spot exchange rate in China? Answer: Answer: The inverted scale used in the charts of the article is a fractional scale where the vertical axis label corresponds to the denominator of the fraction. So, a label of 120 corresponds to 1/120 dollars/yen.
https://elitewriters.net/what-is-the-impact-of-the-pbcs-policy-of-foreign-currency-purchase-on-the-countrys-money-supply/
Monetary policy is a term used to refer to the actions of central banks to achieve macroeconomic policy objectives such as price stability, full employment, and stable economic growth. The Federal Reserve controls over the federal fund rates give it the ability to influence the general level of short-term market interest rates. The Fed has three main tools at its disposal to influence monetary policy which are the open-market operations, discount rate, and reserve requirements. b. Monetary policy is the actions of a central bank, currency board or other regulatory committee that determine the size and rate of the money supply, which in turn affects interest rates. Be explicit and explain to the CFO how financial markets differ from markets for physical assets and why that difference matters to Jagdambay Exports. 2. Explain the relevance of money markets and capital markets for Jagdambay Exports. 3. Analyze Jagdambay exports and advise how the CFO should consider the primary market and secondary market in the expected transaction. Reagan was quoted saying, “Government is not the solution to our problems; government is the problem.” This statement opened up what was known then as Reaganomics. Reagan supported the supply-side economics, the theory that lower taxes will boost the economy as businesses and individuals invest their money.Reaganomics was also called “trickle-down economics” Meaning, the wealth of the upper-class would “trickle down” to the middle and lower classes. The Reagan Reaganomics was one of the most serious attempts to change the U.S. economic policy. This was based on the supply-side theories of an economist known as Arthur Laffer. There were three steps to Reaganomics, step one was reduce the growth of government spending. Step two was reduce the marginal tax rates on income from both labor and capital. Step three was to reduce regulation, and reduce inflation by controlling the growth of the money supply. It was a tax cut, reduction in domestic spending, and a balanced budget (Schaller 33). This was called supply side economics, or Reaganomics. He believed this would stimulate product activity which Some of the things that he set out to do from the beginning of his term turned out to be a success but some of them were also failures. Some successes of Ronald Reagan’s presidency include his objective to restore the U.S armed forces. Reagan had great success in his foreign and defense policies. He was also able to put an end to inflation with supply-side economics. And he was able to lower income taxes for all, especially for those in the higher bracket. to fulfill the role of the economic leader, The president and the nations budget, make tax proposes, and determines how to handle an economic crisis. An extraordinary example of an economic leader is President Ronald Reagan. Reagan said the fundamentals of America 's economy with tax cuts, introducing Reaganomics, increasing military funds, reducing the social program budget and recovering the economy from the stock market crash. Reaganomics, economic policies introduced by President Ronald Reagan, focuses money towards America 's military. With healing the stock market, economic leader Ronald Reagan displays how the economic leader protects the common His stance is in opposition to the position of Richard Posner. And as we know, Richard Posner presents his overall disposition more so in the stance of economic liberalism. He has been very clear about his belief that the best economic decision is one in which the total earning capacity of the economy is maximized even when that earning capacity is mainly held by a single individual. Posner would have strongly argued against the ruling, claiming that an increase in overall profits due to the proposed structural changes of Penn Station would provide a longer-term and greater total benefit to the economy (Leiter 1). Expanding on the benefit of the economy, he suggests that the increase in total earning capacity of the individual owner of Penn station is a better economic investment than the retention of less profitable, albeit more historical, landmarks in the community (Leff, 1). The fiscal policy is primarily an instrument in the hands of the government whereby it estimates its revenues and expenditures in the economy. This is a very important tool as it would define the flow of money from different sources, indicating the level of activity in the economy. It also defines the broad policies of the government indicating the outwards flow of money in to different sectors of the economy to maintain the overall health of the economy and fulfill its social goals. Apart from the fiscal policy every country has monetary policy at its disposal. This is primarily a tool at the disposal of the central bank of a country which uses different tools to manage the macro economic variables of a country to keep the economy stable or to stabilize it in situations of fluctuations. 1.0 Introduction Economic system is the basic arrangements made by societies of the respective country to solve its economic problems. Basically there are three types of economic systems which are the command economic system, the market economic system and the mixed economic system. Each economy system comes with its own strengths and weaknesses (Sloman and Garratt, 2009). According to Investopedia, (2010) a command economic system is where the country’s government plans and controls all aspects of the economic production. The government determines what goods to produce, how to produce, at what price and also how to distribute goods and services within the economy system.
https://www.ipl.org/essay/Economic-Policy-President-Reagans-Administration-F33EDDM428VV
This column by ACRU General Counsel and Senior Fellow for the Carleson Center for Public Policy (CCPP) Peter Ferrara was published June 27, 2012 on The American Spectator website. President Obama told the nation in his June 14 economic policy address in Cleveland that his economic policy plans for a second term would “create strong sustained growth;…pay down our long term debt; and most of all…generate good, middle-class jobs….” He then spent almost an hour describing policies that would do just the opposite. He did not begin the speech with much credibility on how to achieve those goals. He has been President for almost four years, and has done nothing to generate strong sustained growth, pay down our long term debt, and most of all generate good middle class jobs. And what he proposed in the speech as all sides bemoaned was just more of the same. But apparently he thinks we are too stupid to recognize that these are the same left-wing extremist policies that have failed us throughout his presidency, and, indeed, throughout world history. Certainly that seems to be true of his continued supporters. No. 1: Raise Tax Rates Under President Obama’s plan, on January 1 the top tax rates of virtually every major federal tax will increase sharply, as he has already enacted under current law. That is because the tax increases of Obamacare would go into effect, and the Bush tax cuts would expire, which Obama refuses to renew for singles making over $200,000 a year, and couples making over $250,000. The English translation of that target for the tax increases is the nation’s small businesses, job creators and investors. As a result, with the Bush tax cuts just expiring for these targeted taxpayers, the top 2 income tax rates would jump by nearly 20%, the capital gains tax rate would soar by nearly 60%, the tax rate on dividends would nearly triple, the Medicare payroll tax rate would skyrocket by 62% for the above disfavored taxpayers, and the top death tax rate would rise from the grave to 55%. That is all on top of the highest corporate tax rate in the industrialized world at nearly 40%, counting the federal corporate rate of 35% and state corporate rates on average. But under Obama, there is no relief in sight. Instead, Obama is pushing still more tax increases. Under his proposed Buffett rule, the capital gains tax rate would increase by 100%, and would be the fourth highest rate in the industrialized world. Many OECD countries, in fact, impose no capital gains tax at all because it is just another layer of taxation on capital income on top of the corporate and individual income taxes. All of this would leave American businesses uncompetitive in the global economy. How is this going to produce strong sustained growth and generate good middle class jobs? It is going to do just the opposite, as the multiple tax rate increases would only sharply reduce the incentive for productive activities, such as savings, investment, business expansion, business start-ups, and job creation. That will only encourage even more capital flight from America, and a continued capital strike by the capital that remains. But in his Cleveland speech, Obama argued that it was the Bush tax rate cuts that caused the recession somehow. He said, “We were told that huge tax cuts — especially for the wealthiest Americans — would lead to faster job growth….So how did this economic theory work out?” So let’s review how it did work out. Bush cut the top income tax rate by 11.6%, from 39.6% to 35%, and the second highest rate by about 8%, from 36% to 33%. But he cut the lower rates by higher percentages, including slashing the bottom rate by 33%, from 15% to 10%. Then in 2003, he cut the tax rates on capital, reducing the capital gains tax rate by 25% from 20% to 15%, and the tax rate on corporate dividends to 15% as well. These tax rate cuts first quickly ended the 2001 recession, despite the contractionary economic impacts of 9/11, and the economy continued to grow for another 73 months. After the rate cuts were all fully implemented in 2003, the economy created 7.8 million new jobs over the next 4 years and the unemployment rate fell from over 6% to 4.4%. Real economic growth over the next 3 years doubled from the average for the prior 3 years, to 3.5%. In response to the rate cuts, business investment spending, which had declined for 9 straight quarters, reversed and increased 6.7% per quarter. That is where the jobs came from. Manufacturing output soared to its highest level in 20 years. The stock market revived, creating almost $7 trillion in new shareholder wealth. From 2003 to 2007, the S&P 500 almost doubled. Capital gains tax revenues had doubled by 2005, despite the 25% rate cut! There is no economic theory under which the tax rate cuts could cause recession. Even Keynesian economics considers tax rate cuts pro-growth. America cannot afford a President who is this confused and deluded. But in his speech in Cleveland, Obama even opposed tax reform lowering rates in return for closing loopholes. He said it would be a tax increase on the middle class. But no serious tax reform proposal has ever involved a net tax increase on the middle class, because it would be dead before it even got out of the box. Indeed, Obama is so ideologically opposed to lower rates that, perversely, what he has done throughout his presidency is the opposite of tax reform. He has expanded the loopholes and increased rates. Those loopholes have included new and expanded welfare tax credits and corporate welfare like his green energy handouts. When his own Simpson-Bowles Commission recommended real tax reform closing loopholes in return for reducing rates, Obama only paid lip service to it, but didn’t lift a finger to advance the proposals. But higher tax rates with more loopholes reduces economic growth, jobs, and prosperity. The higher rates discourage critical job creating, pro-growth investment, and the loopholes distort markets and promote inefficiency and waste, which is a further drag on growth. Tax reform with lower rates and fewer loopholes, by sharp contrast, promotes powerful pro-growth incentives while reducing the inefficient drag of market distorting loopholes. That is why the bipartisan tax reform of 1986 under President Reagan, when America was under adult supervision, was so powerful in fueling the generation-long, 25-year Reagan boom from 1982 to 2007. Obama said in Cleveland, “Over the last three years, I’ve cut taxes for the typical working family by $3,600. I’ve cut taxes for small businesses 18 times.” But Obama’s “tax cuts” have almost all involved tax credits and other loopholes, not reductions in rates, which he is increasing at a historic pace. It is reductions in rates that promote economic growth and prosperity, because the rate determines how much the producer can keep out of what he produces. Tax credits are really no different than welfare checks, particularly the refundable tax credits Obama has favored, which pay the beneficiary the full amount of the credit regardless of tax liability. But welfare does not promote economic growth and prosperity, Nancy Pelosi to the contrary notwithstanding. No. 2: Increase Regulatory Burdens A second component of Obama’s plan is a blizzard of increased regulatory costs and barriers. The chief rainmaker here is the EPA, which is effectively imposing through regulation the cap and trade legislation that even an overwhelmingly Democrat Congress refused to pass. That is just brewing up, but will effectively be another tax increase of trillions on the economy through higher electricity, gasoline, and other energy costs. Further EPA regulatory storms are forcing the shutdown of coal fired power plants all across the country, and preventing the construction of new ones, exactly the opposite of China. Interior and other regulatory authorities ha ve set over 90% of available federal onshore and offshore jurisdictions off limits for oil and gas exploration and production. Obama’s regulatory minions have also refused to allow construction of the Keystone XL pipeline to bring Canadian oil to Gulf refineries. Another storm front is building through hundreds of new regulations in process under the Dodd-Frank legislation. Those added costs and barriers threaten the availability of business and consumer credit essential for economic recovery and new jobs. Further storm clouds arise from the Obamacare takeover of the entire health care sector, just starting to increase the costs of health insurance and care. The Obamacare employer mandate is already killing jobs before it even becomes effective, as potential employers know they will be required to buy the most expensive health coverage for each of their employees. These added regulatory costs are all effective additional tax increases on the economy. How are these soaring regulatory burdens going to produce strong sustained growth and generate good middle class jobs? Again, they are going to do just the opposite. But in his Cleveland speech, Obama just derided the idea of regulatory relief as a Romney GOP policy to “eliminate most regulations.” He characterized such relief as the “promise to roll back regulations on banks and polluters, on insurance companies and oil companies” and decried that “They’ll roll back regulations designed to protect consumers and workers.” Indeed, Obama argued once again that it was regulatory relief that caused the recession: “During the last decade, there was a specific theory in Washington about how to meet this challenge…. We were told that fewer regulations — especially for big financial institutions and corporations — would bring about widespread prosperity. [But] how did this economic theory work out?” So don’t expect any regulatory relief in any second Obama term. Instead expect even more draconian regulatory burdens, as the EPA’s implementation of effective cap and trade really ramps up, Obamacare is fully implemented (raising health costs still further, still another effective tax increase), and Obama’s promise to make “climate change” a top priority in a second term (what happened to the laserlike focus on jobs?) promises to skyrocket energy costs further (still another effective tax increase, if I need to repeat myself). No. 3: Increase Government Spending In his Cleveland speech, President Obama continued to propound his fundamental economic theory that what drives economic recovery, jobs, and growth is increased government spending. That is why his 2013 budget proposes the highest government spending in world history, following an $800 billion, 27% increase in federal spending from 2008 to 2012, with a proposed 53% increase in annual federal spending from $3.8 trillion today to a record shattering $5.8 trillion by 2022. This President Obama budget proposes a very grand total of $47 trillion in spending over the next 10 years, another all-time world record. Is draining all of that money out of the private sector really going to create strong sustained growth, pay down our long term debt, and generate good, middle-class jobs? Or is it going to bring the chaos of Greece and Western Europe to America? Just as Obama avoids any real tax reform, his budget also fails to propose any significant entitlement reform. As a result, CBO projects that under current policies, with that Obama budget, federal spending soars to 30% of GDP by 2027, 40% by 2040, 50% by 2060, and 80% by 2080. That compares to the long term, postwar, stable, historical average of 20% of GDP that prevailed for 60 years from 1948 to 2008, under which America prospered as the strongest economy in world history. Obama’s Huge Government spending breakout from that stable, long term level is just the perfect Grecian formula for America, as it would undoubtedly create the same spending, deficit and debt crisis here that we see in Greece and Western Europe more generally. Indeed, in his Cleveland speech, Obama criticizes spending cuts, deriding Ryan’s proposed budget to restore federal spending to its long term, historical, postwar average of 20% of GDP as a plan “to strip down government to national security and a few other basic functions.” The Causes of the Financial Crisis and Obama’s Perpetual Recession Rather than tax cuts and deregulation causing the financial crisis, it was more nearly the opposite. It was Bill Clinton’s overregulation that forced financial institutions to abandon traditional mortgage lending standards, in the name of homeownership for minorities and the poor. Once those standards were demolished for lower incomes, they could not be maintained for higher income speculators. The government’s sponsored enterprises Fannie Mae and Freddie Mac, with effective government guarantees, were able to pump trillions into the subprime housing bubble, and spread trillions in toxic mortgage securities based on non-traditional subprime mortgages throughout the global financial community. President Bush then supported a cheap dollar monetary policy following Keynesian doctrine that a cheap dollar boosts the economy by promoting exports. That just pumped up the housing bubble even further, and held back the economy as compared to the earlier Reagan boom built on anti-inflation, strong dollar policies that promote job-creating, wage-increasing investment. But in his Cleveland speech, Obama used the financial crisis as an excuse for his own failure to achieve a traditional recovery from the recession. He said, “Throughout history, it has typically taken countries up to 10 years to recover from financial crises of this magnitude.” Obama is telling us the standard of recovery he wants to be judged by, 10 years to get back on our feet, like during the Great Depression. But that is not based on the history of American recessions and recoveries. That history is fully recounted at the website of the National Bureau of Economic Research (NBER). That history shows that since the Great Depression, and before this last recession, recessions in America have lasted an average of 10 months, with the longest previously being 16 months. But here we are 54 months after the last recession started in December, 2007, and there has been no real recovery. Moreover, the American historical record is the deeper the recession the stronger the recovery. Based on that historical precedent, we should be in the third year of a booming economic recovery by now. Instead what Obama has produced is the worst economic recovery since the Great Depression, as I have recounted previously. Obama always wants to measure his performance from the trough, or worst point of the recession. But every recovery is always better than the worst point of the recession. Obama’s recovery is to be measured as compared to previous recoveries from prior recessions in the American economy. By that standard, Obama’s recovery has been pitiful, again the worst economic recovery since the Great Depression, especially as compared to the all-time record Reagan recovery. Indeed, if Obama’s perverse policies are not reversed, his soaring tax rate increases next year on top of his skyrocketing regulatory burdens and runaway federal spending, deficits, and debt will just throw America back into recession, before there was even any real recovery from the last recession. Then unemployment will soar back into double digits, the deficit will soar to new records over $2 trillion, and President Obama will have added more to the national debt than all prior U.S. Presidents combined, from George Washington to George Bush. The entire period will then look just like a historical reenactment of the 1930s. That should be no surprise that Obama in modeling his Administration after FDR is getting the same results as FDR. That is not fighting for the middle class, that is trashing the middle c lass.
https://theacru.org/2012/06/27/obamas_perverse_plan_for_permanent_recession/
This column by ACRU General Counsel and Senior Fellow for the Carleson Center for Welfare Reform (CCWR) Peter Ferrara was published June 16, 2013 on Forbes.com. Let’s proclaim the Good News: Government money is free. No, not just to the beneficiaries of government programs. To society as a whole. Meaning there is no economic cost to government spending whatsoever. The more the government spends, the richer we will all be. Let the Good Times roll. That is the foundational principle of Keynesian economics, which is heart and soul “Progressivism.” Every Paul Krugman column can just be replaced by the summary paragraph above. Money doesn’t grow on trees, you say? That outdated notion is where you went wrong. Today’s paper money IS trees. The liberal left wing that dominates the Democrat Party today so can’t stand cutting just 1% to 2% of federal spending, just from the growth in spending, as provided in the sequester, that they are saying that the sequester spending cuts involve “austerity,” which, they claim, is slowing economic growth, jobs, and the non-existent recovery. This is based on the failed, discredited, outdated economic doctrine of Keynesian economics, which holds that economic recovery and growth is restored by increasing government spending, deficits and debt. This supposedly increases “aggregate demand,” which supposedly stimulates increased production, and hiring, to meet that demand. You can see exactly this being said in the public debate today. President Obama’s senior economic advisor during his first term, Harvard University Professor of Economics Larry Summers, testified before the Senate Budget Committee on June 5 that during times of sluggish economic growth at least, “Borrowing to support spending, either by the government or the private sector, raises demand and therefore increases output and employment above the level they otherwise would have reached. Unlike in normal times, these gains will not be offset by reduced private spending because there is substantial excess capacity in the economy….” Summers added, “Consider the effect of the sequester in 2013. The sequester will impact the last 10 months of calendar year 2013. The CBO estimates that the sequester will, over this 10 month interval, reduce spending by $64 billion [out of total federal spending during this year of close to $3.7 trillion, or nearly 60 times as much]….However, we must also consider the sequester’s effect on GDP growth. The CBO estimates that the sequester will reduce the GDP growth rate in 2013 by 0.6 percentage points. This stifling of growth actually increases the debt/GDP ratio through two effects: First, by reducing the GDP growth rate, the sequester reduces the denominator of the debt/GDP ratio. Second, lower GDP during 2013 means lower tax revenue, which increases the deficit.” So the conclusion that Summers draws is, “[I]t would not be desirable to undertake further measures to rapidly reduce deficits in the short run. Excessively rapid fiscal consolidation in an economy that is still constrained by lack of demand, and where space for monetary policy action is limited, risks slowing economic expansion at best and halting recovery at worst. Indeed, there is no compelling macroeconomic case for the deficit reduction now being achieved through sequestration, as the adverse impacts of spending cuts on GDP more or less offset their direct impacts in reducing debt.” Senate Budget Committee Chairwoman Patty Murray reflected this thinking in hearings the previous day, saying, “When the economy is struggling, government should act to make things better for the middle class and most vulnerable families by investing in jobs and economic growth [meaning increasing federal spending, deficits and debt] that not only boosts demand in the short term, but also helps lay down a strong foundation for long-term and broad-based growth for years to come….Right now, we’re seeing how the indiscriminate and irresponsible cuts from sequestration are hurting the economy….And continuing on the path to austerity right now [meaning making minimal cuts just in the growth of federal spending] would weaken our economy and do serious damage to job creation and growth….In fact, the bipartisan Simpson-Bowles commission highlighted the importance of ‘investing in [meaning increasing federal spending, deficits and debt for] education, infrastructure, and high-value research and development to help our economy grow, keep us globally competitive, and make it easier for businesses to create jobs.'” But all of this is transparently fallacious, and the marked decline in the ability of our nation’s supposed “elites” to reason, or even to communicate clearly, is markedly contributing to the rapid decline of America. The 2009 stimulus in particular included heavy increases in spending for infrastructure, to no good effect for jobs, wages, middle class incomes, or economic growth. The truth is that Keynesian economics is just plain silly, no matter how many academic eminences embrace it, because increasing government spending, deficits and debt does not increase “aggregate demand.” That is because the money to finance that government spending, deficits and debt must come from somewhere. So if you increase federal spending, deficits and debt by close to a trillion, as the Obama/Democrat so-called 2009 “stimulus” did, and finance that government spending, deficits and debt by borrowing an extra trillion dollars out of the private sector, the net increase in aggregate demand is no more than ZERO at best. [This is why the Wall Street Journal‘s Steve Moore has rightly called Keynesian economics “magic fairy dust economics.”] In fact, the whole transaction is probably a net drag on the economy, growth, jobs and wages. (This reasoning still applies even when there is supposed “substantial excess capacity in the economy.”). That, in fact, is exactly what happened with the 2009 “stimulus,” because as several of my recent columns have documented, the supposed recovery under President Obama from the 2008-2009 recession was the WORST recovery of any previous recovery from any previous recession under any previous President since the Great Depression. It took far too long for the economy to grow back to the previous peak of GDP, we still have not recovered all the jobs lost during the recession, which is taking far longer than for any previous recovery since the Great Depression, poverty has soared during Obama’s entire Presidency, even though the recession actually ended four years ago, middle class incomes have actually declined all that time as well, and economic growth during Obama’s first term was the worst of any President since the Great Depression as well, worse even than during the one term of Jimmy Carter, or during the awful second term of George W. Bush. This pitiful, disastrous, inexcusable economic record is particularly egregious because the American historical record is the worse the recession the stronger the recovery, as the American economy always rebounded back to it’s previous, long term, world leading, economic growth trendline. As a matter of math, that means faster than normal growth, until we return to the normal, stable, long term growth trendline. Professor Summers did much better during the 1990s serving under President Clinton, when Newt Gingrich was running the economy, and his predecessor at Treasury in Robert Rubin was protecting the dollar. Given his less well supervised, more recent performance under President Obama, Professor Summers should be retired, instead of pretending to tutor the Congress on economics. But there is an even more fundamental problem with the logic of Keynesian economics, bes ides the magic fairy dust problem. Demand can never be inadequate in a free market economy. Demand is insatiable in fact. [Verify that with a talk with your teenage daughter, or son]. If the demand for any particular good or service is inadequate for the supply produced, the price will just fall for that good or service, until demand equals supply. Consumers can never consume more than is produced, no matter what policies the government follows to increase “aggregate demand,” and they will never consume less, as the market price system will see to that. This is because what really drives economic growth, recovery, and prosperity is not “aggregate demand” (a fallacious, fabricated, concept, in fact), but economic production and output. Rich nations, like rich people, are not rich because of their demand for goods and services, but because of their production. Those who produce more, consume more. Demand is not the problem. Production is. And what drives increased production, economic growth, and prosperity is the economic incentives for such production, growth and prosperity. The more fundamental reason why President Obama’s “recovery” and economic record are so bad is because he has consistently followed anti-growth policies across the board, squelching incentives for increased production. On taxes, President Obama has now raised top marginal tax rates across the board for virtually every major federal tax. That includes personal income taxes, capital gains taxes, the tax on corporate dividends, and death taxes. The corporate income tax rate under President Obama is now the highest in the world, and the capital gains tax rate, counting state capital gains taxes, is close to the highest in the world. All these tax rate increases slash the incentives for productive activity, and are predatory in particular to capital investment, which is the foundation for increased jobs, wages and incomes in a capitalist economy. Which is why President Obama isn’t getting any. On regulation, President Obama has been busily increasing regulatory burdens and barriers across the board, which further reduces the incentives and even the opportunities for increased production. See, e.g., the Keystone pipeline, the out of control EPA, picking and choosing which whole industries to assassinate, Obamacare, Dodd-Frank, etc., etc. On money, the Obama Administration has provided the political cover for, supported, and cheer-led, the wildest Fed monetary policy in U.S. history, debasing the currency, and sharply undercutting confidence in the dollar, which has survived this onslaught only because it is competing with other fiat, paper money currencies which are even less stable. Bad currency murders economic growth, jobs, wages and prosperity, because investors do not want to invest where they are likely to be paid back in a debased currency worth less, and arbitrary monetary policies produce bubble and bust cycles that can bankrupt almost any enterprise. Today’s Fed monetary abuses are fostering new bubbles, and laying the groundwork for an even worse crisis to come. And on spending, President Obama has pursued the wildest spree of federal spending, deficits and debt, since World War II, when at least we were fighting, and defeating, Nazi Germany and Imperial Japan, which was ultimately very pro-growth. (Investors like to invest where their investments are less likely to be blown up, and they are less likely to be murdered). Before President Obama, no American deficit had ever been anywhere near $1 trillion. Under President Obama, no deficit has ever been less than $1 trillion. And the national debt as a percent of GDP is roaring towards all-time, World War II records. Contrary to Keynesian economics, that runaway federal spending, deficits and debt just drains resources out of the more productive private sector, only contributing to further decline and stagnation. There is within this current economy the greatest boom in world history just straining its bonds to break out, and restore the American Dream — traditional, American, world leading, economic growth and prosperity. All we have to do is set it free, by reversing every one of President Obama’s policies. That means individual and corporate tax reform, cutting rates, and closing, especially special interest, crony capitalist loopholes, such as President Obama’s green energy tax giveaways. Better if such reform is not revenue neutral, but a net tax cut. Such policies would be served particularly well by the proposed tax reforms of House Budget Committee Chairman Paul Ryan, hopefully to be implemented later this year by House Ways and Means Chairman Dave Camp. For individual income taxes, Ryan has proposed a 10% rate for families earning less than $100,000 a year, and a 25% rate for families earning over $100,000. For corporate taxes, Ryan has proposed reducing the world leading 35% federal tax rate to 25%, which is close to the global average. Ryan also proposes to repeal and replace Obamacare, which would end the Obamacare tax increases. These lower rates would sharply increase incentives for productive activities, particularly the capital investment which is the foundation for growing jobs, wages and incomes. And eliminating the special interest tax loopholes and preferences would terminate the resulting misallocation of resources to less productive activities and enterprises. It means deregulation, instead of Obama’s costly reregulation. That would be served by repealing and replacing Obamacare as well, and Dodd-Frank and the EPA’s delusional global warming crusade should join it on the ash heap of history as well. Does that mean returning to the same policies that caused the 2008 financial crisis? That was caused by overregulation, starting with President Clinton’s housing policies, adopted to create housing “fairness,” (what was fair about the housing bubble and its inevitable collapse?), and by the same Federal Reserve monetary policy mismanagement that the Obama Administration has been cheerleading. Probably most important of all is fundamental reform of the Fed, and monetary policy, to enforce a stable dollar. That would involve tying the dollar to stable, real world measures. Such a stable currency would draw capital investment to America from the world over, as investors would know their investment returns would arrive in a currency with the same real world value as when they first made their investment, and there would be no bubble and bust cycles to threaten the very survival of their investments. And the final component would be to restore control over federal spending, deficits and debt. The best plan for that would again be Paul Ryan’s 2013 budget, which would actually balance the budget within 10 years, and stabilize the national debt as a percent of GDP at manageable levels. Long forgotten has been Reagan’s much vilified at the time 1981 “Reagan budget cuts,” which contrary to brain dead Keynesian economics left more resources in the productive private sector, and so contributed to Reagan’s 25 year, generation long, economic boom. That boom restored traditional American economic growth and prosperity after the disastrous 1970s, while slaying the double digit 1970s inflation, which all the Keynesian gurus of the time had said was impossible in the real world. Impossible for them. That leaves as the last question only what would be the appropriate punishment for those blind, persistent fools, who continue to teach and promote Keynesian economics. I will leave that question for suggestions from the commenters. In conclusion, and in honor of Keynesian economics, and President Obama’s economic policies and their results, we can now amend Orwell as follows:
https://theacru.org/2013/06/17/the_magic_fairy_dust_naivete_that_is_progressive_economics/
Although some characterize the trickle-down theory as an experiment originating in the 1980s under the Ronald Regan presidency, the United States had actually used it before. The Harding, Coolidge and Kennedy administrations implemented supply-side tax policies before Reagan did. The first instance of supply-side economics being implemented came even before the trickle-down idea was fully articulated. After World War I, top income tax rates had risen from a modest 7 percent to 77 percent to help pay for the war. This high rate would fall into the prohibitive range of the Laffer Curve, according to the theory. The Harding and Coolidge administrations passed a series of tax cuts to reduce wealthy citizens' tax burden, which had ballooned. Although opponents argue that this kind of policy contributed to the Great Depression, Arthur Laffer points to the resulting increases in tax revenue, gross domestic product (GDP) and employment as evidence that the tax cuts worked by boosting production [source: Laffer]. But this policy soon faced sharp criticism. When the stock market crashed in 1929 and the U.S. economy sank into the Great Depression, the idea of giving tax breaks to the wealthy was an unpopular policy. People blamed Herbert Hoover, who'd shown support for the tax policies of his predecessors. In 1932, voters replaced him with Franklin Roosevelt, who promised the New Deal that would help the economy from the bottom up. Keynesian economics took hold. Wealthy members of society who'd enjoyed the low marginal tax rates of the 1920s would see a dramatic reversal in the next 20 years. During the Depression and World War II, the top marginal rate rose to more than 90 percent [source: Laffer]. Enter John F. Kennedy, who was sympathetic to the idea behind supply-side economics (recall his "rising tide" comment). He argued that lowering taxes increases tax revenue, creates jobs and increases profits [source: Nugent]. His tax cuts didn't pass until after he was assassinated, but Laffer argues that they had the positive effect on the economy that Kennedy had hoped for. Others say that the cuts hurt the gross national product (GNP) growth and resulted in rising unemployment [source: Friedman]. Until Ronald Reagan was elected president, no other administration in the United States championed supply-side policies. In the late 1970s, economists like Laffer and Jude Wanniski were touting the advantages of increasing production through tax breaks for the wealthy. What they said convinced many people and fit into Reagan's economic philosophy. In 1981, Reagan passed his Economic Recovery Tax Act (ERTA), which cut all marginal tax rates dramatically (the top fell from 70 percent to 50 percent) [source: Laffer]. Since then, trickle-down theory has been tied closely to Reagan's policies, collectively named Reaganomics. Trickle-down economics remains highly controversial. Recently, George W. Bush faced harsh criticism for his tax cuts. Despite staunch political opponents to trickle-down policies, some maintain that the general consensus among economists today is that the theory works [source: Bartlett]. Nevertheless, you'll still find plenty of controversy surrounding trickle-down economics among politicians. Many, including Barack Obama, contend that it failed. During a hurting economy, Obama won the support of voters by promising to tax the wealthy and ease the tax burden on the lower-income bracket. So as of 2008, the tide of public opinion certainly shifted away from supply-side thinking yet again. Time will tell if opinion will shift back again. Related HowStuffWorks Articles - How Recessions Work - Can tax rebates really prevent an economic downturn? - What is the difference between a flat tax and a fair tax? - How the Fed Works - How will the U.S. government spend the $700 billion bailout funds?
https://money.howstuffworks.com/trickle-down-economics4.htm
Because of the tumultuous economy that people around the world are feeling the effects of, many are beginning to think deeply about what role the government should play. In this essay, I will use specific examples to examine how government intervention affects businesses. The examples that I will use to give validity to my argument will come primarily from an evaluation of policies that have been implemented over the years and the far-reaching results of the economic recession that we have experienced most recently (i.e., the automobile industry crisis and the housing market crisis). Smith’s laws of the market are simple; under the right circumstances, the outcome of the market can be predictable. Why then have we experienced so much randomness in our economy? When Ronald Reagan took office, he had a plan in mind: to “cut income taxes from top to bottom, reduce the size of the federal government for the first time since the New Deal, and make the U.S. military Number One in the world” (Edwards). These initiatives were based on a speech that Reagan had delivered to the International Business Council of Chicago where he had outlined his goals, “strictly controlling the rate of government spending, reducing personal income tax rates, revising government regulations, establishing a stable monetary policy, and following a consistent national economic policy” (Edwards). Despite intense opposition from Democrats and complaints from big businesses that Reagan’s tax cuts weren’t doing enough for them, in August 1981, he signed the Economic Recovery Act (ERTA) into law. Proponents believe that this act was the major reason for the unparalleled economic expansion. Businesses prospered from the plan; they produced nearly $20 trillion worth of goods and services between 1982 and 1987 (Edwards). In addition to big tax cuts, Regan was quite focused on loosening government regulation on business. For example, the Civil Aeronautics Board (CAB) led to the largest public utility, the American Telephone and Telegraph Company (AT&T) breaking up, which created more competition amongst businesses. As we can see, a central theme during Reagan’s presidency was relaxing government regulation of business. Thus, government regulation should be minimalized. Or should it? Many of his supporters are blind to the fact that Reagan’s ideas about bureaucratic regulation are behind the economic mess that is unstable markets in the United States and around the world. Reagan, known for being the country’s “greatest modern champion of deregulation, perhaps…contributed more to today’s unstable business climate than any other American. His long-standing campaign against the role of government in American life, a crusade he often stretched to... Bibliography: Amadeo, Kimberly. What Was the Stimulus Package? 12 October 2011. 18 October 2011 . Associated Press and Reuters . Obama selling auto bailout good news in Michigan. 30 July 2010. 15 October 2011 . Edwards, Lee. Golden Years. 16 October 2011 . Davis, Bob, Damian Paletta and Rebecca Smith. Amid Turmoil, U.S. Turns Away From Decades of Deregulation. 25 July 2008. 16 October 2011 . Festa, Paul. Probing IBM 's Nazi Connection. 28 June 2001. 15 October 2011 . Niskanen, William A. Reaganomics. 2002. 17 October 2011 . Noll, Roger G. "Regulation After Reagan." The Cato Review of Business & Government. 18 October 2011 . Obama Administration Releases July Housing Scorecard. July Housing Scorecard. DC: The U.S. Department of Housing and Urban Development (HUD) , 2011. Stolberg, Sheryl Gay and Edmund L. Andrews. $275 Billion Plan Seeks to Address Housing Crisis. 18 February 2009. 18 October 2011 . Toplin, Robert Brent. Blame Ronald Reagan For Our Current Economic Crisis . 8 September 2008. 18 October 2011 .
https://www.studymode.com/essays/Government-Regulation-1607788.html
By Aparna Mathur After months of listening to campaign speeches, TV ads and debates, Americans headed to the polls on Tuesday and chose another four years of Barack Obama. But a successful second term will not come easily. The partisan divide has yielded nothing but gridlock in Congress. And the electorate has grown tired of both the Obama New Economic Patriotism and Mitt Romney’s Five Point Plan. As America faces its biggest economic crisis since the Great Depression, what we really need instead is a Plan For America, with both parties working together rather than against each other. One advantage of all the political back and forth is that we have seen two competing visions for America. One involves a larger role for government — more spending, higher taxes to raise revenues and more regulation of the private sector. A second vision, truer to the principles of free markets and economic opportunity, involves less government spending and regulations and lower taxes. While both sides believe strongly in their ideals and want the best for the country and the American people, the electorate has not given a strong mandate for either. Going forward, the right policy prescription is likely to be a mixed bag of both sides. The task ahead for Obama is daunting. In August, the Congressional Budget Office reported that we have now had deficits of more than $1 trillion for four straight years. Net federal debt is set to reach an unprecedented high of 73 per cent of GDP at the end of this year. The economy is headed toward another recession in 2013 if Congress does nothing to stop the implementation of the toxic combination of spending cuts and tax hikes that are set to kick in at the end of this year. The key is Congress — the president alone can do nothing to stop this fiscal crisis from happening. There has to be a joint effort by the executive and the legislative bodies if we want the economy to recover. One of the major differences between the two parties regards whether the Bush tax cuts should be extended for all income classes, or just the middle- and low-income classes. The Democrats, under President Obama, have been unwilling to move forward with tax reform unless Republicans consent to tax increases for the wealthy. President Obama believes that raising rates at the top is important because “the rich should pay their fair share.” While both sides agree that small businesses are important for the vitality of the nation, President Obama argues that only 3 percent of all small businesses would be affected by the tax hike. Romney has similarly argued that he would be unwilling to compromise on a plan that would raise more revenue. He believes that raising taxes in a weak economy is a strategy that will harm hiring, investment and growth. While it is true that less than 5 percent of small businesses would face the hike, their income comprises more than 30 percent of all income, and the small businesses that are the job creators would be the worst hit. This intractable ideological divide is all the more curious since in August 2009, Obama himself warned against raising taxes in an economic downturn because it “would just suck up … more demand out of the economy and put business further in a hole.” Likewise, Romney has repeatedly argued that he would not lower the levels of taxes paid by individuals at the top. Any marginal tax cut would be offset by cuts in tax expenditures for the wealthy. Since both sides are in favour of large-scale tax reform to make the tax code simple, fairer and more efficient, there is a lot that could be achieved in a bipartisan manner. The U.S. corporate tax rate — one of the highest in the developed world — is similarly dysfunctional. The high rate serves as a disincentive to capital investment and employment growth for many firms that can invest overseas in countries with lower tax rates. Indeed, despite having the highest rate, the U.S. collects among the lowest corporate tax revenues in the OECD. Recognizing this, both candidates have proposed plans to lower the rate — from 35 per cent to 28 per cent for Obama and 25 per cent under Romney. Romney has also proposed a territorial system of taxation whereby firms would only be taxed in their country of operation. Perhaps some compromise on tax reform can begin with the corporate rate. The candidates also offered differing visions when it comes to the role and size of government. Romney favours less government and more targeted regulations, while Obama’s record has favoured sweeping regulations in health care and the financial industries. Such differences are reflected in government spending and outlays. Over the past four years, government spending has grown tremendously, driven in large part by specific government actions to stimulate the economy out of the recession. These spending increases are likely to continue for several years as the economy slowly recovers. But high spending does little to help the economy since it signals to people that their tax rates are likely to go up in the future as the government attempts to raise more revenues to finance prior spending. Much research in economics has shown that high tax rates slow investment and hiring, leading to lower rates of economic growth. Combined with the economic uncertainty that businesses and consumers already face, this is clearly not the correct path forward if we want higher rates of GDP growth, investment and employment. Obama now faces a choice: to compromise and engage, or to take the win as a mandate for running on a narrow agenda of high taxes and high spending. For the sake of the country, let’s hope it’s the former. Aparna Mathur is a resident scholar in economic policy at the American Enterprise Institute. She has been a consultant to the World Bank and has taught economics at Georgetown University and University of Maryland.
http://latinosreadytovote.com/congress-is-the-key-to-a-revived-economy/
was the 40th U.S. president, serving from January 20, 1981, to January 20, 1989. His first task was to combat the worst recession since the Great Depression. To do so, Reagan promised the "Reagan Revolution." It focused on reducing government spending, taxes, and regulation. His philosophy was "Government is not the solution to our problem, government is the problem." Reagan was an advocate of laissez-faire economics. He believed the free market and capitalism would solve the nation's woes. His policies matched the "greed is good" mood of 1980s America. Reagan inherited an economy mired in stagflation. It's a combination of double-digit economic contraction with double-digit inflation. To combat the recession, Reagan aggressively cut income taxes from 70 percent to 28 percent for the top tax bracket. He cut the corporate tax rate from 48 percent to 34 percent. He promised to slow the growth of government spending and to deregulate business industries. At the same time, he encouraged the Federal Reserve to combat inflation by reducing the money supply. In 1981, Congress cut the from 70 percent to 50 percent. It helped spur growth in gross domestic product for the next several years. The economy grew 4.6 percent in 1983, 7.3 percent in 1984, and 4.2 percent in 1985. Economic growth reduced unemployment for the next several years. In December 1981, it was 8.5 percent. The minimum wage was $3.35 an hour. In 1982, Congress passed the Job Training Partnership Act. It established job training programs for low-income people. The unemployment rate rose to 10.8 percent by December 1982. It fell to 8.3 percent in 1983, 7.3 percent in 1984, and 7 percent by December 1985. Reagan cut the tax rate again, to 38.5 percent, in 1986. Growth was a healthy 3.5 percent by the end of 1986, but the unemployment rate was 6.6 percent. It was still higher than the natural rate of unemployment. Reagan cut taxes again to 28 percent. Growth bounced up to 4.2 percent in 1987 and unemployment fell to 5.7 percent. Growth leveled out at 3.7 percent in 1988 and unemployment fell to 5.3 percent. Reagan's economic policies are called Reaganomics. Reagan based his policies on the theory of supply-side economics. It says tax cuts encourage economic expansion enough to broaden the tax base over time. The increased revenue from a stronger economy is supposed to offset the initial revenue loss from the tax cuts. But according to the Laffer Curve, this only works if the initial tax rates are high enough. High taxes fall in the curve’s “Prohibitive Range.” Reagan's first tax cuts worked because tax rates were so high. The 1986 and 1987 tax cuts weren't as effective, because tax rates were already reasonable. Also, Reagan offset these tax cuts with tax increases elsewhere. He raised Social Security payroll taxes and some excise taxes. He also cut several deductions. Reagan cut the corporate tax rate from 46 percent to 40 percent. But the effect of this break was unclear. Reagan changed the tax treatment of many new investments. The complexity meant that the overall results of his corporate tax changes couldn't be measured. for continuing to eliminate the Nixon-era price controls. They constrained the free-market equilibrium that would have prevented inflation. Reagan removed controls on oil and gas, cable television, and long-distance phone service. He further deregulated interstate bus service and ocean shipping. In 1982, Reagan deregulated banking. Congress passed the Garn-St. Germain Depository Institutions Act. It removed restrictions on loan-to-value ratios for savings and loan banks. Reagan's budget cut also reduced regulatory staff at the Federal Home Loan Bank Board. As a result, banks invested in risky real estate ventures. Reagan's deregulation and budget cuts contributed to the savings and loan crisis of 1989. The crisis ushered in the 1990 recession. Reagan did little to reduce regulations affecting health, safety, and the environment. In fact, he reduced those regulations at a slower pace than the Carter administration did. Reagan's enthusiasm for the free market did not extend to international trade. Instead, he raised import barriers. Reagan doubled the number of items that were subject to trade restraint from 12 percent in 1980 to 23 percent in 1988. Despite campaigning on a reduced government role, Reagan wasn't as successful as he was at tax cuts. During his first year, he cut domestic programs by $39 billion. But he increased defense spending to achieve "peace through strength" in his opposition to Communism and the Soviet Union. He was successful in ending the Cold War. That’s when he uttered his famous quote, "." To accomplish these goals, Reagan wound up increasing the defense budget by 35 percent. Reagan did not reduce other government programs. He expanded Medicare. He increased the payroll tax to ensure the solvency of Social Security. Under Reagan, government spending increased 2.5 percent annually. Reagan's first budget was for fiscal year 1982. As the chart below reveals, he incurred substantial deficits for each year of his presidency. As a result, the debt each year also increased. By the end of Reagan's two terms, the national debt had more than doubled. $79 $998 2.4% Reagan tax cut. $128 $1,142 3.8% Reagan's 1st budget. $185 $1,572 4.5% Increased defense spending. $221 $2,125 4.8% Tax cut. $155 $2,602 2.9% Fed raised rates. $153 $2,857 2.7% S&L Crisis. Reagan captured the mood of voters when he said, "Inflation is as violent as a mugger, as frightening as an armed robber, and as deadly as a hit man." The inflation rate was 12.5 percent in 1980 and 8.9 percent in 1981. In 1982, inflation fell to 3.8 percent. Inflation remained below 5 percent for the remaining years of Reagan's presidency. But Reagan can't take credit for combating inflation. That goes to Federal Reserve Chairman Paul Volcker. He steadily raised the fed funds rate to 18 percent in 1980. High-interest rates ended double-digit inflation, but also triggered the recession. During his eight-year term, Reagan brought on board many well-known economists to the Council of Economic Advisers. New Chairmen included Murray Weidenbaum, Martin Feldstein, and Beryl Sprinkel. The Council also included William Niskanen, Jerry Jordan, William Poole, Thomas Gale Moore, and Michael Mussa. Niskanen was one of the architects of Reaganomics. The staff included Nobel Prize winner and New York Times columnist Paul Krugman and Harvard professor Larry Summers. Summers later became President Obama's Director of the National Economic Council. was born on February 6, 1911. He studied economics and sociology at Eureka College in Illinois. He became a radio sports announcer, then an actor in 53 films. As president of the Screen Actors Guild, he became involved in rooting out Communism in the film industry. That led him to develop more conservative political views. He became a TV host and spokesman for conservatism. He was Governor of California from 1966 to1974. In 1980, Reagan was nominated as the Republican presidential candidate. George H.W. Bush was the nominee for vice president. Reagan beat Jimmy Carter to become the 40th president of the United States. Reagan's salary as president was $200,000. Reagan's net worth was estimated at $15 million at the time of his death in 2004. Who Says Higher Food Prices Don't Count as Inflation? Who Really Owns the World's Most Powerful Central Bank? Этот нужный web-сайт на тематику попутные перевозки по Украине.
http://jacara.info/president-ronald-reagan-s-economic-policies-3305568
Can Lessons from Past Presidents Help Obama Save the Economy? A new era has dawned in American politics, and with it a likely shift in economic policy. In the last three decades, both Democratic and Republican leaders have been blamed and lauded for their impact on the American economy. FindingDulcinea examines the policies of Jimmy Carter, Ronald Reagan, George H.W. Bush, Bill Clinton and George W. Bush to determine what lessons President-elect Obama may take from previous administrations. During President Carter’s administration, OPEC’s oil embargo caused oil prices to quadruple. The spike in energy costs helped create the only period of peacetime inflation in American history. Center on Budget and Policy Priorities: Does cutting tax rates increase economic growth? Although the American economy was in dire straits when President Reagan took office, inflation was drastically lowered within the first year of his administration with the help of Paul Volcker, then Chairman of the Federal Reserve. The President instituted what the press dubbed “Reaganomics,” which included tax cuts, domestic spending restraints and a rollback of government regulation. George Herbert Walker Bush followed Reagan’s example of keeping taxes low—as evinced by his famous campaign statement, “Read my lips: No new taxes.” But supply side economics began to fail under Bush, prompting some to refute the idea that cutting taxes always increases economic growth. Like President Reagan, Clinton entered the White House in the midst of an economic slump and based his financial agenda on limiting domestic spending while investing in American workers through education, science and research. The rise of the Internet spurred growth during the Clinton era. Many have questioned his aid programs, howeverincluding his encouragement of broad-scale lending by Fannie Mae and Freddie Mac, arguing that those initiatives gave rise to predatory lending. Despite the hopefulness of his proposal, President Bush will be leaving office in the midst of a significant financial crisis. Throughout his campaign, Sen. McCain aimed to distance himself from Bush’s economic policy: associating McCain with George Bush was one of Obama’s primary tactics for competing with the GOP candidate. As Obama prepares to take office, Americans will likely see two forces at work: Democrats pushing Obama’s economic agenda in an attempt to keep the economy from worsening before he takes office, coupled by the constraints inherent within a severe economic recession.
http://www.findingdulcinea.com/news/politics/2008/November/Can-Lessons-from-Past-Presidents-Help-Obama-Save-the-Economy.html
In the preceding article “Progressive Policy for National Progress and Prosperity,” I emphasized on the need to intercept the Congress gridlock by electing the ‘Progressives,’ in the Democratic Party. Following news articles reaffirm such recommendation. ——————————————————————————————– 1. Congress trying to have it both ways on spending Lawmakers lament rising deficits but fight for pet projects By Carl Hulse – New York Times – February 7, 2010 – Thank you. Washington – While Sen. Saxby Chambliss, R-Ga., said he was all for slowing federal spending , he has no appetite for the substantial cuts in farm programs proposed in President Barack Obama’s new budget. Rep. Todd Akin, R-Mo, issued a news release simultaneously lamenting the deficit spending outlined in the new budget and protesting cuts in Pentagon projects important to his state. And Sen. Jeff Sessions, R- Ala., a fiscal conservative and a senior Republican on the Budget Committee, vowed to resist reductions in space program spending that would flow back home. The positions of these Republicans – and similar stances by dozens of other lawmakers of both parties – are a telling illustration of why it is so hard to control federal spending. Every federal program has a constituency, and even lawmakers who profess to be alarmed by rising deficits will go to the mat to preserve money that provides jobs and benefits to their constituents. “I am not a hypocrite,” Sessions said in reconciling his fiscally conservative credentials with his outrage over the administration’s proposal to essentially end the human space flight program and allow private enterprise to take on some of the load – an approach that Republicans typically favor. Sessions said money taken from NASA would not be saved but would instead be directed to other Obama administration priorities that he did not support. Others said that the annual tableau in which members of Congress criticize the spread of red ink even as they reassure voters back home of protection for popular subsidies and Pentagon projects exposed the high degree of cynicism and lack of conviction that colors the fight over congressional spending. “It shows that in Washington, you can be firm on your opinions; it is your principles you can be flexible on,” said Rahm Emanuel, the White House chief of staff. The Republican juggling act on spending comes after a legislative proposal for an independent commission to study ways to cut the deficit stalled in the Senate, partly because some Republicans who had originally backed the idea balked. “There are not enough statesmen who will stand up and say, “Cut it even when it is in my district,” said Rep. Jeff Flake, R-Ariz., who has crusaded against spending by both parties on pet projects known as earmarks. It is not only Republicans who are trying to have it both ways. Conservative and moderate Democrats who have pushed against deficit spending also quickly protested the cuts in NASA, military and farm spending. ————————————————————————————————– 2. GOP hammers Obama over jobs Republicans oppose giving leftover bailout money to small banks By Phillip Elliott – Associated Press, February 7, 2010 – Thank you. Republicans sparred with President Barack Obama in their Saturday media addresses over proposals to create jobs, further evidence of the difficulty of bipartisan solutions to the nation’s pressing problems. Obama pushed Congress to use $30 billion that had been set aside to bail out Wall Street to start a new program that provides loans to small businesses, which the White House calls the engine for job growth. Republicans, meanwhile, taunted Obama with a familiar refrain: Where are the jobs the president promised in exchange for the billions of dollars already spent? The barb came a day after the government reported an unexpected decline in the unemployment rate, from 10 percent to 9.7 percent. It was the first drop in seven months but offered little consolation for the 8.4 million jobs that have vanished since the recessions began. “Even though our economy is growing again, these are still tough times for America,” Obama said. “Too many businesses are still shuttered. Too many families can’t make ends meet. And while yesterday, we learned that the unemployment rate has dropped below 10 percent for the first time since summer, it is still unacceptably high – and too many Americans still can’t find work.” To help the recovery, Obama asked Congress to use leftover money from the Troubled Asset Relief Program, or TARP, to provide to small banks so they can make more loans to small businesses. Republicans have criticized the move, arguing any money left over from the bailout should be used to reduce the budget deficit. In the weekly GOP address, Rep. Jeb Hensarling of Texas chided Obama for proposing a 2011 budget last week that would increase spending, taxes and the national debt. “Americans are still asking, ‘Where are the jobs?’ but all they are getting from Washington is more spending, more taxes, more debt and more bailouts,” Hensarling said. The Republicans attack came even as key Democrats and Republicans in the Senate are working on a bipartisan jobs bill. The senators hope to unveil legislation as early as Monday.” ————————————————————————————————– 3. Obama seeks boost in business lending Proposals draw fire from Democratic leader in House By Christine Simmons and Marcy Gordon – Associated Press – February 6, 2010 Seeking to create more jobs, President Barack Obama on Friday asked Congress to temporarily expand two lending programs for the owners of small businesses. But a Democratic House leader slammed the president’s proposals, saying they’re the wrong approach to creating jobs. Obama said Friday he wants to bolster the impact of the businesses that are the chief creators of new jobs in a struggling economy. Just hours before he spoke, the nation’s jobless rate finally dipped below 10 percent – to a stubborn high 9.7 percent – in the latest government figures. The president said he wants businesses to be able to refinance their commercial real estate loans under the Small Business Administration and he wants that government agency to increase loans used for lines of credit and capital. The truth is, the economy can be growing like gangbusters for years on end and it’s still not easy to run a small business,” Obama said as he visited a heating and air conditioning company in a Maryland suburb of the capital. The White House said Obama’s plan would temporarily raise the cap on Small Business Administration Express loans from the current maximum of $350,000 to $1 million. Obama’s plan would also expand the SBA’s program to support refinancing for owner-occupied commercial real-estate loans. But even the Democratic head of a House committee wasn’t pleased about the plan to expand SBA lending. Rep. Nydia M. Velazquez, D-N.Y., chair of the House Small Business Committee, said the SBA Express program has been criticized for underwriting loans that banks would have made without government backing and for carrying the highest default rate of any SBA program. “With loan defaults on the rise, we should not base our strategy on increasing the size of the least stable SBA lending program,” Velazquez said. The initiative to refinance commercial real estate debt may dilute it and draw away too many resources, she said. ————————————————————————————————– Food for Thought – By Padmini Arhant Feburary 10, 2010 It’s clear from the listed articles that, the priorities for the congressional conservatives’ on both sides are not the people i.e. the working class, the middle class and the small businesses. If they were, they would not try to have it both ways as suggested in the article. Evidently, the national interest is not the primary concern for the Congressional conservatives and moderates in both parties. They are preoccupied in their faultfinding against President Barack Obama, instead of cooperating with the rest of the Congress in passing legislations especially, The health care and health insurance reform where a staggering 46 million Americans are reportedly dying due to these lawmakers’ unwillingness to pass the much-required legislations to heal every American. Notwithstanding, the credit crunch experienced by the small businesses from the ‘bailed out’ banks’ reluctance to facilitate lending. Again the finance sector’s default in containing the worsening real estate crisis in both residential and commercial markets calls for immediate action through finance reform – conveniently rejected by the conservatives’ and moderates on both sides. These legislators positioning them to be ‘fiscal conservatives’ and rebuking President Obama on the rising national debt that, they are contributing with their ambitious pet projects over the ‘average’ American plight, speaks volume on their lack of commitment to the people electing them to the office. With respect to President Obama’s strategy on SBA lending to the small businesses, the Democrat House Committee response is irrational and confirms the legislator’s ‘out-of-touch’ with reality. The President’s justification on this issue is right on target. Since the bailed out finance sector is back in the game with “business as usual,” motto and focused on self-promotion with multi-million dollar bonuses culminated by their Washington representatives’ successful blocking of the finance reform, The President’s proposal is the only viable option to stimulate the job growth in the most desperate segment of the economy – the small business. Besides, in the absence of the banking industry long overdue lending activity, the investment risks in the small business is blown out of proportion compared to the risk exposure in the multi-trillion dollar bailouts to the banks still withholding credit to their creditors-cum-taxpayers and consequently restraining the economic recovery. Time is running out for the conservatives on both sides in correlation with patience among the suffering millions in the economy. If the Republican members are counting on their rebellious attitude towards the democrat President and the Congress to win elections in November 2010, They are in for a serious disappointment for the American electorate would not reward the party with a victory in the face of their deteriorating economic conditions resulting from the Republican members’ blockade. Somehow, if this were to happen, then it would be at the democracy’s peril. Perhaps, it’s something, the American electorate ought to think about because they are responsible for the stalemate in Washington. Having elected the ideological representatives for whom the people seem irrelevant – transparent in their obstinacy on legislative matter, the people are the ones who can undo the wrongdoing by voting the redundant representations out of power this November or even sooner. Democracy is held hostage by the recalcitrant congress members defying the constitutional responsibility to serve the people and the nation as an elected official. Washington hue shines through in these issues. How can any President possibly achieve anything in such a hostile environment? You decide. Thank you. Padmini Arhant Financial Crisis Inquiry January 14, 2010 By Padmini Arhant Today, the financial crisis inquiry commission summoned the financial sector executives to investigate the activities that primarily contributed to the financial market’s downward spiraling and led the economy to the brink of collapse. The inquiry is a step in the right direction to convey a strong message that no one is above the law and democracy cannot be undermined. Although, the executives are perceptive in self-defense and evading responsibility for the financial meltdown, the fact of the matter is, these financial moguls capitalized on the economic vulnerabilities during the Bush administration. It’s generated from the deregulations and substantial prime rate reduction alluring average citizens with a political slogan that linked patriotism to home ownership. More concessions were offered by the Bush-Cheney Presidency through massive tax cuts for corporations, financial institutions and the wealthy individuals boosting the investment banks’ portfolio, thereby driving them from equity markets to speculative trading. It created an enormous capital infusion with investment banks competing with the commercial banks in the absence of Glass Steagall Act. Followed by AIG collaborating in the insurance deals on the credit borrowings invested in derivatives and hedge funds with risky assets as collateral and underlying value further exacerbated the risk management. When the bubble burst, so did their balance sheets. It went disarray with the majority lead players burdened with toxic assets that transformed into dead weight liabilities in the form of large risk exposure eroding their capital and solvency, consequently relying on the taxpayer bailout to salvage the financial market and the economy. Apart from the financial institutions, the architects behind the policies since the early nineties are equally responsible for the debacle. For instance, the former Federal Reserve Chairman Alan Greenspan, The former treasury secretary Henry Paulson and the current treasury secretary Timothy Geithner, The present Federal Reserve Chairman Ben Bernanke along with the financial team under the Obama administration represent the convenient exchanges between the Wall Street and Washington through the revolving door of Goldman Sachs, Merrill Lynch, Morgan Stanley and the Lehman Brothers prior to being acquired by Barclays…to name a few. As found in other national issues such as health care, communication and energy, the prevalent culture between Washington and Wall Street is a huge conflict of interest leaving the average taxpayers and consumers at the mercy of the “corporate owned government” enterprise. Investigation is necessary to determine the cause of the status quo. However, it’s significant to have the financial sector pledge to revive the credit market through liquidity flow to small businesses and corporations. It would jumpstart the economy, since financing businesses and corporations positively impact the job market. Meanwhile, the manufacturing sector could be resurrected pervasively, producing the desirable drastic unemployment contraction. Simultaneously, the finance industry is required to stimulate the real estate and construction areas of the economy. Considering the dismal job growth accompanied by the plummeting residential and commercial real estate values due to the sub-prime mortgage fiasco, The financial institutions should invigorate the financing and refinancing options to homeowners and commercial estate holders by offering reasonable, incentivized programs that would allow the property owners to comply with the payments and retain the values respectively. The viable strategy would ease the burden on the lender and the mortgagee leading to the property value appreciation. President Obama’s proposal to levy taxes against the financial institutions that have benefited from the taxpayer bailout is right on target. Not surprisingly, the financial industry is resisting the tax, estimated to yield $120 billion in revenue for the ailing economy. Taxpayers from bottom up shared the trillions of dollars finance industry bailout. Having stabilized the balance sheets from the massive interjection of funds, the institutions are now challenging the government against the tax proposal by warning that any such levies in the form of fees and taxes would be hurting the consumers, claiming that the customer will ultimately bear the charges through bank fee hikes. Alternatively, the banks are threatening to move jobs overseas upon any tax or fee imposition. Despite the pre-existing exorbitant fee and charges applied to banking transactions, the banks’ retaliation to tax proposal via potential fee increase or job export is not only outrageous but also audacious. Financial sector being the economy’s engine, the credit flow across the spectrum is pertinent to the swift economic recovery including the financial market gains. The financial institutions’ lack of concern for ethics and the excessive greed triggered the financial market crisis ultimately affecting the global economy. Therefore, there is an urgent requirement for aggressive financial reform to prevent history repeating itself in the near future. Thank you. Padmini Arhant Rescue Plan – Big 3 Auto Makers December 8, 2008 Accountability: The Federal Reserve and the Treasury department secured a $700 billion jackpot for the finance industry bailout. Major beneficiaries – The financial institutions comprising investment banks for bad decisions in the subprime mortgage debacle with a prominent mismanagement by Hedge Fund managers. The insurance industry for navigating unchartered waters in search of profit from risky ventures with no guaranteed returns. During the financial sector bailout bankrolled by the taxpayers, there were supposed to be preconditions to the bailout of the financial institutions. Other than oversight and warranted regulations, they were, The immediate recovery plan for the housing market – predominantly the stopgap measures on foreclosures. Latest news articles and reports confirm otherwise. That the foreclosures have been record high subsequent to the financial bailout. As for other issues… Treasury role in easing the burden on financial institutions with liabilities in the form of bad loans and securities were the reasons presented to secure the huge sum of $700 billion at that time. Whatever has happened to accountability? Where is the oversight? Why foreclosures are soaring nationwide despite taxpayers’ investment in the financial sector to cure symptoms of this nature in the housing market that has contributed to the current economic recession? It is apparent from the struggling and still volatile stock market that the financial sector has not met the requirements and honored the agreements with the taxpayer on all of the issues ranging from – Reviving the housing market by temporarily freezing foreclosures and reassessment of payment plan programs with default homeowners. Providing liquidity to commercial sectors to jumpstart the economy. Last but not the least, transparency to the American public with their current lending practices. ——————————————————– As per http://moneynews.newsmax.com/streettalk/bailout_half_gone/2008/11/12/150364.html Street Talk – Thank you. Who Got Bailout Money So Far? Wednesday, November 12, 2008 9:09 AM "The Treasury Department’s $700 billion bailout plan, also known as the Troubled Asset Relief Program (TARP), is one of the main U.S. tools to address the financial crisis. The Treasury Department on October 14 set aside $250 billion of the program to buy senior preferred shares and warrants in banks, thrifts and other financial institutions. Half that money was allocated to nine big banks, the Treasury Department has said. Another $38 billion has since been earmarked for regional or small banks, according to statements from individual banks. On Monday, the department announced its single-biggest TARP investment — $40 billion in American International Group — which the government said would not come from the $250 billion bank capital program. ———————————————- The TARP has so far committed the following funding: AIG $40 billion JPMorgan $25 billion Citigroup $25 billion Wells Fargo $25 billion Bank of America $15 billion Merrill Lynch $10 billion Goldman Sachs $10 billion Morgan Stanley $10 billion PNC Financial Services $7.7 billion Bank of New York Mellon $3 billion State Street Corp $2 billion Capital One Financial $3.55 billion Fifth Third Bancorp $3.45 billion Regions Financial $3.5 billion SunTrust Banks $3.5 billion BB&T Corp $3.1 billion KeyCorp $2.5 billion Comerica $2.25 billion Marshall & Ilsley Corp $1.7 billion Northern Trust Corp $1.5 billion Huntington Bancshares $1.4 billion Zions Bancorp $1.4 billion First Horizon National $866 million City National Corp $395 million Valley National Bancorp $330 million UCBH Holdings Inc $298 million Umpqua Holdings Corp $214 million Washington Federal $200 million First Niagara Financial $186 million HF Financial Corp $25 million Bank of Commerce $17 million TOTAL: $203.08 billion —————————————- INSURANCE COMPANIES In addition to the TARP program’s $40 billion capital injection into AIG, the Federal Reserve is providing the company with up to $112.5 billion in separate loans and funds for asset purchases. Aid to the huge insurance company came after counterparties and rating downgrades forced AIG to post large amounts of collateral for its credit derivatives positions. Some other insurers are interested in cash infusions, but must own a thrift or bank in order to qualify under the terms of Treasury’s current capital injection program. ——————————————- BANKS, LENDERS The TARP program set a November 14 deadline for smaller banks to apply for capital injection funds remaining in the pool of $250 billion. The deadline will be extended for non-publicly traded banks. The government’s preferred shares will pay dividends of 5 percent annually for the first five years and 9 percent after that until the institution repurchases them. Participating banks must comply with Treasury restrictions on executive compensation, which limit tax deductibility of senior executive pay to $500,000. They require bonuses to be "clawed back" if earnings statements or gains are later proven to be materially inaccurate and prohibit "golden parachute" payments to senior executives. —————————————– OTHER COMPANIES Struggling automakers General Motors Corp, Ford Motor Co and Chrysler LLC have requested tens of billions of dollars in Treasury aid under TARP. However, the Bush administration says the TARP program was designed by Congress to help the financial service sector, not the auto industry. ——————————————- REMAINING TARP MONEY The remaining $350 billion in TARP funding can be accessed only after the White House formally notifies Congress. U.S. House Financial Services Chairman Barney Frank has said that if the initial banks participating in the program do not use the money for lending, Congress could block authorization of the final funding." ——————————————————————————– Reality Check: Despite the sizeable cash infusion in the financial sector to revive the stagnant economy, the results confirm the dismal performance in all quarters of the economy. It is imperative for treasury and the Federal Reserve as the guarantors of the financial industry bail out to provide legitimate explanation to the American taxpayers in their failure to achieve the TARP purpose, prior to even contemplating to secure the remaining and final $350 billion amount. ———————————————————————– Auto Industry – Crisis Meanwhile, to focus on the pending national issue concerning the American workforce in the manufacturing sector of the auto industry, It is important to shed light on the current unemployment status of our nation. As per the recent reports… Source: http://www.free-press-release-center.info/pr00000000000000028189_us-unemployment-rate-touches-67-halfmillion-jobs-lost-in-november-employmentcrossing-revs-up-efforts.html – Thank you. US Unemployment Rate Touches 6.7%; Half–Million Jobs Lost in November; EmploymentCrossing Revs Up Efforts Employers slashed 533,000 jobs in November, the most in 34 years, according to the latest US Bureau of Labor Statistics report. Mind-boggling figures of job losses reported for the month are statistically the most since December 1974. The unemployment rate of 6.7% was the worst rate since 1993. It’s only the fourth time in the past 58 years that payrolls have fallen by more than 500,000 in a month. EmploymentCrossing, the leading job board in the US, agrees that the current job market has been increasingly ruthless on the employees, as widespread cuts attain a new high. ———————————————- Analysis: Obviously, it is a dire situation demanding immediate solutions to the burgeoning problems of the job market. Delayed response in addressing the collapse of the major manufacturing sector will worsen the fragile economy already in recession. There have been various good proposals from all corners and discipline presented so far for consideration by Congress. Most proposals target similar aspects of the financial industry bailout like, Oversight, strict regulations and accountability. While, others include emphasis on fuel-efficient and/or hybrid cars to deal with potential energy and present environment crisis. The Union role in the auto industry has been unfairly targeted in the outcry against protecting the manufacturing jobs. Without Union existence and support, the outrageous trade practices by Corporate America towards the American workforce will be emboldened with an adverse effect on the middle and poor income groups. Typically, such scenario will widen the pre-existing canyon between the haves and have-nots. ———————————————————————————- Rescue Plan with Clear Solutions: First and foremost, the incumbent administration’s refusal to recognize the seriousness of the auto industry problem as an impending job market crisis is no revelation. It is appalling that the same administration instantaneously reached out to the financial industry with the notoriety for poor judgment that triggered the entire economic crises. Yet, it holds reservations for a sector seeking assistance with a pledge to comply with all and any legislative requirements to save the manufacturing jobs. Moreover, unlike the financial sector, in this instance the taxpayer investment is secured with tangible assets upon default by the companies. Given the grim unemployment status, economic recession and gloomy Retail forecasts for the holiday season, the auto industry jobs must be rescued at all costs. Strategy: As highlighted above, the purpose behind TARP to financial industry was to facilitate liquidity in commercial lending to various other sectors of the economy. Referencing U.S. House Financial Services Chairman Barney Frank, “if the initial banks participating in the program do not use the money for lending, Congress could block authorization of the final funding.” The comment is fair and valid. Due to the breach of $700 billion agreement proposal by the financial institutions, The entire sum of $75 billion requested by all three major automakers will be approved and allocated accordingly: 1. The unused $15 billion from the previously drawn amount of $350 billion financial bailout is to be utilized in the approval of the $75 billion to protect the auto industry jobs. 2. The remaining $60 billion will be derived from the final amount of the $350 billion financial bailout package through an emergency resolution by Congress. 3. The recommendation to tap into the $25 billion energy bill to assist the automakers would derail any progress aimed at clean energy programs in the future. Therefore, the rescue package for automakers is to be appropriated from the excessive financial bailout fund. —————————————————————————– Compliance by Automakers: A. Besides the standard protocol for oversight, stringent regulations and executive competence, the manufacturing of hybrid models to satisfy the energy efficiency requirement is paramount to this deal. B. Equally important are the recognition and improvement of labor laws, trade practices to benefit the American workforce and thereby increase productivity yielding expected profits for payoff towards the rescue plan. C. Transparency and commitment to achieve pledged goals is vital to avert future crisis and maintain credibility with lenders. D. In addition, exorbitant remuneration perks and bonuses to CEO’s of all three corporations will be eliminated from the package. —————————————————- Congressional Obligation: It is incumbent on Congress particularly with members across the aisles to address the serious challenge in the manufacturing sector currently facing the nation by putting partisan politics aside and prioritizing the needs of the American labor. The nation is grappling with an economy saddled with — Multi-trillion dollar debt Financial crisis Deteriorating housing market Unpredictable stock market All of the above factors threatening the stability of every industry and the fabric of the economic infrastructure. Any more layoffs in any industry particularly the manufacturing companies will debilitate the economic recovery plans in process. It is time to bid farewell to party bickering, earmarks and Pork Barell spending that have resulted in Washington gridlock on all matter of national interest. Legislators must act diligently and promptly by approving the entire amount $75 billion from the suggested source for all three companies to protect American jobs and the ailing economy. It is time for action and not procrastination. Finally, people in a democracy elect representatives and entrust power to solve problems and safeguard their interests so, National interest must supersede all other interests. Thank you.
https://www.padminiarhant.com/tag/bank-of-america/
Fox News anchor Jon Scott claimed the Bush tax cuts generated growth and substantially increased revenue. In fact, economists say the Bush tax cuts produced anemic growth at best while creating substantial budget deficits that persist to this day. Fox's Jon Scott: Bush Tax Cuts “Did Generate Growth” And “Substantially” Increased Revenue Fox Anchor Jon Scott: “All Those [Bush Tax] Cuts Did Generate Growth.” While introducing a segment on the Bush tax cuts, Happening Now anchor Jon Scott claimed that the Bush tax cuts generated economic growth and that “revenue jumped substantially”: SCOTT: President Bush called for another round of tax cuts to jumpstart the economy. In 2003, President Bush signed the Jobs and Growth Tax Relief Act, saving taxpayers an estimated $350 billion. All those cuts did generate growth, and by the end of his second term, revenue jumped substantially from $2 trillion when he took office, reaching $2.5 trillion in 2008. [Fox News, Happening Now, 7/10/12] Economists: Bush Tax Cuts Did Little To Boost The Economy Former Reagan Economist Bruce Bartlett: “The 2001 Tax Cut Did Nothing To Stimulate The Economy.” In a post on The New York Times' Economix blog, economist Bruce Bartlett -- a former adviser to Presidents Reagan and George H.W. Bush -- wrote that not only did the Bush tax cuts fail to stimulate the economy, but contrary to Scott's claim that revenue “jumped substantially” during the Bush years, it actually fell as a percentage of GDP: The 2001 tax cut did nothing to stimulate the economy, yet Republicans pushed for additional tax cuts in 2002, 2003, 2004, 2006 and 2008. The economy continued to languish even as the Treasury hemorrhaged revenue, which fell to 17.5 percent of the gross domestic product in 2008 from 20.6 percent in 2000. Republicans abolished Paygo in 2002, and spending rose to 20.7 percent of G.D.P. in 2008 from 18.2 percent in 2001. According to the C.B.O., by the end of the Bush administration, legislated tax cuts reduced revenues and increased the national debt by $1.6 trillion. Slower-than-expected growth further reduced revenues by $1.4 trillion. [The New York Times, Economix, 6/12/12] Bartlett: Bush Tax Cuts “Did Not” Increase Rate Of Economic Growth. Bartlett wrote in a previous Economix blog post of the lack of an effect on economic growth the Bush tax cuts had: It would have been one thing if the Bush tax cuts had at least bought the country a higher rate of economic growth, even temporarily. They did not. Real G.D.P. growth peaked at just 3.6 percent in 2004 before fading rapidly. Even before the crisis hit, real G.D.P. was growing less than 2 percent a year. By contrast, after the 1982 and 1993 tax increases, growth was much more robust. Real G.D.P. rose 7.2 percent in 1984 and continued to rise at more than 3 percent a year for the balance of the 1980s. Real G.D.P. growth was 4.1 percent in 1994 despite widespread predictions by opponents of the 1993 tax increase that it would bring on another recession. Real growth averaged 4 percent for the balance of the 1990s. By contrast, real G.D.P. growth in the nonrecession years of the 2000s averaged just 2.7 percent a year -- barely above the postwar average. [The New York Times, Economix, 7/26/11] Center For American Progress: Under Bush Tax Policies, Jobs Growth Was “Anemic” And Economic Growth Was “Much Slower.” A Center for American Progress study on the effect of the Bush tax cuts on jobs and gross domestic product (GDP) growth found that the tax cuts “didn't deliver what they promised”: The economy boasted 132 million jobs in June of 2001, the month that the first of the Bush tax cuts was signed into law. Three years later, in June of 2004, there were just 131.4 million jobs. The economy did not add a single new job during three years under the Bush tax cuts. The next three years were better than the first three as the private sector struggled back to its feet following the first Bush recession. By June of 2007, before the start of the Great Recession, total jobs had grown to 137.7 million. Overall, the six years following the Bush tax cuts saw a 4.8 percent increase in jobs. That's not nothing, but it's pretty anemic compared to job growth under President Bill Clinton. President Clinton, after raising taxes in 1993, oversaw an economy that went from 111 million jobs in August of that year (the month Clinton's budget plan passed, including the increase in taxes) to 129 million jobs six years later--an increase of 16.2 percent, and more than three times better than under the Bush tax cuts. And the Bush tax cuts didn't just fail to stack up on jobs. Overall economic growth was much slower under the Bush administration's tax policies than under the Clinton administration's tax policies. Real gross domestic product grew by 26 percent in the six years after Clinton's tax increases. But real GDP grew by just 16 percent in the six years after the Bush tax cuts began. In fact, that six-year growth rate was low even by general historical standards. The average real GDP growth in any given six year period (from any quarter to the same quarter six years later) since World War II was 22 percent. The report included these charts comparing jobs growth and real GDP growth between the Bush and Clinton administrations: [Center for American Progress, 7/29/10] CBPP: Bush Economic Expansion “Was Sub-Par Overall.” The Center on Budget and Policy Priorities found that the economic expansion between 2001-2007 was especially weak despite the Bush tax cuts: Members of the [Bush] Administration routinely tout statistics regarding recent economic growth, then credit the President's tax cuts with what they portray as a stellar economic performance. But as a general rule, it is difficult or impossible to infer the effect of a given tax cut from looking at a few years of economic data, simply because so many factors other than tax policy influence the economy. What the data do show clearly is that, despite major tax cuts in 2001, 2002, 2003, 2004, and 2006, the economy's performance between 2001 and 2007 was from stellar. Growth rates of GDP, investment, and other key economic indicators during the 2001-2007 expansion were below the average for other post-World War II economic expansions ... Growth in wages and salaries and non-residential investment was particularly slow relative to previous expansions, and, while the Administration boasts of its record on jobs, employment growth was weaker in the 2001-2007 period than in any previous post-World War II expansion. CBPP's report included this chart comparing key economic indicators to past expansions: [Center on Budget and Policy Priorities, 5/9/08] Economic Policy Institute: After Bush Tax Cuts Were Enacted, Economy Experienced “The Worst Economic Expansion Of The Post-War Era.” An EPI policy memo released 10 years after the first Bush tax cuts were enacted found that they were a “poor stimulus” and “did not lead to faster economic growth”: Not only were the Bush-era tax cuts a poor stimulus coming out of the 2001 recession, they did not lead to faster economic growth during the economic expansion leading up to the Great Recession. • Between the end of the 2001 recession (2001Q4) and the peak of that expansion (2007Q4), the U.S. economy experienced the worst economic expansion of the post-war era. • Growth in investment, GDP, and employment all posted their worst performance of any post-war expansion. • The tax cuts were supposed to encourage business investment, but nonresidential fixed investment increased a meager 2.1% annually--a third of the average increase and less than half that of the next poorest post-war increase in business investment on record. [Economic Policy Institute, 6/1/11] Economists Say Bush Tax Cuts Lowered Revenue And Contributed Substantially To Debt And Deficits Bartlett: Bush Tax Cuts Caused Federal Revenue To Shrink. Economist Bruce Bartlett explained in a New York Times blog post that the Bush tax cuts were responsible for lowering federal revenues as a share of the economy: In a previous post, I noted that federal taxes as a share of gross domestic product were at their lowest level in generations. The Congressional Budget Office expects revenue to be just 14.8 percent of G.D.P. this year; the last year it was lower was 1950, when revenue amounted to 14.4 percent of G.D.P. But revenue has been below 15 percent of G.D.P. since 2009, and the last time we had three years in a row when revenue as a share of G.D.P. was that low was 1941 to 1943. Revenue has averaged 18 percent of G.D.P. since 1970 and a little more than that in the postwar era. At a similar stage in previous business cycles, two years past the trough, revenue was considerably higher: 18 percent of G.D.P. in 1977 after the 1973-75 recession; 17.3 percent of G.D.P. in 1984 after the 1981-82 recession, and 17.5 percent of G.D.P. in 1993 after the 1990-91 recession. Revenue was markedly lower, however, at this point after the 2001 recession and was just 16.2 percent of G.D.P. in 2003. The reason, of course, is that taxes were cut in 2001, 2002, 2003, 2004 and 2006. [The New York Times, Economix, 7/26/11] Revenue As A Share Of The Economy Fell During Bush Administration And Never Recovered. Data from the Federal Reserve Bank of St. Louis show that federal revenue as a percentage of GDP fell after the earliest Bush tax cuts, and never again reached the level they were at when President Bush took office: [Federal Reserve Bank of St. Louis, accessed 7/11/12] CBPP: “Congressional Budget Office Data Show That The Tax Cuts Have Been The Single Largest Contributor” To Budget Deficits. A 2008 Center on Budget and Policy Priorities report on myths about tax cuts found that tax cuts were the “single largest contributor to the reemergence of substantial budget deficits in recent years”: Congressional Budget Office data show that the tax cuts have been the single largest contributor to the reemergence of substantial budget deficits in recent years. Legislation enacted since 2001 added about $3.0 trillion to deficits between 2001 and 2007, with nearly half of this deterioration in the budget due to the tax cuts (about a third was due to increases in security spending, and about a sixth to increases in domestic spending). Yet the President and some Congressional leaders decline to acknowledge the tax cuts' role in the nation's budget problems, falling back instead on the discredited nostrum that tax cuts “pay for themselves.” The CBPP report included this graph based on CBO data: [Center on Budget and Policy Priorities, 5/9/08] EPI: Bush Tax Cuts “Added $2.6 Trillion To The Public Debt Over 2001-10.” In a September 26, 2011, article, Andrew Fieldhouse of the Economic Policy Institute (EPI) wrote: A spending-cuts-only approach is regressive in that it forces the brunt of deficit reduction on the backs of poor and working families while ignoring a prime culprit of the budget deficit: the expensive, ineffective, and unfair Bush-era tax cuts. These top-heavy tax cuts added $2.6 trillion to the public debt over 2001-10 and will add $3.8 trillion to deficits over the next decade if fully continued. [Economic Policy Institute, 9/26/11] And The Bush Tax Cuts Continue To Contribute To Deficits CBPP: “Virtually The Entire Federal Budget Deficit Over The Next Ten Years” Is Due To Bush Policies, Economic Downturn. Center on Budget and Policy Priorities (CBPP) chief economist Chad Stone said in testimony before the Joint Economic Committee in June 2011: Figure 1 focuses on the projected deficit going forward. It shows that the economic downturn, tax cuts enacted under President Bush, and the wars in Afghanistan and Iraq explain virtually the entire federal budget deficit over the next ten years. The economic downturn added about $300 billion chiefly from the operation of the automatic stabilizers (declining revenue and increased outlays for unemployment insurance and other pro-cyclical spending) and associated interest costs. Both the financial-market measures enacted under President Bush and largely implemented under President Obama such as the Troubled Asset Relief Program (TARP), and the Recovery Act tax cuts and increases in spending enacted under President Obama, were important drivers of the surge in deficits in 2009-11, but those measures will largely have phased out by the end of this year, leaving only associated interest costs in subsequent years. The CBPP included the following graphic to highlight Stone's testimony: [Center on Budget and Policy Priorities, 6/21/11] CBO: “Growing Debt” “Reflects An Imbalance Between Revenues And Spending That Predated The Recession.” The Congressional Budget Office found that the growing debt-to-GDP ratio stems from both the recession and from the Bush administration's tax polifies. From the CBO's latest Budget and Economic Outlook:
https://www.mediamatters.org/fox-news/fox-news-inflates-impact-bush-tax-cuts
The right-wing would have us believe that President Obama has accomplished little, when in reality, he has been one of the most active presidents in U.S. history. Like all presidents, Obama has made mistakes, but his ever growing list of accomplishments would be the envy of most. The following list includes a few highlights of some of President Obama’s initiatives, all fact-checked by Robert P. Watson, Ph.D., Lynn University. The list is current, through the end of 2011. In Part 1 we look at President Obama’s economic, educational, energy & environmental accomplishments. In other parts we will explore accomplishments with the budget, consumer protections, disaster response, economics, education, energy & the environment, ethics, foreign policy and more. Be sure to check out Part 2, which includes highlights of President Obama’s initiatives in areas related to foreign policy, Iraq and Afghanistan, the military and veterans, and national security. ECONOMY • Increased infrastructure spending (roads, bridges, power plants…) (2009-2010) * Note: Bush was the first president since Herbert Hoover to not make infrastructure a priority • Authorized the US auto industry rescue plan and two GMAC rescue packages (2009) • Authorized the housing rescue plan and new FHA residential housing guarantees (2009) • Authorized a $789 billion economic stimulus plan (American Recovery and Reinvestment Act) (Feb 17, 2009) * Note: 1/3 in tax cuts for working-class families; 1/3 to states for infrastructure projects; 1/3 to states to prevent the layoff of police officers, teachers, etc. at risk of losing their jobs because of state budget shortfalls • Instituted a new rule allowing the public to meet with federal housing insurers to refinance (in as quickly as one day) a mortgage if they are having trouble paying (2009) • Signed the Serve America Act (April 21, 2009) • Authorized a continuation of the US financial and banking rescue plans initiated at the end of the Bush Administration and authorized TARP funds to buy “toxic assets” from failing financial institutions (2009) • Authorized the “Cash for Clunkers” program that stimulated auto sales and removed old, inefficient, polluting cars from the road (2009); due to large response to program, signed Cash for Clunkers Extension (Aug 6, 2009) • Convened a “jobs summit” to bring experts together to develop ideas for creating jobs (2009) • Placed a 35% tariff on Chinese tires and a few other products such as pipes after China was found to be illegally “dumping” exports below cost (2009) * Note: Clinton, Bush I, and Reagan all refused to “get tough” on China’s predatory trade practices; Bush II refused four times during his presidency • Ordered the FDIC to beef up deposit insurance (2009) • Authorized the federal government to make more loans available to small businesses and ordered lower rates for federal loans to small businesses (2009) and signed a temporary extension of the Small Business Act (March 20, 2009) • Signed Jobs for Main Street Act (Feb 24, 2010) • Efforts to protect US firms’ trademarks, patents, and copyrights; signed Trademark Technical and Conforming Amendment Act (March 17, 2010) • Secured authority to compensate certain furloughed federal employees (March 11, 2010) • Signed Hiring Incentives to Restore Employment Act (March 18, 2010) • Signed Airport and Airway Extension Act (April 30, 2010) • Enhanced antitrust penalties and reforms (June 9, 2010) • Signed Small Business Investment Act (June 17, 2010) • Homebuyer Assistance and Improvement Act (July 2, 2010) • Extension of Unemployment Compensation (July 22, 2010) • FAA Air Transportation Modernization and Safety Improvement Act (Aug 10, 2010) • Signed US Manufacturing Enhancement Act (Aug 11, 2010) • Signed Small Business Jobs Act (Sept 27, 2010) • Signed Worker, Homeowner, and Business Assistance Act (Nov 6, 2010); signed • Extended unemployment benefits for one million workers and expanded coverage for some existing homeowners who are buying again (Nov 2010) • Called on Congress to deliver a comprehensive “Jobs bill” (2010) • Signed a bill to extend unemployment benefits set to expire (2010) • Signed historic Wall Street reform bill (2010) * Note: Designed to reregulate and end abusive practices and promote consumer protections • Signed the HIRE Act to stimulate the economic recovery (2010) * Note: The bill includes: tax cuts for small businesses who hire someone unemployed for at least two months; small businesses can write off their investments in equipment this year; etc.
http://samuel-warde.com/2012/07/president-obamas-accomplishments-part-1/
# HOME STAR HOME STAR, (also spelled HOMESTAR), informally known as Cash for Caulkers, is a United States government program proposed in November 2009 to encourage economic growth by offering incentives to homeowners and retailers for improving the energy efficiency of existing homes. ## Background In late 2009 there was a broad perception that the United States economy was beginning to recover from the Late-2000s recession. There was a broad perception that government spending authorized by the American Recovery and Reinvestment Act of 2009 had contributed to the recovery, and some desire for the government to do more to encourage job growth and a faster recovery. In mid-November former president Bill Clinton, and John Doerr of Barack Obama's President's Economic Recovery Advisory Board, proposed different versions of an economic stimulus program by which the government would offer tax incentives to encourage people to improve the energy efficiency of their homes. Doerr, in public speeches, called the proposal "cash for caulkers". Separately U.S. Representative Peter Welch proposed a system of energy rebates to Rahm Emanuel, Obama’s Chief of Staff. Obama, in turn, proposed the idea as part of a larger new stimulus program, at a speech at the Brookings Institution on December 8, 2009. The stated goals of the proposed program are to reduce pollution, particularly greenhouse gases, by reducing household energy use, to save consumers money in the long term through lower power bills, and to stimulate American businesses through the money spent on appliances, materials, and installation. Improving the energy efficiency of "fixed infrastructure", which accounts for approximately 40% of all energy use in the United States, is considered the "low hanging fruit" of energy conservation - a step that achieves results relatively inexpensively and does not require any new technologies or changes to production or consumption methods. The name "Homestar" is a reference to the popular energy star electronic device efficiency rating system, and the nickname "Cash for Caulkers" is a play on the earlier cash for clunkers automobile trade-in incentive. ## Structure As of December 2009, no proposed legislation had been released, and there were few specific details of how the program would be administered, which federal agencies would be involved, or how the tax incentives would be paid (or to whom). The program is expected to involve preliminary energy audits by private contractor energy experts, who then recommend a series of steps for each homeowner to upgrade their home's energy efficiency. As proposed the plan was for the government to pay 50% of the cost of each home improvement project through a rebate, tax credit, or funds paid to manufacturers and retailers, up to a maximum of $12,000 paid for each home. Alternatively, there was speculation that the federal government might give funds to local governments to run their own programs. There was no limitation on eligibility based on tax bracket or income. Items under consideration for the program included weatherization of home by installing additional insulation, new doors, and windows, and replacing old appliances with more energy-efficient new ones. Expensive items such as washing machines, dishwashers, refrigerators, air conditioners, and heaters, would be covered. The program was expected to cost approximately $10 billion over the course of one year, paid for out of unspent Troubled Asset Relief Program funds, and would reduce energy consumption of homes that took full advantage of the program by up to 20%. To become effective it would have to be part of a bill passed by the United States Congress.
https://en.wikipedia.org/wiki/Cash_for_Caulkers
Who’s to Blame for the Deficit Numbers? - Share Share Who’s to Blame for the Deficit Numbers? Forty percent of the new deficit numbers is due to President Bush, 20 percent to the economic downturn, and 16 percent to Obama's efforts to save the economy, write Michael Ettlinger and Michael Linden. Download this memo (pdf) The revised deficit numbers reported by the Congressional Budget Office and the Office of Management and Budget today show a lower deficit than previously estimated for 2009, with higher deficits for 2010 and beyond. Political opportunists will be busy looking for chances to score points over these numbers—pinning the dismal fiscal picture on the Obama administration. The real story is, however, fairly obvious. The policies of the Bush administration, which included tax cuts during a time of war and a floundering economy, are clearly the primary source of the current deficits. The Obama administration policies that are beginning to give the economy a needed jumpstart—the American Recovery and Reinvestment Act in particular—place a distant third in contributing to the 2009 and 2010 deficit numbers. The deficit picture for the years beyond still needs to be painted. To come to these conclusions, we calculated the relative importance of the several factors contributing to the 2009 and 2010 deficits by looking at the impact in those years of various policies. A detailed description of our approach is at the end of this column. Below is the percentage share of the major contributing factors to the total deterioration from the surpluses projected in 2000 to the current deficits according to our analysis. The policies of President George W. Bush make up the largest share, followed by the current economic downturn, and then President Barack Obama’s policies. Before explaining these further, it should be said that the generally worse deficit numbers reported today aren’t all that surprising. Since the last projections in May, it’s been plain that this recession has been worse than most analysts thought. With a weak economy comes lower tax revenue and higher safety net expenditures—with the loss in tax revenue causing the lion’s share of the deficit problem. The effects of a deeper recession have a long-lasting impact. Even as growth is restored, it is growth from a reduced starting point—a smaller economy in 2009 usually means a smaller economy than previously predicted for several years hence. Encouragingly, there have been signs of late that the administration’s policies to end the recession are starting to take hold. Without such efforts, the picture would be much gloomier, particularly in the short term. One piece of good news is that the government is no longer expecting to spend another $250 billion rescuing financial institutions through the Troubled Assets Relief Program—which explains the improved deficit picture for 2009. And the projections for deficits in future years would be far more pessimistic if the American Recovery and Reinvestment Act policies were not starting to get traction. As for the deficit’s cause, the single most important factor is the legacy of President George W. Bush’s legislative agenda. Overall, changes in federal law during the Bush administration are responsible for 40 percent of the short-term fiscal problem. For example, we estimate that the tax cuts passed during the Bush presidency are reducing government revenue collections by $231 billion in 2009. Also, because of the additions to the federal debt due to Bush administration policies, the government will be paying $218 billion more in interest payments in 2009. Had President Bush not cut taxes while simultaneously prosecuting two foreign wars and adopting other programs without paying for them, the current deficit would be only 4.7 percent of gross domestic product this year, instead of the eye-catching 11.2 percent—despite the weak economy and the costly efforts taken to restore it. In 2010, the deficit would be 3.2 percent instead of 9.6 percent. The weak economy also plays a major role in the deficit picture. The failure of Bush economic policies—fiscal irresponsibility, regulatory indifference, fueling of an asset and credit bubble, a failure to focus on jobs and incomes, and inaction as the economy started slipping—contributed mightily to the nation’s current economic situation. When the economy contracts, tax revenues decline and outlays increase for programs designed to keep people from falling deep into poverty (with the tax impact much larger than the spending impact). All told, the weak economy is responsible for 20 percent of the fiscal problems we face in 2009 and 2010. President Obama’s policies have also contributed to the federal deficit—but only 16 percent of the projected budget deterioration for 2009 and 2010 are attributable to those policies. The American Recovery and Reinvestment Act, designed to help bring the economy out of the recession is, by far, the largest single additional public spending under this administration. The cumulative cost of the financial sector rescue, mostly initiated under President Bush in response to the financial markets collapse, is also significant—contributing to 12 percent of the problem. A variety of other changes, described in the methodology section, are also contributors. For the longer term, it’s a bit disingenuous to assign any responsibility for the deficits. That’s a story yet to be told, and CBO and OMB provide a selection of numbers to choose from for the long run. Much will depend on how the economy fares. If the Bush tax cuts, scheduled to expire at the end of 2010, were to be continued in their entirety there would be large deficits. If, as the Obama administration has proposed, they are only extended for those making under $250,000, then they still contribute to the deficit but not as substantially. There are a number of similar budget items that have a long history for which one can, with equal legitimacy, assign responsibility to either their originators or current policymakers for continuing them. New Obama program initiatives, it’s important to note, contribute little to future deficits. The administration has insisted that its additional spending, especially on health care, be fully paid for with savings elsewhere in the budget and additional revenues. In fact, to address our budget challenges it is critical to reform health care which, through Medicare, Medicaid, and other programs, is the single biggest budget headache in the long run. Regardless of responsibility, of course, the long-run deficit situation is one that needs to be addressed. Michael Ettlinger is the Vice President for Economic Policy and Michael Linden is the Associate Director for Tax and Budget Policy at American Progress. Methodology Contributors to the nation’s fiscal situation in 2009 and 2010 (in billions of dollars), as measured against surpluses projected in 2001 |2009||2010| |President Bush’s policies||-$923 billion||-$918 billion| |Current economic downturn||-$426 billion||-$469 billion| |President Obama’s policies||-$225 billion||-$497 billion| |Financial rescues begun by President Bush||-$422 billion||-$123 billion| |All other||-$302 billion||-$262 billion| Three times each year, the Congressional budget office releases revised estimates of its budget projections going forward 10 years. In each of these revisions, the CBO describes how its current estimate has changed from its previous estimate, and why. By studying these estimates, we can attribute the change in the federal bottom line to various factors: specific legislative policies, changing economic conditions, and technical modifications. Specifically, in January of 2001, just as President George W. Bush was taking office, the Congressional Budget Office projected that in fiscal year 2009, the federal budget would enjoy a $710 billion surplus. Today the Congressional Budget Office says that the budget will have a $1.6 trillion deficit, a swing of $2.3 trillion. Our analysis looks at the component causes of that swing. Note that this is somewhat different than determining the sources of the deficit—the numbers we derive add up to more than the deficit because they include loss of surplus. It is reasonable, however, to allocate the costs pro-rata between the surplus reduction and the deficit increase. Thus, the percentages presented above can be fairly characterized as the percentage contribution of each factor to the deficits for each year. In order to determine what caused that swing, we allocated changes in CBO’s projections to one of five categories. To President Bush we attributed all changes that CBO marked as “legislative” from its January 2002 update until its September 2008 update. We then modified this total in several ways. First, we subtracted more than $40 billion due to later revisions in CBO’s estimate of the costs of Medicare Part D. CBO categorizes these changes as “technical.” Second, we added about $60 billion in costs stemming from the economic stimulus of 2008 that CBO also classifies as “technical.” Finally, we adjusted downward the current cost of President Bush’s tax cuts. CBO’s estimates of the cost of President Bush’s tax proposals for 2009 and 2010 were based on its economic assumptions for those years. Because the economy is worse than CBO expected at the time it made those estimates, the cost of those tax cuts is also somewhat smaller than expected—as the tax system in general is producing less revenue, the cost of enacted tax reductions is less. To account for this, we adjusted the cost estimates of both the Economic Growth and Tax Relief Reconciliation Act and the Tax Increase Prevention and Reconciliation Act (the Job Growth and Tax Relief Reconciliation Act had no budgetary effect for 2009 and 2010) by the same ratio as CBO’s GDP projections at the time and current projections. This adjustment has the effect of reducing the amount of the fiscal deterioration attributable to President Bush. We believe this is more generous to the former president’s contribution to the current problems than a similar analysis recently conducted by The New York Times. The impact of the current economic downturn was calculated by summing all of the changes attributed to “economic factors” in CBO’s estimates from January 2008 through August 2009. To these we added revenue adjustments made in January and March 2009 that CBO classifies as “technical” but describes as being mostly due to economic changes. To President Obama, we attributed all legislative changes since CBO’s March 2009 update. The “financial rescues begun by President Bush” category consists of expenditures stemming from TARP and the Federal Deposit Insurance Corporation, and from CBO’s decision to bring Fannie Mae and Freddie Mac onto the federal books. The remaining causes, including the economic changes from 2001 to 2007, CBO’s technical changes not accounted for elsewhere, and policies enacted at the very end of 2008 (such as Alternative Minimum Tax relief) were allocated to “all other.” We added $100 billion in additional expenditures for 2010 because CBO’s baseline does not include an additional AMT “patch” for fiscal year 2010, though such a “patch” is exceedingly likely. Download this memo (pdf) The positions of American Progress, and our policy experts, are independent, and the findings and conclusions presented are those of American Progress alone. A full list of supporters is available here. American Progress would like to acknowledge the many generous supporters who make our work possible.
https://www.americanprogress.org/article/whos-to-blame-for-the-deficit-numbers/
The Prospect is proud to exclusively release the book Take Back Our Party: Restoring the Democratic Legacy by James Kwak. We will release one chapter every other day over the next two weeks. Read the Introduction. Our New Choice plainly rejects the old categories and false alternatives they impose.… Liberal or conservative—the truth is, it’s both, and it’s different. —Bill Clinton, at the 1991 annual conference of the Democratic Leadership Council I have a poor memory of my own life, but I remember where I was when Bill Clinton was elected president: Kroeber Hall, on the campus of the University of California, Berkeley, in the evening of November 3, 1992. A friend and I were listening to the radio on a boombox when we heard the news. The nightmare of the Reagan-Bush years was over. We hugged. The Democrats were back in power. Not surprisingly, I learned my politics from my parents. They came to the United States from South Korea, as they say, in search of a better life. My father arrived in San Francisco in 1953 on a freight ship that returned to America after delivering food to GIs fighting the Korean War. He then took a train and a bus across the country, arriving at Wesleyan College two months after the semester started (never having received the letter from Wesleyan telling him it was too late to come). My mother came nine years later to study at the University of Michigan. From them I learned that the Democrats were the party of workers, unions, and the poor, while the Republicans were the party of business and the rich. The archetypal Democratic hero at the time was Franklin Roosevelt, the president of the New Deal, which included massive public works projects to fight unemployment, comprehensive regulation of the financial system, and the creation of Social Security. In 1944, he called for a “second Bill of Rights,” which would guarantee all Americans a job, basic material necessities, housing, health care, and an education—but which never materialized. His successor, Harry Truman, proposed a universal, single-payer health care program. And in 1964, President Lyndon Johnson declared “unconditional war” on poverty, pushing forward a legislative agenda that included Medicare, Medicaid, food stamps, and federal subsidies for schools with poor children. That was what the Democratic brand still meant in the 1970s, when I was a child. At that moment, however, the legacy of the New Deal was in danger, weakened by the Vietnam War and economic stagnation. With a growing backlash against regulation, high taxes, and the civil rights movement, the modern American conservative movement—which had languished in obscurity as recently as the 1950s—was poised to take over the Republican Party and then the country. Democrats were aware that President Johnson’s anti-poverty platform was losing viability on the national level. Jimmy Carter represented an alternative to traditional liberalism, but he was soundly rejected by voters in 1980 in favor of Ronald Reagan, who gave off an aura of breezy confidence that lower taxes and smaller government would unleash the American innovative spirit and usher in a new age of prosperity. The reality of the Reagan Revolution, with its tax cuts for the rich and assault on the welfare state, was the fulfillment of Democrats’ worst nightmares. After 12 years of Reagan and George H.W. Bush, we were desperate for change. The change that Clinton represented, however, went beyond a simple partisan shift in the White House. His historical significance lay in what he did not do. He did not reverse the conservative revolution and restore the core values of the New Deal. Instead, Clinton signaled something far more important: a realignment of the traditional account of American politics, in which Democrats represented wage workers and Republicans represented executives and business owners. Things were never quite so simple, of course—Democrats had long been financially dependent on particular segments of the business class, increasingly throughout the 1980s—but in general terms the economic identities of the two parties were clear. The Clinton election was a crucial turning point in a transformation of the Democratic Party that has lasted to this day, more than a quarter of a century later. As the Republicans shifted to the right, metamorphosing from perfunctory defenders of corporate America to rabid zealots for unregulated markets and minimal government, the Democrats followed in their wake. The party that was once our country’s closest approximation to social democrats instead became technocratic cheerleaders for the market economy. Democratic politicians abandoned government policies that directly aided the disabled, the unlucky, and the poor, consoling themselves with the idea that facilitating overall economic growth was the best way to help the less fortunate. Stephan Savoia/AP Photo Bill Clinton and Al Gore at a campaign stop in Carlisle, Pennsylvania, July 1992 The Clinton nomination in 1992 was itself the end product of a concerted effort to move the Democratic Party away from the left, with its embarrassing baggage of welfare programs and labor unions, and toward a more modern center. In the public’s mind, Republicans had successfully portrayed their opponents as tax-and-spend liberals who suppressed individual initiative, gave handouts to welfare queens, coddled criminals, and appeased Communists. Instead of trying to stand up for the social safety net and progressive taxes (and a humane criminal justice system), many Democratic leaders and strategists chose to distance themselves from their liberal past, accepting the conservative caricature of the New Deal and its legacy. The new Democratic Party would be responsible with the nation’s finances, strong on national defense, tough on crime, friendly to business, unfriendly to unions (except at election time), and supportive of free markets. In other words, it would be a lot like the Republican Party—but somehow, drawing on the credit it had built up from the 1930s through the 1960s, it would still claim to be the party of ordinary people. The shift toward a market-centric ideology was foreshadowed by President Carter’s 1980 re-election campaign, as he unsuccessfully tried to co-opt Reagan’s vocabulary of individual initiative and freedom. “Every day millions of economic decisions are made in factories, in automobile showrooms, in banks and in brokerage houses, on farms and around kitchen tables … according to private needs and private individual judgments,” he said, attempting to rebut the charge that Democrats represented government control over the economy. Pressure to move the party to the right increased after Carter’s stinging loss (and the loss of the Senate majority for the first time since the Eisenhower administration). Louisiana Representative Gillis Long, the new chair of the House Democratic Caucus, sponsored the creation of a new Committee on Party Effectiveness, staffed notably by Al From. This committee began formulating a new platform—one focused on increasing private-sector competitiveness, reducing government spending, fighting inflation, and strengthening national defense. “We want to move away from a temporary economic policy of redistribution … to a long-term policy of growth and opportunity,” Representative Tim Wirth told reporters in 1982. The takeover of the party began in earnest, however, after President Reagan routed Democratic nominee Walter Mondale in the 1984 presidential election, which, many people thought, demonstrated the impotence of traditional liberalism (despite the fact that Mondale had promised higher taxes to reduce the national debt). Only weeks after the election, a group of major Democratic fundraisers met to discuss “how they might use their fund-raising skills to move the party toward their business-oriented, centrist viewpoints.” The same month, at an event held by the centrist Coalition for a Democratic Majority, Governors Bruce Babbitt of Arizona and Charles Robb of Virginia rejected the New Deal and emphasized the need for the party to embrace the business community. In 1985, former Caucus Committee director Al From founded the Democratic Leadership Council, which rejected the populist economic agenda inherited from Roosevelt and Johnson in favor of market-based solutions that could broaden the party’s appeal to the upper middle class and corporate America. “We cannot afford to become a liberal party,” From wrote in a memo for Long; “our message must attract moderates and conservatives, as well.” Their new message was a paean to growth and opportunity. “The private sector, not government, is the primary engine for economic growth,” From wrote later. “Government’s proper role is to foster private sector growth and to equip every American with the opportunities and skills that he or she needs to succeed in the private economy.” The DLC was one of the central institutions of what came to be known as the New Democrats, who distinguished themselves from the presumably “old” Democrats by staking out moderate or conservative—that is, Republican—positions on issues like defense spending, crime, inflation, budget deficits, and free trade. The Progressive Policy Institute, a think tank spun off from the DLC, even adopted one of the conservatives’ favorite attack lines, arguing that the minimum wage harmed poor people by raising prices. In 1990, Arkansas Governor Bill Clinton became chair of the DLC. He later wrote in his memoir, “the so-called New Democrat philosophy … was the backbone of my 1992 campaign for President.” Clinton spent his term as chair promoting the key themes of economic growth, limited government, and personal responsibility around the country—particularly in key primary states—before officially announcing his presidential candidacy in October 1991. He was joined on the campaign trail by Bruce Reed, former DLC policy director, who later became head of the president’s Domestic Policy Council; Robert Shapiro, who moved from the Progressive Policy Institute to become a campaign adviser and later an undersecretary in the Commerce Department; and, of course, vice-presidential nominee Al Gore, a longtime DLC stalwart. After the election, Al From headed domestic policy on the transition team; DLC vice chair Mike Espy was named secretary of agriculture; and DLC member Lloyd Bentsen became Treasury secretary. In many ways, Clinton was the perfect standard bearer for the New Democrats. He was a Southern governor, apparently unstained by the corruption of the big city or of Congress. He had an impressive reputation as a policy wonk, having won a Rhodes Scholarship and graduated from the Yale Law School. He could fly back to Arkansas during the campaign to order the execution of the mentally disabled Ricky Ray Rector to demonstrate how “tough on crime” he was. He emphasized welfare reform and school choice to highlight his willingness to break with past orthodoxies. He openly rejected his own party: “The choice we offer is not conservative or liberal. In many ways, it is not even Republican or Democratic,” he said, accepting the nomination of the Democratic Party. Joe Marquette/AP Photo President Clinton shakes hands with Secretary of State Madeleine Albright before beginning his State of the Union address, February 4, 1997. And yet, in public, Clinton played as a populist. He promised opportunity for a middle class that “worked hard and played by the rules.” His optimistic post-partisan rhetoric papered over the fact that many of his substantive positions—tax cuts, smaller government, welfare limits, increased policing, charter schools, and so on—were taken straight from Republicans. He was campaigning during the aftermath of a mild recession, promising jobs and health care. More importantly, he could act like a man of the people and, like no one else, he could feel your pain. President Clinton sustained that magic throughout his eight years in the White House. As his policies became more and more firmly anchored in the center, his personal charisma and political gifts preserved his image as a defender of the common man. Two decades later, Democrats’ memories of the Clinton administration tend to retain the Monica Lewinsky scandal, eight years of uninterrupted economic expansion, and little else. But in that period, Clinton put his stamp on the identity of his party like no president since Franklin Roosevelt. Because he wanted to be all things to all people, Clinton won the 1992 election promising not just welfare reform but also a middle-class tax cut, health care reform, and more jobs. Even before his inauguration, however, his economic team sat him down and explained that his top priority would have to be not programs aimed at ordinary Americans, but … the bond market. On January 7, 1993, the president-elect held a meeting with his economic team in Little Rock, Arkansas. According to his advisers—including Robert Rubin, former co-chairman of Goldman Sachs and Clinton’s choice as director of the National Economic Council—Wall Street was concerned about the large budget deficits incurred by Republican tax cuts and spending increases. Investors were demanding high interest rates to buy Treasury bonds, which raised rates throughout the economy, making it harder for businesses and households to borrow and constraining growth. The solution was to reduce budget deficits, which would lower interest rates and encourage more economic activity. Clinton supposedly responded, “You mean to tell me that the success of the program and my reelection hinges on the Federal Reserve and a bunch of fucking bond traders?” But he sided with the bond market, giving up his hoped-for domestic programs for a package of tax increases and deficit reduction targets. The bill squeaked through Congress and helped fuel the Republicans’ sweeping victories in the 1994 midterm elections. In retrospect, there are arguments in favor of the 1993 tax increase. It did reduce budget deficits, interest rates did fall, and the economy did expand for the next seven years. By the time Clinton left office, the federal government was actually running a surplus on an annual basis. Still, it is impossible to definitively say what policy produced what macroeconomic outcome; interest rates were already falling, and the economy had already begun growing in March 1991. It is clear, however, that the decision to focus on deficits left a lasting imprint on the identity of the Democratic Party. The economic boom of the 1990s—the longest in modern American economic history at the time—and the budget surpluses of Clinton’s second term allowed his party to claim the high ground when it came to fiscal rectitude and macroeconomic management. In the past, Democrats had been big spenders, willing to shove money at the nation’s problems, while Republicans had been the mature adults worrying about deficits. Now the tables were turned. The Republican Party was in the hands of conservatives who cut taxes at every opportunity, fervently insisting that deficits would require spending cuts in the future. By contrast, Clinton and his successors could claim to be the prudent, responsible ones who understood how the economy worked. This is why Democrats have objected to every Republican tax cut on the grounds that it would increase deficits, and why Barack Obama made a “grand bargain” to reduce the national debt a top political priority, even while the economy was struggling to recover from the Great Recession. Clinton remained determined to follow through on his campaign promise of health care reform. One challenge was that he and his team (led by Hillary Clinton), wary of undermining their newfound credibility as deficit fighters, insisted that their plan had to be budget-neutral and could not require additional new taxes beyond those in the 1993 budget bill. Perhaps more important, they rejected the basic model proposed by earlier generations of Democrats: a universal, single-payer health insurance program similar to Medicare. Instead, they were convinced that managed market competition, not a new government entitlement, could provide the solution to America’s health care problems. Under the Clinton plan, consumers would choose their insurance from among plans offered by private insurers, either via large employers or via regional health alliances. Competition, the argument went, would give insurers the incentive to deliver superior health plans at reasonable prices, and insurers would then put pressure on providers to ensure quality and control costs. There are many reasons why health care reform suffered an ignominious defeat in Congress, never even coming up for a vote. The episode, however, was another landmark in the shift of the Democratic Party toward market-based, technocratic policies and away from the New Deal blueprint of large-scale government social programs. The political backlash to the Clinton tax increase and health care debacle was overwhelming and immediate: The 1994 midterm elections gave Republicans control of both houses of Congress for the first time since the 1950s. The president’s response was to double down on “triangulation”—staking out a third position on the political landscape distinct from both Democrats and Republicans. And so it was Clinton, a Democratic president, who announced in his 1996 State of the Union address that “The era of big government is over.” What began as a clever strategy to secure re-election in the wake of the crushing midterm defeat has since become the standard playbook of the party: splitting the difference between conservative Republicans and a caricature of a liberal Democratic base whose primary function is to make the party establishment seem reasonable by comparison. In most cases, triangulation has led to the adoption of positions once characteristic of moderate Republicans. Mike Albans/AP Photo New York City police officers stand guard over a wounded man in Times Square, February 1994. One way to seek out the middle was to enact tough-on-crime bills, including the Violent Crime Control and Law Enforcement Act of 1994, which increased funding for police and prisons while lengthening sentences for federal crimes, and the Antiterrorism and Effective Death Penalty Act of 1996, which virtually eliminated the ability of the federal courts to review state court decisions, particularly in death penalty cases. After 1994, the Clinton administration’s only significant contribution to tax policy was a handout to the rich: the reduction in the maximum tax rate on capital gains from 28 percent to 20 percent in 1997. (Capital gains are the profits realized from selling assets, and thus are mainly collected by the wealthy.) Another of the president’s meager economic policy achievements was the ratification and implementation of the North American Free Trade Agreement. Although the effects of NAFTA on the American economy were relatively slight, it helped Democrats distance themselves from their traditional roots in labor unions while ingratiating themselves with the business community and burnishing their image as advocates of free trade. And it led to more aggressive trade actions, like establishing permanent normal trade relations with China, which had a far more disastrous impact on the nation’s industrial base, responsible for an 18 percent reduction in manufacturing employment from 2001 to 2007. At the core of the New Democrats’ agenda was disassociating their party from the welfare state. Since the 1960s, conservatives had criticized government assistance programs for undermining personal responsibility and encouraging dependency on the state. By the 1980s, association with welfare was an enormous political weakness for Democrats. Republicans successfully popularized the image of the “welfare queen” and tarred Democrats as soft-hearted, weak-minded liberals who raised taxes on people who worked to fund lavish benefits for people who didn’t. In 1984, Charles Murray’s hugely influential book Losing Ground (written with funding from two conservative think tanks, the Manhattan Institute and the Heritage Foundation) argued that welfare programs harmed poor people by undermining their incentive to work. Instead of defending the social safety net, the New Democrats simply co-opted the issue, adopting the idea that the poor needed better incentives to participate in the labor market. In 1986, DLC chair Charles Robb called for “a social policy that rewards self-discipline and hard work, not one that penalizes individual initiative.” In his 1992 campaign, Clinton staked out a position well to the right of President Bush, promising to “end welfare as we know it” and to impose a new work requirement that would kick in after two years of government assistance. In a 1993 interview, Clinton said, “[Murray’s] analysis is essentially right,” although he did not necessarily agree with the author’s policy prescriptions. Once in office, Clinton backed his words with action. After his own welfare reform proposal fizzled out, the 1994 elections gave the initiative to congressional Republicans. While the president attacked some of their more extreme proposals, his administration quietly signaled his openness to a bill that he eventually signed into law as the Personal Responsibility and Work Opportunity Reconciliation Act of 1996 (an Orwellian name even by Washington standards). As promised, the act completely transformed the welfare system. It eliminated Aid to Families with Dependent Children, which had ensured a federal entitlement to cash support, and replaced it with Temporary Assistance for Needy Families, a new program that allocated money as block grants to states, which could spend it more or less however they wished. (The block grants were also designed to decline over time, after accounting for inflation.) PRWORA set a lifetime maximum of five years of welfare support from federal funds and placed work requirements on recipients, while giving states the latitude to impose their own, more onerous restrictions. This “welfare to work” bill also included severe cuts to food stamp benefits and, in a premonition of worse things to come, cut off most legal immigrants from both food stamps and Supplemental Security Income (paid to the disabled and the very poor elderly). Politically, welfare reform paid off handsomely. Clinton followed through on a 1992 campaign promise just in time for re-election while neutering a potent Republican attack line. By breaking with much of his own party (PRWORA was passed mainly with Republican votes, while congressional Democrats were roughly evenly split), the president perfectly executed the triangulation strategy, distinguishing himself from both poles of the political system. Or, as Republican presidential nominee Bob Dole put it, “By selling out his own party, Bill Clinton has proven he is ideologically adrift.” At the same time, by vetoing two earlier, more conservative bills, Clinton could still claim to be the protector of the downtrodden—or, at least, not as bad as the Republicans. He swept to victory over Dole just a few months later, while the balance of power in Congress remained virtually unchanged—seeming to demonstrate the benefits of running to the middle and portraying both liberals and conservatives as out of touch with America. Welfare reform signaled that the Democrats were no longer the party of handouts to the poor. Instead of cash, the New Democrats promised to provide opportunity, which meant giving people the education necessary to compete in the new economy. Not surprisingly, President Clinton saw market incentives as the key to improving the public-school system. He embraced the idea of accountability, which was euphemistic shorthand for making the educational market more competitive. His administration supported nationwide K-12 standards and a requirement that states turn around or close schools that failed to meet those standards—in theory, a way to mimic the competitive forces of capitalism. The president also endorsed the budding charter school movement, which sought to provide a new source of competition to traditional public schools, which would be forced to improve or risk losing their students to the new entrants. The Federal Charter School Program, created in 1994, provided funds to support the development of charter schools by states and towns. When it came to higher education, President Clinton helped launch the Federal Direct Student Loan Program, under which the government lends directly to students, instead of subsidizing and guaranteeing loans made by banks (thereby ensuring them risk-free profits). This was a small step in the right direction, but the focus on loan programs typified the New Democrats’ approach to education. Instead of thinking of a college education as something that an advanced society should provide to its citizens, they saw it as an individual’s private investment in her human capital; the government’s role was simply to provide a nudge to make that investment easier to finance. At the same time, the Higher Education Amendments Act of 1998 made it more difficult to discharge student loans in bankruptcy. While this provision may have caused banks to lower interest rates on student loans (because they now faced less risk of not being paid back), it also had the effect of punishing those people who were unable to turn their college studies into higher-paying jobs. With welfare reform and education policy, the New Democrats repudiated the vocabulary of social solidarity and economic security in favor of their cherished themes of opportunity, accountability, and competitiveness. But to recast the party as architects of a dynamic, fast-growing economy, it was not enough to simply cut deficits and lower interest rates. By explicitly identifying themselves with the financial sector, and with Wall Street in particular, the Democrats rebranded themselves as the party of an innovative, prosperous future. This historic rapprochement with the bankers—traditionally a Republican stronghold—began with an open door to financial-industry executives unprecedented for a Democratic White House. Dennis Cook/AP Photo Treasury Secretary Robert Rubin in 1995 Robert Rubin, the former co-chair of Goldman Sachs, was Clinton’s first director of the National Economic Council and was promoted to Treasury secretary in 1995. Other Wall Street appointees included Gary Gensler of Goldman (Treasury undersecretary), Roger Altman of Lehman Brothers (deputy Treasury secretary), and Lee Sachs of Bear Stearns (assistant secretary for financial markets). Most surprisingly, in 1996 and again in 2000, Clinton reappointed Alan Greenspan—a libertarian free-market ideologue and onetime devotee of conservative intellectual darling Ayn Rand—as chair of the Federal Reserve Board of Governors. More important, Clinton definitively reversed the Democratic Party’s historic support for tight financial regulation. This dated back to the New Deal, when the Roosevelt administration and Congress had imposed sweeping constraints on the industry with the Glass-Steagall Act of 1933, the Securities Act of 1933, and the Securities Exchange Act of 1934. Financial deregulation had been one of the top priorities of President Reagan, but a decade later it was eagerly embraced by Democrats in both the White House and Congress. The Clinton years saw the Riegle-Neal Act of 1994, which opened the door to interstate banking and a wave of consolidation; the Gramm-Leach-Bliley Act of 1999, which dismantled the remaining barriers separating investment banking, commercial banking, and insurance; and the Commodity Futures Modernization Act of 2000, which effectively prohibited regulation of financial derivatives. All three bills were ultimately passed with major Democratic support in Congress. The Clinton administration even suppressed attempts by regulators to monitor excessive systemic risks—most famously when Rubin, Deputy Treasury Secretary Lawrence Summers, and Greenspan shut down a 1998 attempt by Brooksley Born, chair of the Commodity Futures Trading Commission, to study the possibility of increased oversight over derivatives. The Democrats’ new love affair with Wall Street was based on the theory that relaxing the rules governing financial institutions would stimulate innovation and promote the flow of capital to where it would do the most good for the economy. In particular, deregulation gave Democrats a shiny new housing story, free from the complications and negative connotations of public housing and Section 8 vouchers. Freeing markets for mortgages and mortgage-backed securities would increase the funding available to homebuyers, making it possible for more and more people to buy real estate—even if their incomes remained stagnant. As we know today, deregulation also made possible the highly concentrated and unstable financial system that collapsed in 2008, when the world suddenly realized that millions of those same mortgages could never be repaid. At the time, however, catering to the financial industry dramatically increased the flow of campaign donations from Wall Street. Historically, banks and securities firms had leaned Republican, but the Clinton administration’s embrace of deregulation—along with the Republican Party’s increasingly hostile stances toward women, gays, and minorities—attracted more and more contributions precisely at the time when shrinking unions were unable to keep the party afloat. Bill Clinton, Barack Obama, and Hillary Clinton were all able to raise as much or more money from the financial sector than their Republican opponents for president. The reconciliation between the Democratic Party and the financial industry also successfully redefined the party’s economic identity. Prior to the 1990s, Democrats were associated with the old economy: heavy manufacturing, with its smokestacks, acid rain, and largely unionized workforces. But after the stagnation of the 1970s and the invasion by cheap Japanese cars, the American manufacturing industry was widely perceived to be in decline. By contrast, finance seemed a hotbed of innovation, beginning with the leveraged buyouts and junk bonds that marked the 1980s. Oliver Stone’s 1987 movie Wall Street made corporate raider Gordon Gekko, best known for his “Greed is good” speech, a cultural icon; in 1989, Michael Lewis’s memoir Liar’s Poker, though written as an indictment of finance culture, helped make Wall Street the destination of choice for graduates of America’s elite colleges. Finance was perceived to be clean, modern, and sophisticated—an industry that moved dollars instead of steel, that relied on brains instead of brawn. Once upon a time, the Democratic Party helped the working class by supporting unions and promoting the social safety net. Now it claimed to help the working class by lubricating the flow of capital, which would promote economic growth and eventually create good jobs for everyone. Tired of being seen as for labor and against capital, the party could now be for everyone; or, in practice, it could be for capital while claiming to help labor as well. In less than a decade, the Democrats successfully repositioned themselves as the party of deficit reduction, welfare reform, market incentives, and financial innovation. The fact that the Clinton presidency coincided with a long economic expansion only sealed the bargain in Democrats’ minds. From this point the party’s platform would be technocratic management of a growing economy in which markets fulfill everyone’s needs. The 2000 presidential election demonstrated the weakness of this political strategy. Al Gore was the heir apparent, not only the two-term vice president, but also a fellow overeducated white Southerner and a longtime member of the Democratic Leadership Council. But when he tried to campaign on the continuation of the Clinton years (minus the sexual peccadilloes), it turned out that no one really knew what that meant. Without Clinton’s personal charisma to close the deal, it wasn’t clear that many voters wanted to buy the New Democrat brand—or that they could distinguish it from the empty “compassionate conservatism” that George W. Bush was peddling. In the late stages of the campaign, Gore tried to reinvent himself as a populist. “They’re for the powerful. We’re for the people,” he declared, but by that point it was just an empty slogan. Although overall economic growth had boosted the wages of lower-income workers, the administration he served in could point to few identifiable accomplishments that benefited “the people,” and while the president himself had the political skill to square that circle, his lieutenant did not. And so the moderation of the Clinton years gave way to the disaster of the Bush years: two major tax cuts for the rich, an all-out campaign against economic regulation, and the most severe financial crisis and recession for 70 years. Bob Childs/AP Photo Al Gore and his running mate Joseph Lieberman during a campaign stop in Stamford, Connecticut, August 2000 The New Democrats, however, escaped the shipwreck of Gore’s defeat with their prestige intact and only flourished during the next eight years. Indeed, they followed the model defined by conservative think tanks decades before, creating a network of Washington institutions that would house the “government in waiting” until the next shift in power. In addition to the Democratic Leadership Council, think tanks such as Third Way, the Progressive Policy Institute, the Center for American Progress, and the Hamilton Project (housed at the Brookings Institution) became refuges for Clinton administration alumni and sources of policy research favoring market-based solutions to economic problems. The Hamilton Project was founded by Robert Rubin, Clinton’s Treasury secretary, and the Center for American Progress by John Podesta, Clinton’s chief of staff. In Congress, Democrats used their new stature as responsible economic technocrats to criticize President Bush’s tax cuts on the grounds that they threatened to increase budget deficits. For the most part, however, corruption, incompetence, and the Iraq War tarnished the Republican brand enough that Democrats could win an emphatic victory in 2006 without having to articulate a coherent economic agenda. Hillary Clinton was the obvious first choice of the Democratic establishment for the 2008 presidential nomination, but Barack Obama fit the bill equally well—a smart, well-educated policy wonk who claimed to transcend party divisions. “There is not a liberal America and a conservative America—there is the United States of America,” he proclaimed as a Senate candidate giving the keynote speech at the 2004 Democratic National Convention. (After his election, Obama spoke at the launch of the Hamilton Project in 2006.) Obama was arguably the most moderate of the main primary contenders; on health care, for example, he opposed the individual mandate proposed by Clinton (and later incorporated into Obamacare). He won the presidency largely on his personal story and charisma, an uplifting but vague promise of change, and general disaffection with Republicans that was exacerbated by the 2008 financial crisis. Obama’s first inaugural address in 2009 was a masterpiece of triangulation, stereotyping and discarding both conservative and liberal positions in favor of an idealized center. “The question we ask today is not whether our government is too big or too small, but whether it works,” the new president said. His economic philosophy was even more squarely in the Clinton tradition: “Nor is the question before us whether the market is a force for good or ill. Its power to generate wealth and expand freedom is unmatched. But this crisis has reminded us that without a watchful eye, the market can spin out of control.” In short, markets are the source of prosperity, and government’s role is limited to ensuring that they function properly. Barack Obama borrowed not only Bill Clinton’s ideology, but also much of his staff. He turned over his transition team to Podesta, a longtime Clinton loyalist. Several of his Senate and campaign economic advisers were sidelined in favor of Clinton veterans. Among those taking top economic positions in the new administration were: Lawrence Summers, Robert Rubin’s understudy and successor as Treasury secretary in the Clinton years; Timothy Geithner, an undersecretary to Summers and Rubin protégé; Peter Orszag, a former Clinton adviser and director of the Hamilton Project; Jason Furman, director of the Hamilton Project after Orszag; Michael Froman, Rubin’s chief of staff in the Treasury Department; Gary Gensler, undersecretary for domestic finance under Summers; Mary Schapiro, head of the CFTC under Clinton; Neal Wolin, former general counsel of the Treasury Department; Jack Lew, former head of the Office of Management and Budget; and Michael Barr, a deputy assistant Treasury secretary under Rubin. When it came to economic affairs, it was clear that the new president was eager to embrace the legacy of the Clinton years. Coming into office in the depths of the Great Recession, Obama’s first task was to shore up a global financial system that had imploded the previous year and was only functioning thanks to oceans of liquidity provided by the Federal Reserve. At the same time, he had to rescue an economy that was shrinking rapidly and shedding jobs by the millions. Some members of the president’s team considered nationalizing some of the sickest megabanks, particularly Citigroup and Bank of America, which clearly would have failed without government assistance (and whose recklessness had helped produce the financial crisis in the first place). But instead, the administration continued where its predecessor had left off, giving Citigroup a third bailout in February 2009 and effectively pledging unlimited support for the megabanks. Given the close relationships with the financial sector that had been fostered by the Clinton administration and maintained during the intervening years, the instinct of the Obama White House was to ride to the rescue of the banks they knew. Whereas Franklin Roosevelt positioned himself as an opponent of Wall Street, Obama styled himself as its protector. “My administration is the only thing between you and the pitchforks,” he said to the CEOs of 13 large banks at the White House in March 2009. As it turned out, that was a promise, not a threat, and Obama lived up to it, ensuring that the remaining large banks survived the crisis intact, with their CEOs in place. Charlie Neibergall/AP Photo The Obamas at the Democratic National Convention in Boston, July 27, 2004 The paralysis of the financial system was an unprecedented emergency, but the choice to rescue the megabanks that caused the crisis was a natural one for a Democratic establishment that prided itself on its sophisticated appreciation of modern finance. The Obama administration’s broader response to the crisis and Great Recession demonstrated where the party now stood on key economic issues. The stimulus bill passed early in 2009 was far better than nothing, but it was constrained by the administration’s dedication to fiscal responsibility, which the Clinton veterans saw as the keystone of both the 1990s boom and the Democrats’ return to power. To preserve the party’s deficit-fighting credentials, the stimulus had to be coupled with a “strategy to return to long-term fiscal discipline,” in Furman’s words. Or, as Rahm Emanuel, Obama’s first chief of staff, put it, “No fucking way is this number coming anywhere near a trillion dollars.” Therefore, the stimulus was too small to fill the hole created by the recession, while also designed to phase out quickly and avoid creating long-term government programs that could have had a lasting structural impact on the economy. While the 2009 stimulus was merely inadequate, the administration’s response to the housing crisis was unconscionable. The collapse of the housing bubble and the financial system inexorably produced wave after wave of delinquencies and foreclosures as homeowners, no longer able to refinance their houses, could not make their monthly payments. Yet the federal government never addressed the problems faced by ordinary families—once the backbone of the Democratic Party—with anything like the imagination or financial firepower it used to rescue the big banks. President Obama failed to follow through on his campaign proposal for mortgage “cramdown,” which would have allowed bankruptcy judges to reduce the principal balance on mortgages. A plan by Treasury official Herb Allison to force banks to recognize losses on their bad loans was rejected by higher-ups because, in Allison’s words, “We don’t want to appear as though we’re socialists.” (Allison was himself a seasoned capitalist, previously chief operating officer of Merrill Lynch and CEO of asset management giant TIAA.) The administration declined to pressure mortgage servicers to reduce the principal owed on mortgages, even in federally funded programs purportedly designed to help people stay in their homes. Any of these ideas would have hurt the banks’ balance sheets, weakening them further. Instead, the administration’s main vehicle to help homeowners, the Home Affordable Modification Program, was designed as a voluntary program for mortgage servicers, promising them cash subsidies in exchange for reducing borrowers’ monthly payments—another attempt to achieve public ends by giving a gentle nudge to private-sector institutions. The significant discretion handed to mortgage servicers enabled them to use the program as a predatory lending scheme, squeezing extra payments out of struggling borrowers before ultimately pursuing foreclosure. As Treasury Secretary Geithner infamously said, HAMP’s real purpose was to “foam the runway for [banks]”—that is, to space out foreclosures long enough so that banks could absorb their losses without going under. In other words, the administration’s strategy was to let struggling families lose their homes in order to protect banks—an approach that clearly showed where their priorities lay. In the end, only a small fraction of delinquent homeowners received permanent loan modifications under HAMP, while more than nine million households lost their homes to foreclosure or financial distress. In total, the Treasury Department only spent $29 billion on (mostly indirect) aid to homeowners, less than half of what the administration initially promised; to put this in context, families collectively lost more than $7 trillion in equity in their houses during the financial crisis. Alan Diaz/AP Photo A foreclosed home in Homestead, Florida, 2009 The disintegration of the banking system also demonstrated the need—and created the opportunity—for comprehensive reform of the financial sector. It was clear that decades of unchecked innovation, rampant deregulation, and excessive concentration had produced a financial system in which a handful of colossal banks preyed on unsophisticated borrowers while accumulating risks they scarcely understood, forcing the government to come to their rescue when they finally exploded. Unlike in 1933, however, the Obama administration and Democrats in Congress chose not to pursue structural reform of the industry. Instead, the Dodd-Frank Act of 2010 re-engineered the regulatory framework of the financial sector, rearranging and in some cases increasing the powers available to government officials to oversee and potentially intervene in banks’ operations. The administration opposed proposals either to separate investment and commercial banking (by repealing the Gramm-Leach-Bliley Act) or to impose size limits on banks. After an amendment to establish size caps failed in the Senate, a senior Treasury official said, “If we’d been for it, it probably would have happened. But we weren’t, so it didn’t.” Lobbyists swarmed over Capitol Hill, picking off moderate Democrats—often longtime recipients of financial-industry support—to weaken the legislation even further. In 1933, Franklin Roosevelt had shut bankers out of his inner circle. By 2010, however, the Democratic Party was locked in a marriage with Wall Street, even if it was going through a rough patch at the time. A core tenet of the New Democrats was that finance is good and more finance is better, and one crisis was not enough to shake that belief. The result was a bill that largely preserved the financial system that was responsible for the 2008 crisis—and, indeed, America’s largest banks today are even bigger than ever. While financial reform was a battle that was thrust upon President Obama, he always intended health care to be the centerpiece of his legacy. The Patient Protection and Affordable Care Act of 2010 was the clearest demonstration yet of the Democratic Party’s infatuation with market-based solutions to broad social problems. As late as the 1970s, party leaders had been proposing government-financed universal health insurance programs; even Republican Senator Jacob Javits introduced a Medicare for all bill in 1970. In 2010, by contrast, the party united behind a bill whose centerpiece was exchanges, in which insurers would supposedly compete for consumers by providing better health plans at lower prices. The administration’s experts knew that health insurance markets, left on their own, would produce unwanted outcomes. Poor people simply wouldn’t be able to afford coverage; people might be tricked into buying deceptive policies that turned out to provide minimal benefits when they were actually needed; and, because of adverse selection, insurers would set high prices that only sick people would be willing to pay (if they could afford them). To address these problems, the Affordable Care Act included subsidies for lower-income families, minimum coverage requirements, and the individual mandate, which was designed to force healthy people to buy insurance, bringing down prices for everyone. Obamacare, as it came to be known, was an improvement on the unregulated individual market that preceded it. But as an exercise in Democratic policymaking, it was remarkable for its insistence on using markets and the private sector to achieve public ends—which, in this case, could be much more simply accomplished with a traditional social insurance program such as Social Security or Medicare. The root problem with the American health care system is that care is expensive; the average total premium for an employer-sponsored family plan is more than $20,000 even before deductibles and co-payments, far more than many workers could afford. In a properly functioning market, people who can’t afford something don’t get it. But when it comes to health care, that’s not an outcome we are willing to accept. No one will say (in public, at least) that how much you suffer, or whether you live or die, should depend on how much money you have—although in practice that’s often how it works. Instead, Republicans and Democrats agree that all people should have access to decent health care at a price they can afford. The most direct way to realize this goal is a universal health insurance plan (often known as “single payer”) paid for by a progressive tax system; that way everyone has coverage, and the amount you pay depends on your income. But the core assumption behind the Affordable Care Act was that competitive markets are the best way to provide goods and services, and the role of government is limited to ensuring that markets function properly. That’s why we ended up with a complicated system designed to contort markets into producing socially acceptable outcomes, whose features include private insurers whose costs are significantly higher than Medicare’s; complicated risk adjustment mechanisms designed to ensure that insurers aren’t profiting by skimming off the healthiest customers; subsidies that aren’t enough for many families; coverage for the poor that is subject to the whims of state governors and legislatures; annual negotiations in which insurers threaten to pull out of the exchanges unless they can raise prices; and, because underlying health care costs continue to rise, plans that increasingly require more out-of-pocket spending by customers. Despite all the attention paid to the private health insurance exchanges, it is telling that the part of the Affordable Care Act that expanded coverage the most was actually an expansion of public insurance: the increase in Medicaid eligibility up to 138 percent of the federal poverty level. This one provision alone gave health insurance to 14 million Americans, far more than the increase in coverage in the individual market—even though many Republican governors and legislatures chose not to adopt the Medicaid expansion. Charles Dharapak/AP Photo Obama is applauded after signing the Affordable Care Act into law, March 23, 2010. Defenders of Obamacare point out that there were not enough votes for a universal single-payer system. But there is no evidence that the architects of the Affordable Care Act would have preferred single-payer; instead, they seem to have been firmly in favor of private markets. More to the point, the question of votes only deflects the question. Exactly zero Republicans voted for the final version of the Affordable Care Act. The health care reform we got was the health care reform that the Democratic Party wanted—whether for ideological reasons, or because its members wanted to stay on good terms with the insurance industry. The Democratic position was once that we should pool our resources to ensure that everyone is protected against certain shared risks, including lack of access to health care. Now the default approach was to assume that markets can provide all good things and then, if necessary, figure out how to make those markets work better. Obamacare fit perfectly with the new worldview of Democratic insiders, who rejected anything that might be seen as socialist and portrayed themselves as sophisticated, business-friendly architects of enlightened policies informed by the latest economic research. Although finance and health care dominated President Obama’s legislative agenda, other components of his economic agenda betrayed the same preference for technocratic, market-based solutions. Retirement security has been a ticking time bomb for decades, made worse by the collapse of home values beginning in 2006. The traditional Democratic approach to retirement was Social Security—a mandatory, government-run program that provides minimum benefits to virtually every worker. Obama’s answer, however, was a proposal to require companies to automatically enroll their employees into 401(k) individual saving plans administered by the private asset management industry. This was a classic New Democrat proposal, giving private markets a nudge to help them achieve public ends—in this case, a nudge based on the hot new field of behavioral economics. It also fit the New Democrat mold by failing to address the root cause of retirement insecurity: After decades of wage stagnation, many people just don’t make enough money to save. Unwilling to propose anything that smacked of redistribution, Obama was left talking about the power of the stock market to multiply wealth—scant consolation to workers who have none to begin with. As for Social Security, President Obama was willing to offer a major long-term reduction in benefits (by changing the way cost-of-living adjustments are calculated) in his pursuit of a budgetary deal with Republicans. When negotiations failed, he even included the same benefit reduction in his 2014 budget proposal as a way to reduce long-term deficits—since the Clinton administration, Democrats’ way of showing their tough-mindedness on economic issues. President Obama also inherited Clinton’s market-oriented approach to education. His administration took up the buzzword of “accountability,” backing the development of the Common Core standards and providing incentives to states to develop teacher evaluation systems based on standardized test scores. The Race to the Top program dangled money to states strapped for cash after the recession, rewarding those that attempted to expand the market share of charter schools. Obama went one further than Clinton in the student loan market, completely eliminating federal subsidies to private banks. In addition, the administration created a new loan repayment plan that links payments to the borrower’s income, while requiring universities to demonstrate that their graduates were actually able to get jobs, and issuing a rule helping people who had been defrauded by for-profit institutions. These were welcome changes. But after a generation of rapidly rising tuition costs, and with outstanding loans doubling to $1.4 trillion during the president’s tenure, they barely made a dent in the ballooning student debt crisis. Making it a little easier to borrow money was no solution to the fundamental problem: that college costs were rising much faster than incomes. With time running out on his second term, President Obama pinned his hopes for one more economic policy victory on the Trans-Pacific Partnership, a trade agreement negotiated between a dozen countries bordering the Pacific Ocean. For the administration, the TPP was simply a matter of basic economics: International trade contributes to economic growth, creating jobs in export industries and lowering prices for consumers. Opponents of the agreement, Obama’s team insisted, were simply old-fashioned protectionists who didn’t understand the magic of markets. Even in the economics textbook, however, trade creates winners and losers, and since the invasion of our markets by Chinese exports, the United States has not done a good job of protecting the losers—primarily people dependent on industries threatened by foreign competition. In addition, TPP was far more than a free-trade agreement. Among other things, its intellectual-property rules forced other countries to adopt laws protecting the (U.S.-dominated) media and pharmaceutical industries, and its system for “investor-state dispute settlement” allowed multinational corporations to bypass domestic legal systems—fostering the belief that TPP was primarily drafted to benefit big business. Politically, TPP ultimately became a casualty of the populist rebellion of 2016, leaving a Democratic president as the last supporter of a trade agreement rejected by both the left and the right—an outcome that would have seemed fantastic only 30 years before. To be sure, there are other voices in the Democratic Party, which has not been entirely taken over by pro-business, pro-market ideas and policies. Presidents Clinton and Obama, however, almost by definition constitute the center of gravity of the party. Their ideas, their legacy, and their people make up the current Democratic establishment. In addition, in an age of short memory spans, they represent what the party means to most Americans today; its identity is largely a creation of their words and their actions. Hillary Clinton certainly did little to reposition the Democratic Party in the eyes of most people. Her economic platform was a caricature of New Democrat technocracy, with its catalog of bulleted plans, its painstaking care to distinguish itself from opponents on both the right and the left, the overbearing sophistication with which it lectured that Bernie Sanders’s proposals were impractical, and its lack of any message beyond a promise to create economic growth and good jobs. Clinton’s few, faltering endorsements of progressive positions—such as her switch from supporting to opposing the TPP—only came after considerable pressure from the left, and only reinforced the idea that the Democratic standard-bearer stood for nothing more than grinding out every last vote possible in the upcoming election. This brings us to the Democratic Party of today. Leaving aside the recent progressive insurrection—inspired by Sanders and embodied by Alexandria Ocasio-Cortez and the Squad—it is a party devoid of any compelling idea of how to address the fundamental economic challenges our country faces today: wage stagnation, the rising cost of health care and urban housing, the precariousness of most jobs, and extreme inequality. After defining themselves in opposition to old-fashioned government spending programs that smacked suspiciously of redistribution, after embracing the doctrine of market-based solutions, and after insisting for decades that economic growth would solve all problems, establishment Democrats today have nothing left to offer. Their economic policy agenda is anemic, constrained as it is by the premise that all good things must come from the market. Infrastructure spending is a perennial favorite, because it addresses a market failure (private companies have insufficient incentive to build or maintain shared goods like roads and bridges), can be touted as a productive long-term investment, and can be channeled through the private sector. Job retraining programs are another staple, because they promise to help workers adapt to changes in the labor market—a promise that, unfortunately, they often fail to keep. Otherwise, there is precious little. The idea of a $15 minimum wage was the product of progressive groups on the state and local levels, only later reluctantly embraced by party leaders. Medicare for All likewise was born on the party’s left wing, and the establishment currently appears to be trying to figure out how best to squash the idea without being blamed for doing so. The idea that the government should increase taxes on the rich and give stuff to ordinary people (a college education, for example) is anathema, condemned as class warfare by moderate Democrats even more vigorously than by Republicans. This is the economic vision of their Democratic Party. Economic growth is the goal, markets are the means, and the role of government is to maximize efficiency by correcting for specific market failures. The things people actually want—such as education, jobs, housing, and health care—are a by-product of that growth. Heavy-handed attempts to intervene in the economy will only backfire. The fact that we would have called this a moderate Republican vision only a few years ago doesn’t necessarily make it wrong. And it isn’t nonsensical on its face. The problem is that it has failed, both as policy and as politics.
https://prospect.org/takebackourparty/chapter-1-their-democratic-party/
In a 2014 Quinnipiac poll of American voters, those surveyed concluded that President Barack Obama has been the worst president in post-World War II history. They ranked President Ronald Reagan, icon of so much Republican rhetoric in this (as in every other) election year, as the greatest. It isn't hard to dismiss this sort of faux information as meaningless, but I must admit that a wave of dark energy coursed through my veins. It made me want to hold a nationwide webinar where I show America how, under Obama, unemployment is well below where it was when the recession started and how deficit spending is down. I want to remind Americans that millions more Americans can live without the fear of losing everything due to illness. Then I want to show them stock market graphs, deficit gaps, and unemployment figures from 2001 through 2009 and ask them again: "Who was the worst president?" As troubling to me as the Obama-fail was in the poll, even more troubling was the Reagan canonization. The Legend of Reagan grows because the Republicans have not had a legend since Lincoln. They marginalized Eisenhower during his eight years of presidential prosperity because America had also moved to the left, and Republicans created a more extreme ideology in order to define some relevance — ergo the second Red Scare, McCarthyism, Goldwater conservatism, and the great alliteration himself: The Ronald Reagan Republican Revolution. I contend that Reagan's greatness is a fable woven from selective memory to put a noble face on failed policy. To be sure, in 1980 (Jimmy Carter's last year in office) inflation averaged a very high 12.5 percent, and America was heading into a recession. Carter's failed economic policy was the perfect platform for the Reagan myth to begin upon. Reagan immediately implemented supply-side economic policies — which meant tax cuts across the board — and expanded the tax base to offset the resulting revenue loss. "Reaganomics" entered our lexicon, and certain economic indicators began to improve. During Reagan's administration, the unemployment rate averaged 7.5 percent over eight years, after reaching a high from the recession in 1982 of 10.5. Reagan's legacy was set halfway through his first term, because he was the man who lowered taxes and turned the tide of recession. Production was up, unemployment was down. Mount Rushmore couldn't be far behind! But there was a virus deep within Reagan's great plan: There wasn't enough revenue to pay for his defense initiatives and for the government programs that he supported. So along came the Tax Equity and Fiscal Responsibility Act of 1982, the largest peacetime tax increase in history. He then sold Congress and the American public on the Tax Reform Act of 1986, which "simplified" the tax code while raising the bottom bracket tax rate by four percent and lowering the top rate another 22 percent. In theory, he could say that they were tax cuts, since the total percentage was lower, but the new tax burden fell to everyone but the wealthy. Democrats were on the Trickle Down Train, as well — proof of the historical journey toward oligarchy that has seen a 250 percent increase in the wealth of the upper class over the past three decades. It must be said that the widening gap between the rich and poor had already begun during the 1970s, before Reagan's economic policies took effect. However, it must also be stated that Reagan's policies exacerbated that trend. When Reagan left office there were 7 million more Americans living in poverty than when he started. Reagan remains popular as the anti-tax hero despite raising taxes 11 times over the course of his presidency — in the name of fiscal responsibility. Overall, the 1982 tax increase undid about a third of the 1981 cut. Even Reagan admitted that his greatest regret was having tripled the national debt and turning the United States into a debtor nation for the first time. Reaganomics was a short-term fix with long-term negative consequences. The proof of Reagan's successes and failures will not be revealed by polls, party rhetoric, or platitudes. They are there for serious-minded people to view and decide for themselves. Greatest president since FDR? Depends on your income. Gary Kroeger is a former member of the Saturday Night Live cast, now an advertising executive in Iowa and a Democratic candidate for Congress. This essay is condensed from a version posted on his blog.
https://m.memphisflyer.com/memphis/the-myth-of-reaganomics/Content?oid=4494534
The overstated celebrations and commemorations of the centennial of Ronald Reagan's birth, with their razzle-dazzle of Super Bowl tributes and marathon deifying in Simi Valley, are fitting tributes to a president whose public relations guru, Michael Deaver, was a pioneer of this same kind of flim-flammery. But the Reagan Centennial's flashy hagiography masks a far more complicated reality. He set the nation's economic agenda, imitated by Democrats Bill Clinton and Barack Obama, which continues to this day. "Reagan taught us deficits don't matter," Dick Cheney once boasted. But not even the staunchest of Reaganites (Democrat or Republican) would make that assertion today. Those who are celebrating the centennial of President Reagan's birth are rejoicing in the trajectory he put the nation on. And what a trajectory these last three decades have been. At the start of his 1980 campaign, after undergoing several cram sessions with enthusiasts of "supply side" economic theory, Reagan told an interviewer: "an across the board reduction in tax rates, every time it has been tried, it has resulted in such an increase in prosperity . . . that even government winds up with more revenue." But Reagan had no evidence to support this assertion, which his own vice president, George Herbert Walker Bush, famously denounced during the Republican primaries as "voodoo economics." There was little question that the new ethos the Reaganites brought to Washington had a profoundly different attitude toward the poor than had been seen in the Capitol for many years. Reagan's friend and ally from California, Edwin Meese III, who later became his Attorney General, told reporters that the Administration "had considerable information that people go to soup kitchens because the food is free and that's easier than paying for it." Throughout his two-term presidency Reagan blocked any increase in the federal minimum wage. His domestic policymakers sought to roll back federal help to the working poor who were reeling under the worst economic conditions in a generation. In fact, during the 1980s, the word "welfare" itself became strongly associated with failure and waste. Thirty years ago this month, on February 18, 1981, during Reagan's first State of the Union address, the House chamber boomed with applause when he proposed cuts totaling $41.4 billion in the federal budget. In his first budget he dropped about 400,000 households from the food-stamp program. At the same time he planned to boost military spending by $7.2 billion, which added significantly to Carter's earlier defense buildup. On August 13, 1981, Reagan held an outdoor signing ceremony for the Economic Recovery Tax Act (ERTA) at his 688-acre ranch outside Santa Barbara. Forty-eight House Democrats crossed over to join the Republicans' overhaul of the nation's tax code. The new legislation reduced marginal income tax rates for all Americans by 25 percent. The wealthiest Americans, who paid 70 percent in 1981, would see their tax rate lowered to 50 percent. Rates for lower-income people fell more modestly, from 14 percent to 11 percent. Today, the Republicans scream bloody murder at the thought of raising the top rate from 35 percent to 39. 5 percent. Not only did the rich reap the greatest windfall from the changes in the tax code but during the "sausage-making" mark up of the legislation Congress members and Senators larded the bill with billion-dollar tax breaks for corporations, oil conglomerates, and other special interests. One corporation that benefited from the new legislation was Reagan's former employer, General Electric. Citizens for Tax Justice, a liberal advocacy group in Washington, D.C., estimated that the new law yielded $1 billion for GE over the course of five years. GE also paid no income taxes during the first years of the Reagan presidency. Keeping with Reagan's spirit of bipartisan servitude to wealthy elites, President Barack Obama recently tapped the CEO of GE, Jeffrey Immelt, to head his economic team. The supply-siders' prognostications for increased government revenues turned out to be as whimsical as their critics had suspected. Even with the harsh cuts in social programs and Reagan's signature on subsequent tax hikes designed to mitigate the negative fiscal impact of the initial ERTA, the federal budget deficit swelled from $74 billion in 1980 to $300 billion by the middle of the decade. So any honest evaluation of Reagan's legacy on taxes and budget cuts would have to acknowledge that he tilted the playing field in favor of the rich at the expense of the poor. PATCO Reagan's National Labor Relations Board (NLRB) favored management over labor far more than any previous administration. Companies now had greater latitude to impose speed-ups, hire scab workers, and violate labor contracts. No battle illustrates this new order in labor relations than how Reagan handled a strike by the Professional Air Traffic Controllers Organization (PATCO) that began early in his presidency. Ironically, in 1980, PATCO had been one of three national labor unions that had endorsed Reagan for president; (the other two were the Teamsters and the Airline Pilots Association). During the previous decade strikes by public workers were usually settled quickly with good-faith negotiations and federal arbitration. Reagan gave PATCO 48-hours to end the strike and when the workers refused he fired all 11,600 of them. He then brought in supervisors and air traffic controllers from the U.S. military to break the strike. Reagan's Justice Department even arrested some of the union's leaders and promised to criminally prosecute them. In a few short weeks PATCO was history. It was the most aggressive stand against organized labor by the federal government since the passage of the anti-labor Taft-Hartley Act of 1947. The President's decisive stand against PATCO altered long-established norms and provided a context for a string of private sector strikes that ended badly for the unions. Like today, in the recessionary context of the early-1980s it was easy for Reagan and his corporate allies to gain the upper hand against organized labor. Not long after the PATCO action, the Greyhound Bus Company and Eastern Airlines fired striking workers and replaced them with non-union substitutes. The labor leader and co-founder of the United Farm Workers, Dolores Huerta, later noted: "We found that right after the PATCO people were fired the United Auto Workers union accepted an agreement to freeze their wages. That put a lot of pressure on the other unions to do the same thing. So, what you had was a tremendous weakening of the power of labor." Labor's long slide (which has gone on for thirty years and took down a big chunk of the middle class with it) was initiated early in Reagan's first term. Foxes In Henhouses Reagan attempted to defang federal regulatory agencies that conservatives had long railed against by elevating lobbyists, corporate lawyers, and executives to pivotal positions inside the Administration. The tactics included appointing people to high government posts from the regulated industries themselves, saddling the regulatory agencies with debilitating budget cuts, and encouraging bureaucratic inertia. For example, to head his NLRB, Reagan tapped John Van de Water who ran a West Coast consulting firm that specialized in union busting. Other similar appointments followed. To lead the Department of Agriculture's marketing and inspection division, Reagan appointed C.W. McMillian who was formerly the vice president of the National Cattleman's Association; Richard Lyng, a lobbyist for the American Meat Institute, became undersecretary of agriculture. The assistant secretary of energy for conservation and renewable energy, Joseph Tribble, came from the Georgia Pulp and Paper Company, a corporation that had been charged with dumping toxic waste into rivers. James Watt, who became Reagan's Secretary of the Interior, had been a lawyer for the Mountain States Legal Foundation, a law firm that represented some of the nation's largest mining and timber corporations. Anne Gorsuch (Burford) was put in charge of the Environmental Protection Agency (EPA) who, like Watt, came from Colorado and had built her career fighting against environmental regulations as a state legislator. Secretary of Labor Raymond Donovan had been a construction company executive who instructed the agencies under his command, including the Mine Safety and Health Administration (MSHA), to emphasize "voluntary" compliance by mine operators of health and safety laws. On the 1980 campaign trail, Reagan had called the Occupational Safety and Health Administration (OSHA) "one of the most pernicious of the watchdog agencies" that sought "to minimize the ownership of private property in this country." He slashed OSHA's budget by 10 percent and chose Thorne Auchter to be an assistant secretary of the agency whose family-run construction business in Jacksonville, Florida had been charged with forty-eight safety violations. To Chair the Securities and Exchange Commission (SEC), Reagan recruited the Wall Street insider, John Shad, who worked diligently to transform the once feared enforcement arm of the federal government into a partner in the prerogatives of brokerage houses and investment banks. Shad also emphasized "voluntary" compliance with financial regulations. During his seven-year tenure at the SEC he froze the number of investigators at its 1981 level even though the number of stock traders nearly doubled in that period. Reagan cut the SEC's budget by roughly 30 percent and the enforcement division's staff was reduced from 200 investigators to fifty. Reagan's Secretary of the Treasury, Donald Regan, who entered government directly from his post as CEO of Merrill Lynch, moved to de-regulate the financial services industry and nurtured an environment where big Wall Street players could manufacture new debt instruments, such as "junk bonds," and engage in Leveraged Buy-Outs (LBOs). Both of these innovations strained the financial system and sapped the productivity of other sectors of the economy. So the appointing of corporate "foxes" to guard the public's "hen houses" began under Reagan and has continued through both Democratic and Republican administrations ever since. Deregulation On February 17, 1981, Reagan signed Executive Order 12291 mandating that all federal regulations undergo a "cost-benefit" analysis. Proposals for new guidelines were to be submitted to David Stockman's Office of Management and Budget (OMB) to determine their effects on big business's bottom line. Any rule that corporations did not like would be subjected to an industry-friendly review. Reagan gave the OMB new powers to reject "burdensome" regulations and he named a corporate lobbyist, Jim Tozzi, to be the deputy administrator of the Office of Information and Regulatory Affairs (OIRA), a division of OMB. Tozzi had specialized in finding ways around federal rules for his clients, and as a high-ranking official he now could "review" many of the same directives he had fought against. Reagan appointed J. Peter Grace, chief of W. R. Grace & Company, to head the "President's Private Sector Survey on Cost Control." The Grace Commission had an executive committee consisting of CEOs from some of the nation's largest corporations. Accustomed to working behind closed doors, the panel refused to cooperate when Congress demanded a list of its members. In the mid-1980s, Grace's own company was forced to settle a civil lawsuit that accused it of poisoning two wells in Woburn, Massachusetts that led to the leukemia deaths of five children and one adult. The Woburn case was later chronicled in a best-selling book and even made into a Hollywood movie, A Civil Action. The end result after Reagan's two terms was a tripling of the national debt, from about $900 billion to $2.9 trillion. No president in U.S. history had tripled the debt in peacetime. Then, as today, the same politicians who brought us the tax cuts and military spending point to the deficit as an excuse to gut programs that serve not only the poor but the working middle class. Reaganomics In his best-selling 1981 book, Wealth and Poverty, the conservative author George Gilder offered a spirited defense of laissez-faire capitalism and bluntly stated the underlying premise of supply-side economics. "A successful economy depends on the proliferation of the rich," he wrote, "to help the poor and middle classes, one must cut the taxes of the rich." What transpired throughout most of Reagan's time in office was a patchwork of fiscal measures designed to blunt the negative budgetary effects of the original 1981 ERTA, and shift the tax burden from the wealthy to the working and middle classes. The Tax Equity and Fiscal Responsibility Act (TEFRA) of 1982 closed some of the loopholes and raised specific taxes that the ERTA had dropped. Richard Darman, a top White House aide, labeled the $37.5 billion in new taxes contained in TEFRA, (along with the Highway Revenue Act's $3.3 billion), "the single largest tax increase in history." The 1983 Social Security Amendments raised payroll taxes and imposed new restrictions on workers' benefits. The Deficit Reduction Act of 1984 and the Omnibus Budget Reconciliation Act of 1987 both found ways to raise revenues while cutting social spending. In addition, Congress stepped in with its own initiatives in the form of the Gramm-Rudman-Hollings Act of 1985, followed by the Gramm-Rudman Act of 1987, which set fixed deficit targets and a means of theoretically achieving them. During the Reagan years, labor unions suffered their most precipitous decline in the post-war period. The share of private sector workers who belonged to unions fell from close to 20 percent in 1980 to 12.1 percent in 1990. (By the 2000s it had dropped to about 7 percent.) This decrease in private sector unionization is sometimes attributed to changing attitudes among the workers themselves, but public employee unions grew steadily during this period and accounted for most of the new unionization. It was far more difficult for governmental institutions to practice the kind of aggressive anti-union tactics that have become the norm in the private sector since the 1980s. The Harvard economist, Benjamin Friedman, calculated that the portion of national income invested in plant and equipment during the Reagan Administration averaged about 2.3 percent. During the previous three decades it had averaged 3 percent and had never reached that number in the 1980s. Friedman's analysis undercuts the view that supply-side tax cuts had produced greater investment in domestic plant and equipment. Throughout the post-World War Two period the United States had run modest trade surpluses. But on September 16, 1985, the Commerce Department announced that the United States had become a debtor nation. For the first time since 1914 the United States brought into being a situation where it had to borrow money from abroad to pay for its imports. In 1980, the U.S. still kept up a trade surplus of $166 billion, but by 1987 the nation owed foreigners $340 billion. The trade imbalances were, in part, the product of "neo-liberal" trade policies that rewarded American companies that outsourced production to low-wage countries. In early 1981, Reagan's Secretary of Human Services Richard Schweiker caused a stir when he called for reducing Social Security benefits for those who retired before the age of sixty-five and imposing new requirements to punish early retirees. Reagan had been a harsh critic of Social Security throughout his public career, which he considered a "coercive" government program. Reagan appointed a fifteen-member "bipartisan" commission headed by one of the Administration's favorite free market economists, Alan Greenspan, to examine the condition of Social Security and make recommendations. Greenspan had been a close associate of the free-market guru and Atlas Shrugged author, Ayn Rand, and, along with Milton Friedman, was among the academic economists most famous for holding an almost religious devotion to the precepts of laissez-faire capitalism. The Greenspan Commission imposed higher payroll taxes on working people, which accounted for about half of the hike in taxes from 1984 to 1989. The Commission's work was widely praised because the legislation that sprung from it was bipartisan. But the higher payroll taxes, along with the regressive tax increases contained in the TEFRA and other acts of Congress during the 1980s, constituted nearly a 50 percent tax hike on lower-and middle-class workers. Cash strapped state and local governments also raised taxes to offset the reductions in federal assistance. When viewed in the context of the substantially lower tax rates for the highest income earners, the changes in the tax structure associated with Reaganomics amounted to one of the largest redistributions of wealth upward in U.S. history. By 1984, Reagan had largely succeeded in realigning the economic debate away from Keynesianism with its positive view of the role of government and toward a culture that valued deregulation and free markets over all else. Large swathes of the public had become suspicious of social programs and contemptuous of government. In 1987, Reagan appointed Greenspan to chair the Federal Reserve Board, which was a post he held for the next eighteen years, thereby institutionalizing many of the tenets of Reaganomics. Deregulation, along with "free trade" and cutting welfare spending, became bipartisan orthodoxy in Washington as domestic policy moved definitively in the Republicans' direction. What came after Reagan were bipartisan "free trade" agreements, NAFTA, GATT and the WTO, which ended up outsourcing millions of good-paying American jobs to low-wage countries. Then came the bipartisan deregulation of the Telecommunications industry that gave us Fox News, and at the close of Clinton's second term, the bipartisan deregulation of the financial services industry that took a mere eight years to bring the nation's economy to its knees. Now, out of the wreckage from the last thirty years of bipartisan Reaganite economic policy designed to serve the richest of global elites, we have the bipartisan calls for shredding what's left of the social safety net, including Social Security, as a way to "make hard choices" to tackle the deficits that were produced by more or less the same politicians that brought on the catastrophe in the first place. Today, with states, counties, and municipalities reeling under a load of debt, brought to us by failed Reaganomics, the public sector, by which I mean health care services for the poor and elderly, schools, libraries, police and fire fighters, child protective services, as well as social programs of all kinds that help people, are being cut back past the bone and into the marrow. What we're seeing at the state and local levels is nothing short of the systematic dismantling of public institutions that took decades to build. "Jobs, Jobs, Jobs" is a nice slogan but it tells us nothing about the quality of those jobs. Today, what's happening all over the country are across-the-board layoffs of public employees who had decent jobs with okay benefits and in their place are either McJobs or no jobs at all. What we've seen happening over the past three or four years is a further deskilling and downgrading of the living standards of the average American worker. The legacy of Reaganomics continues with the aggressive attempt to turn public school teachers into Wal-Mart workers. Put in its context of austerity and debt reduction, this concerted attack on teachers is just the latest onslaught against the American working middle class. They've already wiped out the manufacturing workers and their unions, now they're going after public employees and their unions. Across the country, right-wing Republican governors are teaming up in a spirit of "bipartisanship" with clueless "education reform" zealots like Michelle Rhee to eliminate teacher tenure, slash pensions, and generally make public school teaching a profession that anyone would have to be crazy to want to join. If you like the way things are in the United States today -- with Gilded Age levels of inequality, weak labor unions, low-wage service jobs for most of the workforce, and a public sector that's dying on the vine -- then you can thank Ronald Reagan. If you could have seen the parade of disabled people (many of them severely) who came to the California State Capitol in Sacramento on February 3rd begging their elected leaders to block a proposed cut of $750 million from programs that help them live better lives -- one by one, approaching a microphone at a recent hearing, speaking eloquently and poignantly, and calling out for human dignity and compassion -- you'd have a better idea of the kind of suffering that this brand of heartless economics have wrought in this country. That's the true Reagan legacy.
https://www.huffpost.com/entry/the-reagan-centennial-cel_b_819163
— End of week Seven — Edit: This contract is the first draft that was presented before my partner dropped out. Contents - Project Description - Tools to be Utilized - Work to be Completed by Team Members - Schedule of Milestones - Full Research Proposal Project Description The purpose of this research project is designed to consider the historical significance and influence American Indian Tribes had on the Northern Frontier of Texas. The main goal of this project is to track the migration patterns of the Wichita Indians within the historical context of Wichita Falls. It will consider their initial habitation, forced migrations and their continued influence on regional culture even after their departure. Tools to be Utilized Throughout the entirety of this project, multiple social media platforms will be utilized in order to keep up with the progress of the work being done collectively as a team and individually such as:
http://xroads.coplacdigital.org/pena/2017/10/05/
The purpose of the Implementation of Promoting Safe and Stable Families (PSSF) by American Indian Tribes study was to examine the ways in which Indian Tribes use funds received under title IV-B, subpart 2 to provide services that strengthen families’ abilities to care for their children. This study documented the process of implementation by focusing on planning and monitoring processes, service delivery systems and resources utilized by Tribes. As context to PSSF implementation, the full range of child welfare and related human services utilized by Tribes was explored, along with the resources used to fund these services. This study provided a historical perspective on Tribal implementation of the legislation. It explored how the usage of funds had evolved and how effective the funds had been in meeting the goals of PSSF and addressing the needs of American Indian children and families. The study had the following components to provide both breadth and depth to the analysis: - Review of Tribal Plans: This component of the study examined the Child and Family Services Plans (CFSPs) for information on planning, funding and services. - Formation of a Technical Working Group (TWG): A group of leading experts was convened to provide input and guidance into the study. - Case Studies: A total of 12 Tribes representing a wide array of practices in terms of service delivery, funding and resource utilization and collaborative arrangements were visited in-person.
http://www.acf.hhs.gov/opre/research/project/implementation-of-promoting-safe-and-stable-families-by-indian-tribes-pssf
Looking at politics through the lens of film allows audiences to separate from reality long enough to understand that sometimes fact truly is stranger than fiction. When film critics examine political cinema we should be aware of several things: - Bias - Purpose/Message/Meaning - Historical Context - Genre impact Bias: We all have bias and political leanings towards a party, candidate or an issue. We may deny it, but unless we are completely apathetic about nearly everything in society, we are connected to something within the political scope of life. We will at sometime in our lives be drawn to certain causes and individual leaders, and yet repulsed by others. That is bias, and it exists for all people. When we view political films we will discover that viewers are not the only ones with bias; filmmakers creating projects can present their leanings in cinema as well. Some filmmakers attempt to put aside political feelings for an objective narrative or story, while others wear politics on their sleeve for all to see. For those filmmakers who choose to expose their bias to others; film becomes a medium to share their message, their agenda, and their politics. As a film critic you should examine films with that understanding. If you feel that bias exists (Democrat, Green Party, Libertarian, Republican, Tea Party, etc.) mention it, but most of all you should be able to defend why you feel that way. If you are torn and are unsure, you may need to research the writer, director, or producers to get a closer understanding of political leanings. You will also need to examine your own personal political baggage/bias. If you feel your bias is coming through in your review of the film, let your audience know that up front; then they will know where you are coming from. Purpose/Message/Meaning: Political cinema (regardless of genre) speaks to the topics, ills, and issues of society. In doing so, there is usually a meaning or message the director (or writer) wants to share. These messages relate to the varied topics featured earlier in this chapter. Sometimes the message in political cinema is under the surface of the story, while in other situations, there is no doubt of what the movie (and filmmaker) is trying to say. Audiences may end up asking more questions than they find answers to. That is the beauty of politics and film, most times there will be more than one correct answer – it is all about your point of view. When you walk away from a political film and continue to think critically about what you’ve just witnessed, then these films have probably done their job. Historical Context: One can actually research the history, the time period, the events, locations and personalities through factual accounts on film. These films (biopics or based on true story films) can be set up against the context of reality vs. fiction. To understand the messages, meanings, or purpose of these films is to understand the historical time periods (cultural, political, societal, international) surrounding the setting of these political movies and the people involved in its creation. As critics we should ask: Why does this film matter? Why should we care? Is this person, place, situation or event still relevant in the context of today’s society, and if so why and how? Are lessons learned from this film universal (regardless of time period)? To understand our history is to appreciate our present; to appreciate our present is to anticipate our future. As critics (and audience members) we must also understand that fictional political-based films can be viewed within historical context and significance as well. As mentioned in earlier chapters, sci-fi does an amazing job exploring society through a future or alternate universe. But there are also genres like drama, action, and animation that can ask and address the “what if” questions within historical context. Genre impact: Political cinema can be examined through any genre. Because each genre is different, the approach to sharing messages and meanings will be diverse as well. Comedy may offer a more relaxed way of getting to meaning while biting satire may smack you in the face with it. In an action film you may see an undercurrent of political topics as a backdrop for the narrative, while family animation may present a message that only the most astute will catch.
https://thefilmcritic.blog/2019/05/05/ch-11-reviewing-political-films/
This course examines basic learning processes within the context of classical, instrumental, and operant learning situations. Course content focuses on classical conditioning, instrumental learning, principles of reinforcement, punishment and avoidance conditioning, stimulus generalization and discrimination, retention and forgetting, nature and functioning of memory, and learning and performance of motor skills. CO1: Examine the historical development of cognitive psychology, outlining the philosophical and psychological schools of thought contributing to the field’s foundation. C02: Investigate the methods of research applied to the study of cognitive psychology. CO3: Examined cognitive psychology as a discipline of study and identify the significance of cognitive psychology. CO4: Delineate the various types of learning, expressing the degree to which learning has occurred based upon measures traditionally applied. CO5: Identify the structures and functions of the brain, examining the role of cognitive processes in controlling the systems and functions throughout the body through an understanding of the field of cognitive neuroscience. CO6: Examine the major processes of consideration under cognitive psychology, identifying the related functions, theoretical explanations, and research models associated with each process. ASSIGNMENTS AND PROJECTS: Assignments and projects are due during weeks 3, 5, and 7 of the course. Instructions are in the Assignments link on the classroom left-screen menu.
https://www.amu.apus.edu/course-schedule/details.html?c=PSYC303
The course therefore provides knowledge of the historical context of law and its theoretical underpinning. Students will learn that there is more than one solution to common legal problems and builds critical analytical skills in so doing. At a practical level, the course aims to introduce students to the Civil Law systems for students who wish to work or study further in this area in Britain or abroad. It enhances research skills and gives an insight into conflicting doctrinal approaches. Most of the legal systems of the modern Western world are based on one of two legal traditions. The first is that of the English Common Law, on which the systems of most Commonwealth countries and the United States are based. The other is that of the Civil Law which has its origins in Roman Law. Today, Civil Law forms the basis of the legal systems of most European countries, as well as many in Africa, Asia, North and South America and the Middle East. It has also been very influential in relation to the development of European Law. Accordingly, the course offers students, who will largely come from the common law tradition, the opportunity firstly to study Roman Law, and goes on to compare the legal reasoning, structure and substantive law of some modern day Civil Law legal systems with our own. The study of comparative law is important for two obvious reasons. First, it allows us to consider the law in other systems. This includes an examination not merely of the rules of substantive law but a consideration of the historical, social, economic and political factors which have influenced the shape and content of the law in those jurisdictions studied. Secondly, the study of law on a comparative basis provides an opportunity critically to examine the principles of the common law and their practical effects. There is a particular, and increasing, need for British lawyers to understand the principles and the detail of civil law systems. This arises from our membership of the European Union and the opportunities presented for greater trade a commerce between Union members. Knowledge of French or any other foreign language is NOT required for the course. Summary of Course Content The course begins with an historical introduction to the Civil Law. It will examine the Roman Legal system, and explain its significance as a foundation for the systems of modern continental Europe. The Roman Law of Obligations will then be considered. In the second part of the course, we first consider the function and methodology of comparative law. We then examine the history and current day structure of the French legislature, and hierarchy of courts and comparisons made with Britain and some other modern European countries. Difference of approach in Legal Reasoning and statutory interpretation will be considered as well as the different sources of law. The significance of the more inquisitorial Civil Law approach to Criminal Justice issues in France will be examined and compared with those within the adversarial British system against the background of both systems facing mounting criticism from the public and the European Court on Human Rights. In the final part of the course compare the substantive law of obligations in France and Germany. This is done by examining a range of special topics within the French and German law of contractual and delictual obligations. Throughout students will be expected to compare, and evaluate, not only French and German law but also civilian approaches with those of the common law. Where possible, 2 lectures are given by visiting French and German lecturers to provide students with first-hand experience of these jurisdictions.
https://www.aber.ac.uk/modules-archive/2004/LA36520.html
Hepatitis C Screening and Linkage to Care in an Urban Emergency Department Background: Hepatitis C virus (HCV) is a major cause of chronic liver disease and is the most common chronic bloodborne pathogen in the United States. The Emergency Department (ED) is potentially a high-yield site in ... - Here Comes the Sun (Oakland University, 2009-10-01)Recollections of memorials of the Kent State University shootings and the Students for a Democratic Society. - Higbie Manufacturing Company Annual Report cover (7/31/1951)Higbie Manufacturing Company. Annual Report for the Year Ending July 31, 1951. J.A. Coxe ; Bronson ; Avon Lube ; McAleer ; General Crafts Division - High Context vs. Low Context Political Messages and Audience Preference Research has shown that the differences in Eastern and Western communication styles may affect political messages and diplomatic relations. The difference in these communication styles includes the context, such as high ... - High School Factors as Predictors for College Success: Students’ Satisfaction, Motivation, and Success This study focused on evaluating and analyzing high school and demographic factors as accurate predictors for college success as well as considering other factors impacting students’ motivation, success, and self-efficacy. ... - - Higher Education Leadership: Where and Who are the Interdisciplinarians? An Introductory Story (Association for Interdisciplinary Studies, 1991) - Hill Holler (1967-07-06) - Hispanic Celebration 2009: Opening Ceremony - September 14, 2009 (2011-01-18)Opening Remarks: Dr. Virinder Moudgil, Senior Vice President for Academic Affairs and Provost Keynote Speaker: Martina Guzmán Martina Guzmán is a Detroit journalist who won the Associated Press Award as Best Individual ... - Historic Gift to the OU Community (2012-12-21) - Historical and Cultural Preservation of Resources In Oakland County (Oakland University, 2000-04-01)The Oakland County Historical and Cultural Resources Inventory Map displays the key archaeological, architectural and historical sites that can be found in Oakland County, Michigan. - The Historical Jewish Ghettos of Venice (Oakland University, 2002-10-01)The purpose of this paper is to examine the Jewish Ghetto within the Venetian urban context, to detail its sociological and economic aspects, and, in so doing, establish the historical and cultural significance of its built ... - History Department Newsletter (Spring 2005) (2005-04)Newsletter of the History Dept. Faculty, student and alumni news. - History Department Newsletter Spring 2006 (2006-04)Newsletter of the History Dept. News of faculty, students and alumni.
https://our.oakland.edu/browse?rpp=20&sort_by=1&type=title&offset=1635&etal=-1&order=ASC
Amid the removal of historic Works Progress Administration murals from schools in Oak Park following complaints that they do not reflect the community’s diversity, a Park Ridge WPA mural depicting Native Americans and white government agents is staying put inside the city’s library — but getting some additional historical context, the library’s director says. Heidi Smith, executive director of the Park Ridge library, said printed pamphlets describing the history and restoration of the “Indians Cede the Land” post office mural will be updated to include expanded historical context of the scene based on information provided to the library last year by Julie Pelletier, an associate professor of indigenous studies at the University of Winnipeg. Pelletier served as acting director of Newberry Library’s McNickle Center for American Indian and Indigenous Studies. The additional information explains that when U.S. government treaties were signed with Native American tribes, they often were not honored by the government, Smith said. Such information was recently added to the “news” section of the library’s website. “The act of ceding land by Native Americans was involuntary and typically done under duress,” Pelletier’s information states. “In return for vast tracts of land, tribes might be promised goods, money, reserved lands (reservations) and protection from encroaching settlers …. The Treaty of Chicago gained over a million acres of land for the United States. In return, signatory tribes received $100,000 in trade goods, $280,000 in twenty annual payments of $14,000 each, and $150,000 for the erection of mills, houses, etc. The treaty does not list any land to be held for the tribes so one wonders where the houses and mills would be built. The United States government often did not honor its treaties with Native Americans and most tribes do not receive what they were promised as payment for land cessions.” Officials said they did not yet know when the library’s informational brochure on the mural would be updated to include the new language. The oil on canvas mural, painted by George Melville Smith and restored in 2013 through a volunteer-led fundraising campaign, hung for many years in the former Park Ridge Post Office at 164 S. Prospect Ave. According to the New Deal Art Registry, it was created in 1940 as part of a government program that commissioned art to be created for federal buildings. Three other New Deal-era murals, created through the WPA Federal Art Project, have been or are in the process of being removed from schools in Oak Park after complaints were raised that the 80-year-old paintings featured no children of color, even though the school district today is diverse. The first mural to be removed depicted a scene of white children and parents enjoying winter activities. It was painted in 1937. Smith acknowledged that upon joining the Park Ridge Public Library last year, she initially found its own New Deal mural— which shows a white uniformed officer in a handshake with a bare-chested native man — to be “a very beautiful, but challenging, piece of art” due to the subject matter. “I think it has cultural and historical significance here in Park Ridge,” Smith said of the mural. “I’m thrilled we have it here in the library, but I also strongly feel it’s our responsibility to educate and share the context — both when the portrait was created and the historical period in which it is meant to portray.” Smith said she reached out to Pelletier last summer around the time that moving the mural to the Quiet Reading Room was briefly considered. It was also around this time that a patron inquired if the mural was insulting to Native Americans and questioned whether it should have a “place of honor” in the library, Smith said. Pelletier recently said that the library’s initial description of the piece included wording that “acted as if the tribal people of that area had either simply handed over their land or been beaten into submission. I really encouraged [the library] to put context in place, to look at the long period of colonization — not just the unfair, but illegal, ways in which territory was taken from indigenous people — and to really keep that context clear so that people looking at the mural would understand it expressed a particular moment in history from a particular perspective.” That perspective, Pelletier said, is of white America, which, in the 1930s, was experiencing a time of “nationalistic fervor.” Such feelings are reflected in the “Indians Cede the Land” mural, she said. “The expression of Manifest Destiny is clear to my eyes. While the indigenous men are painted as physically powerful, they are bowing to the military might and right of settler colonial society,” Pelletier said. Pelletier acknowledged that the portrayal of Native Americans in such a prominent piece of art was “unusual” for the time it was created, but it was not necessarily an accurate portrayal. “It shows them as ‘noble savages’ and that they have willingly moved out of the way for progress,” she said. It also shows indigenous people living in the past, in a seemingly different world than the white men, Pelletier said, although by the 19th Century, native people wore modern clothing of the time, spoke multiple languages and were “politically savvy.” “American Indians in the Great Lakes area had been intermarrying with, trading with and negotiating with settlers for 200 years,” she explained. “So it’s not like these folks showed up in this community and the natives didn’t know who they were. They had a pretty good idea of what the desire was: Land and for peaceful movement in the territory. They would have known they didn’t have a lot of options.” The mural can also be viewed as offensive to native people, Pelletier acknowledged, which is why it is important to “put the mural into perspective” by considering the heightened nationalism of the period when it was painted and actual history of government treaties, she said. Pat Lofthouse, a former library trustee who was involved in the mural’s restoration effort, said she fully supports additional historical context accompanying the mural. At its dedication in 2013, the restoration group invited a Native American scholar who pointed out the “romanticism” of the piece and the fact that the indigenous people did not give up their land happily. “It was always my intent that it be used for educational purposes to demonstrate Manifest Destiny,” Lofthouse said of the mural. The painting also reflects the attitudes of the time when it was created, she said. “Now, with our heightened sensitivity, we realize the sacrifice of all people of color,” Lofthouse acknowledged. For Smith, the mural is one of many different viewpoints reflected among the materials that are available within the library.
https://www.chicagotribune.com/suburbs/park-ridge/ct-prh-library-mural-tl-0523-story.html
Very few subjects in American history are as full of contradictions, fantasy and misplaced nostalgia as that of the history of the Native Americans. I have spent a significant portion of my adult years researching tribes and their associated culture and history. This website represents the fullness of what I’ve learned and will grow as I continue to learn. It’s important to understand tribal people in historical context, because our “modern” society too casually paints indigenous populations with a monochromatic brush. And this is dangerous with American Indians, because their origins and spirituality and symbols and music and art are quite diverse across the panorama of tribes recognized throughout North America. My own family history is woven together by two prominent tribes, and so a great deal of information has been passed down to me through informal conversations and the sharing of various documents over the years. Very little of this is authoritative, but what it did was ignite my interest in the larger subject of Native American history. That morphed into understanding as much as I can about the various cultural distinctions within the tribes, as well as the fascinating history of interaction between early European settlers and American Indians who experienced all manner of interaction throughout the decades with their white counterparts. From Christopher Columbus on up to current U. S. policy, there has been war and famine, spanning the plains and mountains, affecting every tribe. I want to examine the history of Native Americans without the filtered lens that most historians have applied. School textbooks have taught tens of millions of our children a sanitized version of what First Nation peoples experienced, and so we will look back at the battles and the policy decisions from our states and Washington. We will look at the real motivation behind federal government action through presidents such as Andrew Jackson and the controversial Bureau of Indian Affairs, much of which had devastating impact on the families and resources of the American Indian. I’ve written also about Indian reservations, both where they stand currently, as well as their historical creation. It represents a history of betrayal, violated terms and rights, littered with starvation and poverty that many find hard to believe exist in America today. But no story about American Indians begins with the United States. Long before Europeans discovered the eastern seaboard, tribes were flourishing in a land full of agriculture and buffalo, beauty and simplicity. This led to intricate artwork, moving music, celebrated jewelry and a tapestry of symbolism and spirituality that is still being unearthed by archaeologists thousands of years later. There was a thriving system of language, education, architecture and community that is rarely discussed outside of a museum or library. Look Up Native American Tribal Histories Already on this site, you’ll find tribal histories summarized for the Sioux, Navajo, Apache, Cherokee, Chippewa, Choctaw, Iroquois, Pueblo, Creek and Blackfeet nations. There’s also a master list of tribes that you can use to learn more history. There will soon be more covering the Comanche, Seminole, Iroquois and a host of other tribes both large and small. We have a full video section, recommendations on books that are not tainted by revisionist scandal, and resources for students about a wide array of subjects that are often covered poorly in school systems. This has included balanced articles about the Indian Reorganization Act, the Wounded Knee massacre, and a full series about the Trail of Tears. Our weakest area to date has been in covering current American Indian news, but we are increasing our staff to give that the attention it deserves. Your input is both welcomed and anticipated. I acknowledge that “the story” of these Native American tribes is way beyond my current knowledge and imagination, but with your help, we can see it all unfold here over time.
https://americanindiancoc.org/?blackhole=cd459f2a32
In A Discourse and check in research of the Prophetic publication of Joel, Colin M. Toffelmire offers a radical research of the textual content of Joel from the viewpoint of Systemic sensible Linguistics. whereas conventional explorations of Joel mostly interact the publication from an ancient or literary viewpoint, the following Toffelmire examines syntactic and semantic patterning within the publication, and builds from there towards an outline of the linguistic check in and context of state of affairs that those linguistic styles recommend. This paintings additionally showcases the usefulness of discourse research grounded in Systemic practical Linguistics for the research of historical texts. Read Online or Download A Discourse and Register Analysis of the Prophetic Book of Joel PDF Similar old testament books Ziva Shavitsky's The Mystery of the Ten Lost Tribes: A Critical Survey of PDF There were many legends and traditions in regards to the ten misplaced tribes of the Northern nation of Israel. This e-book attracts upon vast discoveries and knowledge released concerning the circulate of the folk of Israel and Judah from Davidic occasions to the sunrise of the Hellenistic interval. the writer has verified the biblical documents opposed to archaeological proof, testimony and inscriptions present in Syria, Assyria, Babylon and Persia. Be Holy. Becoming "Set Apart" for God by Warren W. Wiersbe PDF Detect what issues such a lot to God. Download e-book for iPad: Psalms 38 and 145 of the Old Greek Version by Randall X Gauthier One of many serious, ongoing discussions in Septuagint reports this present day matters the difficulty of ways texts have been understood through their translators, and the way these translations may be able to give you the sleek reader with clues to that unique interpretation. In Psalms 38 and one hundred forty five of the outdated Greek model, Randall X. Get Food in Ancient Judah: Domestic Cooking in the Time of the PDF The learn of nutrition within the Hebrew Bible and Syro-Palestinian archaeology has tended to target kosher nutritional legislation, the sacrificial method, and feasting in elite contexts. extra daily ritual and perform - the guidance of meals in the house - has been ignored. foodstuff in old Judah explores either the archaeological is still and historical close to japanese resources to work out what they display in regards to the household gastronomical way of life of historic Judahites in the narratives of the Hebrew Bible. - The Asterisked Materials in the Greek Job - The Metaphor of Illness and Healing in Hosea and Its Significance in the Socio-Economic Context of Eighth-Century Israel and Judah - Biblical Hebrew in Transition: The Language of the Book of Ezekiel - The Face of Old Testament Studies: A Survey of Contemporary Approaches Extra resources for A Discourse and Register Analysis of the Prophetic Book of Joel Sample text Firth, who was deeply influenced by Malinowski’s work, carried the issue of context into his work in linguistics. ”35 Halliday, 31 32 33 34 35 Language Education,” in Text and Context in Functional Linguistics, ed. Mohsen Ghadessy (Amsterdam: John Benjamins, 1999), 3. Helen Leckie-Tarry, Language and Context: A Functional Linguistic Theory of Register, ed. David Birch (London and New York: Pinter, 1995), 20. In systemic terms, Hasan describes the system of culture as “an organization of the possible features of all possible situations in all their possible permutations, where ‘possible’ means socially recognizable …” (Hasan, “Place of Context,” 169). Hasan, “Speaking with Reference to Context,” 238. I will elaborate on the relationship between context of situation and register below. 28 chapter 2 situation provides is a theoretically adequate account of linguistic context that can serve as the basis for statements about the represented context of some given text. These descriptions of the context of situation must, of necessity, be somewhat general and tentative, but, of course, general and tentative is significantly preferable to nothing at all. Boda, Michael H. Floyd, and Colin M. Toffelmire, anem 10 (Atlanta: Society of Biblical Literature, 2015). Hyun Chul Paul Kim, “Form Criticism in Dialogue with Other Criticisms: Building the Multidimensional Structures of Texts and Concepts,” in The Changing Shape of Form Criticism, ed. Marvin A. Sweeney and Ehud Ben Zvi (Grand Rapids: Eerdmans, 2003), 94. Martin J. 2 (1978): 157–170; Bob Becking, “Nehe- systemic functional linguistics 37 it does not distinguish between specific social situations that produce texts and the broader cultural situation in which those specific situations exist; nor does it provide a clear distinction between material situation and linguistic situation. A Discourse and Register Analysis of the Prophetic Book of Joel by Colin Toffelmire by Jeff 4.0 - Medical care economic risk : measuring financial - download pdf or read online - Download e-book for kindle: The power of life : Agamben and the coming politics (To by Agamben, Giorgio; Kishik, David; Agamben, Giorgio; Kishik,
http://jacobthefakejeweler.com/pdf/a-discourse-and-register-analysis-of-the-prophetic-book-of-joel
Good friends in Oslo (Margunn Aanestad, Miria Grisot, Ole Hanseth and Polyxeni Vassilakopoulou) have just launched their edited a book on Information Infrastructure within European Health Care. The book is open-access meaning you can download it for free here. Our team’s contribution is chapter 8 which discusses England’s Electronic Prescription Service that we evaluated for NPfIT over a number of years. This service moved UK GPs away from paper prescriptions (FP10s – the green form) to electronic messages sent directly to the pharmacy. We examine the making of the EPS temporally by looking at: (1) How existing technology (the installed base) and historical actions affect the project. (2) How the present practices and the wider NPfIT programme influenced. (3) How the desired future, reflected in policy goals and visions, influenced the present actions. To go to our article directly click here. England’s Electronic Prescription Service Ralph Hibberd, Tony Cornford, Valentina Lichtner, Will Venters, Nick Barber. Abstract We describe the development of the Electronic Prescription Service (EPS), the solution for the electronic transmission of prescriptions adopted by the English NHS for primary care. The chapter is based on both an analysis of data collected as part of a nationally commissioned evaluation of EPS, and on reports of contemporary developments in the service. Drawing on the notion of an installed infrastructural base, we illustrate how EPS has been assembled within a rich institutional and organizational context including causal pasts, contemporary practices and policy visions. This process of assembly is traced using three perspectives; as the realization and negotiation of constraints found in the wider NHS context, as a response to inertia arising from limited resources and weak incentive structures, and as a purposive fidelity to the existing institutional cultures of the NHS. The chapter concludes by reflecting on the significance of this analysis for notions of an installed base.
https://binaryblurring.com/2017/05/21/englands-electronic-prescription-service-infrastructure-in-an-institutional-setting/
Oregon Folklife Network The Oregon Historical Society (OHS) is proud to be an Operational Partner of the Oregon Folklife Network (OFN), whose mission - to make a meaningful difference in Oregon communities and Tribes by documenting, supporting, and celebrating our diverse cultural traditions and by empowering our tradition-bearers - strongly aligns with our own goals. Until budget restrictions forced its closure in 2009, OHS was proud to house the former Oregon Folklife Program, which was widely praised for its work with a variety of culture groups, communities, and tradition-bearers to document Oregon's cultural traditions and offer high quality state-wide programs and services. As an Operational Partner, OHS collaborates with OFN to foster new and ongoing projects that reach diverse communities across the state. Our unrestricted financial support to OFN allows us to do this important work without overspending our resources. We are grateful to be able to continue our long tradition of leadership in statewide folklore projects and look forward to serving Oregonians through this new partnership. About the Oregon Folklife Network Folklife encompasses the everyday knowledge, art, and lore passed within communities through imitation, conversation, and practice. Such arts, knowledge, and skills are rooted in the cultural life of a community whose members share a common language, ethnic heritage, religion, occupation, or geographic region. The OFN, Oregon's official Folk and Traditional Arts Program, represents a network of statewide culture and heritage partners that operate on state, regional, county, and community levels to document, support, preserve, and celebrate Oregon's traditional art forms and cultural practices. The OFN conducts folklife fieldwork and collaborates with communities and organizations to sponsor activities that increase public awareness of the significance of Oregon's living cultural heritage. For information about OFN programs and resources for traditional artists and communities, please visit their website. Oregon Culture Keepers Roster The Oregon Culture Keepers Roster is an online juried selection of excellent folk and traditional artists and cultural experts. Those included preserve and present Oregon's diverse heritage for a variety of audiences. As OFN completes more fieldwork around Oregon, they will be expanding the Roster. At present, it lists nearly 75 culture keepers from the following counties and Tribes as well as TAAP (Traditional Arts Apprenticeship Program) master artists from a few more: Klamath, Lake, Harney, Malheur, Hood River, Wasco, Jefferson, Sherman, Gilliam, Morrow, and Umatilla as well as with the Confederated Tribes of Umatilla, the Confederated Tribes of Warm Springs, and the Burns Paiute Tribe. On the Oregon History Project Oregon Folklife: Our Living Traditions explores community-based arts and culture in the context of local history. In each region of the state, ethnic, religious, occupational, and recreational communities tell stories, create material arts, and participate in rituals and celebrations. Joanne Mulcahy, who directed the Oregon Folk Arts Program from 1988 to 1991, has conducted fieldwork throughout the state. She is the author of Birth and Rebirth on an Alaskan Island, which chronicles the life of an Alaska Native healer, and Remedios: The Healing Life of Eva Castellanoz. She teaches writing at the Northwest Writing Institute at Lewis and Clark College.
https://ohs.org/about-us/affiliates-and-partners/oregon-folklife-network.cfm
As a sovereign nation, an Indian Tribe has the general power to create and enforce its own rules to govern activities within the Tribe’s jurisdiction. In this two-part series, we will look at one set of such rules: the preferential treatment of Indians in contracting and in employment. In this first article, we will look at Indian preference in the context of vendor selection. The second article will examine Indian preference in the context of employment. The emphasis in both articles will be on Indian preference as applied in the engineering & construction industry. Indian preference in vendor selection gives Indian-owned vendors preferential treatment when the Tribe or funding agency is awarding contracts. If an outside funding source is funding the project (such as the federal government), then the funding agency’s rules typically (but not always) will govern the preference. However, if Tribal money is funding the project, which is quite common, then the Tribe’s preference rules will typically govern the vendor selection. There is no “one size fits all” rule for how Tribes use Indian preference in vendor selection. Because each Tribe is free to establish its own rules for vendor selection, there is variety among Tribes for how this is done. A Tribe does not have to have an Indian preference policy. But for those that do, three common elements of Indian preferential treatment in vendor selection can be found. First, the Tribe can restrict the competition to only Indian-owned businesses. In this method, the Tribe certifies such businesses after an application process and then allows only those certified businesses to compete. Second, the Tribe can allow any business to compete, but Indian-owned businesses can be awarded the contract if their bid is within a certain percentage of the low bid from the non-Indian owned business. This element is often used in the award of construction contracts based solely on price. Third, a business can be awarded a higher number of points in a proposal if they are Indian-owned. This element is often used in the selection of architects and engineers. Some Tribes have no formalized preference policy in place, choosing to deal with it in an informal, relationship-based approach. But other Tribes have in place a written and published procurement policy that details their Indian preference rules. Tribes that follow such formal policies include the Salt River Pima-Maricopa Indian Community, the Navajo Nation, the Colville Tribes, and the Salish and Kootenai Tribes, to name but a few. When considering how to approach a potential business opportunity in Indian Country, it is important to understand if and how the policy of Indian preference in selecting the vendor is applied. Each Tribe can have its own unique set of rules, which may or may not be influenced by an outside funding agency. Such rules will apply in the Tribe’s selection of the vendor, whether a contractor, architect, engineer, or other business or professional.
https://kaibabllc.com/indian-preference-engineering-construction-projects/
By Craig Cooper Epigraphy is a technique of inferring and reading historic information through inscriptions came across on historic artifacts akin to stones, cash, and statues. It has confirmed critical for archaeologists and classicists, and has massive strength for the examine of historic heritage on the undergraduate and graduate degrees. Epigraphy and the Greek Historian is a suite of essays that discover numerous ways that inscriptions may help scholars reconstruct and comprehend Greek History. In order to have interaction with the examine of epigraphy, this assortment is split into components, Athens and Athens from the surface. The individuals retain the significance of epigraphy, arguing that, at times, inscriptions are the single instruments we need to get better the neighborhood background of locations that stand outdoors the focus of historic literary assets, that are usually frustratingly Athenocentric. preferably, the historian makes use of either inscriptions and literary assets to make believable inferences and thereby weave jointly the disconnected threads of the previous right into a hooked up and persuasive narrative. Epigraphy and the Greek Historian is a finished exam of epigraphy and a well timed source for college students and students fascinated about the examine of historical history. Read Online or Download Epigraphy and the Greek Historian (Phoenix Supplementary Volumes) PDF Similar ancient history books Download e-book for kindle: Lysimachus: A Study in Early Hellenistic Kingship by Helen S. Lund Even though shortlived, Lysimachus' Hellespontine empire foreshadowed these of Pergamum and Byzantium. Lund's booklet units his activities considerably in the context of the unstable early Hellenistic global and perspectives them as a part of a continuum of imperial rule in Asia minor. She demanding situations the idea that he used to be a vicious, yet finally incompetent tyrant. Enrico Dal Lago,Constantina Katsari's Slave Systems: Ancient and Modern PDF A ground-breaking edited assortment charting the increase and fall of different types of unfree labour within the historical Mediterranean and within the smooth Atlantic, applying the method of comparative background. The 11 chapters within the booklet take care of conceptual concerns and assorted methods to historic comparability, and comprise particular case-studies starting from the traditional sorts of slavery of classical Greece and of the Roman empire to the trendy examples of slavery that characterized the Caribbean, Latin the USA and the us. Read e-book online A Literary History of Persia PDF Browne's recognized paintings, first released in 1902, was once the basic textual content on literary background in Persian experiences for a few years. As an outline of Persian literature from the earliest occasions till Firdawsi, it is still a useful reference. Out of print for it slow, it's now reissued as a library variation, in facsimile to catch the texture of the unique version. Hellenistic Science at Court (Science, Technology, and by Marquis Berrey PDF The advance of technology within the smooth international is usually held to rely on such associations as universities, peer-reviewed journals, and democracy. How, then, did new technology emerge within the pre-modern tradition of the Hellenistic Egyptian monarchy? Berrey argues that the court docket society shaped round the Ptolemaic pharaohs Ptolemy III and IV (reigned successively 246-205/4 BCE) supplied an viewers for cross-disciplinary, discovered wisdom, as physicians, mathematicians, and mechanicians clothed themselves within the virtues of courtiers attendant at the kings.
http://daffydefreitas.com/read/epigraphy-and-the-greek-historian-phoenix-supplementary-volumes
A brilliant opportunity to explore the Omo Valley, home to a diverse and fascinating range of distinct tribal groups. Explore by 4WD with your local leader, stopping at local markets and villages to discover one of Africa’s last great wilderness areas. Trip Name Lost Tribes of Ethiopia Last Updated 2019-07-18 Days 9 Capacity 12 Highlights - Uncover the eight distinct tribes that call the Omo Valley home, and gain a deeper understanding and respect for their lifestyle and traditions. - Venture out on a game walk through the grasslands of the Nechisar Plain in search of zebra, kudu, and gazelle. - Learn about the importance of scarification and hair grooming for the Karo & Hamar people in the market town of Turmi. - Perched on a cliff crest, our Feature Stay Paradise Lodge, has spectacular views across to Abra Minch’s ‘Bridge of God’ – take in the scenery during your two-night stay. - Enjoy a sundowner, local music and dancing on the terrace of our Konso lodge. - Visit the less well-known Daasanach and Ari tribes, where your potential custom helps these unique communities to remain independent and preserve their culture.
https://www.peregrinetraveladelaide.com.au/trip/africa/ethiopia/lost-tribes-of-ethiopia/
American Studies is a field defined not only by the critical questions it asks but by the interdisciplinary methods it uses to answer those questions. In considering the United States as a cultural, ideological, geographical and historical formation, students of American Studies examine how cultural configurations of and within the nation-state operate as social forces, contested archives of change, loci of power and resistance, and sites historical meaning and memory. How are ideologies and arrangements in the U.S. amplified, altered, challenged or contested? American Studies seeks to address these questions by critically examining how ideas and assumptions about the U.S. have been constituted through a range of competing, corroborating and resistant discourses. Barnard students majoring in American Studies engage in the critical and interdisciplinary study of race, gender, class, sexuality, Indigeneity, political economy, imperialism and social movements in contemporary, historical, hemispheric and transnational contexts. All majors are introduced to the field through AMST 1001 (“What Is American Studies?”) and hone their understanding of the theories and methods of American Studies in an intensive junior colloquium. Their individually-chosen electives include three historically-situated Foundations courses as well as a five-course concentration that culminates in a two-course senior capstone project. The American Studies major aims to teach students to recognize, question and analyze, within an international context, the formation, implementation and contestation of power in both the nation-state and in other institutions of collective life. We work closely with our colleagues in Africana Studies and Women's, Gender, and Sexuality Studies who, along with the faculty in American Studies, constitute Barnard's Consortium for Critical Interdisciplinary Studies. Spotlight on Faculty Work Empire's Tracks: Indigenous Nations, Chinese Workers, and the Transcontinental Railroad by Manu Karuka Empire’s Tracks (University of California Press, 2019) boldly reframes the history of the transcontinental railroad from the perspectives of the Cheyenne, Lakota, and Pawnee Native American tribes, and the Chinese migrants who toiled on its path. For more details, click here. Columbia Center for the Study of Social Difference Working Group Grant on Racial Capitalism Racial capitalism is a concept that delineates the interlinked relationships of race and class constitutive of global capitalism. The racial capitalism working group is a site of sustained collaborative research and study. Our collective work is rooted in a commitment to Black radicalism, historical materialism, feminism, and anti-imperialism. For more details, click here. Collaborative Archaeology of Indigenous Agriculture In 2018, Prof. Severin Fowles initiated a community-based archaeology project, working in partnership with Picuris Pueblo in New Mexico to document the tribe’s agricultural history. Barnard students work on tribal lands under the supervision of Picuris leaders, gathering data to assist the tribe in exploring their agrarian past and also in legally protecting the water rights that are central to their future. To participate, contact Prof. Fowles.
https://americanstudies.barnard.edu/
The song refers to the forcible removal and relocation of Five Civilized Tribes, including the Cherokee people, from the southeastern states of Georgia, Florida, Mississippi and Alabama to the southern Indian Territory in present-day Oklahoma. The removal of these tribes throughout the 1830s is often referred to as the "Trail of Tears". The removal of the Cherokee, Chickasaw, Choctaw, Creek and Seminole came on the heels of President Andrew Jackson's key legislation, Indian Removal Act of 1830. The Cherokee were the last of the Five Civilized Tribes to be removed after signing the Treaty of New Echota. The removal caused great turmoil within the tribe as members of the Treaty Party were marked for death by Principal Chief John Ross. During the American Civil War the Cherokee were divided between the Ross Faction and the Ridge Faction. The Ross Faction, who had not supported removal and was made up mostly of full blood members of the tribe remained loyal to the Union. The Ridge Faction, led by Stand Watie, was made up mostly of half blood members of the tribe and due to their southern ways (including owning slaves) sided with the Confederacy. Stand Watie became the last Confederate General to surrender following the Battle of Doaksville. Following the Civil War, the United States Indian Policy turned to war and forced reservation life for the nations of the Great Plains. The Dawes Act of 1887 was adopted to allow the President to survey Indian lands and divide it up into individual allotments. Under the Dawes Act many Natives were "registered" with the Federal government. However, the law did not apply to the Five Civilized Tribes; instead the Dawes Commission was established in 1893 to convince members of the Five Civilized Tribes to adopt the individual allotments under the Dawes Act. Many Cherokee refused to be registered and as a result another split in the Cherokee Nation occurred. Today the Cherokee maintain their Federal reservation in Oklahoma with pockets living in their ancestral lands of Georgia. Read more about this topic: Indian Reservation (The Lament Of The Cherokee Reservation Indian) Other articles related to "historical context, context": ... In the context of the time, Joan was familiar with a long line of female cross-dressers living fully male lives, often at what they saw as God's calling ... However, while she may have been aware of the context, she knew that her transvestism, while potentially acceptable to a degree, still carried ... ... Following abolition, the provinces became known as provincial districts ... Their only visible function today is their use to determine, with the exception of the Chatham Islands, Northland, and South Canterbury, the geographical boundaries for anniversary day public holidays ... ... Generally, under risk-based pricing, a rational lender will underwrite a loan with an interest rate that correlates directly to the borrower's credit risk ... That is the risk that the lender cannot recover its entire expected return on investment, which in turn includes not only the probability of default, but also any external forces (like a bankruptcy court) that could delay repayment or force the lender to write off any part of the loan ... Famous quotes containing the words context and/or historical:
https://www.primidi.com/indian_reservation_the_lament_of_the_cherokee_reservation_indian/historical_context
Celebrating Freedom: A Historical Narrative This year, Parks & People had its first official Juneteenth observation. Since this is a relatively young national holiday, our Diversity Equity Inclusion Justice Committee thought it would be beneficial to our staff and board to host a lecture on the background of the holiday. We wish to provide everyone with historical context and an education of what exactly Juneteenth is, examine its significance, and follow with an open seminar. We were blessed by hosting noted historian and Founder of the National Great Blacks in Wax Museum, Dr. Joanne Martin. Dr. Martin provided us with an incredibly enriching presentation entitled “Celebrating Freedom: A Historical Narrative” in which she took us through all of the national and regional liberation celebrations including Circle Celebrations, Pinkster’s Day, Maryland’s Jubilee Day, and of course Juneteenth itself. Dr. Martin reminded us that all citizens of this country are tied together through history, so it will never be enough to simply learn your own story. When we teach ourselves and future generations our shared histories, we gain vital context for our world today. Our staff gained valuable insight on power, race, class and we get to use this education to inform our equity work. We offer a huge thank you to Dr. Martin for her time and talent!
https://www.parksandpeople.org/2022/06/21/parks-peoples-juneteenth-observation/
8) Strength, peace and security are considered to be the pillars of international relations. Elucidate. 7) What is permaculture? Can permaculture make agriculture sustainable? Examine. 6) What are biotoilets? Are they effective in managing sanitation problem in trains? Examine the alternatives that Indian railways can explore. 5) What is DNA barcoding? Discuss their applications, especially in conservation. 4) Why is it advised for pregnant women to gain weight during pregnancy? Examine the causes of high IMR and MMR in India. 3) Critically evaluate the functioning of the Monetary Policy Committee. 2) What are the important findings of Global Burden of Disease study with respect to India? Discuss the significance of its findings to policymaking in health sector for Indian states. 1) Who were ‘criminal tribes’? What was the British policy on these tribes in colonial India? Examine.
https://www.insightsonindia.com/2017/11/25/
Despite the fact that this study was conducted on rodent models, the effects were significant. In the selective paths, the axons recognize the fiber type, either by factors or signals released specifically by the fast or slow-twitch muscle fibers. The synapse itself is composed of three cells: There are no documented human studies suggesting that synaptogenesis occurs during adulthood. Note that this affects gene transcription at a distance: The technique employed both stimulated emission depletion STED microscopy and photoactivated localization microscopy PALM on ultrathin sections for protein localization at super-resolution nanoscale level and subsequently correlate the protein localization with ultrastructures by electron microscope. The depolarization caused by AChR induces muscle contraction and simultaneously intiates repression of AChR gene transcription across the entire muscle membrane. Adults with lower levels of synapses often experience reductions in cognitive function compared to those with an increased number of synapses. This high concentration of AChR in the synapse is achieved through clustering of AChR, up-regulation of the AChR gene transcription in the post-synaptic nuclei, and down-regulation of the AChR gene in the non-synaptic nuclei. Compared with other primates, in humans a relatively large proportion of brain size growth takes place postnatally, allowing for social and environmental factors to powerfully impact the establishment of neural connectivity 2 — 6. Lack of human studies: This near-infrared light reduces the size of brain lesions, minimizes inflammation, and induces neurogenesis. Synapses are formed between neurons and also with targets that are not part of the nervous system, e. There is evidence linking neurodegenerative diseases with synaptic deficits. This pioneer axon is of crucial importance because the new axons that follow have a high propensity for forming contacts with well-established synapses. Rapsyn contains domains that allow for AChR association and multimerization, and it is directly responsible for AChR clustering in the post-synaptic membrane: Effects of unilateral and bilateral training in a reaching task on dendritic branching of neurons in the rat motor-sensory forelimb cortex. The entire synapse is sheathed within a myelin cover provided by the Schwann cell to insulate and encapsulate the junction. Abstract Neocortical development in humans is characterized by an extended period of synaptic proliferation that peaks in mid-childhood, with subsequent pruning through early adulthood, as well as relatively delayed maturation of neuronal arborization in the prefrontal cortex compared with sensorimotor areas. Wnt contribution to synaptogenesis has been verified in both the central nervous system and the neuromuscular junction. Number of parallel fiber synapses on an individual Purkinje cell in the cerebellum of the rat. Jun 18, · It should be noted, however, that prior data suggesting a relative delay of synaptogenesis in the human prefrontal cortex are based on a single study of a modest sample that lacked statistical analysis. In the present study we examine the number and type of synapses in the molec- ular layer of the dentate gyrus in order to determine the extent of synaptogenesis at various developmental states. According to autoradiographic determinations of granule cell birth datesZ,3, there exist dorsal-ventral and possibly temporo-septal gradients in cell age. Synaptogenesis is the formation of synapses. Although it occurs throughout a healthy person's lifespan, an explosion of synapse formation occurs during early brain development. Although it occurs throughout a healthy person's lifespan, an explosion of synapse formation occurs during early brain development. In a rodent study, it was found that adult rats that had been given difficult acrobatic training experienced synaptogenesis, whereas rats assigned to a physical exercise task or inactivity failed to. Psychology Definition of SYNAPTOGENESIS: The formation of synapses between neurons as the axons and dendrites grow. Read also the experience-dependent process; experience-expectantPROCESS. Synaptogenesis is the formation of synapses between neurons in the nervous system. Although it occurs throughout a healthy person's lifespan, an explosion of synapse formation occurs during early brain development, known as exuberant synaptogenesis.
https://qybowuwikox.michaelferrisjr.com/a-study-on-synaptogenesis-2336eu.html
What Is Synaptogenesis? (with pictures) The term "synaptogenesis" refers to the process that occurs when a new synapse is created within the central nervous system of an organism. The literal meaning of "synapse" is "to clasp." In less abstract terms, the synapse is where the end of one nerve cell, or neuron, meets another. Specifically, the synapse is where the end of the axon of one neuron meets the dendrite or cell body of the other. The synapse is particularly important as it is where information is passed from one cell to another. The various components of a synapse include a synaptic knob of the presynaptic neuron, the synaptic cleft, and the postsynaptic knob. The information is passed across the synaptic cleft either through chemical or electrical means, although chemical neurotransmitters are the most common method of transmission. There are two different places along an axon where synaptogenesis occurs. If the presynaptic ends form along the length of the axon, the formation of the synapse is said to be en passant. A new synapse that forms at the end of the axon, on the other hand, is called terminaux. When a neuron is carrying a nerve impulse towards or away from the brain, it rarely passes the information to just one neuron. In most cases, each neuron has many synapses meeting the dendrites or cell bodies of other neurons. Each neuron can have several synapses with a following, and can also interact with several different neurons at the same time. If an organism has a high degree of synaptogenesis, it will have an increased number of synapses forming. With more synapses, the central nervous system can pass messages at a quicker rate. As a result, the more synaptogenesis there is, the faster and more efficient the central nervous system. The process of synaptogenesis usually occurs throughout the lifespan of an organism. This does not mean that synapses are forming at the same rate all the time. In most cases, there is a much higher level of synaptogenesis occurring when the brain is developing during early life. This process is of particular importance when pathways are initially forming within the brain. When the brain is developing, there is competition between neurons to create strong pathways. Those that are used more often will develop into stronger pathways and have more neurons and synapses in them. Inhibiting, or not even using, a particular process during this developmental stage can lead to decreased numbers of synapses, and even the loss of neurons. This may result in problems later in life as the less-used processes may not go on to develop properly.
https://www.wise-geek.com/what-is-synaptogenesis.htm
Almost all of the neurons in the brain are generated before birth, during the first three months of pregnancy, and the newborn child’s brain has a similar number of neurons to that of an adult. Many more neurons are formed than are needed, and only those which form active connections with other neurons survive. In the first year after birth the infant brain undergoes an intense phase of development, during which excessive numbers of connections between neurons are formed, and many of these excess connections have to be cut back through the process of synaptic pruning that follows. This pruning process is just as important a stage of development as the early rapid growth of connections between brain cells. The process during which large numbers of connections between neurons are formed is called synaptogenesis. For vision and hearing (visual and auditory cortex), there is extensive early synaptogenesis. The density of connections peaks at around 150% of adult levels between four and 12 months, and the connections are then extensively pruned. Synaptic density returns to adult levels between two and four years in the visual cortex. For other areas such as prefrontal cortex (thought to underpin planning and reasoning), density increases more slowly and peaks after the first year. Reduction to adult levels of density takes at least another 10–20 years; hence there is significant brain development in the frontal areas even in adolescence. Brain metabolism (glucose uptake, which is an approximate index of synaptic functioning) is also above adult levels in the early years. Glucose uptake peaks at about 150% of adult levels somewhere around four to five years. By the age of around ten years, brain metabolism has reduced to adult levels for most cortical regions. Brain development consists of bursts of synaptogenesis, peaks of density, and then synapse rearrangement and stabilisation. This occurs at different times and different rates for different brain regions, which implies that there may be different sensitive periods for the development of different types of knowledge. Neuroscience research into early brain development has informed government education policy for children under three years old in many countries including the USA and the United Kingdom. These policies have focused on enriching the environment of children during nursery and preschool years, exposing them to stimuli and experiences which are thought to maximise the learning potential of the young brain. - Education Topics - Achievement Gap - Alternative Education - American Education Awards - Assessment & Evaluation - Education during COVID-19 - Education Economics - Education Environment - Education in the United States during COVID-19 - Education Issues - Education Policy - Education Psychology - Education Scandals and Controversies - Education Reform - Education Theory - Education Worldwide - Educational Leadership - Educational Philosophy - Educational Research - Educational Technology - Federal Education Legislation - Higher Education Worldwide - Homeless Education - Homeschooling in the United States - Migrant Education - Neglected/Deliquent Students - Pedagogy - Sociology of Education - Special Needs - National Directories - After School Programs - Alternative Schools - The Arts - At-Risk Students - Camps - Camp Services - Colleges & Universities - Counties - Driving Schools - Educational Businesses - Financial Aid - Higher Education - International Programs - Jewish Community Centers - K-12 Schools - Language Studies - Libraries - Organizations - Preschools - Professional Development - Prom Services - School Assemblies - School Districts - School Field Trips - School Health - School Supplies - School Travel - School Vendors - Schools Worldwide - Special Education - Special Needs - Study Abroad - Teaching Abroad - Volunteer Programs - Youth Sports - For Schools - Academic Standards - Assembly Programs - Blue Ribbon Schools Program - Educational Accreditation - Educational Television Channels - Education in the United States - History of Education in the United States - Reading Education in the U.S.
https://www.k12academics.com/education-theory/educational-neuroscience/early-brain-development
If it seems like every time you turn around your child has learned a new skill, you're not mistaken. These years are a period of extremely rapid brain development, ushered by two simultaneous processes: synaptogenesis and myelination. Synaptogenesis links neurons together into sophisticated networks through the creation of new synapses in the brain. Meanwhile, myelination sheathes the nerves in a fatty, protective coating that enables faster transmission of brain signals. Here are some of the ways you may see brain development displayed in your child right now. Intellectual Children's brains grow in a preprogrammed, bottom-up sequence that is, from the primitive sections of the brain to the more sophisticated sections. Experts agree that there are critical periods in brain development during which children are especially sensitive to the environment for learning. From there, cognitive advancements are a matter of stimulating the neurons and their connections, which strengthens them and broadens their functionality. While most children aren't reading by the age of 5, between 37 months and 5 years old, they are learning to recognize letters and associate them with sounds. Motor As your child grows more agile, his brain continues to hone the processes that are key to balance and coordination. During the preschool years, he's also developing executive functions, abilities that are essential for more complex physical activities. Repetition is key to advancement; neural connections are strengthened with the use of both large- and small-muscle movements. At this age, fine motor skills also become more important, as your child learns to write, draw, build, and create in ways that require his hands to do exactly what his brain tells them to. Emotional The "use it or lose it" rule of brain development is key in understanding your child's socialization. Studies have shown dramatic differences between children who experience frequent interactions with parents and other caregivers and children who are raised with less stimulation. Social interactions reinforce the synaptic connections involved with language and other forms of communication and social expression, while those not used become weak and disappear with disuse. You will see similar changes in other areas of social interaction, particularly as your child begins to establish relationships with his peers. Communication Ever wonder why it's difficult to distinguish and understand sounds and syllables that aren't used in your native language? Your child's brain first forms the synaptic connections necessary to hear and produce all the sounds used by all languages around the world. But with use and disuse, these connections either become strengthened or are pruned away to favor those he hears and uses in his native language. Your child's communication skills take a quantum leap during these years. He becomes a much better listener and responds more readily when spoken to. References: Child Welfare. Health and Human Services. Understanding the Effects of Maltreatment on Brain Development – How the Brain Develops. https://www.childwelfare.gov/pubs/issue_briefs/brain_development/how.cfm Dana Foundation. The Dana Foundation: Brain Development in early Childhood. https://www.dana.org/news/brainhealth/detail.aspx?id=10054 Best, J. R. Effects of physical activity on children's executive function: contributions of experimental research on aerobic exercise. Developmental Review 30 (2010). 331-351.
https://www.enfagrow.com.ph/development/milestones/3-years-old/your-child%27s-brain-development-37-months
Neurons communicate with each other in the brain through specialized junctions, called synapses. During brain development, numerous new synapses are established and new synapses continue to form throughout life. The long-term goal of the research proposed in this application is to determine the molecular basis of synapse formation in the vertebrate brain. The first proteins have now been identified that organize synapse formation and development. One such protein is SynCAM 1, a synaptic cell adhesion molecule that connects pre- and postsynaptic sides. Importantly, SynCAM 1 induces the formation of new, fully functional excitatory synapses between neurons. It is highly expressed in the developing brain during intense synaptogenesis, indicating a broad function for this molecule in synapse formation. Such synaptogenic functions have been validated in cultured neurons and in vivo. The objective of this application is to define the signaling pathways through which SynCAM 1 organizes synapses and determine how other trans-synaptic proteins act in concert with it. The central hypothesis of this application is that SynCAM signaling organizes developing synapses and regulates synaptic function at later stages. To attain the objective of this application, three specific aims will be pursued. The first aim of this application is to determine the intracellular signaling pathways through which SynCAM 1- mediated synaptic adhesion instructs synapse development, focusing on changes in the synaptic cytoskeleton. Second, it will be analyzed how trans-synaptic interactions act in concert to assemble synapses and shape their structure. Third, it will be determined to which extent SynCAM 1 functions in vivo together with other synaptic adhesion molecules to organize synapses. These experiments involve the biochemical characterization of SynCAM binding partners and their activities. Functional analyses of SynCAM interactions will be performed by quantitative immunocytochemistry, imaging of synapses in cultured hippocampal neurons, and electrophysiological recordings. In addition, the in vivo relevance of these interactions will be tested using structural and functional studies of synapses, including ultrastructural analyses, electrophysiological recordings, and behavioral analyses. Achieving these goals is important for human health, as altered synapse organization affects the wiring of neuronal circuits and synaptic plasticity. These changes are associated with alterations in human behavior, the ability to learn and remember, and addiction to drugs of abuse. Furthermore, deficits of synapse formation likely underlie neurodevelopmental disorders such as autism. In summary, this application aims to identify the molecular interactions involved in synapse formation. The progress under this application will allow testing to which extent these synapse-organizing processes are affected in disorders of the human brain and whether they represent novel points of therapeutic intervention. Nerve cells communicate with each other in the brain through specialized junctions, called synapses. These junctions form in the human brain soon before birth, and changes in this process impair the wiring of the brain and can cause mental retardation. This research program is relevant to public health because it will analyze how nerve cells connect to each other in the healthy brain, allowing us to understand what steps go wrong in developmental disorders.
http://grantome.com/grant/NIH/R01-DA018928-08
The human brain forms synapses — microscopic connections between neurons in the brain — to record thoughts, memories and ideas. When 100-billion neurons need to find their connections, the biology behind the process is complex, to say the least. Assistant Biochemistry Prof. Hisashi Umemori said many debilitating diseases, including autism, epilepsy and schizophrenia, could be linked to certain neurodevelopmental dysfunctions that occur when brain structures fail to properly mature. Umemori’s research was published in the scientific journal, Nature, on Sept. 15. At the molecular level, these dysfunctions are caused by improper wiring of synapses. Recently, Umemori’s lab identified an important new molecule, SIRP-alpha, which is involved in the process of synapse maturation in the brain, thus opening the door to possible therapeutic treatments. “These diseases are caused by defects during synapse formation, so that’s why understanding the steps of these molecules — by which the brain is formed — we hope to contribute to the treatment and prevention of those diseases,” Umemori said. The lab is exploring neuron connectivity and brain development, especially the pathways by which the brain systems become wired early in life. “Neurons are precisely connected to each other, meaning each neuron knows exactly where to connect,” Umemori said. “We’re interested in how such a precise network is formed.” Neuronal pathways in the brain are formed in two distinct steps, Umemori said. The first step, which begins at birth and continues until adolescence, establishes the initial connections between neurons and forms a preliminary network. In the second step, the connections are either reinforced or eliminated based on the amount of activity they encounter, establishing a “functional circuitry,” Umemori said. “In the beginning, we usually have excess synapses, so we choose good ones,” he said. “Active ones will be stabilized and inactive ones will be eliminated, so that we will basically have the most efficient circuitry in the brain.” While the lab’s research involves development of the brain over time, the recent findings focus on the molecular mechanisms that underlie the process of synapse maturation in the second step of the process. In particular, Umemori’s lab has confirmed the role a new molecule in this step: signal regulatory protein-alpha. SIRP-alpha travels between pre- and post-synaptic neurons, binding with specific receptors that tell the neuron to reinforce the synaptic connection. “SIRP is basically used as a communication tool between pre- and post-synaptic cells to tell them that this is an active synapse,” Umemori said. The molecule was discovered in 2010 in a research study focusing primarily on the first step of neuron development. It wasn’t until the lab’s most recent publication that they realized the importance of this molecule in synaptic reinforcement was realized. In the search for molecules involved in the synapse maturation process, Umemori’s lab screened brain tissue samples in culture — placing neurons in contact with a variety of molecules thought to play a role in synapse development. After identifying cultures with active synapse formation, the tissue samples underwent a procedure known as biochemical purification, which separates molecules based on different characteristics such as size or charge. In future studies, Umemori said the lab hopes to analyze the effects of synapse dysfunction in schizophrenia using genetically modified mice, often called knockouts. While these mice are thought to express schizophrenia, the lab plans to run behavioral studies to confirm the presence of this trait — or “phenotype” — and its link to synaptic development. “We have the knockout animals, and knockout animals do have synaptic changes, but we don’t know if they have different phenotype yet,” Umemori said. “If the animals show schizophrenic phenotype then we can try to treat (them) and see if that can be a disease model.” Additionally, future research in the lab will examine other areas of the brain, since the recent findings were isolated to specific regions like the hippocampus. Erin Johnson-Venkatesh, a postdoctoral research fellow in the lab, plans to expand the research to cover other aspects of synapse development. “The paper focuses only on excitatory synapses,” Johnson-Venkatesh said. “Inhibitory synapses also may be affected, so I’m trying to figure out why and how.” Although the research has potentially broad implications for clinical treatments of neurodevelopmental diseases, Johnson-Venkatesh said the molecular processes tend to dominate the day-to-day focus of the lab. Only when a project reaches the publication stage does she come to fully realize the impact of such work. “You get really engrossed in a particular set of experience and sometimes you forget to even come up for air and all of a sudden … we need to write a paper and share these results,” she said. “Usually at the beginning and the end you sort of think more larger picture, and in the middle you’re just focused.” Two University alumni, Anna Toth and Lily Zhang, both contributed to this recent publication. Given their success in this field, Johnson-Venkatesh offered advice to undergraduates interested in pursuing research. “I think finding something you’re interested in is probably the most important,” she said. “And the second most important is finding an environment that you like working in … because then you’re going to enjoy being there.” —Alexandra Soos and Madison Dettlinger contributed reporting.
https://www.michigandaily.com/uncategorized/sirp-alpha-found-play-key-role-neuron-maturation/
Neurons in your brain are interconnected through synapses, thus constituting the so-called neural networks. Following a simplified but conceptually useful view, each synapse can be seen as a link between two given neurons, and it has a certain associated strength. Such a strength gives an idea. A synaptic pattern or neural pattern was the unique configuration of neurons and synapses in the brain. Since memories, thought patterns, and aspects of personality were encoded in this pattern, it was often considered to represent a person's consciousness At birth, the mammalian brain contains excess connections, or synapses, between neurons. To form mature, precise neural circuits, the brain must remove these excess connections in an activity-dependent process called synaptic pruning Neuroplasticity, also called brain plasticity, is the process in which your brain's neural synapses and pathways are altered as an effect of environmental, behavioral, and neural changes. When it. Synapse: A specialized junction at which a neural cell (neuron) communicates with a target cell. At a synapse, a neuron releases a chemical transmitter that diffuses across a small gap and activates special sites called receptors on the target cell. The target cell may be another neuron or a specialized region of a muscle or secretory cell Neurons and Synapses. The brain is the source of thoughts, perceptions, emotions, memories and actions. Neural signaling, the foundation of brain activity, must be precisely regulated to prevent neuronal disorders that may cause Parkinson's disease, schizophrenia, compulsive behaviors and addiction Memristance can explain Spike-Time-Dependent-Plasticity in Neural Synapses. Bernabé Linares-Barranco 1 and Teresa Serrano-Gotarredona 1. Correspondence: (Login to view email address) CSIC (Spanish Research Council) PDF (562.1 KB compared to electrical synapses. the probability that a presynaptic action potential will produce a postsynaptic action potential is termed the SAFETY FACTOR (a concept similar to security that will be discussed under neurotransmitter release) neither electrical nor chemical synapses are perfectly secur A team of scientists from the Moscow Institute of Physics and Technology (MIPT) have created prototypes of electronic synapses based on ultra-thin films of hafnium oxide (HfO2). These prototypes. There are two broad types of synapse: electrical and chemical. An electrical synapse, also known as a gap junction, is a mechanical link between two neurons that allows for the conduction of electricity. Electrical synapses contain channels that allow charges (ions) to flow from one cell to another (Fig. 2.2) Artificial Synapses Could Lead to Smarter AI By replicating the function of the human brain's 100 trillion synapses, scientists hope to boost the versatility of artificial neural networks Electrical synapses are found in vertebrate and invertebrate nervous systems. The cellular basis of these synapses is the gap junction, a group of intercellular channels that mediate direct. Recognition of spike sequences is demonstrated after supervised training of a multiple-neuron network with resistive switching synapses. Finally, we show that, due to the sensitivity to precise spike timing, the spatiotemporal neural network is able to mimic the sound azimuth detection of the human brain NEURONS, SYNAPSES, AND BRAIN DEVELOPMENT What happens to synapses during development, and why, are fundamental questions for modern neuroscience. As one prominent textbook, Eric Kandel and James Schwartz's Principles of Neural Science, says, Behavior depends on the formation of appropriate interconnections among neurons in the brain.5 O Synapses can weaken or strengthen in response to a number of factors, a phenomenon which is termed neural plasticity. Neurodegenerative diseases, such as Alzheimer's and Parkinson's, display a loss of synaptic function and a subsequent degradation of these structures Knowing Neurons is an award-winning neuroscience education and outreach website that was created by young neuroscientists. The global team members at Knowing Neurons explain complicated ideas about the brain and mind clearly and accurately using powerful images, infographics, and animations to enhance written content Image by NIDA Cocaine in the brain: In the normal neural communication process, dopamine is released by a neuron into the synapse, where it can bind to dopamine receptors on neighboring neurons Synapses can be thought of as converting an electrical signal (the action potential) into a chemical signal in the form of neurotransmitter release, and then, upon binding of the transmitter to the postsynaptic receptor, switching the signal back again into an electrical form, as charged ions flow into or out of the postsynaptic neuron Abstract: Spiking neural P systems (SN P systems) are models of computation inspired by biological spiking neurons. SN P systems have neurons as spike processors, which are placed on the nodes of a directed and static graph (the edges in the graph are the synapses) neurons, or to send signals to non-neural cells such as muscle bres. In contrast to other cel-lular signalling mechanisms, signal transmission through synapses is very fast. Glutamatergic synapses for instance can generate a postsynaptic current in less than 0.5ms after the arrival of the presynaptic action potential In both the perforant path synapses from entorhinal cortex to dentate gyrus and Schaffer collateral synapses from CA3 to CA1 pyramidal neurons, LTP follows learning rules first postulated by Hebb. It requires that presynaptic activity be closely followed by postsynaptic activity Healthy development in the early years (particularly birth to three) provides the building blocks for educational achievement, economic productivity, responsible citizenship, lifelong health, strong communities, and successful parenting of the next generation. What can we do during this incredibly. Signaling at neuro/immune synapses Michael L. Dustin The Helen L. and Martin S. Kimmel Center for Biology and Medicine, Skirball Institute of Biomolecular Medicine, New York University School of Medicine, New York, New York, USA. Immunological and neural synapses share properties such as the synaptic cleft, adhesion molecules, stability, an In this way, the only cell that is acted upon is the cell with which the T cell forms the synapse, and this function is clearly analogous to that of the neural synapse. In the immunological synapse, the gasket is integrin dependent, whereas at the neural synapse cadherins and other molecules form the gasket Neural networks with dynamical synapses: from mixed-mode oscillations and spindles to chaos KyoungEun Lee, A. V. Goltsev,†, M. A. Lopes and J. F. F. Mendes Departamento de Física da Universidade de Aveiro, I3N, 3810-193 Aveiro, Portuga 1. INTRODUCTION The Neural Networks (NN) VLSI implementa- tions are massively parallel analog systems. The Multi Layer Perceptron (MLP) trained by the back propagation algorithm is one of the most important neural network. It contains neurons and synapses. Each synapse needs a multiplier to multiply the input times the weight A neural network is a collection of neurons with synapses connecting them. The collection is organized into three main parts: the input layer, the hidden layer, and the output layer. Note that you can have n hidden layers, with the term deep learning implying multiple hidden layers With approximately 10 15 synapses in the human brain, it is a critical component of neural circuitry. Hence, finding a simple, low-energy, artificial synapse is an important step in making a neuromorphic computer that can approach the level of complexity of the human brain Aug 22, 2014 · Credit Credit Ruth Fremson/The New York Times. there is an explosion of synapses, the connections that allow neurons to send and receive signals. a professor of neural science at New York. Get directions, reviews and information for Neural Synapse Consulting in Tulsa, OK. Neural Synapse Consulting 6629 E 57th St Tulsa OK 74145. Reviews (918) 749-1115. We studied an autoassociative neural network with dynamic synapses which include a facilitating mechanism. We have developed a general mean-field framework to study the relevance of the different parameters defining the dynamics of the synapses an Left) This is a schematic representation of giant synapse from the mammalian auditory brainstem in lab cultures. Notice the large pre-synaptic terminal (green) surrounding the postsynaptic cell. The power to harness the synapses and neural impulses in the body. Combination of Bio-Electricity Manipulation and Nerve Manipulation. The user can sense and control neural impulses, the electrical discharges that travel along the nerve fibers within organisms As the neurons mature, more and more synapses are made. At birth, the number of synapses per neuron is 2,500, but by age two or three, it's about 15,000 synapses per neuron. This is like going from 100 to 600 friends on Facebook, and each of those friends in turn, is connected to 600 more people! The neural network expands exponentially These signals are conducted axonally through ascending pathways, across synapses, and finally to specific sites in the brain. Other neural cells in the brain process the coded signals, and direct the actions of muscles and other organs in response to the various sensory inputs Silent Synapses in Neural Plasticity: Current Evidence Harold L. Atwood1 and J. Martin Wojtowicz1 Department of Physiology Medical Sciences Building University of Toronto Toronto, Ontario M5S 1A8, Canada Abstract Silent synapses, defined as structural specializations for neurotransmission that do not produce a physiological respons Synapses A Pytorch Implementation of Sparse Evolutionary Training (SET) for Neural Networks. Based on research published by Mocanu et al., Scalable Training of Artificial Neural Networks with Adaptive Sparse Connectivity inspired by Network Scienc Possibly because of the movie Amélie and possibly because people like to quote statistics or calculations without knowing what they mean, it's fairly easy to find some absurd claims about the number of synapses or neural connections in the human brain and how they compare to the number of stars/atoms/something else in the universe Neural circuits are useful theoretical constructs to probe brain function but, as theoretical constructs, suffer from inherent limitations that need to be considered when evaluating the role of synapses in neural information processing. Two features of neural circuit organization in particular are notable This process is known as synaptic pruning and enables neural connections that no longer serve any purpose to be removed, while more useful ones are strengthened. Memory, Development and Learning. Neural plasticity is crucial to the development of the brain, the formation of memories and the ability to learn from experience The basic kinds of connections between neurons are chemical synapses and electrical gap junctions, through which either chemical or electrical impulses are communicated between neurons. Neural networks are primarily made up of axons, which in some cases deliver information as far as two meters Rather than simulating a neural network with software, they made a device that behaves like the brain's synapses—the connection between neurons that processes and stores information—and completely overhauled our traditional idea of computing hardware populär:
http://barnet-ned.fun/938345163/1619/neural-synapses.html
Are new connections continuously forged in the adult human brain, or do we just start all wired together and then prune until we have something we like? - $\begingroup$ I believe it is a bit of both. But that will likely depend per brain region. $\endgroup$ – Robin Kramer Jul 6 '16 at 17:29 Development of patterns of synaptic connection contains elements of both outgrowth and pruning. For example, this paper from zebrafish development illustrates both those mechanisms, which appear to be tightly linked: Meyer MP, Smith SJ. 2006. Evidence From in Vivo Imaging That Synaptogenesis Guides the Growth and Branching of Axonal Arbors by Two Distinct Mechanisms. Journal of Neuroscience 26:3604–3614. In more advanced animals (e.g. cats), there is also evidence for both outgrowth and pruning: Callaway EM, Katz LC. 1990. Emergence and Refinement of Clustered Horizontal Connections in Cat Striate Cortex. Journal of Neuroscience 10:1134–1153. There is also evidence for generation of new neurons in adult humans, which implies that new synaptic connections must be forged to integrate newborn neurons into existing networks: Ericsson et al. 1998. Neurogenesis in the adult human hippocampus. Nature Medicine 4:1313-1317. The phrase "fire together, wire together" comes from an explanation of Hebbian Learning and refers to the adaptation of synapses as a response to the firing of already connected neurons. This is one of several Synaptic Plasticity mechanisms. Two others that exist are Long Term Potentiation (strengthening and creation) and Long Term Depression (weakening and destruction). All of these mechanisms are still being researched heavily to understand their detail better. However, I hope this gives you a good starting point.
https://psychology.stackexchange.com/questions/15404/can-fire-together-wire-together-be-undone
The Social Brain of a Teenager It has been known for many decades that the brain undergoes critical periods of development during the early years. However, only recently has it been discovered that brain development does not stop after early childhood. Indeed, new evidence shows that certain regions of the human brain continue to develop during adolescence and beyond. Some of these regions, in particular the prefrontal cortex (PFC) and superior temporal cortex, are involved in social cognitive processes, such as understanding others’ minds. Thus, it might be expected that certain social cognitive processes undergo refinement during adolescence. While there is a mass of self-report data on social development during puberty and adolescence, until recently very little empirical research had investigated social cognitive development after childhood. Here, I describe studies that fill that gap. Early brain development – A brief history An adult brain has about 100 billion neurons; at birth, slightly fewer. However, during development many changes take place in the brain. Neurons grow, which accounts for some of the change, but it is the wiring of connections between cells (synapses) that undergoes the most significant transformation. Early in development, the brain begins to form new synapses, so that the synaptic density (the number of synapses per unit volume of brain tissue) in young animals greatly exceeds adult levels. This process of synaptic proliferation, called synaptogenesis, lasts up to several months, depending on the species of animal. It is followed by a period of synaptic elimination (or pruning) in which frequently used connections are strengthened, and infrequently used connections are eliminated. This experience-dependent process, which occurs over a period of years, reduces the overall synaptic density to adult levels. Research on rhesus monkeys demonstrated that synaptic densities reach maximal levels two to four months after birth, after which time pruning begins. Synaptic densities gradually decline to adult levels at around three years of age, around the time monkeys reach sexual maturity (Rakic, 1995). Educational literature often suggests that the crucial phase of brain development in humans occurs from birth to three years and that during this time children should be exposed to all sorts of learning experiences. However, this claim makes the assumption that the time course of synaptogenesis is the same for humans as it is for rhesus monkeys. In the next section I describe the first experiments that looked at development of the human brain. Human brain development Cellular studies The time course of synaptogenesis and synaptic pruning is different for different brain areas, and different classes of neurons in the same brain region gain and lose synapses at different rates. Moreover, brain development varies between species. The only available data on the development of the human brain suggest that synaptogenesis follows a different time course from that in animals. In the human visual cortex, there is a rapid increase in the number of synaptic connections at around two or three months of age, which reaches a peak at eight to 10 months. After that there is a steady decline in synaptic density until it stabilises at around age 10 years and remains at this level throughout adult life (Huttenlocher, 1979). Different areas of the human brain develop at different rates. In the human frontal cortex – the brain area responsible for planning, integrating information and decision making – synaptogenesis occurs later and the pruning process takes longer than in the visual cortex. In this area, neuronal development continues throughout adolescence: synaptic densities peak at around age 11 and then decline during adolescence and into the twenties (Huttenlocher, 1979). MRI studies The scarcity of post-mortem brains meant that knowledge of the adolescent brain was until recently extremely scanty. However, since the advent of magnetic resonance imaging (MRI), a number of brain-imaging studies, using large samples of participants, have provided further evidence of the ongoing cortical maturation into adolescence and even into adulthood. One of the most consistent findings from these MRI studies is that there is a linear increase in white matter, and a net decrease in grey matter, in certain brain regions during childhood and adolescence. These changes are most significant in frontal and parietal regions (Giedd et al., 1999; Sowell et al., 1999). The rise in white matter with age reflects an increase in the myelin sheathing surrounding axons in the frontal cortex. While the increase in white matter is linear, the changes in grey matter density appear not to be. In one of the first MRI studies of human brain development, Giedd et al. (1999) performed an MRI study on 145 healthy boys and girls ranging in age from about four to 22 years. The volume of grey matter in the frontal lobes increased during childhood with a peak occurring at around 12 years for males and 11 years for females. This was followed by a decline during adolescence. A similar non-linear pattern was found for other cortical regions including parietal and temporal lobes. These findings have been replicated by a number of MRI studies (e.g. Gogtay et al., 2004). Generally, it has been found that brain areas associated with more basic motor and sensory functions mature first, followed by brain regions related to higher cognitive function. The non-linear pattern of grey matter development during adolescence has been interpreted as reflecting, at least in part, the synaptic reorganisation that occurs at the onset of and after puberty (Huttenlocher, 1979). Thus, the peak in grey matter at the onset of puberty (Giedd et al., 1999) is thought to reflect a wave of synapse proliferation, which is followed by synaptic pruning during adolescence. Given that certain brain areas are subject to protracted development, it might be predicted that cognitive abilities that depend on the functioning of these brain areas would also develop. In the next section, I consider the implications of structural brain development during adolescence for cognition. Social cognitive development during adolescence Cognitive abilities that rely on the brain regions that undergo the most protracted development – including PFC and superior temporal sulcus (STS) – include executive function and social cognition. There is evidence that a variety of executive function abilities undergo refinement during adolescence (e.g. Anderson et al., 2001; see Blakemore & Choudhury, 2006, for a review). Here, I focus on the development of social cognition during adolescence. Theory of mind, or mentalising, refers to the inferences that we naturally make about other people’s intentions, beliefs and desires, which we then use to predict their behaviour. A number of neuroimaging studies, using a wide range of tasks, have reported activation in a highly circumscribed ‘mentalising network’, comprising the medial PFC, the STS and temporo-parietal junction (TPJ), and the temporal poles (see Frith & Frith, 2006, for a review). Lesion studies have also implicated the frontal cortex and STS/TPJ in mentalising (see Apperly et al., 2005, for a review). It has been known for many decades that the ability to attribute mental states develops over the first few years of life, culminating in the ability to pass complex false belief tasks by about age four or five (Barresi & Moore, 1996). While typically developing children are able to pass theory of mind tasks by five, the brain structures that underlie mentalising undergo substantial development beyond early childhood. Yet the development of social cognitive abilities such as mentalising after early childhood has been neglected. I now turn to two recent studies that have investigated development of social cognition during adolescence. The first concerns perspective-taking ability, a skill that is crucial for successful social communication. In order to reason about others, and understand what they think, feel or believe, it is necessary to step into their ‘mental shoes’ and take their perspective. Perspective taking includes awareness of one’s own mental states (‘first-person perspective’) and requires the ability to ascribe viewpoints, mental states or emotions to another person (‘third-person perspective’). Functional neuroimaging studies have revealed that medial PFC, inferior parietal lobe (IPL) and STS are associated with making the distinction between third and first person (e.g. Ruby & Decety, 2001, 2004). We recently investigated development of perspective taking during adolescence (Choudhury et al., 2006). We tested pre-adolescent children (mean age 9 years), adolescents (mean age 13 years) and adults (mean age 24 years,) on a perspective-taking task that required participants to imagine either how they would feel (first person) or how a protagonist (third person) would feel in various scenarios. Participants were asked to choose one of two possible emotional faces in answer to each question, as quickly as possible. Each participant’s reaction time difference between first and third person perspective was calculated. The results showed that this reaction time difference decreased significantly with age. This finding suggests that the efficiency of perspective taking develops during adolescence, perhaps in parallel with the underlying neural circuitry. Whether this response pattern is because younger participants found it difficult to differentiate between the first and third person, or younger children are less inclined, or find it more difficult, to enter into another person’s ‘mental shoes’, requires further investigation. The differences between age groups may also be influenced by differences in social experience. Perhaps adults show no significant difference between the time it took them to answer first and third person perspective questions as a result of their mature neural circuitry supporting social cognition, as well as their greater social experience. Next we turned our attention to the development of intention understanding. In a recent fMRI study, we investigated how the functioning of the mentalising brain network changes with age (Blakemore et al., in press). The aim of the study was to explore adolescent development of the brain regions involved in thinking about intentions. A group of adolescents (mean age 15) and a group of adults (mean age 28) responded to scenarios related either to their own intentions and consequential actions (intentional causality) or to physical events and their consequences (physical causality). We investigated how activity during these tasks in the adult brain compares with activity in the adolescent brain. The results showed that both groups recruit the mentalising network (medial PFC, STS/TPJ and temporal poles) during intentional causality relative to physical causality. However, adolescents activated the medial PFC part of this network to a significantly greater extent than did adults. Activity in a particular part of medial PFC during intentional causality occurred only in the adolescent group. This suggests that adolescents use additional regions of the medial PFC to achieve the same performance as the adults. The results imply that the demand on medial PFC circuitry during mentalising tasks is higher in adolescence than in adulthood. One possible explanation is that cortical development, in particular grey matter reorganisation in the PFC (e.g. Giedd et al., 1999; Gogtay et al., 2004), mediates this developmental change in medial PFC recruitment. As described above, it has been suggested that the loss of grey matter in PFC during adolescence reflects synaptic pruning. It is unknown whether the synaptic pruning that occurs during human adolescence in parts of the brain (including the PFC) fine-tunes neural tissue into specialised networks in the same way as during early development. If this is the case, then such regions may not function as efficiently in adolescents as in adults. As a result, it is possible that they contain less efficient connections, which may result in more widespread, diffuse activity for tasks that involve processing in these areas. The results of our study suggest that adolescents require more activity in medial PFC when using mental-state representations during the intentional causality task. Part of the right STS, which, like medial PFC, is part of the mentalising network, was activated by intentional causality for adults only. This suggests that activity within the mentalising network shifts from anterior (PFC) regions to posterior (STS) regions with age over the period of adolescence. Many unanswered questions While children start to pass explicit theory of mind tasks by five years, the data described in this review suggest that the neural basis of theory of mind continues to develop well past early childhood. Social cognitive development during adolescence is a new and rapidly expanding field and yet many questions remain unanswered. The relative roles of hormones, culture and the social environment on the development of the social brain are unknown. Future research is needed to disentangle the contributions of biological and environmental factors to the developing social brain. - Dr Sarah-Jayne Blakemore is at the Institute of Cognitive Neuroscience, University College London. E-mail: [email protected]. Weblinks UCL Institute of Cognitive Neuroscience, Developmental Group: www.icn.ucl.ac.uk/ research-groups/Developmental-Group/index.php Sarah-Jayne Blakemore’s homepage, including The Learning Brain: Lessons for Education: www.icn.ucl.ac.uk/sblakemore New Scientist report: www.newscientist.com/ channel/being-human/teenagers YoungMinds: www.youngminds.org.uk TeenIssues: www.teenissues.co.uk Discuss and debate How does brain development during adolescence interact with hormonal and social environmental changes occurring during this period of life? Is synaptic pruning during adolescence susceptible to environmental influence as it is during early development? Recent studies have shown that cannabis consumption during adolescence increases the risk of developing psychosis (e.g. Arseneault et al., 2002). How exactly does cannabis affect adolescent brain development? Have your say on these or other issues this article raises. E-mail ‘Letters’ on [email protected] or contribute (members only) via www.psychforum.org.uk. References Apperly, I.A., Samson, D. & Humphreys, G.W. (2005). Domain-specificity and theory of mind. Trends in Cognitive Science, 9(12), 572–577. Anderson, V., Anderson, P., Northam, E., Jacobs, R. & Catroppa, C. (2001). Development of executive functions through late childhood and adolescence in an Australian sample. Developmental Neuropsychology, 20, 385–406. Arseneault, L., Cannon, M., Poulton, R. et al. (2002). Cannabis use in adolescence and risk for adult psychosis: longitudinal prospective study. British Medical Journal, 325, 1212–1213. Barresi, J. & Moore, C. (1996). Intentional relations and social understanding. Behavioral and Brain Sciences 19, 107–154. Blakemore, S-J. & Choudhury, S. (2006). Development of the adolescent brain: Implications for executive function and social cognition. Journal of Child Psychology and Psychiatry, 47(3–4), 296–312. Blakemore, S-J., Ouden, H.E.M. den, Choudhury, S. & Frith, C. (in press). Adolescent development of the neural circuitry for thinking about intentions. Social Cognitive and Affective Neuroscience. Choudhury, S., Blakemore, S-J. & Charman, T. (2006). Social cognitive development during adolescence. Social Cognitive and Affective Neuroscience, 1(3), 163–164. Frith, C.D. & Frith, U. (2006). The neural basis of mentalizing. Neuron, 50(4), 531–534. Giedd, J.N., Blumenthal, J., Jeffries, N.O. et al. (1999). Brain development during childhood and adolescence: A longitudinal MRI study. Nature Neuroscience, 2, 861–863. Gogtay, N., Giedd, J.N., Lusk, L. et al. (2004). Dynamic mapping of human cortical development during childhood through early adulthood. Proceedings of the National Academy of Science, USA 101, 8174–8179. Huttenlocher, P.R. (1979). Synaptic density in human frontal cortex: Developmental changes and effect of aging. Brain Research, 163, 195–205. Rakic, P. (1995). Corticogenesis in human and nonhuman primates. In M.S. Gazzaniga (Ed.) The Cognitive Neurosciences (pp.127–145). Cambridge MA: MIT Press. Ruby, P. & Decety, J. (2001). Effect of subjective perspective taking during simulation of action: A PET investigation of agency. Nature Neuroscience, 4, 546–550. Ruby, P. & Decety, J. (2004). How would you feel versus how do you think she would feel? A neuroimaging study of perspective-taking with social emotions. Journal of Cognitive Neuroscience, 16, 988–999. Sowell, E.R., Thompson, P.M., Holmes, C.J. et al. (1999). Localizing age-related changes in brain structure between childhood and adolescence using statistical parametric mapping. Neuroimage, 9, 587–597.
https://thepsychologist.bps.org.uk/volume-20/edition-10/social-brain-teenager
Perhaps the most remarkable feature of the developing brain is its ability to self-organize into functional circuits. How does this happen? We know that the formation of appropriate connections requires the targeting of axons and dendrites to specific regions of the brain and the selection of appropriate synaptic partners within those regions. Our lab is interested in understanding the mechanisms that mediate these decisions, and in the past few years we have identified a number of extracellular factors as well as transcription factors that regulate different aspects of hippocampal and cortical connectivity. The major areas of current research are described below. While there has been significant progress in our understanding of the molecular control of axonal and dendritic development during the last decade, we know very little about the mechanisms that allow neurons to select appropriate synaptic partners. This problem, which is a fundamental unsolved problem in neural development, is investigated in several projects in the lab. We are using the hippocampus as a model system to study the mechanisms of synaptic specificity because of its highly structured connectivity. We are using electrophysiological and molecular approaches to identify and characterize molecules that regulate synaptic specificity during establishment of neural circuits. The establishment of functional neuronal circuits relies on the formation of excess synapses, followed by the elimination of inappropriate connections. Although the stabilization of presynaptic inputs is critical for the development and maintenance of functional circuits, the signals that regulate presynaptic stability are not known. We have found that synapse formation in cortical and hippocampal cultures is highly dynamic and involves the stabilization of a subset of synapses in a backdrop of a high rate of synapse formation and elimination. During the peak of synaptogenesis, only about 50% of putative synapses are stable over an hour. We have found that presynaptic stability is strongly correlated with the presence of postsynaptic AMPA but not NMDA receptors. We are examining the mechanisms by which postsynaptic AMPA receptors regulate presynaptic stability. Once the initial connections are formed, neuronal activity exerts a major influence on the organization of neuronal circuits by regulating changes in synaptic strength. At many synapses, the direction and extent of change in synaptic strength depends on the stimulus parameters. For example, at the CA3-CA1 Schaffer collateral synapse in the hippocampus, low frequency stimulation leads to long term depression (LTD) and high frequency stimulation leads to long term potentiation (LTP). The relationship between stimulus frequency and change in synaptic strength is often depicted by a function called the Bienenstock, Cooper, Munro (BCM) function. This function itself can be modified by various manipulations, and the resulting shift in the BCM function is a measure of metaplasticity. Despite the importance of metaplasticity, the molecular mechanisms that regulate metaplasticity are not well understood. We are examining the hypothesis that activity-dependent transcription plays a critical role in regulating metaplasticity of synapses by controlling the AMPA: NMDA ratio. We have found that CREST, a transcription factor cloned in our lab, exerts a significant influence in regulating AMPA and NMDA receptor levels. Another area of recent investigation in the lab is to use molecular genetic approaches to study the role of specific cell types in the development of cortical circuits. We are exploring the possibility that GABAergic inputs, which regulate postsynaptic depolarization with great spatial and temporal precision, may play a critical role in determining cortical connectivity by influencing input selectivity. GABAergic inputs are ideally suited to mediate input selectivity since different classes of GABAergic neurons innervate different parts of the principal (pyramidal) neurons and serve different functions. While the dendritic inhibitory inputs locally regulate the amplitude of excitatory postsynaptic potentials (EPSPs), the perisomatic inhibitory inputs regulate the overall firing rates of neurons. In principle each of these inputs could influence which excitatory inputs are stabilized, but whether they serve such a function is not known. By selectively silencing the activity of different GABAergic subpopulations we should be able to assess their contribution to the function of cortical circuits. A new area of research in our lab concerns the differentiation human ES and iPS cells into forebrain neurons, and using these neurons to develop stem cell-based models of neurological and psychiatric disorders. Our specific interests are in developing models of synaptic dysfunction associated with autism and Alzheimer’s Disease. Ghosh, A., J. Carnahan and M.E. Greenberg (1994). Requirement for BDNF in activity-dependent survival of cortical neurons. Science 263:1618-1623. Shieh, P.B., Hu, S.-C., Timmusk, T., and A. Ghosh (1998). Identification of a signaling pathway involved in calcium regulation of BDNF expression. Neuron 20:727-740. Polleux, F., R.J. Giger, D.D. Ginty, A.L. Kolodkin, and A. Ghosh (1998). Patterning of cortical efferent projections by semaphorin-neuropilin interactions. Science 282:1904-1906. Polleux, F., T. Morrow and A. Ghosh (2000). Semaphorin 3A is a chemoattractant for developing cortical dendrites. Nature (research article; cover) 404:567-573. Redmond, L., Kashani, A., and A. Ghosh (2002). Calcium regulation of dendritic growth via Cam kinase IV and CREB-mediated transcription. Neuron 34:999-1010. Aizawa, H., Hu, S-C, Bobb, K., Balakrishnan, K., Ince, G., Gurevich, I., Cowan, M., and A. Ghosh (2004). Dendrite development regulated by CREST, a calcium-regulated transcription activator. Science (research article; cover) 303:197-202. Ince-Dunn, G., Hall, B.H., Hu, S-C., Ripley, B., Huganir, R.L., Olson, J.M., and A. Ghosh (2006). Regulation of thalamocortical patterning and synaptic maturation by NeuroD2. Neuron (cover) 49:683-695. Qiu, Z. and A. Ghosh (2008) A calcium-dependent switch in a CREST-BRG1 complex regulates activity-dependent gene expression. Neuron 60:775-787. Joris de Wit, Emily Sylwestrak, Matthew L. O’Sullivan, Stefanie Otto, Katie Tiglio, Jeffrey N. Savas, John R. Yates III, Davide Comoletti, Palmer Taylor, and Anirvan Ghosh (2009) LRRTM2 interacts with Neurexin1 and Regulates Excitatory Synapse Formation. Neuron 64:799-806.
https://medschool.ucsd.edu/education/neurograd/faculty/Pages/anirvan-ghosh.aspx
SANTA CRUZ, CA–New connections begin to form between brain cells almost immediately as animals learn a new task, according to a study published this week in Nature. Led by researchers at the University of California, Santa Cruz, the study involved detailed observations of the rewiring processes that take place in the brain during motor learning. The researchers studied mice as they were trained to reach through a slot to get a seed. They observed rapid growth of structures that form connections (called synapses) between nerve cells in the motor cortex, the brain layer that controls muscle movements. “We found very quick and robust synapse formation almost immediately, within one hour of the start of training,” said Yi Zuo, assistant professor of molecular, cell and developmental biology at UCSC. Zuo’s team observed the formation of structures called “dendritic spines” that grow on pyramidal neurons in the motor cortex. The dendritic spines form synapses with other nerve cells. At those synapses, the pyramidal neurons receive input from other brain regions involved in motor memories and muscle movements. The researchers found that growth of new dendritic spines was followed by selective elimination of pre-existing spines, so that the overall density of spines returned to the original level. Understanding the basis for such long-lasting memories is an important goal for neuroscientists, with implications for efforts to help patients recover abilities lost due to stroke or other injuries. “We initiated the motor learning studies to understand the process that takes place after a stroke, when patients have to relearn how to do certain things. We want to find out if there are things we can do to speed up the recovery process,” Zuo said. The lead authors of the Nature paper, Tonghui Xu and Xinzhu Yu, are a postdoctoral researcher and doctoral student, respectively, in Zuo’s lab at UCSC. Coauthors include Andrew Perlik, Willie Tobin, and Jonathan Zweig of UCSC and Kelly Tennant and Theresa Jones of the University of Texas, Austin. The study used mice that had been genetically altered to make a fluorescent protein within certain neurons in the brain. The researchers were then able to use a special microscopy technique (two-photon microscopy) to obtain clear images of those neurons near the surface of the brain. The noninvasive imaging technique enabled them to view changes in individual brain cells of the mice before, during, and after the mice were trained in the seed-reaching task. Results from the study suggested that the newly formed dendritic spines are initially unstable and undergo a prolonged selection process during the course of training before being converted into stable synapses. When previously trained mice were reintroduced to the reaching task four months later, their skill at the task remained high, and images of their brains did not show increased spine formation. When previously trained mice were taught a new skill, however, they showed enhanced spine formation and elimination similar to that seen during the initial training. Furthermore, spines that had formed during the initial training persisted after the remodeling process that accompanied the learning of a new task. These findings suggest that different motor behaviors are stored using different sets of synapses in the brain, Zuo said. One of the questions she would like to explore in future studies is how these findings apply to different types of learning. “In China, where I grew up, we memorize a lot in school. What are the changes that take place in the brain during learning and memorizing, and what are the best ways to consolidate those memories? We don’t really know the best way to learn and memorize,” she said. This work was supported by grants from the Ellison Medical Foundation, the DANA Foundation, and the National Institute on Aging.
https://scienceblog.com/27606/study-shows-new-brain-connections-form-rapidly-during-motor-learning/
The human brain is perhaps the most fantastic machine in the universe. It has 100 billion (10**11) cells or neurons, and each neuron interconnects with hundreds of other neurons via, on average, 10 thousand (10**4) connections or synapses. Thus, the brain has 10**15 synapses - this staggering number of connections is one reason for the complexity of brain processing. Another is the precision of wiring between neurons in order to make networks and modules. Neurons are not simply interconnected with any and all other neurons; rather, they make precise connections with a subset of cells and form networks that process information. Networks are the engine of the brain, for they transform simple inputs to make complex outputs, often via nonlinear operations. The brain has been compared to a computer, but differs in one fundamental respect: the brain wires itself. Understanding how the brain is wired may be key to understanding how it gives rise to the mind and creates intelligence. Research in our laboratory demonstrates that specificity and plasticity are both fundamental requirements for brain wiring. The underlying mechanisms provide critical clues for repairing the brain after damage or disease. Short bio Mriganka Sur is the Newton Professor of Neuroscience, Head of the Department of Brain and Cognitive Sciences, and Director of the Simons Initiative on Autism and the Brain at the Massachusetts Institute of Technology (MIT). Professor Sur studies the organization, development and plasticity of the cerebral cortex of the brain using experimental and theoretical approaches. He has discovered fundamental principles by which networks of the cerebral cortex are wired during development and change dynamically during learning. His laboratory has identified gene networks underlying cortical plasticity, and pioneered high resolution imaging methods to study cells, synapses and circuits of the intact brain. Recently, his group has demonstrated novel mechanisms underlying disorders of brain development, and proposed innovative strategies for treating such disorders. Professor Sur received the B. Tech. degree in Electrical Engineering from the Indian Institute of Technology, Kanpur, in 1974 and the Ph.D. degree in Electrical Engineering from Vanderbilt University, Nashville, in 1978. He has received numerous awards and honors, including the Charles Judson Herrick Award of the American Association of Anatomists, the A.P. Sloan Fellowship, the McKnight Development Award, the Hans-Lukas Teuber Scholar Award, the Distinguished Alumnus Award of the Indian Institute of Technology, Kanpur, the Sigma Xi Lectureship, and the Foundation Day Medal of the National Brain Research Center, India. At MIT, he has received awards for outstanding teaching and been recognized with the Sherman Fairchild and Newton Chairs. He has been elected Fellow of the Royal Society of the UK, and a member of the Institute of Medicine of the National Academies, the American Academy of Arts and Sciences, the American Association for the Advancement of Science, the Neuroscience Research Program, the National Academy of Sciences, India, the Rodin Academy, Sweden, and the Third World Academy of Sciences.
http://brain-mind-institute.org/mriganka-sur.html
Editorial Design Inspiration: White Time for some Editorial Design inspiration because, first, it’s been a long time since the last post on this topic, at least by me and second because this project I saw on Behance is just beautiful. 08.27.19 Shrenik Ganatra and Ninad Kale shared a beautiful editorial design project on their Behance profile. It was commissioned for a design event called Design Fabric Fest. Featuring an awesome graphic design style, they played with bold typography and really nice graphic elements to create the booklet. The colors are also very nicely chosen, mixing a vibrant red with dark purple and yellow. There are a lot of things to love about this project, it reminds me of when I started my design career, in college, where our focus and dream was to create printed pieces like this one. For more information about Shrenik and Ninad, make sure to check out their Behance profiles.
https://abduzeedo.com/node/84963
Throughout your research on the Committee for Dynamic Success you and your fellow colleagues identified that one of the most important ways to establish an ethical workplace is through training during the onboarding process and on a continuous cycle. With this in mind, the President, has asked your committee to create a basic presentation entitled, Ethics 101. Your presentation should cover the following topics: - Introduction: Why do you need Ethics training? - Identify 5 current ethical issues: You will want to include the following information: - Define the issue, include information on how it takes place, who are the usual perpetrators, where does it typically take place? - How does the issue impact the business, the employees, the stakeholders, etc.? - What recommendations would you make to resolve the unethical conduct? - Define conflict and the causes of it in the workplace. - How does unresolved/unaddressed conflict lead to unethical actions? - Provide a basic roadmap/plan that any employee could follow if they encounter a conflict with an employee from within their department. - 10 Rules for Ethical Conduct for Luxor Employees - Conclusion: What should you do if you see unethical conduct? Your presentation should be a minimum of 12 slides, including a Title slide as well as a References slide containing APA citations for the information that you used throughout the presentation. You will need to include research in the body of the presentation by using in text citations. Feel free to be as creative as you want. Use photos, charts, or even embed videos, just make sure that you are clearly conveying your message on your slides, and that you cite your sources (including multimedia) according to APA style. You will need to include accompanying “reader/presenter” notes in the notes section below each slide. These are the key notes that describe, explain and teach what the slide is trying to convey. Please note that a PowerPoint is not generally wordy and paragraphs should not be included on a slide. The slide itself is used as a visual for the audience and the notes section, which would be read by the presenter, is where the substance comes in. General PowerPoint Tips: - Fonts - Avoid fonts that are difficult to read. - Use no font size smaller than 24 point. - Use different colors, sizes and styles (bold, underline) for impact. - No more than 6-8 words per line. - For bullet points, use the 6 x 6 Rule. One thought per line with no more than 6 words per line and no more than 6 lines per slide. - Use dark text on light background or light text on dark background. - To test the font, stand back six feet from the monitor and see if you can read the slide. - Graphics and Design - Keep the background consistent and subtle. - Use only enough text when using charts or graphs to explain clearly label the graphic. - Keep the design clean and uncluttered. - Try to use the same style graphics throughout the presentation. (e.g. cartoon, photographs) - Limit the number of graphics on each slide. - Color - Limit the number of colors on a single screen. - Bright colors make small objects and thin lines stand out. However, some vibrant colors are difficult to read when projected. - Use no more than four colors on one chart. Make sure you have a reference page as well, if using the book please include page number and if your are using information of from a web site please make sure you include the site as well.
https://varsitytermpapers.com/05-course-project-final-deliverable-ethics-training-presentation/
I am interested in an innovative blend of Graphic Design, Photography, and Interactive Communication Technology. At the start of college I knew I was going to become a design nerd seeking education from passionate designers around the world. Design allows one to connect globally through a universal visual language. I want people to think deeply about their place in a world with 8 billion people. With a high attention to detail I work with digital graphics, typography, and imagery to expose personal encounters with global issues and different cultures. Design embodies our world. We are continuously creating ourselves rather than discovering exactly who we are. In life there are no endings, only new beginnings. One idea initiating multiple ideas for future designs. Curiosity leads me to follow an idea that will allow me to gain the most knowledge about global issues. Seeking inspiration abroad, I studied at Massey University’s College of Creative Arts in Wellington, New Zealand from July to December 2013. The vibrant, natural landscapes and adventurous lifestyle of the Kiwi’s influence my choice to contrast natural colors while highlighting designs with a few bright colors. This can be seen in my Biodiversity Infographic, which I chose to create based on my interest in sustainability. Utilizing digital graphics, typography, and color to visualize statistics, I successfully convey the global need to work together to combat environmental destruction. Most recently, I developed several print designs for the company Colourworks, in Cape Town, South Africa where I interned from June to August 2014. Gaining working, in-house, studio experience in a different country I overcame language barriers, while beginning the international life I feels is necessary for creating myself and a sustainable world. In addition to personal international influences, Otl Aicher’s active color palette of the 1972 Munich Olympic posters, serves as inspiration for my choice of vibrant color to emphasize natural settings and movement. I am also inspired by the balanced movement between letterform and the human figure in the work of Alexey Brodovitch. In my Personal Brand Identity the natural curves in my logo create an interesting contrast with the rigidness of a standard envelope and letterhead. By manipulating the logo I was able to create unique patterns and a dynamic suggesting an interaction between the design and the actual shape of the letterhead and envelopes. The keen eye I have for nature and movement is also found in the application design Icreated for the late photographer Carelton Emmons Watkins. Developing an interactive app, I was able to animate the natural world digitally to emphasize Watkins photography in Yosemite National Park, California. Combing my passion for design, travel, adventure, nature, culture and life, my ambitions have led me to adapt to many different situations with a compassionate, creative mindset. If you would like to learn more feel free to contact me. I am waiting to help you produce creative solutions and expand my knowledge and experience in the design field.
https://www.nikoledesign.com/about
CHALLENGE: Zella Day is a singer and songwriter with a free-spirited and confident attitude. She has a unique lyrical-type voice and her music falls in the genre of indie-pop with a western flare. Her popular album Kicker marks the transition from her youth in Pinetop, Arizona to her worldwide fame. Her songs in the album reflect her childhood in Arizona and how nature played a large role in her creativity. The challenge was to create an album cover design that would show who Zella Day is as an artist and as a person, while also showing what Kicker means to her. SOLUTION: The final iteration of the album cover design consists of vibrant colors, organic flowers, and bold typefaces. The gradient of colors used has a calming effect, like a sunrise, with the flowers flowing throughout, relating to Zella Day's love of nature and care free energy. I chose to contrast this organic style by making the title "Kicker" be bold and have a western feeling to it. The texture of the overall design is also inspired by the vintage style of Zella Day.
https://www.alyssadesigns.net/work/kicker-album-cover
The International Typographic Style, also known as the Swiss Style, is a graphic design style that emerged in Russia, the Netherlands, and Germany in the 1920s and was further developed by designers in Switzerland during the 1950s. The International Typographic Style has had profound influence on graphic design as a part of the modernist movement, impacting many design-related fields including architecture and art. It emphasizes cleanness, readability, and objectivity. Hallmarks of the style are asymmetric layouts, use of a grid, sans-serif typefaces like Akzidenz Grotesk, and flush left, ragged right text. The style is also associated with a preference for photography in place of illustrations or drawings. Many of the early International Typographic Style works featured typography as a primary design element in addition to its use in text, and it is for this that the style is named. The influences of this graphic movement can still be seen in design strategy and theory to this day. The style emerged from a desire to represent information objectively, free from the influence of associated meaning. The International Typographic Style evolved as a modernist graphic movement that sought to convey messages clearly and in a universally straightforward manner. Two major Swiss design schools are responsible for the early years of International Typographic Style. A graphic design technique based on grid-work that began in the 19th century became inspiration for modifying the foundational course at the Basel School of Design in 1908. Shortly thereafter, in 1918 Ernst Keller became a professor at the Kunstgewerbeschule Zürich and began developing a graphic design and typography course. He did not teach a specific style to his students, rather he taught a philosophy of style that dictated "the solution to the design problem should emerge from its content." This idea of the solution to the design emerging from the problem itself was a reaction to previous artistic processes focused on "beauty for the sake of beauty" or "the creation of beauty as a purpose in and of itself". Keller's work uses simple geometric forms, vibrant colors and evocative imagery to further elucidate the meaning behind each design. Other early pioneers include Théo Ballmer and Max Bill. The 1950s saw the distillation of International Typographic Style elements into sans-serif font families such as Univers. Univers paved the way for Max Miedinger and collaborator Edouard Hoffman to design the typeface Neue Haas Grotesk, which would be later renamed Helvetica. The goal with Helvetica was to create a pure typeface that could be applied to longer texts and that was highly readable. The movement began to coalesce after a periodical publication began in 1959 titled New Graphic Design, which was edited by several influential designers who played major roles in the development of International Typographic Style. The format of the journal represented many of the important elements of the style—visually demonstrating the content—and was published internationally, thus spreading the movement beyond Switzerland's borders. One of the editors, Josef Müller-Brockmann, "sought an absolute and universal form of graphic expression through objective and impersonal presentation, communicating to the audience without the interference of the designer's subjective feelings or propagandist techniques of persuasion." Many of Müller-Brockmann's feature large photographs as objective symbols meant to convey his ideas in particularly clear and powerful ways. After World War II international trade began to increase and relations between countries grew steadily stronger. Typography and design were crucial to helping these relationships progress—clarity, objectivity, region-less glyphs, and symbols are essential to communication between international partners. International Typographic Style found its niche in this communicative climate and expanded further beyond Switzerland, to America. One of the first American designers to integrate Swiss design with his own was Rudolph de Harak. The influence of International Typographic Style on de Harak's own works can be seen in his many book jacket designs for McGraw-Hill publishers in the 1960s. Each jacket shows the book title and author, often aligned with a grid—flush left, ragged-right. One striking image covers most of the jacket, elucidating the theme of the particular book. International Typographic Style was embraced by corporations and institutions in America from the 1960s on, for almost two decades. One institution particularly devoted to the style was MIT. During the 1900s other design based movements were formulating, influencing and influenced by the International Typographic movement. These movements emerged within the relationships between artistic fields including architecture, literature, graphic design, painting, sculpting etc. De Stijl was a Dutch artistic movement that saw prominence in the period between 1917 and 1931. Referred to as neoplasticism, this artistic strategy sought to reflect a new Utopian ideal of spiritual harmony and order. It was a form of pure abstraction through reduction to the essentials of form and colour, employing vertical and horizontal layouts using only black and white and primary colors. Proponents of this movement included painters like Piet Mondrian, Vilmos Huszar and Bart van der Leck as well as architects like Gerrit Rietveld, Robert van 't Hoff and J. J. P. Oud. Bauhaus was a German-based movement that emphasized purity of geometry, absence of ornamentation and the motto 'form follows function'. This was a school of thought that combined craftsmaking with the fine arts and was founded by Walter Gropius. The goal was to work towards the essence of the form follows function relationship to facilitate a style that could be applied to all design problems; the International Style. Constructivism was an art/architectural philosophy that emerged from Russia in the 1920s. The style develops by assorted mechanical objects that are combined into abstract mobile structural forms. Hallmarks of the movement include geometric reduction, photo-montage and simplified palettes. Suprematism, which arose in 1913, is another Russian art movement similarly focused on the simplification and purity of geometric forms to speak to values of spirituality. All of these movements including the International Typographic styles are defined by reductionist purity as a visually compelling strategy of conveying messages through geometric and color based hierarchies. The Bauhaus mantra of 'form follows function' applies to design in the spirit of the International Typographic movement. The movement was structured by focusing on detail, precision, craft skill, systems of education and approach, technical training, high standards of print, and the innovative application of lettering. The theory revolves around critically approaching the development of a system specific to the design problem presented. For example, a father of the style, Ernst Keller, argued that a design solution should always be respectful of its content. A good comparison is the structure that defines a math problem. One only uses specific equations for specific types of problems. One similarly only can work through these equations in specific ways. With the International Typographic and other related philosophies, a design context is critical to deriving a response. Each design done with International Typographic Style in mind begins with a mathematical grid, because a grid is the "most legible and harmonious means for structuring information." Text is then applied, most often aligned flush left, ragged right. Fonts chosen for the text are sans serif, a type style believed to "[express] the spirit of a more progressive age" by early designers in the movement. Objective photography is another design element meant to present information clearly, and without any of the persuading influences of propaganda or commercial advertising. Such a strong focus on order and clarity is drawn from early pioneers of the movement believing that design is a "socially useful and important activity ... the designers define their roles not as artists but as objective conduits for spreading important information between components of society."
https://everything.explained.today/International_Typographic_Style/
Study Time with JoAnn Well, it’s the run down to the finish line for my Interior Design and Decorating course with Mercer. How have my personal tastes developed over the duration of the course ? I have adopted the use of vibrant colour and natural light in my course designs. I love to take the indoors outdoors, fostering a relaxed lifestyle concept which is so distinctively Australian. While not committing to one style, I take inspiration from a range of modern design influences. Anna Spiro is one designer who inspires me with her use of gorgeous colours, prints and layered textures. “I am drawn to light-filled rooms on neutral backdrops, adorned with vibrant splashes of colour and print in textiles, soft furnishings and upholstery,” Other partialities include the bold lines of Art Deco, natural forms of Art Nouveau, the sleek glamour of Hollywood Regency and the sophisticated coastal atmosphere of the Hampton’s style. What trend developments have inspired me? It’s so great that wallpaper and murals are back in fashion. I have been playing around with design and creating my own patterns to produce a range of original wallpapers. I admire that fashion is open to exploring more colour in interior design in 2016. Neutral palettes have been around far too long in Queensland now. Let’s embrace colour!
http://joanncaseyclarke.com/studytime/
My approach is minimal and is always based on experimenting with typography, photography and primitive shapes mixed with vibrant colors in order to create imagery that is both bold and modern. stefanlucut gmail com / DE 173 218 5192 Mark Design & Zeit Typopassage Timisoara is an open air Micro Museum with and about Lettering. Synopsis studio is well known for their own initiated projects and this is the project I was really attracted by and involved from the beginning. During my period working with the Synopsis studio, I helped with the design of Typopassage newspaper Issue No.2. typopassage.ro Credits Art direction:
https://stefanlucut.com/Design-Zeit-typopassage
Movements. 1.2. For each of these movements: find examples from their eras, as well as current designs that are influenced by these styles. Explain in your own words how these designs were inspired by the movements. |Question 2| |Research, written & practical assignment (problem solving) 1,5 days| | | The Bauhaus The Bauhaus was an Art School in Germany, and the school was founded in 1919 by Walter Groupis (1969) in Weimar in Goethe. In 1924 the school moved to Dessau, and later moved to Berlin in 1932-1933, but the school had to shut down because of the pressure from the Nazi political party. The Bauhaus School has played a big role in history in art, design, and architecture. Bauhaus has played a huge role by contributing to modern design, by developing the well-known typography font sans-serif. The school focused on functionality, simple forms, and rationality, the idea behind the school was to create a school that had all the elements, where architecture, painting, sculpture, craft engineering, where everything was in harmony and the artistic spirit and individuality could live together with mass production. The Bauhaus school has later become one of the most influential in modern design. Today the world sees the Bauhaus to be the home of the avant-garde of classical and modern style, in every field of liberal and applied arts. Bauhaus had many great contributors, artist and teachers like Laszlo Moholy-Nagyand and Herbert Bayer, they both where important in the development of graphic design and were great contributors. The Bauhaus School continued on with Laszlo Moholy-Nagy, after he resigned in 1928 at Bauhaus, he then moved to Chicago in 1937 and started the new Bauhaus School called; The Illinois Institute of Technology. This new school had the same philosophy as the original Bauhas School. De Stijl “The Style” in Dutch language. De Stijl was a movement in 1917 that wanted harmony and order; it thrived for the ultimate simplicity and abstraction in the way it could be expressed. The style was to remove elements, or reduce them, and just using primary colors with geometric forms. The two most important artist connected to this movement was Theo van Doesburg, Piet Mondrian, and Gerrit Reitveld. There were also a journal called De Stijl, and this journal was published by Van Doesburg, and in this journal he discussed the group’s theories. They wanted to reduce color and form, where important and a great influence on the development of graphic design. They used only primary colors like blue, yellow and red, and with black and white. Their compositions where simplified down to vertical and horizontal directions. De Stijl influenced the style of Bauhaus, architecture, fashion and interior design. Actually Theo van Doesburg applied to the Bauhaus School, but was not accepted, so he started his own school right next to the Bauhaus school. Swiss Design Swiss design is a movement that started in Switzerland in the 1940s and 1950s, and was much developed from the graphic design style movements during the 1920s. The Swiss Design is often referred to as the International Typographic Style, or the International Style. Two schools; The Zurich School of Arts and Crafts was led by Josef Müller-Brockmann, and the Basel School of Design where led by Armin Hofmann. They focused on simplicity, legibility and objectivity. One of many contributions from these two schools where the use of the sans-serif typography, asymmetrical layouts, grids, and how they combined photography and typography in visual communication. One of the most influential works was posters; they saw the poster as the most influential mean of communication. Examples from these movements and current design The Bauhaus The Bauhaus from its period ‘ The Bauhaus influence today Modern Architecture is inspired by the Bauhaus movement. You can see this by the simple forms and functionality, that we today call “Funkishus” here in Norway. The orange chair from Ikea is clearly inspired by The Bauhaus period. Bauhaus wanted to serve the need of the people, and Ikea functions in the same way today. You can see the simple and clean lines in the Ikea chair, it has the same lines as the original Bauhaus chair, the chair is minimalistic, where function and usability is combined, and its timelessness. The Bauhaus movement has also had an influence on how the web is today. Art, technology and craft where combined in the Bauhaus movement, where the idea was to bring it all together, where arts, architecture, everything was brought together. And so does the web, the web, Microsoft for example, focuses on functionality, usability, taking away unnecessary clutter. Making the web visual attractive, but first and most important, the web needs to be practical. You see the same in the company Apple and their products. De Stijl De Stijl from its period De Stijl influence You can clearly see how the modern design has been influenced by De Stijl. The band called White Stripes made an album called De Stijl, and you can see how they have used the color red combined with the black and white. There are horizontal and vertical lines. The same goes for the fashion and interior industri, they have used primary colors and have horizontal and vertical lines. The style is minimalistic and clean with strong lines and primary colors. Swiss Design Swiss Design from its period Swiss Design influence today There is a geometric grid in the new posters, clearly influenced by Swiss Design. You see how the typography is striking; it is bold and with a lot of contrast, and it is clean and minimalistic. The message is clear in these posters from newer days, there is a visual impact. Grids, sans-serif and photos play a huge part in the Swiss Design, and you see how the Swiss Design has influenced the graphic design today. There is an asymmetrical balance between the negative and positive elements in the design. Look at the history timeline at the beginning of this lesson. Gather information from 1900 – 2000, and design your own timeline using the Swiss Design Style as your theme. Each movement should be described in a creative way. In a different perspectiv Click the image below to see my history timeline in pdf Links:
https://grafisk.torilsorlie.no/2017/09/03/learning-activity-graphic-design-history/