source_id
int64
1
4.64M
question
stringlengths
0
28.4k
response
stringlengths
0
28.8k
metadata
dict
54,029
There is a scenario going around the Internet (Google link showing extent of reach) saying the following (with possible slight variations which I haven't spotted yet)... Why should we pay cash everywhere with banknotes instead of a card? I have a $50 banknote in my pocket. Going to a restaurant and paying for dinner with it. The restaurant owner then uses the bill to pay for the laundry. The laundry owner then uses the bill to pay the barber. The barber will then use the bill for shopping. After an unlimited number of payments, it will still remain a $50, which has fulfilled its purpose to everyone who used it for payment and the bank has jumped dry from every cash payment transaction made.. But if I come to a restaurant and pay for digitally - Card, bank fees for my payment transaction charged to the seller are 3%, so around $1.50 and so will the fee $1.50 for each further payment transaction or owner re laundry or payments of the owner of the laundry shop, or payments of the barber etc..... Therefore, after 30 transactions, the initial $50 will remain only $5 and the remaining $45 became the property of the bank thanks to all digital transactions and fees. (Copied from Reddit r/Anarco_Capitalism ) First of all, (grammar and actual average fees aside) the mathematics do not work on this, as 30 transactions with 3% charges per transaction would reduce $50 to just under $20 in the economy (if the fees are rounded up). $19.98 to be exact if my mathematics are correct. This is because the first charge would reduce the $50 to $48.50, then $48.50 - 3% charge = $47.05, then another would leave $45.64 and so on... However, $30 is still a big chunk lost from $50 to card transaction fees. I don't know how much you spend per month on average for food, but my wife and I in the UK spend £300 per month, and using this scenario reduces that £300 to around £120 if my mathematics are correct. This is surely going to ruin the economy and bust the banks in the end. That is because a strong economy needs money sloshing around the system . So, would this be something the central banks would really entertain? Is this argument for paper money and coins to remain realistic?
I'm assuming the $ in the question refers to US Dollars, so have tried to find US sources, although the principles apply in all markets. Does 3% of your payment go to fees? Possibly, although this is at the top end of fees. According to this article from the Motley Fool US card transactions are subject to three types of fee: interchange fees (paid to the bank that issued the card) assessment fees (paid to the payment network, mostly Visa or MasterCard) processing fees (paid to the retailers choice of payment processor) Each part, particularly the processing fees, can be made up of: percentage of transaction value fixed cost per transaction fixed cost per month The article gives examples of Visa fees as low as 1.29% + $0.05, plus an extra 0.30% + $0.08 for the cheapest listed card processor, well below 3%, but other combinations are higher. Does this money disappear? No. The money isn't simply burnt, it ends up in the pockets of other businesses - the payment processor, the card network, and the issuing bank. Does the money continue to circulate in the economy? This is harder to answer - some would argue that the large banks and payment institutions accumulate wealth into the hands of a few individuals, which then fails to "trickle down". But some, for instance, pays the salaries of bank branch staff and software developers maintaining the payment network, who can use it pay the barber just as in the original example. Is the cost of handling cash zero? No. There are many costs associated with handling cash, including equipment, staff time and training, and security. Some research estimates that: U.S. retail businesses lose about $40 billion annually because of the theft of cash alone. But none of that goes to the greedy banks, right? Wrong. Retailers do not want to hold their entire balance of trade in cash, so need to deposit in a business bank account. Business bank accounts often charge fees for cash deposits (see e.g. this comparison ), and those that don't may not be practical for a cash-based business (online only, or no branches in convenient locations).
{ "source": [ "https://skeptics.stackexchange.com/questions/54029", "https://skeptics.stackexchange.com", "https://skeptics.stackexchange.com/users/43717/" ] }
54,051
Rabbi Menachem Mendel Schneerson was the most recent (and final) rebbe of Chabad . In 1975 he published a series of correspondence with a scientist about whether the Sun revolves around the Earth and not the other way around. [Schneerson's first response] It is my firm belief that the sun revolves around the earth, as I have also declared publicly on various occasions and in discussion with professors specializing in this field of science. [Schneerson's second response] One of the conclusions of the theory of relativity is that when there are two systems, or planets, in motion relative to each other—such as the sun and earth in our case—either view, namely, the sun rotating around the earth, or the earth rotating around the sun, has equal validity. My knowledge of physics is limited to the undergraduate level, but, after reading up on relativity and a biography of Albert Einstein, everything he said seems to be correct as regards the current scientific knowledge. Answers here and here tackle this question, but the answers, to my understanding, ultimately come down to "Mathematically modelling the Earth revolving around the sun and not the other way around makes our models simpler and physics calculations easier". While true, I don't see how this constitutes any sort of proof of an underlying physical truth. Nevertheless, I remain skeptical. Is Rabbi Schneerson indeed correct: that we have no absolute way to determine whether the Earth revolves around the sun or the other way around and that his belief is no less valid than the one accepted by scientists? Mod Note I don't normally do this, but this question has garnered an incredible range of answers - often contradictory! - that do not belong on this site. Please read our Welcome to New Users before answering. It is not sufficient to share your philosophy of science. It is not sufficient to assert that modern physics says something without references. Answers and comments that don't follow the site's standards will be deleted. - Oddthinking.
The referenced link is correct; all frames of reference are equally valid in general relativity -- with a catch or two. One catch is that the mathematics can get a bit ugly (or more than a bit ugly), but it is possible to describe the behavior of the solar system from a geocentric perspective. Another catch is that in general relativity, frames of reference are local, where local means extremely local (i.e., infinitesimal). However, in the rabbi's second response make his point of view much clearer: It is my firm belief that the sun revolves around the earth, as I have also declared publicly on various occasions and in discussion with professors specializing in this field of science. This Earth-centered ( geocentric ) point of view being the only valid point of view, which is shared amongst geocentricists, is nonsense. It is however a series of big leaps from "all frames of reference are equally valid" to "that means a geocentric point of view is valid" and then to "that means that only a geocentric point of view is valid". Just because it is possible to explain the behavior of the solar system from a geocentric perspective does not mean that that is the only valid perspective. Ignoring the rest of the universe, the equations that describe the behavior of the solar system take on their simplest form when expressed in terms of a non-rotating frame of reference with the solar system's center of mass as the origin of the frame. This is the solar system barycentric frame of reference . The equations of motion take on their simplest form in a non-rotating, non-accelerating frame of reference such as this. Complex mathematics are involved in transforming from a barycentric frame to a geocentric frame. One way to look at it is to apply those complex mathematics (and we do do this, all of the time, for spacecraft orbiting the Earth) so as to come up with a geocentric view of those spacecraft's orbits. Another way to arrive at this point of view is to claim that these complex mathematical expressions are the right (and only) way to do it. Occam's razor (the simplest answer is most likely the best answer) comes into play here. The best ephemerides (postdictions / predictions of the positions of the planets as a function of time) of the solar system (there are three competing organizations, the Jet Propulsion Laboratory in California, the Russian Academy of Science in St. Petersburg, and the Observatoire de Paris ) all use a barycentric frame of reference in which the nastiness of general relativity is reduced to perturbations on top of Newtonian physics. A barycentric frame of reference makes the extreme ugliness of a geocentric frame vanish. Occam's razor prevails.
{ "source": [ "https://skeptics.stackexchange.com/questions/54051", "https://skeptics.stackexchange.com", "https://skeptics.stackexchange.com/users/66664/" ] }
54,065
Elon Musk's Wikipedia page states that he has bachelor's degrees in physics and economics, referencing various sources including a biography by Ashlee Vance. … the University of Pennsylvania, where he completed studies for a Bachelor of Arts degree in physics and a Bachelor of Science degree in economics … … In 1995, he was accepted to a Doctor of Philosophy (PhD) program in materials science at Stanford University. However, Musk decided to join the Internet boom, instead dropping out — Elon Musk - Wikipedia However, various websites and some celebrities recently have claimed that Musk lied about his degrees and that he does not have a bachelor's in physics or in science in general, some of them claiming that there is evidence of this in court transcripts. Someone has to say it: Elon Musk has lied for 27 years about his credentials. He does not have a BS in Physics, or any technical field. Did not get into a PhD program. Dropped out in 1995 & was illegal. Later, investors quietly arranged a diploma - but not in science," Is this correct? Does Musk have a bachelor's in physics as he claims, or not?
According to Ron Ozio (Director, Media Relations at UPenn) in an email on Plainsite : Elon Musk earned a B.A. in physics and a B.S. in economics (concentrations: finance and entrepreneurial management) from the University of Pennsylvania. The degrees were awarded on May 19, 1997. About Stanford, their director confirms Musk's acceptance (and lack of enrollment): Dear Elon, As per special request from my colleagues in the School of Engineering, I have searched Stanford's admission data base and acknowledge that you applied and were admitted to the graduate program in Material Science Engineering in 1995. Since you did not enroll, Stanford is not able to issue you an official certification document. Sincerely, Judith Haccou, Director The source is a document that also contains scans of Musk's degrees from a lawsuit .
{ "source": [ "https://skeptics.stackexchange.com/questions/54065", "https://skeptics.stackexchange.com", "https://skeptics.stackexchange.com/users/40504/" ] }
54,088
According to Wikipedia , Frederick Jaffe, former vice president of Planned Parenthood Federation of America and founder of what is now the Guttmacher Institute, wrote a memorandum discussing population reduction methods including: fertility reduction agents in water encourage women to work provide few child caring facilities encourage increased homosexuality compulsory abortions/sterilisation postpone or avoid marriage alter image of ideal family size discouragement of private home ownership various financial obstacles for parents abortion and sterilisation on demand improve contraceptive technology make contraception truly available etc. This is the table from the memo (click and zoom to see): Did Frederick Jaffe really write this memo, as Wikipedia claims? Some context as requested. On page 492 Jaffe states: ... the table takes a number of measures which have been discussed in the literature as possible elements in a population policy to reduce fertility ... He also states ... neither I nor the Planned Parenthood Federation of America advocates any of the specific proposals embodied in the table which go beyond voluntary actions by individual couples ...
I understand the claim you want to be tested is the claim by the Wikipedia page of Frederick S. Jaffe that there existed a memo he wrote that contained the table provided. The Wikipedia page provides two sources for the memo: The original memo is available online [12] or in the record [13] of a 1973 Senate hearing. The second reference is to a freely-downloadable copy of the hearing: Family Planning Services and Population Research Amendments of 1973, Hearings Before the Special Subcommittee on Human Resources..., 93-1, on S. 1708..., S. 1632 ..., May 8, 9, 10, and 23, 1973 The memo appears starting page 493, with the controversial table appearing on page 501.
{ "source": [ "https://skeptics.stackexchange.com/questions/54088", "https://skeptics.stackexchange.com", "https://skeptics.stackexchange.com/users/66719/" ] }
54,123
In 2019, John Anderson interviewed Konstantin Kisin. The discussion is transcribed by me. I is the interviewer and K is for Konstantin. K: In Russia last year 400 people were arrested for things that they posted on social media. Obviously this country is very different. How many do you think were arrested in Britain for what they said in social media? I: ... K: Take a guess. I: I've no idea. K: 3300 I: Really? Arrested for things that they said on social media? ... Were over 3000 people arrested in Britain for social media posts in before 2019? What are the numbers today? ref Youtube clip : (Note that the clip is posted 2022 and original video was published in 2020) Edit: Please note that 1) The question is about the UK, so Russia is irrelevant. 2) This discussion took place before the current war.
I can't speak for the exactness of the figure, but it certainly seems plausible. According to this article about arrests for online posts in London 857 arrests were made in 2015 in London alone as a result of online activity. However this can include emails as well as social media. The reason is: The Communications Act 2003 [which] defines illegal communication as “using public electronic communications network in order to cause annoyance, inconvenience or needless anxiety”. That's a very wide definition. Offences include : alleged sexual offences, including grooming, as well as complaints of stalking, racially aggravated conduct and fraud. Note that there is no suggestion that any of the arrests are solely for posting things that disagree with the government. By contrast in Russia you can be arrested for saying online that Crimea does not belong to Russia . This is useful information because the claimant (Kisin) is clearly trying to compare UK and Russia.
{ "source": [ "https://skeptics.stackexchange.com/questions/54123", "https://skeptics.stackexchange.com", "https://skeptics.stackexchange.com/users/56721/" ] }
54,129
I was reading about Ron Desantis on Wikipedia and was a little surprised by the section about his handling of COVID-19 in Florida. It lists a huge number of actions to oppose standard COVID preventative measures, including outlawing mask and vaccine mandates and minimal shutdowns of the state. It also claims that Florida had some of the highest death tolls in later parts of the epidemic. However it also has these two statements: Florida's death rate from COVID-19 (75,000 deaths) ended up being within the national average and Florida's economy fared better than many other U.S. states and in May 2022, a Bloomberg News op-ed claimed that, when adjusting state death tolls based on what they would be if age distribution were equal between the states, Florida's COVID-19 death toll would be less than the national average I'm skeptical that Florida could average better than the national average while resisting all precautions. A quick search found Statistica.com which lists Florida's current per capita death toll as of this month as the 14th highest in the USA, with a death rate of 426/100,000. By contrast the John Hopkin's Coronavirus Resource Center shows the current average death rate for USA as 329.68/100,000, which would seem to refute at least the first claim I quoted, unless there are differences in how death rate or average is being measured. The one potential discrepancy I could see is that the Wikipedia links are a bit older, so Florida may have caught up/exceeded the national average since then as it's still a hotspot for COVID as I understand? But it's hard to believe a few months could make that huge a difference in numbers. So, is Wikipedia's claim that Florida managed as well, or better, then the national average true at the time the links were cited, and is it still true now?
The source provides no methodology. tl;dr The claim that "if age distribution were equal" is not complete. The original source adjusts COVID mortality by age and an ill-defined "metabolic health". They provide no methodology for either. Without knowing the details, they could have done any amount of monkeying with the numbers. The results are meaningless. The Britannica Claim Florida's death rate from COVID-19 (75,000 deaths) ended up being within the national average and Florida's economy fared better than many other U.S. states. This is from a Britannica article about Ron DeSantis and provides no citations how they concluded Florida is within the national average. The Bloomberg Claim in May 2022, a Bloomberg News op-ed claimed that, when adjusting state death tolls based on what they would be if age distribution were equal between the states, Florida's COVID-19 death toll would be less than the national average This is from an op-ed by Justin Fox. He is a Bloomberg Opinion columnist covering business, not health. The same op-ed was posted in the Washington Post with no paywall. Hiding inequality in averages The relevant claim appears to be this: Adjust state Covid mortality statistics to what they would be if all the states’ populations had roughly the same age distribution as the nation — as conservative activist and policy analyst Phil Kerpen has been doing for much of the pandemic — and Florida’s rate drops to 275 per 100,000, well below the national rate of 302, while California’s rises to 267. Phil Kerpen is a founder of the Committee To Unleash Prosperity. More on them in a moment. We must address whether ignoring a state's demographics is a relevant statistic. It smacks of the "sacrifice the elderly" modest proposals of the early pandemic. Population demographics are not a hypothetical. Those are real people, really getting sick, and really dying. Instead of arguing that Florida would have done better with average demographics, one can argue Florida failed to adapt their policies to their population. It's very tempting to boil everything down into a single number for the "average person", but there is no such thing as an average person. Populations are messy. For example, (50, 50, 50, 50) and (-100, 100, 200, 0) are very different sets of numbers, but they average (mean) out the same. Abusing population averages is a way to hide inequalities. I would argue that if we really want to compare states without being biased by their age distributions, we shouldn't be mashing all ages together into one statistic. We should be looking at COVID mortality rates by age. This removes the age distribution bias without artificially smoothing out bumps. We should compare people apples-to-apples, not applesauce-to-applesauce. That's just what this chart already does (sorry about the poor quality)... ...we can see that, relative to California, Florida did as good or slightly better with their 75+ population. However, they did far worse with their younger populations. That is more meaningful information than one number for all ages. One could argue that doing worse in younger populations is less relevant because their death rates are orders of magnitude lower than the 75+ rates, but this all teeters on the verge of devaluing lives. Devaluing the elderly by ignoring the demographic differences and devaluing the young by ignoring their higher death rates in Florida. A Final Report Card on the States’ Response to COVID-19 This claim appears to be sourced from A Final Report Card on the States’ Response to COVID-19 by Phil Kerpen, Stephen Moore and Casey Mulligan of the Committee To Unleash Prosperity . The numbers don't exactly agree, maybe they were using an earlier version, but they are very close. The Committee To Unleash Prosperity is an organization to promote supply side economics . They are anti-regulation . Its founders are all economists, not public health professionals. All the authors of the paper are economists, not public health professionals. Critically, there's little discussion of their methodology; no formulas are provided. However, while the article only mentions age, the report also includes "metabolic health"... We adjust COVID mortality (through March 5, 2022) for age and “metabolic health,” by which we mean the pre-pandemic prevalence of obesity and diabetes – as these are highly correlated with higher death rates from the virus. There is no methodology about how they calculated or applied "metabolic health". How did they determine the prevalence? How did it affect the COVID mortality rate? Why only obesity and diabetes? Why not respiratory problems? Auto-immune diseases? Without that information, we can't check their math; their adjusted COVID mortality rates are meaningless. They could have used it as a fudge factor, tweaking the method until they got the result they wanted. With all this in mind, and the lack of methodology, the source cannot be used.
{ "source": [ "https://skeptics.stackexchange.com/questions/54129", "https://skeptics.stackexchange.com", "https://skeptics.stackexchange.com/users/8366/" ] }
54,137
According to Elon Musk : My plane is actually not trackable without using non-public data This is a part of an ongoing controversy surrounding the bans on accounts posting the location of his private jet . Can the location of said jet be obtained using public data alone?
Yes, it does appear to be trackable with only publicly available data and it appears that Elon Musk is making various requests to make it harder to track his flights. Reddit now has a subreddit entirely dedicated to tracking Elon Musk’s jet The start of the article mentions a program he applied to that makes it harder to track his flights. Elon Musk has banned the Twitter accounts of the app that tracked his jet, the creator of that app and several high-profile reporters who covered the story. He has even, apparently, applied for a new FAA program, which makes it harder to track his plane. The ADS-B Exchange is a public site to track flights all around the world and it is my understanding that a lot of journalists use it when they need to track flights. Musk has long disliked ElonJet tracker, which uses publicly available data gathered from ADS-B exchange, a larger hobbyist site that assembles publicly available data from the transponders of different aircraft. However, he promised when he took over Twitter that he would not ban the account, in the name of free speech. As a side note while it isn't relevant to the question about tracking planes it seems this reaction came after Musk thought a stalker went after a vehicle that his two year old son was in. (Whether there was indeed a stalker is unclear .) On Wednesday, Musk alleged a car carrying his two-year-old son X was followed by a “crazy stalker (thinking it was me)” on Dec. 13, who blocked and climbed on the hood of one of the vehicles. He followed up the allegation by saying that he was would take legal action against Jack Sweeney, creator of the app and a freshman at the University of Central Florida. Suspensions of several journalists’ Twitter accounts soon followed. More on ADS-B Exchange ADS-B Exchange rightfully calls itself “the world’s largest source of unfiltered flight data.” The key word is “unfiltered,” meaning that the site relies on ADS-B signals and does not filter out information about US aircraft that have requested anonymity through the US government, which makes it attractive to journalists. As the only tracking service to do this, ADS-B Exchange has proved to be a disruptive force in the tracking industry since it was started by US pilot Dan Streufert. Billing itself a cooperative, ADS-B Exchange relies on a worldwide community of more than 2,000 people who send in real-time MLAT and ADS-B data. This is uploaded on a searchable website. It’s free for non-commercial use (contributions requested). Commercial users are required to license the data.
{ "source": [ "https://skeptics.stackexchange.com/questions/54137", "https://skeptics.stackexchange.com", "https://skeptics.stackexchange.com/users/23144/" ] }
54,149
In "Industrial Policy Comes Full Circle" (Dec. 15), Clyde Prestowitz ( Wall Street Journal , 2022-12-11 , 2022-22-15 print edition p. R17): Lincoln said: "I don't know much about tariffs, but I do know that when we buy steel abroad, the foreigner gets the money and we get the steel, but when we buy steel made in America, we get the steel and the money too." Did Lincoln say the above?
No. From Frank Taussig ( 1914 ), "Abraham Lincoln on the Tariff: A Myth": It seems certain that the phrase is apocryphal. There is no evidence that Lincoln ever used it. ... By dint of repetition it has come to be associated with Lincoln almost as much as the cherry tree is associated with Washington. So crude is the reasoning (if such it can be called), so vulgarly fallacious is the antithesis, that we must hope that it will cease to be invested with the sanction of a venerated name. A bit more from Taussig: The very first mention which we have found is in 1894, in the American Economist , a weekly protectionist sheet published in New York ...: "Lincoln's first speech on the tariff question was short and to the point. He said he did not pretend to be learned in political economy, but he thought that he knew enough to know that 'when an American paid twenty dollars for steel to an English manufacturer, America had the steel and England had the twenty dollars. But when he paid twenty dollars for the steel to an American manufacturer, America had both the steel and the twenty dollars.'" ... the phrase was not current before 1894 ... But after 1900 it turns up repeatedly ... After the very first appearance, the commodity mentioned seems to be invariably rails, — sometimes iron rails, sometimes steel rails. ... The first appearance for express campaign use appears to be in 1904. ... In the Campaign Book of 1904, there is an extended quotation from Lincoln's tariff notes of 1846-47 (referred to a moment ago) and then at the close we find: - "On another occasion Mr. Lincoln is quoted as saying: 'I am not posted on the tariff, but I know that if I give my wife twenty dollars to buy a cloak and she buys one made in free-trade England, we have the cloak, but England has the twenty dollars; while if she buys a cloak made in the protected United States, we have the cloak and the twenty dollars.'" Etc. In the century-plus since Taussig (1914), this nonsensical "Lincoln quote" has been repeated many more times (usually with some slight alterations), usually by those with a poor understanding of economics (and hence for whom the quote has appeal), and most recently in late 2022 by Clyde Prestowitz in the Wall Street Journal . (Today if you Google "lincoln tariff quote", Taussig (1914) appears as the first result. So it's quite amazing that the "quote" is still being repeated as fact.)
{ "source": [ "https://skeptics.stackexchange.com/questions/54149", "https://skeptics.stackexchange.com", "https://skeptics.stackexchange.com/users/62611/" ] }
54,169
One headline on the December 21, 2022 episode of Tucker Carlson Tonight claimed that "Zelensky has declared war against Christianity", referring to Ukrainian President Volodymyr Zelenskyy In this episode, Carlson says: You will not hear a word on television tonight about the fact that Zelensky has banned an entire ancient Christian denomination in Ukraine, and then seized churches, and then thrown priests into jail. What is the basis for these claims?
The best way to tell a lie is to tell a partial truth. It is true that the Ukrainian government has raided several churches associated with the Russian Orthodox Church . What Carlson failed to mention was that key elements in those raided churches were suspected of providing intelligence to Russia. Some of those churches are hotbeds of treacherous behavior. It is true that Zelensky has proposed legislation to ban this particular church . What Carlson failed to mention was that Zelensky and his government do not have any issues with the Catholics and Protestants in Ukraine, or with the Orthodox Church of Ukraine. What Carlson also failed to mention was that the Ukrainian branch of the Russian Orthodox Church split in three branches. Two of those branches merged and asked for (and received) independence from the Russian Orthodox Church. This is the Orthodox Church of Ukraine . The third branch (the Ukrainian Orthodox Church of the Moscow Patriarchate) remains a part of the Russian Orthodox Church. It is the Ukrainian Orthodox Church with which Zelensky and his government has a problem. Some of that church's members (including church leaders) are actively acting against the Ukrainian government and are providing intelligence to Russia regarding Ukrainian troop movements. This is not "a war on Christianity". What it is is a war against the Ukrainian Orthodox Church. Some of the Ukrainian Orthodox Church churches are loyal to Ukraine, but others definitely are not.
{ "source": [ "https://skeptics.stackexchange.com/questions/54169", "https://skeptics.stackexchange.com", "https://skeptics.stackexchange.com/users/25893/" ] }
54,175
According to this video, Rand Paul states the United States spent $118,000 USD to see if a metal replica of Thanos could snap his fingers . Is this what they were trying to figure out with these funds? We spent $118,000 to study if a metal replica, a robot, of Marvel Comics evil warlord Thanos could snap his fingers. [...] They apparently hired some dude to wear metal gloves and then try to snap his fingers. You know what? They found out it's impossible to make a snapping sound with metal fingers. So robots of the world, be warned: it's hard to snap your fingers.
The grant is #2019371 , where the $118k+ is only the 2020 funds. From the description, it seems clear that they published a paper and are planning another that builds on the first. That first paper is open access; a snippet reads: Using high-speed imaging and force sensors, we analyse the dynamics of the finger snap… Our analysis reveals the central role of skin friction in mediating the snap dynamics by acting as a latch to control the resulting high velocities and accelerations. We evaluate the role of this frictional latch experimentally, by covering the thumb and middle finger with different materials to produce different friction coefficients and varying compressibility… We also develop a soft, compressible friction-based latch-mediated spring actuated model to further elucidate the key role of friction and how it interacts with a compressible latch. They did not test a "metal glove". The closest test was of thimbles under medical gloves: Force data are collected for 5 snaps made while wearing a nitrile glove with lubrication, while wearing latex rubber on both fingers, and while wearing a metal thimble on both fingers underneath the nitrile glove. It seems "hired some dude" does not seem to be an accurate description either: We thank two members of the BhamlaLab for volunteering to participate in the finger snap experiments. Thanos was an inspiration for the work, but it's not about him. The authors suggest that the research will be helpful in understanding the mechanics behind creatures like termites and ants which have snapping mandibles, in addition to being useful for developing prosthetics and " soft robotics ".
{ "source": [ "https://skeptics.stackexchange.com/questions/54175", "https://skeptics.stackexchange.com", "https://skeptics.stackexchange.com/users/3835/" ] }
54,191
I can't square this one. Greg Abbott is a Republican governor who espouses conservative Christian values, so when I read that he deliberately redirected a busload of migrant families to a political opponent's house, on Christmas Eve of all days, it makes me wonder whether we're getting the whole story. It doesn't seem to be in his political interest to do this. According to Reuters: WASHINGTON, Dec 26 (Reuters) - The White House on Monday accused Texas Governor Greg Abbott of endangering lives after busloads of migrants from the southwest border in Texas were dropped near Vice President Kamala Harris' home in Washington, D.C., on a cold Christmas Eve. Source: White House assails Texas Governor Abbott over Christmas Eve migrant drop As far as I can tell there is no direct evidence linking Abbott to this incident. But I also don't think the White House would make such an accusation without basis. So my question is, what's the basis for the accusation that Abbott orchestrated this incident?
Yes, it does appear that Greg Abbott did send them to the VP's residence and sent a letter to Biden to justify those actions. Washington, DC: Bus of migrants dropped off outside VP Kamala Harris' home Gov. Abbott began sending migrants from border cities to the nation’s capital in April, in an effort to pressure the Biden administration to take action on immigration enforcement and border security. This practice has been widely criticized. In a letter Abbott sent to President Biden on Tuesday, the Texas governor cited freezing temperatures in cities like El Paso as his reason to transport the migrants as migrant housing facilities already at capacity have been forced to release people outside onto the streets. "Your policies will leave many people in the bitter, dangerous cold as a polar vortex moves into Texas," Abbott wrote. "Texas has borne a lopsided burden caused by your open border policies."
{ "source": [ "https://skeptics.stackexchange.com/questions/54191", "https://skeptics.stackexchange.com", "https://skeptics.stackexchange.com/users/25893/" ] }
54,201
After the arrest of controversial internet personality Andrew Tate on December 29th 2022 it has been claimed in multiple highly viral posts ( example on Reddit ) that the arrest was made possible because in a video he created during an online spat a pizza box was visible from a local Romanian pizza chain. The claim further goes that thanks to the pizza box (coupled with the proven recency of the video) Romanian authorities knew that Andrew Tate was in the country which lead them to raiding his home. Is there an actual source for this? My personal research did not bring anything concrete up. None of the various viral posts I have seen have a source attached, typically relying on wording such as "reportedly". ( Example on Twitter ) I am skeptical of this claim because I would assume that Romanian police would have different methods of ascertaining his whereabouts such as movements into/out of country through immigration checkpoints or similar on airports or borders.
(edit: Oddthinking found an explicit denial that the pizza box had anything to do with it, so I accordingly have edited this answer.) Report says "police monitored social media" I believe the initial "pizza box" story came from this Twitter account which cites a scoop by the Romanian language newspaper Gândul . Here is a Google translation of their article : Sources close to the investigation stated to Gândul that shortly after the completion of the computer analysis, the authorities waited for the right moment to catch the Tate brothers, who were always out of the country. After seeing, including on social networks , that they were together in Romania, the DIICOT prosecutors mobilized the special troops of the Gendarmerie and descended, by force, on their villa in Pipera, but also on other addresses. Emphasis added, for my analysis. Claim is making two false inferences People seem to be inferring from the article that the "social media" posts which prosecutors used to decide their timing included the pizza box video. This was questionable (now known to be false), because the article specifically mentions that they needed to know that both brothers were together in Romania, something which cannot be confirmed by the pizza box video. The tweet makes a second incorrect inference which many others have repeated as well: she made him so angry he inadvertently tipped off Romanian authorities of his presence in Romania The article makes it clear that they relied on multiple sources of information to establish Tate's presence. They already knew he was in the country and, given the time that paperwork takes, they may have already been in the midst of writing a warrant at the time that he posted the pizza box video. The claim that the video "tipped off Romanian authorities" is false , because they needed to locate both brothers. The claim that the video assisted authorities is explicitly denied by police: Speculation swirled overnight Thursday that Romanian authorities were able to locate Tate after he posted a video in response to Thunberg containing a pizza box from a local spot that gave away his location. Bolla denied that this played a role in the detention or its timing. “It was a hard job gathering all the evidence” in the months-long investigation, Bolla said. ( WaPo )
{ "source": [ "https://skeptics.stackexchange.com/questions/54201", "https://skeptics.stackexchange.com", "https://skeptics.stackexchange.com/users/43383/" ] }
54,206
A September 2022 article in The Daily Sceptic discusses a recent article in the New England Journal of Medicine about the effectiveness of the Pfizer-BioNTech vaccine on children between 5 and 11 years old. A new study published in the New England Journal of Medicine (NEJM) shows not only that the effectiveness of the Pfizer Covid vaccine becomes negative (meaning the vaccinated are more likely to be infected than the unvaccinated) within five months but that the vaccine destroys any protection a person has from natural immunity. The study is a large observational study that looks at 887,193 children aged 5 to 11 years in North Carolina, of whom 273,157 (30.8%) received at least one dose of Pfizer vaccine between November 1st 2021 and June 3rd 2022. The study includes 193,346 SARS-CoV-2 infections reported between March 11th 2020 and June 3rd 2022. The researchers used a form of statistical modelling with adjustments for confounding factors (such as underlying conditions) to calculate estimates of vaccine effectiveness over time and against the different Covid variants. The findings are depicted in the charts below. In chart A, notice that the green and blue lines, representing children vaccinated in November and December respectively, go through zero into negative territory at a sharp gradient within five months of the first injection. It’s unclear why the green line is not continued past April, as the researchers presumably had the data, but from what is shown it looks very much like the vaccine effectiveness will continue declining deep into negative territory. [...] Does the Pfizer vaccine destroy natural immunity in children?
No. If you read the original, rather short article you'll notice that it never makes any kind of claim about the vaccine destroying any protection. To the contrary, the article argues that the data supports booster vaccinations for children. Reuters has done a fact check of this claim which includes quotes from the original author of the study. “The statement that ‘the vaccine destroys any protection a person has from natural immunity’ is unfounded,” Lin told Reuters in an email. “Our data showed that the vaccine was effective against infection for 4 months. In addition, vaccination conferred greater protection against hospitalization than against infection. Finally, no vaccinated children died whereas 7 unvaccinated children died.” So the author of the study used to make this claim directly refutes it. In both figures, the lines’ continuation past the boundaries of the graph is strictly an illustrative technique, to show the overall trajectory more clearly, according to Lin. They do not indicate that vaccine effectiveness becomes “negative” at any point, or that children become more vulnerable to infection than they would be without vaccination, he said. I really dislike these graphs, they don't show clearly which parts are actual data points and which parts are interpolated or extrapolated. But no matter how bad these graphs are, according to the author they are not intended to show that the vaccines reduce protection, that is simply an extrapolated line and not measured data. As for the graphs C&D, they are very hard to read in my opinion as they don't clearly indicate which points of data were measured (which are fewer for the vaccinated case than for the unvaccinated case, so the timeframe observered is different in both cases). The error bars are also somewhat confusing as there is also the shading for the prevalence of the different mutations in the background. I created a very rough plot myself from the data in the supplementary material (only for the delta variant) to understand this: Blue are the unvaccinated, red the vaccinated children. The x-axis is the number of months. The error bars for the vaccinated children are much, much larger than the error bars for the unvaccinated children. The series also doesn't continue as long as it does for the unvaccinated children. The "trend" you see in graph D for the vaccinated children seems to be an extrapolation that is based mostly on the last two data points that have absolutely enormous error bars. This is extremely misleading and not at all what the raw data shows.
{ "source": [ "https://skeptics.stackexchange.com/questions/54206", "https://skeptics.stackexchange.com", "https://skeptics.stackexchange.com/users/63175/" ] }
54,227
Jordan B Peterson tweets and is further quoted in The Post Millennial BREAKING: the Ontario College of Psychologists @CPOntario has demanded that I submit myself to mandatory social-media communication retraining with their experts for, among other crimes, retweeting @PierrePoilievre and criticizing @JustinTrudeau and his political allies. I have been accused of harming people (although none of the complainants involved in the current action were clients of mone [sic], past or present, or en [sic] were even acquainted with any of my clients. I am to take a course of such training (with reports documenting my "progress" or face an in-person tribunal and suspension of my right to operate as a licensed clinical psychologist. About a dozen people from all over the world submitted complaints about my public statements on Twitter and Rogan over a four year period (out of the 15 million who follow me on social media) claiming that I had "harmed" people (not them) with my views. Can Ontario College of Psychologists , the regulating body for Psychologists in Ontario Canada, demand that registered psychologists attend "communication retraining"? Have they demanded that Jordan must attend? If so, can they suspend his clinical psychologist license if he refuses?
Yes, they have required Jordan Peterson to attend training sessions. The College has required Jordan Peterson to complete a SCERP . In a decision released on November 22, 2022, the Inquiries, Complaints and Reports Committee decided to require Dr. Jordan Peterson to successfully complete a prescribed Specified Continuing Education or Remedial Program (SCERP). The substance of the SCERP is a Coaching Program to address issues regarding professionalism in public statements. Dr. Peterson has filed a Notice of Application for Judicial Review with the Ontario Divisional Court. Yes, The College of Psychologists of Ontario can regulate their members. Jordan Peterson is a member and subject to their rules of professional conduct and has had a "Certificate of registration for a psychologist authorizing autonomous practice" since 1999. The College of Psychologists of Ontario is the governing body for psychological practitioners, Psychologists and Psychological Associates, in Ontario . The College is not a university, school or community college; its mandate is to protect the public interest by monitoring and regulating the practice of psychology. Its FAQ goes further... Regulated professionals are required by law to deliver professional services competently and ethically. They are accountable to the public, through their professional regulatory body, for their professional behaviour and activities. As members of the College of Psychologists of Ontario, Psychologists and Psychological Associates must meet rigorous professional entry requirements, adhere to prescribed standards, guidelines, and ethical principles and participate in quality assurance activities to continually update and improve their knowledge and skill . The Inquiries, Complaints, and Reports Committee (ICRC) must investigate each complaint and should take no action if the complaints were considered frivolous. From the Regulated Health Professions Act ... Complaint in bad faith, etc. (4) If the panel considers a complaint to be frivolous, vexatious, made in bad faith, moot or otherwise an abuse of process, it shall give the complainant and the member notice that it intends to take no action with respect to the complaint and that the complainant and the member have a right to make written submissions within 30 days after receiving the notice. His activity has been deemed of "Moderate risk" The ICRC Risk Assessment Framework says it can "order a SCERP if it identifies moderate risks. A SCERP is remedial and can include a specific course of study." SCERP has been applied 15 times in the last 5 years. "Public trust in and perception of the psychology profession" is a risk indicator and moderate risk is defined as "conduct may cause moderate disapproval and/or sustained implications for the client and/or others. In addition, a "moderate display of a pattern in Member's conduct history" and "the Member demonstrates some awareness and/or plans no/insufficient changes" are also moderate risk. A note about professional licensing and public trust. Peterson has attempted to characterize this as a free speech issue; that he is being persecuted for his political views as a private citizen. He has attempted to diminish the complaints because he is a public figure with millions of followers on social media, and because they did not directly involve his clients. The College has been levying accusations and conducting investigations in relationship to me since 2017 (although not once in the twenty years I operated as a clinical psychologist before my rise to public awareness). About a dozen people from all over the world submitted complaints about my public statements on Twitter and Rogan over a four year period (out of the 15 million who follow me on social media)... ...none of the complainants involved in the current action were clients of mone, past or present, or en were even acquainted with any of my clients... In reality, Peterson has made these public statements while presenting himself as a Clinical Psychologist . This has a very special meaning in Canada. Only members of the College of Psychologists of Ontario may use the title "Psychologist" in Ontario. As a licensed psychologist, he his granted the power and authority and trust necessary for his profession to function; people are expected to share their deepest and most intimate thoughts and behaviors with a complete stranger . It is Peterson's responsibility to protect this "public trust in and perception of the psychology profession". As a public figure who has chosen to develop a large following he has more responsibility to do so. Accusations are quite rare; the ICRC receives about 25-30 new matters every quarter . This makes a dozen complaints against a member over any period of time quite a lot. He is being disciplined by the organization which licenses him as a psychologist . To obtain and retain his license he agreed to uphold a code of conduct which includes his public behavior when presenting himself as a psychologist. Many professions which involve the public trust have similar codes of conduct. For example, a police officer can be disciplined for offensive public remarks made while presenting themselves as a police officer. This protects the public trust in the police which helps them do their job. Peterson is not going to jail. He is not being silenced; he can continue to post as he likes on Twitter without a license to practice psychology. His licensing organization is presenting him with a choice: protect the public trust in his profession, or lose his license to practice psychology in Ontario. The complaints Peterson said he would "make all the concrete allegations 100% public (except identifying the complainants) tomorrow" , which he did, and then he deleted it and claims it will be "back up soon, in identical form" while insulting the person who pointed out that he'd deleted it . However, copies of the original were retained which Peterson claims includes "all relevant correspondence". Peterson makes two claims about the complaints... ...retweeting @PierrePoilievre and criticizing @JustinTrudeau and his political allies... ...About a dozen people from all over the world submitted complaints about my public statements on Twitter and Rogan over a four year period... The first is grossly misleading. The second is wrong, they came in over a few months in early 2022. The Case Summary You can find the "Summary of Report and Investigative Steps" on page 11 of Peterson's document . Here is my summary of the case summary. The complaints cited came in between Jan 2022 and April 19, 2022 about his current behavior. Joking about suicide. Misleading statements about COVID-19. Concerning remarks on The Joe Rogan Experience Episode #1769. A long, insulting public argument with Gerald Butts . "Unprofessional, embarrassing, threatening, abusive and harassing" behavior. Falsely presenting that he works at the University of Toronto. Claiming white supremacists do not exist in Canada. Claiming a plus-size Sports Illustrated swimsuit model was a "conscious progressive attempt to manipulate & retool the notion of beauty" by the "oh-so virtuous politically correct". Calling the physician who performed the gender affirming surgery on Elliot Page (age 35) a "criminal". Private and public combative and dismissive behavior towards the College and the complaints. Peterson's document contains the details and screenshots of his Tweets. The report identifies these issues are to be addressed. Disgraceful, Dishonorable, or Unprofessional Conduct. Does it appear that Dr. Peterson's Tweets... constitute abuse and/or harassment? Does it appear that Dr. Peterson's conduct on the Joe Rogan Experience and/or his use of Twitter would... be reasonably regarded by members as disgraceful, dishonorable, or unprofessional? Provision of information to the Public: Does it appear that the information Dr. Peterson shared on the Joe Rogan Experience Podcast is accurate and supportable based on current professional literature or research; and is consistent with the professional standards, policies, and ethics currently adopted by the College? Combative and dismissive behavior towards the complaints. During the whole process, Peterson has been combative, dismissive, and unrepentant in private and in public. He has stated the complaints and the investigation process are designed to punish him . The process is the punishment, and those who levy complaints... know this full well. And I'm not participating in it anymore. Take my license if you must. At this point, it would be a relief. This adds "recurrence risks" to his issues (see ICRC Risk Assessment Table). "The Member does not demonstrate awareness and/or plans no/insufficient changes." "Significant concerns identified with respect to practices, processes, and/or systems." "Significant display of patterns in member's conduct history." Yes, they can revoke or suspend his license if he refuses. Under the risk assessment for "Awareness of the Identified Practice Concerns", Peterson's refusal may be defined as high risk. The Member does not demonstrate awareness and/or plans no/insufficient changes. High risk cases will be referred to the Discipline Committee which conducts a legal hearing. The ICRC will refer allegations when it believes the member’s conduct poses a high risk to the public. The Discipline Committee will hold a hearing to decide whether there was professional misconduct or incompetence. A discipline hearing is a formal, legal process. Evidence is presented under oath. Witnesses are subject to examination and cross-examination. The College is represented by legal counsel. The member is often represented by their own legal counsel as well. The Discipline Committee also has its own legal advisor, who is independent of both the College and the member. They have a range of options, including suspension. After a hearing, the Discipline Committee can make a finding of professional misconduct or incompetence. It can then impose a penalty, which may include: Revocation of the member’s certificate of registration; Suspension of the member’s certificate of registration; Terms, conditions and limitations on the member’s certificate of registration; A reprimand; and A fine payable to the government of Ontario.
{ "source": [ "https://skeptics.stackexchange.com/questions/54227", "https://skeptics.stackexchange.com", "https://skeptics.stackexchange.com/users/56721/" ] }
54,230
The Wall Street Journal claims that this document is a legitimate initiative of Stanford University. It recommends, for example, that the term "blind study", widely used in experimental trials to avoid bias, should be replaced by "masked study" because it "Unintentionally perpetuates that disability is somehow abnormal or negative, furthering an ableist culture." Though it is really hard for me to see by what logic that conclusion is derived. OK, it also recommends against the use of terms like "retard" which seem perfectly reasonable as the word is undeniably prejudiced in most actual use describing people (but is also a perfectly useful neutral term in physics or mechanics). But it also calls for a stop to the casual use of the word "guru" as this might offend Buddhists and Hindus where it is a term of respect. I thought it was a term of respect in most uses. It also recommends not using terms like "black-box" or "cakewalk" for reasons I find hard to take seriously. Is this a serious document? Is it a subtle parody? PS There are some issues about how the original story was reported in the media as an outrageous overreach by woke academics. The WSJ neglected to mention some details about the source to make it seem more like a Policy Statement by overly woke university academics. What they should have noted was that it was more like a "style guide" produced by the internal IT team (see the explanation in this blog from Stanford Academic Adrian Daub .) This observation doesn't alter the question but it might mitigate the degree of anti-woke outrage some seem to feel about it.
While Brian's answer was correct when they posted it 13 hours ago, as of now Stanford has apparently backpedaled: The feedback that this work was broadly viewed as counter to inclusivity means we missed the intended mark. It is for this reason that we have taken down the EHLI site. Source: the same link as in Brian's answer. EHLI is "Elimination of Harmful Language Initiative", Stanford's name for the initiative in question. So: the website clearly isn't a spoof, but Stanford also seems less than committed to and convinced by the whole thing. To provide a little more context as an academic working in IT: there has been an ongoing debate about technical terminology such as "master/slave" or "blind studies" for a while now, so this initiative certainly did not come out of nowhere.
{ "source": [ "https://skeptics.stackexchange.com/questions/54230", "https://skeptics.stackexchange.com", "https://skeptics.stackexchange.com/users/3943/" ] }
54,269
Theoretical physicist, Richard Feynman is often attributed with the quote Anyone who claims to understand quantum theory is either lying or crazy. Example of the claim. Is this apocryphal, or did Feynman (or another notable physicist) actually say that? Note, in November 1964 Feynman similarly stated I think I can safely say that nobody understands quantum mechanics. (This is sometimes quoted to lend authority to mysticism or anti-intellectualism, but in context it seems to be voicing his lack of an intuitive classical-mechanics based explanation for why nature obeys quantum laws. This was also the same month as Bell's theorem was published, so contemporary physicists still anticipated uncovering a classical explanation of quantum theory.)
I can’t find hard evidence for the “lying or crazy” quote, but as OP already mentioned, Feynman said something very similar in content, if perhaps less colorfully phrased: I think I can safely say that nobody understands quantum mechanics. Here he is on YouTube saying it. Here is a discussion of what he meant. Here is a 2019 New York Times opinion piece with a modern take on the issue by Sean Carroll, a well known physicist. Because of the similarity between the two quotes, I don’t think it’s of material significance whether he actually made the more colorful statement. It expresses an identical sentiment to what he is shown saying in the video. As the discussions I linked to above illustrate, the common view among physicists and mathematicians is that while quantum mechanics works amazingly well as a way of predicting the results of experiments, and is a phenomenally successful model of how the physical universe behaves, there is still something fundamentally mysterious and unintuitive about what its predictions really “mean”. That’s likely the sentiment that Feynman was trying to capture with his quote. The point is that he meant something pretty specific. The reason that this quote has such potential to be used out of context and in misleading ways is that there isn’t an objectively correct definition of what it means to “understand” something. At a practical level, if we can use quantum mechanics to build amazing technological inventions such as lasers, MRI machines, and much more, then we can claim to understand it pretty well. That doesn’t mean we understand everything that we would like to understand, or that it’s some kind of heresy or admission of failure to express frustration about the aspects of the theory we don’t understand, including through pithy, colorful statements. The issue about the slippery and subjective nature of “understanding” in science is illustrated quite well by another quote from a famous mathematician and physicist, John von Neumann, who once said to a colleague: “Young man, in mathematics you don't understand things. You just get used to them.” See the discussion here . Sorry for editorializing in the above couple of paragraphs, I realize some people might object that this is off-topic, but I thought it was important to discuss not just the literal question of whether Feynman said something, but also the implied question of whether what he said actually means what some of the people citing his quote seem to think it means.
{ "source": [ "https://skeptics.stackexchange.com/questions/54269", "https://skeptics.stackexchange.com", "https://skeptics.stackexchange.com/users/67127/" ] }
54,270
Twitter account @EndWokeness has claimed (in a tweet with almost 5 million views, 6000 retweets, 500+ quotes and 30k+ likes) that Joe Biden has a seventh granddaughter This is Joe Biden’s granddaughter He still pretends she doesn’t exist despite the fact that DNA tests have proven that Hunter is her father This doesn't match some reporting - e.g. Yahoo Insider says "Meet his 6 grandchildren" Does Joe Biden have a granddaughter from Hunter Biden, the existence of which he denies?
Joe Biden has 7 biological (and legal) grand-children. Between Joe Biden's four children (two of whom are deceased), there are six legitimate grand-children: Two from Beau Biden: Natalie and Robert Hunter II Four from Hunter Biden: Naomi, Finnegan, Maisy, and Beau In addition, Hunter Biden has been declared the biological and legal father of a fifth child (therefore Joe Biden's seventh grand-child), identified in court as "NJR". The court case establishing paternity and agreeing maintenance payments was widely reported in late 2019 / early 2020, e.g. CNN , CNBC , New York Post .
{ "source": [ "https://skeptics.stackexchange.com/questions/54270", "https://skeptics.stackexchange.com", "https://skeptics.stackexchange.com/users/60414/" ] }
54,290
On January 6, Ohio Gov. Mike DeWine signed legislation that legally defined natural gas as a source of "green energy." In a January 10 opinion piece U.S. representative Troy Baldwin (of Ohio) argues the case that natural gas is green, including this statement (emphasis added): In Ohio, natural gas has already reduced carbon emissions from power generation by 38 percent. Further, increased production and usage in the United States of this green energy is key to helping lower emissions and to reaching emission reduction goals. Is this a true statement? The words "38 percent" are linked in the article to this page from the U.S. Energy Information Agency , which is just a list of data tables showing energy-related CO 2 emissions by state. The data starts in 1970, so that would seem to be the starting year for the supposed 38% reduction.
From the linked tables , Ohio's carbon production fell from a high of 296.5 million metric tons in 2013 with regard to energy production to 185.6 million metric tons in 2020. That's a 37.4% reduction, which is close enough to 38% to make the claim true. The issue with this claim is that compared to coal, natural gas is quite "green". Coal is mostly carbon while natural gas is mostly hydrogen (by atomic count). However, compared to solar or wind or nuclear power, natural gas is not anywhere close to "green". Ohio used to use a lot of coal-powered electrical generation plants, many of which are no longer economically viable, so it was not all that hard to reduce carbon emissions by 38 percent by switching from coal to natural gas.
{ "source": [ "https://skeptics.stackexchange.com/questions/54290", "https://skeptics.stackexchange.com", "https://skeptics.stackexchange.com/users/38403/" ] }
54,327
I read from several places that Bertrand Russell spent many pages in Principia Mathematica to prove 1 + 1 = 2, e.g. here said "it takes over 360 pages to prove definitively that 1 + 1 = 2 ", while here said 162 pages. I do not believe that is the case, however, as I don't see why you'd need to prove 1+1=2 in the first place. But Wikipedia's article for Principia Mathematica mentions: "From this proposition it will follow, when arithmetical addition has been defined, that 1 + 1 = 2." – Volume I, 1st edition, p. 379 So did Bertrand Russell actually spend 360 pages proving that 1 + 1 = 2? What did Bertrand Russell want to accomplish by doing that?
If you have only studied mathematics at school, the way it works at university/academic level can be quite alien. By looking at the original Principia Mathematica , by Alfred Whitehead AND Bertrand Russell (e.g. this large PDF ), we can confirm the claim. It isn't until page 359 that the concept of "2" is introduced (as a "cardinal couple" - it isn't until later that they show that this is equivalent to the cardinal number, 2, that we are familiar with.) On page 362 there is the quoted claim that Proposition 54.43 provides the basis for 1 + 1 = 2 It is worth noting: Whitehead & Russell don't spend 360-odd pages just adding two numbers together, like you were taught in school. They spend the treatise defining what was hoped to be a complete and consistent basis for all of mathematics. That means they weren't just proving that 1+1=2 (under their system of mathematics) but also defined (amongst a lot of other propositions) what "1", "2", "+" and "=" meant. They based this on a minimum set of "axioms" or assumptions. They tried to avoid allowing paradoxes and contradictions [before Kurt Gödel came along and proved that to be impossible.]
{ "source": [ "https://skeptics.stackexchange.com/questions/54327", "https://skeptics.stackexchange.com", "https://skeptics.stackexchange.com/users/66003/" ] }
54,342
Eric Brown writes in How Marketing Created the Most Important Meal of the Day : In 1944, a marketing campaign for Grape Nuts would be unleashed called “Eat a Good Breakfast — Do a Better Job”. During radio ads, it would be mentioned, “Nutrition experts say breakfast is the most important meal of the day.” This marketing phrase became soaked into our lexicon ever since. As you can imagine, this made the cereal industry powerful, giving them a monopoly over breakfast. Is the claim that this marketing campaign is central for the belief true? Did the belief not exist before 1944 and was common afterward?
The expression predates that These may not be the oldest examples (especially with so many variations of the expression), but these examples predate the radio entirely and even nearly predate Grape Nuts itself. Saints Herald , 1887: We commend the reading of the following; it contains a deal of good sense: "EAT BEFORE YOU DRINK. "A large proportion of intemperance in the use of stimulants," philosophized a physician, "may be laid to the light breakfasts eaten by most people. Breakfast is the most important meal of the day and sufficient importance is not attached to it in the majority of households." That was reprinted in Good Housekeeping , 1889. Donahoe's Magazine , 1897: The breakfast is the most important meal of the day, and the thought and care bestowed upon its preparation will do much to make the day happy. But opinions at the time varied Nobody has ever thought that lunch was important (except maybe the French ). Dinner, on the other hand... The Illustrated Oarsman's Manual (1874?) says Dinner, whether taken in the middle of the day or late in the afternoon, is the most important meal of the day. When did it become popular? Google Ngrams shows that growth of the expression was exponential, but 1944 was on the long tail of that. It saw a slight boost in the 1970s and took off in the 1990s. I didn't have time to sift through all the timeframe, but a cursory glance shows that it's the motto of several breakfast companies, such as Ovaltine . Kellogg's went a step further and funded research so they could push that conclusion.
{ "source": [ "https://skeptics.stackexchange.com/questions/54342", "https://skeptics.stackexchange.com", "https://skeptics.stackexchange.com/users/196/" ] }
54,372
In September 2022, the Nord Stream natural gas pipelines off the coast of Denmark were sabotaged Seymour Hersh , a Pulitzer-winning investigative journalist, recently published a blog post, How America Took Out The Nord Stream Pipeline reconstructing a top-secret CIA operation allegedly behind the sabotage. Last June, the Navy divers, operating under the cover of a widely publicized mid-summer NATO exercise known as BALTOPS 22, planted the remotely triggered explosives that, three months later, destroyed three of the four Nord Stream pipelines, according to a source with direct knowledge of the operational planning. The reconstruction the post appears very detailed but it is difficult to confirm it with publicly-available information. Did the CIA conspire to sabotage the Nord Stream pipelines?
There is a good OSINT (open-source intelligence) based assessment of Hersh's claims by Oliver Alexander, which you can read in full at Blowing Holes in Seymour Hersh's Pipe Dream . The executive summary is: When first reading through Hersh’s account of the events, the level of detail he provides could add credence to his story. Unfortunately for Hersh’s story, the high level of detail is also where the entire story begins to unravel and fall apart. It is often stated that people who lie have a tendency to add too much superfluous detail to their accounts. This attempt to “cover all bases” is in many cases what trips these people up. Extra details add extra points of reference that can be crosschecked and examined. In Hersh’s case, this is exactly what appears to have happened. On the surface level, the level of detail checks out to laymen or people without more niche knowledge of the subject matter mentioned. When you look closer though, the entire story begins to show massive glaring holes and specific details can be debunked. It isn't really possible to summarise the analysis without quoting the whole thing, but the gist is that Hersh provides various details that look like they give credibility to the story that turn out to be incorrect: NATO General Secretary Jens Stoltenberg is said to have worked with the US intelligence since the Vietnam War, although he was actually only 16 when the war ended. The sabotage was supposed to be done by units involved in the BALTOPS 2022 exercises. Divers were supposed to have deployed from a "Norwegian Alta class mine hunter", but no Alta class vessels were involved in BALTOPS 2022 Hersh claims that a "Norwegian Navy P8 surveillance plane" on a routine flight (and thus visible to open source flight tracking) would drop a sonar buoy which would be used to detonate the charges. The Norwegian Navy doesn't operate any P8s. The Norwegian Army has taken delivery of some, but they aren't due to enter active service until later in 2023, and weren't in use by any Norwegian forces in 2022. Much of Hersh's narrative seems to assume that the explosions of Nord Stream 1 and Nord Stream 2 pipelines were close together. In fact the explosion sites were 80 km apart. In summary, on several of the points on which can be checked with publicly available information, the information contradicts Hersh's narrative.
{ "source": [ "https://skeptics.stackexchange.com/questions/54372", "https://skeptics.stackexchange.com", "https://skeptics.stackexchange.com/users/48414/" ] }
54,379
On the 12th of February 2023 Douglas Murray , Author and political commentator, wrote the following tweet : Evidence from official UK government report shows radical far-left group ‘Hope Not Hate’ (⁦@hopenothate⁩) is in fact a radical extremist hate-group. The tweet then references the following Mail Online news article: ' Rees-Mogg claims Brexit led anti-terror body to link him to far-right' . The Mail Online article makes no such claim that an official UK government report recognise ‘ Hope Not Hate ’ as a radical extremist hate group. The only reference to 'Hope Not Hate' in the Mail Online article is the following: Those on the course were handed an essay by the Hope Not Hate campaign group which flagged up columns by Douglas Murray at the Spectator magazine, Rod Liddle at the Sunday Times and Melanie Phillips on the Times. Essentially the Mail Online references the Independent Review of Prevent by William Shawcross CVO that reported that a Prevent workshop had used a report published by 'Hope Not Hate' called State of Hate 2020: Far Right Terror Goes Global . In the report Douglas Murray is listed as person responsible for the "normalisation and mainstreaming of Islamophobia". The follow-up response to the Independent Review of Prevent also makes no reference to 'Hope Not Hate'. Only some of the replies to the tweet question the validity of the claim made by Douglas Murray, would this be an example of blatant misinformation?
I am addressing the claim that the UK government officially recognise HOPE Not Hate as a radical extremist hate group. The UK Government tracks hate crime statistics but doesn't track hate groups . Neither does the FBI in the USA - these groups are monitored by non-profit organisations like the Anti-Defamation League . the Southern Poverty Law Center , Canadian Anti-Hate Network and... yes: HOPE, Not Hate. [Ref: The report mentioned in the question]. What UK Government do maintain is a list of Proscribed terrorist groups or organisations . It is of no great surprise that HOPE Not Hate, who "exists to challenge all kinds of extremism and build local communities" (according to their own site), do not appear on that list. This is not the first set of accusations against Hope not Hate by right-leaning public figures. Former leader of UKIP Nigel Farage withdrew accusations that they were linked to extremism and pursue violence after a libel case against him was mounted. As expressed in a comment, I am not convinced the original claim actually meant that HOPE Not Hate was officially listed as an extremist organisation by the UK government. It may have merely meant that Douglas Murray felt that their report itself demonstrated that, to his mind, they were extremist hate-group. If so, this is his political opinion, not a statement of fact. Both his political opinion and my opinion of his opinion are off-topic here.
{ "source": [ "https://skeptics.stackexchange.com/questions/54379", "https://skeptics.stackexchange.com", "https://skeptics.stackexchange.com/users/67360/" ] }
54,401
An episode of Last Week Tonight claims that the US Patent Office once patented a stick: Okay, but we shouldn't necessarily trust the US Patent Office as an arbiter of good judgement. It's the same place that issued patents for an umbrella for your beer, a tricycle with a lawnmower attached to it and... a stick. That's it. Just a stick, for animals to play with. Someone in the patent office saw that and said "we've never seen a stick before... approved." It seems to be referring to this patent . It seems legit, yet I find it hard to believe Oliver's simplified version of events. I'm betting there is some nuance to this story that's been omitted. Question : Did the US Patent Office issue a patent for a stick?
The patent office did issue the patent, but later determined that none of the claims were patentable and therefore cancelled all the claims. From the " Legal Events " section: Date Code Title Description 2002-07-23 CO Commissioner ordered reexamination Free format text: 20020606 2005-06-03 FPAY Fee payment Year of fee payment: 4 2006-07-04 FPB1 Reexamination decision cancelled all claims The PDF version of the page says: EX PARTE REEXAMINATION CERTIFICATE ISSUED UNDER 35 U.S.C. 307 THE PATENT IS HEREBY AMENDED AS INDICATED BELOW. AS A RESULT OF REEXAMINATION, IT HAS BEEN DETERMINED THAT: Claims 1–20 are cancelled. The proceedings of the reexamination are available on the USPTO Patent Center page.
{ "source": [ "https://skeptics.stackexchange.com/questions/54401", "https://skeptics.stackexchange.com", "https://skeptics.stackexchange.com/users/25398/" ] }
54,436
According to CNN , 90% of ice around Antarctica has disappeared in less than a decade. In the video, CNN Climate Correspondent, Bill Weir states: ... the ice around the continent was growing. In 2014, it was about 7,000,000 square miles, but in less than ten years, the National Snow and Ice Center, out of Colorado, has confirmed: It has broken the record again. Now, it is just over 700,000 square miles. So, that's over 90% of that ice around Antarctica has disappeared in less than a decade ... Is this true?
Each southern summer, the sea ice around Antarctica melts. The peak of sea ice is usually in September and the trough in February. Currently, the sea ice is at a record low, as seen in red below. (Only the 3 highest years and 2023 are shown.) ( image source ) Taking the seasonal peak of the highest year, 2014, and contrasting it with the seasonally low of this year, CNN can come up with a 90% loss! A better analysis is the below graph which shows the minimum for each year: (caption: "...The linear trend line is in blue with a 1.0 percent per decade downward trend, which is not statistically significant. A five-year running average is shown in red.") ( image source ) Note also that in the above data "sea ice" excludes not just ice on the land, but also excludes ice shelves which float on the sea around Antarctica. The ice shelf area is an addition 1.4 million sq km which would make the data be more constant if included.
{ "source": [ "https://skeptics.stackexchange.com/questions/54436", "https://skeptics.stackexchange.com", "https://skeptics.stackexchange.com/users/25579/" ] }
105
I've asked a lot of questions, but very few of them have any answer that nicely covers the whole issue or which is so much higher quality than the other answers that it deserves to be highlighted? My current accept rate is only 34%. Should I increase it by going back and selecting the answer with the highest number of votes or doesn't accept rate matter on this site? Related: Why would anyone accept an answer?
I would suggest that we remove it from the OP user box since its irrelevant to the effort that the user puts into Programmers. This way SO people who join won't have a mindset that <50% = Not worth answering
{ "source": [ "https://softwareengineering.meta.stackexchange.com/questions/105", "https://softwareengineering.meta.stackexchange.com", "https://softwareengineering.meta.stackexchange.com/users/86/" ] }
171
What's the difference between Programmers and Stack Overflow?
From Introducing Programmers.StackExchange.com : In a nutshell, Stack Overflow is for when you’re front of your compiler or editor working through code issues. Programmers is for when you’re in front of a whiteboard working through higher level conceptual programming issues. Stated another way, Stack Overflow questions almost all have actual source code in the questions or answers . It’s much rarer (though certainly OK) for a Programmers question to contain source code. Additional Resources: Programmers.SE FAQ Stack Overflow FAQ Good Subjective, Bad Subjective The Six Subjective Question Guidelines — Enforcement Notice Real Questions Have Answers
{ "source": [ "https://softwareengineering.meta.stackexchange.com/questions/171", "https://softwareengineering.meta.stackexchange.com", "https://softwareengineering.meta.stackexchange.com/users/1112/" ] }
213
One of the topics that's come up several times on Meta.StackOverflow is the perception that Programmers.SE is a joke proposal, existing solely to help keep Stack Overflow free of all the "crappy" questions that it gets plagued with every day. A few of examples of this perception ( note: see Edits 2 and 3 below for clarification ): From Is the Software Engineering site proposal a duplicate of programmers.SE? : You absolutely nailed my resentment of Programmers.SE; it was created as a tongue-in-cheek, NOT-Stack-Overflow site: the stuff we keep in the basement because it's not fit for keeping upstairs. — Robert Cartaino The problem is that Programmers.SE was created as an amorphous blob where nothing is off topic (read: everything not Stack Overflow is on topic). The travesty is that Programmers.SE is, at least in perception, [your quote] "an ocean of jokes and polls." That's where the problem lies. I would have created the site very differently: Subjective/soft programming topics okay; Industry/engineering issues okay; The business of software okay; Banal polls/jokes off-topic. — Robert Cartaino From Programmers on stackexchange : So what you are saying is that Programmers.SE is the skin of evil left behind to free the race of titans from the bonds of destructiveness: http://youtube.com/watch?v=Ims5W5bJl9s&feature=related#t=11s — Robert Cartaino If you browse questions on Meta for a bit, you'll quickly realize that there are a fair number of people who want to use SO for things that SO wasn't designed to be used for: discussions, polls, flame-wars, endless lists... And over time, this has created some amount of irritation on the site: some folks really, really want to, um, participate in questions like, What's the single best way to type code? Do you hurt sometimes (as a programmer)? How can I get a job drawing cartoons featuring the funniest programmer joke about my favorite hidden feature in the worst language ever (on a boat)? So rather than just stamping them out, like so many ducks putting out flaming elephants, some rose-tint-spectacled user got the idea of proposing a site for all the questions that shouldn't be asked on SO . Programmers.SE is that site. Soon, it'll have its own set of rules and standards, and the cycle will begin anew. But until then, it remains an anarchic paradise of freedom and love. — Shog9 I'm not going to challenge the premise behind the sentiment: in many ways, that was the purpose of the proposal, and Programmers.SE is flourishing despite it. But I'm wondering if there's a way we can spin the content of this site more positively rather than being merely a wretched hive of scum and villainy. Edit One of the things that's popping up here is we should leave Programmers.SE alone because it's popular and as Nathan Taylor puts it, "Haters gonna hate." I don't think the content or community Programmers.SE needs changing per se . Rather, I think we just need to provide a clear way to show value to SOIS and others, because the current perception is that we don't, and this site is more or less a joke that's gotten out of hand. So how can we do that? Edit 2 Looks like SOIS has decided to publicly address the banality of Programmers.SE on its blog: There’s an even longer list of things that really belong on the new Programmers Stack Exchange, which appears to be degrading into fairly stupid water-cooler nonsense, and could benefit from an infusion of more meaty subjects, like these proposals: I've created a new question, What questions are on topic, and what are off topic? , to see if we can hash out the proposed merges and other off-topic questions that have come up so far. Edit 3 Robert Cartaino clarified his position and the quotes above in the comments: I don't actively despise the concept of Programmers.SE at all. Read my quotes (above) very carefully. What I lament is the perception created; the "anything goes" ethos that simply aims to flout what makes Stack Overflow great. I wanted a site about the "Programmers' Life: a site for discussion of the business, careers, issues, and memes concerning professional developers." Subjective talk and soft topics okay if it follows a "back it up" philosophy or hard-earned experience. What I got instead was "what is the coolest/stupidest/weirdest/funniest thing you saw/did/tasted today?" Software Q&A will live on SO, Programmers, and a few academic sites. That's it. Not dozens of ridiculously niche and redundant proposals. But the mere mention of merging proposals with Programmers horrifies most. "I don't want my subject on that site." That's a huge problem. Programmers.SE is the "Park Place" to Stack Overflow's "Boardwalk"; a tremendously valuable resource in a bad neighborhood. I'd prefer to paint over the graffiti and fix broken windows; to reestablish community pride so the police come out on occasion. But that's a far cry from "hater" or "actively despise. Edit 4 Jeff Atwood has created two topics with the basic premise that Programmers.SE as it is now is too undisciplined, and SOIS will be taking steps to correct that: Adding discipline to programmers.stackexchange.com Should “Developer Testing” be folded into a more general “Programmers” site?
One idea I had was to position ourselves as a sort of mechanical turk: yes, there is a proliferation of polls and greatly subjective questions, but that has value to a passerby because it's a great forum to get a variety of possible opinions about something. Take Job hopping, is it a problem? , for example. There are a ton of different opinions, some similar to others, some completely divergent. There's very little evidence to support any one answer's perspective as being the canonical answer. But the value in the question is being able to accumulate all possible perspectives on the issue in one shot so one can synthesize their own answer to a related question. This could even be applied to the categorically silly questions like What is your favorite “programmer” cartoon? : if I was looking for programming cartoons for whatever reason, I could spend a lot of time performing Google searches, or I could just ask "What's your favorite programming cartoon?" on Programmers.SE and relatively quickly have a fairly-curated list of them. Another example that would be beneficial to Stack Overflow (while still being off-topic there) are questions like What is your opinion of the new Java 7 features? . You can ask here to get a list of various opinions about a specific programming topic to get up to speed and so you can create an informed question or answer on Stack Overflow. In short, I think we can keep being the garbage dump for Stack Overflow and still have some selling point when it comes time for Programmers.SE to be reviewed as being viable.
{ "source": [ "https://softwareengineering.meta.stackexchange.com/questions/213", "https://softwareengineering.meta.stackexchange.com", "https://softwareengineering.meta.stackexchange.com/users/-1/" ] }
655
Possible Duplicate: New Design Launched First, I want to thank you for your valuable feedback for the first design concept . Programmers.SE is a site about people and conceptual design more than coding. The old design simply did not reflect that; so it failed. I talked to Jeff about this, and he agrees that Programmers.se is much different from Stackoverflow. I quote him: Stack Overflow is when you sit down in front of the compiler, Programmers is when you sit down in front of the whiteboard Dan Grossman and Jon Purdy also thought a whiteboard theme was more appropriate from their comments regarding the first design review, and I agree. Since late last week I've been working on a whiteboard theme, and I'm pleased with the outcome. I feel it gives the site a more positive and personal touch, and has some similarities from the Beta theme. (click to see full res version.) The handwritten font in the title, top nav and section header is House Whiteboard . Yes, it is actually Hugh Laurie's handwriting. ("It's Not Lupus!") For the question title typeface, I tried a few "handwritten" ones; they looked great visually, but they scored rather poorly on readability. I eventually went with Yanone Kaffeesatz . I feel it has that soft look, and works well with the whiteboard theme. Please let me know what you think. bonus: (the making of)
This design is AWESOME. I think it captures the spirit of Programmers and sets us apart from SO (IMO that's a good thing). I'm not even going to offer any suggestions - Spot ON.
{ "source": [ "https://softwareengineering.meta.stackexchange.com/questions/655", "https://softwareengineering.meta.stackexchange.com", "https://softwareengineering.meta.stackexchange.com/users/1685/" ] }
2,948
This site is called programmers .stackexchange, while the FAQ states that this site is about "conceptual questions about software development". Since a large number of people never read the FAQ, or don't read it before posting, this means we have a community of people that think the site is about Programmers, but a team of moderators that try and maintain a site on "conceptual questions about software development". This causes a large number of unhappy people who don't understand why their questions, or other questions they think are good, get closed, downvoted, etc So my suggestion is to either rename the site to something that matches it's FAQ, like softwaredevelopment.SE , or change the FAQ so that this site is about Programmers, not software development only. Personally, I hate the suggestion of renaming the site. I want to participate in a site about programmers, not software development. But as long as the site is called one thing, and the FAQ states another, we're never going to be at peace. Edit #1 I have been told by a few different users to create separate specific requests to try and implement these changes, so Here's the proposal to change the FAQ so it matches the site name Here's the proposal to change the site name to match the FAQ Edit #2 Since posting this, I have learned the history behind the site scope change , and doubt SE will ever allow it to go back to simply being a Q&A site for programmers to get answers from other programmers So please, change the name of the site to something that better reflects the site's scope to stop the number of frustrated and confused users we have posting bad and off-topic questions.
No answer so far states the opinion defined in the question, so I am posting one The site name matters. It is the first, and sometimes only thing someone judges a SE site by, so I feel it should accurately represent what the site is about. If a descriptive site name does not match its content, then the site is not at its full potential since they are losing what could potentially be a large part of their user base. The loss could occur because the content is not what the user expects, so they leave, or it could occur because users actually searching for a site like the one you provide are not finding your site, or maybe are seeing it, but are assuming it is something else. Cooking.SE is a site about cooking, Databases.SE is a site about databases, and Bicycles.SE is a site about cycling, however Programmers.SE is NOT a site about programmers. It is a site about conceptual, whiteboard-styled software development questions. The best analogy I can think of is creating an Athletes.SE , which is only for conceptual questions about sports, while questions about athletes themselves, or technical questions about sports are off-topic. It's very confusing, especially to new users. So either change the site name to something like softwaredevelopment.SE to accurately reflect what the site is about, or change the site scope so that this is a Q&A site for programmers, about programmers.
{ "source": [ "https://softwareengineering.meta.stackexchange.com/questions/2948", "https://softwareengineering.meta.stackexchange.com", "https://softwareengineering.meta.stackexchange.com/users/1130/" ] }
3,136
From what I was seeing overall question quality is going downhill, so on that hunch, Using Stack exchange's Data explore r ran the Following Query: Select CAST(year(p.CreationDate) as char(4))+'-'+CAST(DATEPART(wk,p.CreationDate) as char(2)) as week ,count(1) as Cnt ,avg(score) as Avg ,avg(VoteCount) as AvgVoteCount from Posts p JOIN (Select postid,count(1) as VoteCount from Votes group by postid) v on v.postid=p.id where p.CreationDate>dateadd(yy,-1,getdate()) group by year(p.CreationDate) ,DATEPART(wk,p.CreationDate) order by year(p.CreationDate),DATEPART(wk,p.CreationDate) I then Graphed The Count and Average: Our Average Question Score is Down a full point over the last year , whats going on? Edit: I Updated the SQL and Added Average Vote count.. Edit 2: For Thomas SO Results
I think we are. The site scope was changed dramatically from its original proposal, without the consent of the Programmers community, which drove many users away. Since then, strict enforcement of keeping questions on-topic with the new site scope, combined with the fact our site name doesn't match the FAQ has been causing a lot of users to misunderstand what the site is meant for, and to lose interest and stop asking/answering questions. For example, here's a graph of our new user growth . It remains fairly steady, because many programmers are interested in a site for programmers: And here's a graph of new questions asked . It is going down hill fairly steadily for a while now, and I don't believe it's because all the good questions have been asked/answered. I think it's instead people being aware of stricter rules, and/or being uncertain about if they should post something for fear of just getting downvoted and closed. It should be noted that the above graph shows posts which were closed as of 5/10/12 (which is when I noticed I was including answers in my old graph). To view a graph of actual open/close post history, use this query , which was added in May. It should also be noted that some of the larger spikes of posts getting closed can be attributed to the massive cleanup that some users undertook. Our number of votes is also declining, which is the result of less user participation. I can't speak for others, but personally I've gotten frustrated with the way our site is going and have kind of given up. I don't think I'm alone either judging by questions on meta, and the disappearance of several formerly-active users. This site is yours, so you can run it however you want, however I strongly believe that the downward nature of the graphs is caused by a lot of misunderstanding and confusion about the site scope, and the way SE treated the Programmers community when they decided to change the site scope without their consent. The site was meant to be a site for programmers about programmers, and it has since changed to a site for programmers about software development only. A lot of old active users have lost interest, and a lot of new users don't understand the site scope and ask low quality questions.
{ "source": [ "https://softwareengineering.meta.stackexchange.com/questions/3136", "https://softwareengineering.meta.stackexchange.com", "https://softwareengineering.meta.stackexchange.com/users/7748/" ] }
3,412
How did the topic of this site to change from "Not-Programming-Related" to "conceptual questions about software development"? I've tried looking through meta questions to find where the change in site scope was discussed, but haven't found much. The biggest change in site scope I see on meta is the Six Subjective Question Guidelines — Enforcement Notice , which doesn't really discuss site scope at all other than rules for subjective questions. If this was a community-decision, I would expect there to be a lot of meta questions related to the site scope change. If this was an executive decision, I would expect to see some kind of notice posted that got a lot of attention. Maybe I'm just using the wrong keywords in my meta searches. Can someone explain the history of how this site changed from the original "Not-Programming-Related" proposal to be a site for programmers about issues not directly related to programming, to the current site scope of "conceptual questions about software development"? Update Mark's answer provides the timeline I was looking for, along with some great links, however Walter's comment pretty much sums it up: There really wasn't discussion on Meta about the site change, it was an edict from above saying this must change... and so it did This can further be seen in the links Mark provides where site scope was discussed on P.SE meta, and the general consensus was not to change anything , other than to crack down on some of the nonsense questions that just had "as a programmer" tacked on. To quote Mark : A few days ago, I created the question, "How can we avoid Programmers.SE from becoming the SE black sheep?" . There, the consensus was to not change anything. Unfortunately, SOIS has spoken, and it looks like that sentiment will not be able to sustain the site. While I am disappointed in this, at least now I know where the change came from. I actually didn't realize the change in site scope went that far back, because I never saw any kind of announcement telling us that NPR was no longer going to be tolerated. The only thing I saw was an announcement about subjective questions guidelines , which I thought was the attempt to crack down on the "as a programmer" nonsense questions, and many subsequent battles on meta over site scope where it seemed the users had one opinion, and the moderators had another. I did see the blog post when P.SE finally left beta, however I thought it was merely trying to advertise the more constructive side of P.SE, not that the NPR side was no longer valid.
The timeline of this was: June 3, 2010 — NPR proposal created in Area51 September 1, 2010 — NPR enters private beta as "Programmers" (will continue to call it NPR for the sake of distinguishing it from the current site) September 7, 2010 — NPR enters public beta September 13, 2010 — MSO starts to grapple with how bad NPR is when trying to dupe Software Engineering proposals to it September 15, 2010 — We on Meta Programmers.SE start to deal with that perception September 17, 2010 — SE makes a blog post announcing the decision to start merging programming-related proposals to improve NPR September 18, 2010 — We begin to redefine the scope of the site September 23, 2010 — MSO and SE do the same , with different results September 29, 2010 — " Good Subjective, Bad Subjective " is published, new scope and guidelines begin to be enforced , and the free-for-all NPR proposal dies. It's replaced with Programmers, the site for expert programmers who are interested in subjective questions on software development. December 16, 2010 — Programmers launches, and the position statement is tweaked from "subjective questions" to "conceptual questions" to make it easier to explain why it exists compared to Stack Overflow . December 17, 2010–Present — Clean-up of the colossal mess left over from beta, including incorrect UI strings and FAQ wordings , crappy questions , awful tags , misconceptions , and so on. Within 29 days the site went from programmers hanging out and discussing their lifestyle to programmers discussing software development issues, now over 18 months ago, which is why I'm always surprised people still cling to that old proposal. I would've thought they'd all move on by now: the writing was on the wall almost immediately after the site left private beta.
{ "source": [ "https://softwareengineering.meta.stackexchange.com/questions/3412", "https://softwareengineering.meta.stackexchange.com", "https://softwareengineering.meta.stackexchange.com/users/1130/" ] }
3,457
https://softwareengineering.stackexchange.com/users/26782/goma Two Questions: How did this user amass more than 2k worth of reputation? Does their history of asking non-constructive questions put them at risk of getting banned? UPDATE This user is suspended again for their activity.
The mods are privy to certain statistics about users that others do not have. Goma is an unusual case of somebody who goes above and beyond to continue spamming Programmers with garbage. He has used over 50 unique IP's most of them behind proxy servers located in Saudi Arabia. He has opened multiple SE accounts with numerous unique email addresses. Previous attempts at notification and suspension have simply resulted in him changing IP and email address. He leaves a pretty reliable pattern though, almost always IP address that originates in Saudi Arabia among other tells. His questions are easy to spot too. I believe we keep this account open and merge duplicates into it so that we have a single consistent record of his activities for study and future consideration. This is why he has posted so many questions -- because many of them were on duplicate accounts that we had merged into Goma.
{ "source": [ "https://softwareengineering.meta.stackexchange.com/questions/3457", "https://softwareengineering.meta.stackexchange.com", "https://softwareengineering.meta.stackexchange.com/users/1996/" ] }
4,013
@Rachel just pointed it out this morning. Pierre 303's account is gone! What happened to him? Was he at odds with the way Stack Exchange was being run? He was an all-star! The departure of a user of his caliber is a significant loss for this community! Can anyone explain? EDIT: If Pierre chooses to come back in a year, can we simply undelete his account?
Pierre contacted us and asked us to remove his Stack Exchange accounts. His reasons are his own, and unfortunately we weren't able to convince him to reconsider. His questions and answers here on Programmers were (and are - they're still around) valuable, and while I'm personally sad to see Pierre leave, I'm happy that he was around and shared his expertise with us for as long as he did.
{ "source": [ "https://softwareengineering.meta.stackexchange.com/questions/4013", "https://softwareengineering.meta.stackexchange.com", "https://softwareengineering.meta.stackexchange.com/users/1996/" ] }
4,067
I recently came across the tag great-programmer . What does this even mean? Is this a garbage tag that should just be removed? For anyone working on the tag clean-up, please watch out for a few things: If it's a question that should be closed, don't retag - flag or vote. If it's closed and can't be saved, flag for deletion. If it's a good question, try to make as many edits at the same time as possible (consider the title, body, any other tag changes, and answers). Just don't flood the homepage with a bunch of minor edits to questions. It just buries new questions. If you aren't sure if you can save a question, feel free to discuss it somewhere - in this question or in chat. If an edit is going to be made, make it a good one. Just generally reduce the number of edits to a single question as well as the number of questions you edit at a given moment in time. Thanks!
It's a terrible tag and it should die an ignoble death. We had it on the chopping block before, but it slipped through the cracks while we were dealing with even worse tags that were far more prolific. Kill it. Kill it with fire.
{ "source": [ "https://softwareengineering.meta.stackexchange.com/questions/4067", "https://softwareengineering.meta.stackexchange.com", "https://softwareengineering.meta.stackexchange.com/users/483/" ] }
5,207
Since straight-up book recommendation Q&A is discouraged, book reviews seems like a perfect fit for the blog . And what self-respecting blogger would blog about a book without a link to an online shop where it could be purchased? However, this raises the thorny question of affiliate links. On the Q&A site, links to Amazon are automatically converted into links that reference Stack Exchange's affiliate ID, but the blogs have no such restriction - in theory, an author could use his own affiliate ID. How does the community feel about our blog writers earning a bit of money when writing articles for our blogs in this fashion? It should be noted that all articles get reviewed by a few different users before posting, and we would make it a requirement that the articles are truthful about the book (or whatever product is being reviewed). We also probably wouldn't publish too many of these kind of articles, as we don't want the blog to become just a series of sales pitches out to sell products.
Adding affiliate links or any type of commercial or promotional activity to blog posts would not be appropriate. These blogs are not the personal property of those who write the articles. These blog posts are supposed to be a resource for the community. Once folks start wondering "what kind of money you could earn if…" the motivations start to get muddled. Obviously, we cannot provide equal time for everyone who has a product to review or a book to mention. So even if we could assure all the stipulations you cited above, there is just too much potential for stepping over some arbitrary line. Folks here are willing to contribute their time and their knowledge into creating these resources. Whether they do it for fun, or to share their knowledge, or just to show off a bit, I wouldn't want to be put in the position of saying their activities have become too self serving. It's best to have a clear and unambiguous line up front. No affiliate links in blog posts.
{ "source": [ "https://softwareengineering.meta.stackexchange.com/questions/5207", "https://softwareengineering.meta.stackexchange.com", "https://softwareengineering.meta.stackexchange.com/users/1130/" ] }
5,654
I'm going to try and keep this short, because a lot of it has already been discussed. If you are interested in the history, you can find it here on Programmers Meta, but here's a very brief recap: Programmers started out life as "Not Programming Related." It was supposed to be a haven for all of those "soft" questions that Stack Overflow didn't want. Stack Exchange decided to change the scope of the site, when it became apparent that the "Not Programming Related" concept was not working. But they waited a little too long. We lost a high-rep user, largely because "people factors" are no longer considered on-topic. We lost a good mod, who was probably tired of fighting the good fight. We've argued over the site scope ever since. Programmers is a site for professional programmers who are interested in getting expert answers on conceptual questions about software development. This is Programmers' site scope, taken directly from the FAQ: algorithm and data structure concepts design patterns developer testing development methodologies freelancing and business concerns quality assurance software architecture software engineering software licensing That's it. Now tell me, how does a question like Where can I find a printed copy of the C++ specification fit within this framework? I get the impression that the user community is asking themselves, "How can we make Programmers more friendly and more inclusive?" If that is the case, you're asking the wrong question. The question you should be asking is How can I make this site more friendly to experts? I am a moderator on Stack Overflow. What I have noticed is that people sometimes ask their conceptual programming questions on Stack Overflow instead of Programmers; when I ask why, they usually say "there are more people here to answer my question." What they really mean is, "the experts are here." Why are you here? Is it to talk about one of the bullet points above? Or is it to help people find books? Is it to help people design, architect and test their programs, or is it to talk about why end users are so unreasonable ? Do you really think the experts care why c languages use curly braces? There's a balance between helping people get answers to their questions, and keeping the site an interesting place for experts. There's nothing wrong with helping someone out, but too many questions that are only interesting to one person, or only interesting to non-experts, will drive away the experts. Without experts, there is no site.
As I've mentioned before, the blessing and curse of Programmers SE is that most all the allowed topics are interesting to most all the users. The curse part is that anytime there's a topic which not everyone is interested in, that topic gets pushed out, if not officially, then de facto by voting to close. StackOverflow is teeming with questions about Microsoft languages and frameworks. For the moment at least, I have zero interest in those questions, almost zero direct knowledge about them, and find them a complete waste of my browser space. However, I recognize that those questions are valuable and interesting to a lot of other people, and happily go on using StackOverflow because those tags are in my ignored list. However, that doesn't happen on Programmers. More and more, Programmers feels like StackOverflow would feel if only C++ questions were allowed. Any topic without a broad consensus is deemed off topic for everyone. I think this site would be better off if we broadened the scope considerably and encouraged people to use the tag system to filter out topics that are personally uninteresting. Consider the big four controversial topics: books, career, history, and getting started on a new technology. These are all things that one time or another almost all of us have gone down the hall to ask the advice of a more experienced programmer. I've been programming professionally for 15 years, and as an amateur 10 years before that. No one would consider me not to be an "expert" programmer, but if I took a .Net job tomorrow, the first thing I would do is ask an expert programmer which of the gazillion books out there on the topic are actually worth reading. Why there's so much confusion about this site's scope is that there are so many questions like that, that you would want to ask your colleague down the hall, but are inexplicably off topic here. So what if you might get 10 different answers from 10 different people? Those are expert opinions, and you've narrowed your options down considerably. The best answers get voted to the top, and countervailing evidence is expressed in the comments. The biggest thing holding back this site is the expectation that the only on topic questions are those that have a single clear and definitive answer. On the contrary, the best conceptual questions have more than one good answer. We have tools and a process to filter out bad answers. We shouldn't throw out the baby with the bathwater. We have a lot of questions that are deemed "unanswerable" or "not constructive" in the comments, then closed shortly after one or more excellent, constructive answers are posted. To me, that shows a fundamental misunderstanding about the kinds of questions both askers and answerers want to be on topic on our site.
{ "source": [ "https://softwareengineering.meta.stackexchange.com/questions/5654", "https://softwareengineering.meta.stackexchange.com", "https://softwareengineering.meta.stackexchange.com/users/1204/" ] }
6,067
* This might best be answered by someone who fits this description. Usually when I am impressed by a certain user's answer, I will look at their profile to see what kinds of questions they are asking and to my surprise many times even high-ranked users have never asked any questions. How can this be? Certainly, they didn't get so far without asking questions. Are they relying on other sources to answer their questions? The other thoughts I had: Do questions no longer affect their reputation after they reach a certain rep? or Could it strictly be because a question upvote only gives you 5 points whereas an answer upvote gives you 10. Hopefully reputation wouldn't be the ulterior motive.
Speaking personally I'm at the stage in my career/life where I don't have many questions left to ask. What I do have is many years of experience in encountering the problems that other people ask about so I can offer my solutions, knowledge and experience to hopefully help them.
{ "source": [ "https://softwareengineering.meta.stackexchange.com/questions/6067", "https://softwareengineering.meta.stackexchange.com", "https://softwareengineering.meta.stackexchange.com/users/91765/" ] }
6,166
It is September once again (today is the 8755 th day of September ), and once again students are asking their homework problems on Stack Overflow and SoftwareEngineering.SE. We start seeing questions like: A car dealer has 10 salespersons. Each salesperson keeps track of the number of cars sold each month and reports it to the management at the end of the month. The management keeps the data in a file and assigns a number, 1 to 10, to each salesperson.... Write the code to store the number of cars sold by each salesperson in the array cars, output the total numbers of cars sold at the end each month, and output the salesperson number selling the maximum number of cars. (Assume that data is in the file cars.dat, and that this file has been opened using the ifstream variable inFile.) The first thing to understand is that we are not a code writing service. You can't just copy/paste a homework into the text area and expect someone to do your homework for you. Programmer education builds upon previous experiences. The compiler class has machine language and data structures as prerequisites because if you don't understand those, you will be hopelessly lost in the class and not even able to understand the lectures. Lets assume you do understand the code (the person answering the question did a good job explaining it)... the curriculum is designed to take you from A to Z with 24 steps between. As industry programmers we often take shortcuts and don't need say, steps ijkl to do something. Learning from us, you'll never get these steps. However you may find in your next assignment or class that understanding jk is assumed and critical to the understanding of some other concepts. Just because we don't need ijkl to do it doesn't mean it isn't understood. A programming class I took years ago used SPIM - a MIPS simulator. One of the students discovered a little-used DECStation in the lab that had gcc on it. Instead of writing the assignment (factorial) by hand with the concepts we had access to (we were supposed to write a recursive function to learn about the stack and frame pointer), he wrote it in C, compiled it with gcc -S and handed in the resulting MIPS assembler code. However, the compiler, recognizing an optimization, converted the entire code from a recursive subroutine into a for loop. He got a '0' on that homework and had trouble with the next one (which assumed you already understood the frame pointer and stack pointer). Copy and paste takes no skill. It cheats you out of the education you are paying to get. It cheats us of good interview candidates. Technical interviewers often complain about the quality of college graduates. You may be enthusiastic, but unless you can write code and explain concepts better than the other person, we're going to hire the other person. Your first resource to look at should be your instructor. They are there for you and want to have you follow a specific path to get to the end point of understanding. So, you've exhausted the resources. You've gone over your lecture notes. You've searched google. You've asked your peers and knocked on the TA's door during office hours. You've even tried asking your instructor. And you've come here... Don't expect an answer in any given time frame. The urgency of your question is not something we are concerned with. Good questions and answers are timeless - not something that needs to be done by 5pm today or 8am on Monday (you may find the rate of answers drops substantially on the weekends and evenings of various timezones). Describe the problem you are having, what your understanding of the problem is and where you are confused . For a question from a student, the best questions are often the ones that are asking how to take a single step in understanding rather than trying to leap all the way to the solution. Realize also that the answer we give you may be completely wrong for the path that your instructor is trying to get you to follow. Having previously fought through the problem ourselves, we know and understand when one can jump directly from il and when one needs to go through each step of ijkl in a process. Our answers may skip over steps that aren't needed for this particular problem , but may be critical for understanding the next assignment or some problem years down the road where skipping jk is the wrong answer. In many cases, it is important to follow the curriculum as best as you are able. Going above and beyond is good where one gains a deeper understanding of a problem domain, but one must have the foundation upon which to build. We want you to do your homework to the best of your ability. Getting points off on an assignment and learning something from that produces a better interview candidate than one who can copy and paste code that got As in school but can't solve a simple problem they've never seen before. If you decide to post your question anyway Please make sure you read the tour and help center . Software Engineering focuses on software design and architecture. Questions about "how to write some code" or "help me debug this code" are off topic on SoftwareEngineering.SE as they are issues with implementation rather than design. They may be on topic on Stack Overflow, but just posting the requirements or code and saying what amounts to "help me" is rarely enough for a good question - make sure you read How to create a Minimal, Complete, and Verifiable example before posting a question on Stack Overflow. If your question on SoftwareEngineering.SE is just a copy paste of homework problem, expect it to be downvoted, closed, and deleted - potentially in quite short order.
I agree with this open letter. I guess it's more directed at the people potentially answering this kind of thoughtless questions, because the people asking them -- more often than not Help Vampires -- will never read meta, or the FAQ, or try to understand the issues involved. So I guess it could be rephrased to say: "potential answerers, don't encourage these questions with answers, but downvote and close them instead". Wasn't the purpose of StackExchange, as stated by Jeff Atwood, to increase the signal-to-noise ratio of the internet? This kind of questions reduce the ratio, plain and simple. Don't encourage Help Vampires. Educate them, if you can, but know it's an uphill battle.
{ "source": [ "https://softwareengineering.meta.stackexchange.com/questions/6166", "https://softwareengineering.meta.stackexchange.com", "https://softwareengineering.meta.stackexchange.com/users/-1/" ] }
6,582
Difficulty with naming has a bit of a history with programming. There are two hard problems in computer science: caching, naming, and off by one errors. It has been repeatedly revisited on meta and elsewhere (P.SE Aug 2012) Question about naming conventions "not constructive"? (P.SE Oct 2011) Are "name that thing" questions on-topic? (P.SE Jan 2014) "Name that thing" questions (MSO Nov 2013) What is the appropriate / preferred method for questions concerning programming nomenclature? (Blog Feb 2012) Lets Play The Guessing Game And yes, the guidance on these is confusing. So, please answer: Are the terminology and naming questions on topic? If so, what should be expected of these questions to not be closed as custom, too broad, or primarily opinion?
yes, absolutely, these questions are on-topic, and here's why I think so: knowing the right word enables you to find a precise definition, and to search through the literature for more background information knowing the right word enables effective communication. If you don't know what a "free variable" is, I think it would be nearly impossible to have a good discussion about them the answers are rarely opinion-based. Most have been established in the literature and are well-defined. in the few cases where there are multiple possible names, the question provides an opportunity to cover the exact differences between what the names refer to criteria for good naming questions: here's some good answers . I would add that good naming questions: focus on what already exists and has already been defined aren't trying to come up with new names let's keep in mind that a bad question is a bad question, and a lazy asker is a lazy asker. Ultimately, good users will (probably) generate good content, and abusize users will game, abuse, and damage the system, no matter how complicated or precise the rules are. Lastly, I have seen this comment appearing frequently under suspected naming questions: This question appears to be off-topic because it is a "name that thing" question. "Name that thing" are bad questions for the same reasons that "identify this obscure TV show, film or book by its characters or story" are bad questions: you can't Google them, they aren't practical in any way, they don't help anyone else, and allowing them opens the door for the asking of other types of marginal questions. See http://blog.stackoverflow.com/2012/02/lets-play-the-guessing-game I would like to request that users stop posting this. Oftentimes, it is inappropriate for the OP under which it was posted, because: you can google them (see #1 above) they are practical (see #1 above) they do help others. Of course they don't help everybody , but no question does. it uses unwarranted pejorative language ("guessing game") the linked blog has little to do with the OP under which the comment is pasted the examples have nothing to do with precise terms such as "alpha substitution" the intent of the question is totally different than a so-called "guessing game" If you believe that a question is low quality, then (IMHO) please say why you think so, downvote it, or vote to close it. But please, do not copy-paste such comments when they don't apply. I believe it's simple harassment, incites pointless debate, turns off people from using the site and is unhelpful.
{ "source": [ "https://softwareengineering.meta.stackexchange.com/questions/6582", "https://softwareengineering.meta.stackexchange.com", "https://softwareengineering.meta.stackexchange.com/users/-1/" ] }
6,629
One of the not infrequently asked questions that shows up on Software Engineering.SE is along the lines of: I am trying to explain the differences between a NoSQL database and a relational database to a manager. How do I do this? These questions generally take the form of "How do I explain ${something} to ${someone}" where something is a programming topic and someone is not a programmer. This is particularly relevant with the oft said "If you can't explain it to a five year old, you don't really understand it." Why do these questions keep getting closed? What can I do to ask them in a way that doesn't get closed?
Problems So many targets... The first problem with these questions is that there are so many possible audiences. How do you explain pointers to a 13 year old. How do you explain the internet to your grandfather? How do you explain testing to a manager who has pointy hair ? The way you would explain something to a 13 year old is different than the way you would explain it to your grandfather or a manager Ok, maybe the manager and the 13 year old would get the same explanation. Key to this is that each type of person you explain something to is different. Whats more, each person you explain something to is different (maybe your grandfather was given write enable rings for teething and had a very good grasp of UUCP back in the day). Your grandfather is different than my grandfather, and your boss is different than my boss. The explanations for one won't always work for the other. These questions inevitably get refining the specifics of explaining it to one person. At that point, the question and answers become useless for anyone else who wants to explain it to another grandfather or 13 year old. So many understandings... So how would you explain pointers to a peer in computing? If you do not understand pointers fully yourself, the question will turn from how do you explain them to someone else to the community explaining them to you. If you lack the full understanding of the topic at hand, you will not be able to properly answer the follow up questions from whatever analogy you pick (the internet is a series of tubes... but how do they go to the right tube? - if you don't understand routing and DNS and all that fun stuff yourself you won't be able to properly answer that question). On winning arguments... One form this type of question will take is: I disagree with my boss/coworker over [some issue]. How do I persuade them otherwise? These questions are problematic because We don't know who you are trying to persuade and what his or her reasons for those beliefs are, and We don't know what you already understand. They also suffer from another problem: "I'm trying to collect bullet points for ammunition in this argument" which can easily turn the question into a poll where every answer is a new argument/opinion. Such questions for trying to win arguments really fall down on these points and only very rarely produce good answers. The key to Stack Exchange itself is about having high quality answers - not bullet points of "you could bring this up." Solutions So what can I do? Make sure you understand the subject you are trying to explain, at least to the level you are going to try to explain it to and one deeper . Make sure you understand that. Ask a question about that misunderstanding if you have it so the misunderstanding may be corrected and you'll be able to think of the proper car or train analogy to explain what you are trying to explain. Consider asking in chat. While this isn't the main site and you don't get rep for it, it can help in being able to particularly address your understanding and the understanding of the person you are trying to explain it to. Consider for example, this conversation where we went through several iterations of understanding the target audience before recognizing the proper way to explain pointers. Asking about how to explain something and working through the solution is not a 'fire and forget' question where you ask it and come back to get the answers. Those are the questions get get closed. They take a significant amount of work explaining what you understand and properly tailoring it to the audience that you understand best. While we can help with your understanding, trying to answer it for your audience is not a good fit for the Q&A site.
{ "source": [ "https://softwareengineering.meta.stackexchange.com/questions/6629", "https://softwareengineering.meta.stackexchange.com", "https://softwareengineering.meta.stackexchange.com/users/-1/" ] }
6,878
So there was this question asked by a new user. And it's undeniably crap. It is a code-dump showing 0 research effort, not even a problem statement. This is the prime example of what a crap question is. https://softwareengineering.stackexchange.com/questions/255703/how-to-get-the-average-of-an-array?noredirect=1 : Now that question is on SO and the situation isn't any better. The question still is crap. I know what you all are probably thinking right now: "Another mindless rant by a new user". Well you may think that way, and I'm okay with that, but I want you to ask yourselves a simple question: Has SO become the trash dump of Programmers? From what happened to that question, this one definitely looks like a yes. And I think that is very sad. Some users have even realized that this question should not be migrated. Please programmers-users, don't become mindless crap-migrating SO users and think before pressing the migrate button.
Has SO become the trash dump of Programmers? The specific question certainly isn't... stellar, but you are over-reacting. In fact, of the 258 questions we've migrated to SO (in the last 90 days), only 13% have been rejected. Conversely, the numbers aren't as good the other way around. We've rejected 16% of the questions SO has send us ( but you don't see us complaining about it, do you? ;). Next time you spot a shady migration, just vote to close it and move on.
{ "source": [ "https://softwareengineering.meta.stackexchange.com/questions/6878", "https://softwareengineering.meta.stackexchange.com", "https://softwareengineering.meta.stackexchange.com/users/99078/" ] }
6,998
Winter Bash is an annual event that can run on any Stack Exchange site that chooses to participate. Users earn “hats” for their gravatars by completing certain tasks (analogous to badges). Certain actions trigger the user receiving a hat, which their gravatar can “wear”. We track everyone’s progress earning hats in a leaderboard. This event will run from 15 December 2014 to 4 January 2015. Users will be able to see their entire hat collection on winterbash2014.stackexchange.com. That site will also have a landing page, explaining the rules and other details of the event. Individual users who don’t want to participate, don’t want to see hats, and/or are generally anti-hat will have an “I hate hats” option available. And just like last time, at the end of the event, all hats will go back into storage. The only visual change to the Stack Exchange sites themselves will be the presence of the hats and the “I hate hats” button in the footer. Programmers has participated in this event in the past. Should we participate again this year?
HATS!!!! * * 'nuff said
{ "source": [ "https://softwareengineering.meta.stackexchange.com/questions/6998", "https://softwareengineering.meta.stackexchange.com", "https://softwareengineering.meta.stackexchange.com/users/4/" ] }
7,182
Moderator Note: Programmers changed its name recently to Software Engineering. This post has been updated to reflect the new scope, but it still refers to Programmers for historical reasons. Wherever you see Programmers in this post, you can now mentally substitute "Software Engineering." You're on Stack Overflow and you've found a question that isn't about coding. It's about design or something squishy like that. You are trying to be helpful, and you put a comment in the question: You should try asking on Programmers.SE instead. --YourName 2 minutes ago ... and suddenly, out of nowhere Programmer.SE users sweep in and saying: No, this isn't a good question for Programmers.SE --AnotherName 5 seconds ago Why are they saying that this isn't a good question for Programmers.SE? What is in scope on that squishy site with the out of focus whiteboard background and a coffee thing as its favico ? What should I look for when determining if a question should go on Programmers? And while you're at it answering these questions, why do they get so annoyed at these suggestions?
What is a good question? For a start, a good question is: not too broad not opinion polling and not unclear. If you'd close a question on Stack Overflow for one of these reasons, don't suggest posting it here on Software Engineering.SE. Those close reasons span all of the Stack Exchange network. For a description or interpretation of the close reasons, which are specific to this Software Engineering SE site, see Why was my question closed or down voted? What are common types of poor question? Please, we don't want these, though it seems like people keep suggesting them to be reposted here. "I don't understand..." Questions that that boil down to "I don't understand ${concept}" with no additional information about what is understood tend to get down voted and closed (too broad). There is an expectation that at least the person read the Wikipedia article on the concept and explained what was understood and not. "Here's my problem. Anyone have a solution?" Another example of commonly (incorrectly) referred questions are those of "come up with a design or solution for me." Similar to how Stack Overflow has issues with people doing problem statement questions ("write a program that takes the average of three values"), Programmers.SE has similar issues with similar questions ("here is what I want the outcome to be, anyone have a solution?"). Neither Stack Overflow nor Software Engineering.SE is a code/solution writing service. Just because the question lacks code but has a problem statement doesn't mean it belongs here as is. The question needs to be fixed up significantly before it is reposted on the proper site and suggesting to do so with the question in its current state does a disservice to the person asking the question. For a more detailed outline of what is generally expected at Programmers, see Why is research important? "A blog I read said..." Questions that are trying to get someone to explain more about something someone said in a blog (or worse - twitter) post also often have trouble with being a good question here. More about those questions can be read at Discuss this ${blog} Our custom off topic reasons And then there's our off topic reasons that were alluded to above: What tool, library, language, project, resource to take up or use (the wording was modeled after Stack Overflow's close reason) What career to take or what to study Assistance in writing or debugging code These are specific forms of polling that the Q&A format really isn't geared to answer. While close voters may argue amongst themselves about too broad or the clarity of the question - these reasons often meet with very prompt close votes. So what is in scope here? The help center i.e. What topics can I ask about here? is the best page to read for our scope: software development methods and practices requirements, architecture, and design quality assurance and testing configuration management, build, release, and deployment What we want are well thought out and researched questions about the Software Development Life Cycle that aren't code troubleshooting questions. Remember that algorithm questions are also on topic on Stack Overflow (so you don't need to migrate those ). If you have a question, stop into chat and ask (we're a friendly bunch). If it's a good question, we might be able to prod a SO mod into migrating it or help flag it ourselves. I don't believe I've had a flag to migrate a question from SO to P.SE declined - explaining that the person flagging the question, despite the low Stack Overflow rep is a trusted user on the target site and will endeavor to have it remain open on the target site rather than being a rejected migration does go a long ways to helping. Also consider that many of the "soft and squishy" questions are ones that can be answered in chat. We are easily distracted by actual questions when people ask them. So, why so mad? Software Engineering.SE has a much smaller community than Stack Overflow. We get about 30-40 questions per day. Stack Overflow gets about 8k questions per day. This is orders of magnitude different in what we look at. It also partially relates to how many people we have available to moderate the site. To put this into comparison, the Java tag on Stack Overflow gets more questions in one hour than we get in a day. We're much more in line with the Perl tag on Stack Overflow, or Matlab and a bit more active than the Delphi tag. There are days when there are more suggestions to migrate or repost to Software Engineering.SE than there are questions posted here. There's a bit of a history here. Back when Software Engineering (Programmers.SE) was changing from its "not programming related" charter to its "conceptual software design questions" we got crap hurled at us. Several times more crap questions were migrated to Programmers.SE each day than were asked on the site. This lead to the post Please stop using SoftwareEngineering.SE as your toilet bowl because we were getting all the questions that were "meh, no code, migrate it to Programmers.SE". As an aside, also give How can I encourage Stack Overflow to rein in the 'subjective' vigilantes? a read for some more history. When a new user posts a question and then gets told to repost it on Programmers.SE... and then has it resoundingly trounced and down voted - it's not a good experience for anyone involved. We really don't want that. We've got our own set of "why is Programmers.SE so negative" and "why do we get so much crap?" questions in meta too. Exacerbating the problem really doesn't help anyone and it takes up the time of the community moderation. Stack Overflow can completely swamp us with "post this on Programmers.SE" comments and when even a fraction of those questions show up here (and get closed) it's problematic. So the attempts to nip it in the bud and try to help educate the person suggesting the reposting, and the person asking the question, and hopefully try to prevent the "no code, repost elsewhere" meme from spreading we will. That you have made it this far, I will apologize for any suggestion that we're a bit terse or angry in the comments. It can be true (though we're not angry - we're just not able to fully express our feelings - it's exasperation and weariness). But when there are a dozen or so comments a day with suggestions one can be a bit sparse with words. Imagine hanging out on the php tag and writing a customized comment each time someone has a SQL injection vulnerability in the code... yea, it's like that. And no, trying to fit all of this in a comment doesn't work either. So, what can you do? Think about if it's a good question. Flag it for migration if it is. If it's a really good question, ask in chat here for us to help it move along. Avoid bouncing users from one site to another (and having the question get closed on each) Try answering good questions on this site to understand what we are looking for
{ "source": [ "https://softwareengineering.meta.stackexchange.com/questions/7182", "https://softwareengineering.meta.stackexchange.com", "https://softwareengineering.meta.stackexchange.com/users/-1/" ] }
7,935
Four years ago we asked for a name change , which was rejected by SE because there was no evidence to support the claim it was causing user confusion, and there were other things the site should focus on first. Now four years later, users have done all that they can and things are not improving in relation to off-topic posts and user confusion. We still have a very high percentage of closed questions here, and repeatedly get negative feedback on meta about this . (As a lurker of P.SE Meta, I can tell you there are many more links too - these are just the ones found with a quick search of recent questions.) Steps taken to improve site since four years ago: The site scope and FAQ has been clarified so it is very clear what is on or off topic now Users have worked hard to generate a set of meta posts to help guide new users with common misconceptions There is an active group of users who are very aggressive about maintaining questions according to the current site scope and standards More focus is placed on guiding and educating new users The subject of the site's name still keeps coming up time and again from both new and veteran users. maple_shaft (mod) The solution to the vast majority of bad content that I speak of is Our Domain Name. It is as simple as that. Change the domain name of the site from Programmers to Software Engineering and Architecture or something akin to this and I will personally step down as a moderator if spam on the site doesn't cut by half. enderland I don't think anyone -- other than SE itself -- disagrees that the site is poorly named. However... that is well outside our ability to influence. Carson2000 Is this site poorly named? Are people coming to "programmers.stackexchange.com" and immediately thinking "aha, the place for my programming questions! they will help me debug my code!" Mage xy I've never really understood why the site was called "Programmers.SE"... we're all programmers, aren't we? Pretty much the same over at SO... so from a newb's perspective, what's the difference? AndresF For example, the name change was rejected in 2012! Sites, people and rules evolve. Maybe it's time to re-evaluate those decisions? I happen to think the name programmers.SE is part of the problem Alexei Averchenko software architecture is a viable topic, but if it's the main one on this site, it's better to rename it. the name "Programmers" connotes that it's a site about people, not code and .. turning away people who seek advice from seasoned programmers on a site called "Programmers" is downright criminal I could keep going if you want, but those are just some comments from recent posts. There are many more of them if you keep looking further back. So please Stack Exchange, can we get your permission to change the name of our site? It doesn't need to be a complete re-design, just a domain name change and a title.
I think it's a valid to request to find out if this option is even feasible. There have been a number of conversations lately, including with SE Community Managers, that have revolved around the site's scope and focus. I think it's safely stated that the current site name does not give a correct first impression for the site's scope.
{ "source": [ "https://softwareengineering.meta.stackexchange.com/questions/7935", "https://softwareengineering.meta.stackexchange.com", "https://softwareengineering.meta.stackexchange.com/users/1130/" ] }
8,056
From Ana, a Stack Exchange Community Manager in the previous question about changing the site name : It took us a good while, but the Community Team has circled up and here's where we stand on your request to change Programmers.SE's name. We agree that renaming this site is a good idea. "Programmers" is an affinity group, whereas "Software Architecture" (for example) is an action and a discipline. It is entirely possible that changing the site's name will send clearer signals and prime new users to ask questions more consistently in line with what y'all would be proud to see and answer, particularly given where this site’s scope has solidified over the past few years. So yes, we're open to proceeding with a name change. That said, we'll need to make sure the change makes the kind of difference you want it to, because we can't repeat this process again several years down the line. We gotta get it right this time. Last time the site was renamed, the scope was very much still in flux; by now, hopefully you can identify much more precisely what this site is about and commit to both a name and description and introductory materials that communicate this body of knowledge clearly and succinctly. Here's what the Community Team needs before we can move forward: Tell us what you're proposing as the new site name. "Software Engineering" and "Software Architecture" both look like good options from where I stand. Let’s not turn this into a bikeshed discussion; we need well-considered options here, not a popularity contest - Coffee McWhiskeyface is right out. The ideal name will suggest at a glance what this site is about, without suggesting that it might be “fix my code” even a little bit. Lay out what the new site scope will be. Above all else, the scope needs to be simple . Seriously. No more than four bullets, no multi-line comma-separated lists, no gerrymandering - it needs to be easy for any new visitor who bothers to read and even mildly pays attention to what they're reading to know what they can and cannot ask about here. Eliminate ambiguity for a first time poster once and for all. Let's talk again in a month. Let's start answering the following questions: What should our site name be? What should our tour say? Specifically, the first paragraph below the site title that begins with "Programmers Stack Exchange is a question and answer site..." What should our Help Center's on-topic page say about what is on and off-topic? Note that what this site is about and not about also does appear on the tour.
This answer is to summarize the requests for Stack Exchange in a single post. It is a community wiki, so if something else seems to be a consensus in this thread, feel free to add or edit it. Name Change to Software Engineering Tour / Tag Line Welcome Software Engineering is a question and answer site for people involved in the Systems Development Life Cycle who care about creating, delivering, and maintaining software responsibly. We don't address questions about debugging code or how to use specific tools in software development. It's built and run by you as part of the Stack Exchange network of Q&A sites. With your help, we're working together to build a library of detailed answers to every question about software development. Ask About the systems development life cycle Don't Ask About writing or debugging code support for software tools or packages what to read, learn, buy or use legal advice On Topic Page What topics can I ask about here? Software Engineering is a question and answer site for people involved in the Systems Development Life Cycle who care about creating, delivering, and maintaining software responsibly. If you have a question about... software development methods and practices requirements engineering software architecture and design algorithms and data structures quality assurance and testing configuration , build and release management ... then you're probably in the right place to ask your question! Please make sure that your question is not too broad or strongly rooted in opinions . If you have questions that warrant an extended discussion, feel free to come to chat . Before asking, look around to see if your question has been asked. If you see similar questions, be sure to check out their answers and differentiate your question from other, related questions. If you still aren't sure, you can ask about our scope on our Meta site . We have a curated FAQ on our Meta site . You can also check out our list of related sites within the Stack Exchange network .
{ "source": [ "https://softwareengineering.meta.stackexchange.com/questions/8056", "https://softwareengineering.meta.stackexchange.com", "https://softwareengineering.meta.stackexchange.com/users/4/" ] }
8,137
So programmers.se is going to become software-engineering.se? This means I still can't ask the burning questions about proper potted plants for programmers.... AND THAT IS OK BY ME But, I do have one little nitpick: Although my job title is "Software Engineer". I'm really just a hacker guy. I write scripts in php, python, perl, bash, tsch, hack on C and Java when the need arises. I know lots of SQL tricks in sqlite, postgres, mysql and tsql. Right now I'm spending most of my time "ansibilizing" things. But I'll spend an evening or so a week trying to learn some newfangled thing. Maybe I represent a minority, but the site I want is called "programmers" and it's for people who don't really know what they're doing and blindly trudge forth into the abyss hoping, because of experience, wisdom, hubris whatever, that things will work out in the end. So... is that still gonna be here in a month or two the next time I happen to get pinged on an old post? I guess I don't care either way since I can't ask "whatever I want" and haven't been able to since Beta, but what about "softwareengineering.se" will cater to pro-sumers, enthusiasts and cobknobbers like myself?
Do you care about applying craftsmanship 1 or engineering to the planning, design, development, testing, deployment, and maintenance of software? Are you concerned with all aspects of software development (and not just writing/debugging/testing code)? Are you concerned with the practice (as opposed to the theory) of software development? If you can answer "yes" to any or all of those questions, then you're probably good here. It doesn't matter if you are a self-taught hobbyist working on an open source project in your spare time, a student, a professional, or an academic. We are a community of craftsman and engineers who are concerned with all aspects of software development. We just don't want to see questions about writing/debugging/testing code, since those questions already have a good home on Stack Overflow. 1 So why isn't the site called "Software Craftsmanship"? Because Software Engineering is both a job title and an academic field of study at the graduate and undergraduate level. Plus, there are at least a few common definitions, even if there isn't 1 fully accepted definition.
{ "source": [ "https://softwareengineering.meta.stackexchange.com/questions/8137", "https://softwareengineering.meta.stackexchange.com", "https://softwareengineering.meta.stackexchange.com/users/1973/" ] }
8,169
People come to Programmers with their questions for a great many reasons. One apparently is that they've been question banned on Stack Overflow. If we are supposed to treat questions from such people differently, say by how we instruct them in the comments, then we need some way to detect the ban. Is there such a way? Do we just look at their history on Stack Overflow and guess?
No, there's absolutely no reason to treat a question differently if a user has been blocked from asking questions on a particular SE site. If a question isn't a good fit here on Programmers, here's a good workflow: The first thing to look for is questions that aren't a good fit anywhere. If it's primarily opinion based, too broad, or unclear, it's highly likely to be those anywhere in the network. Some communities may be a little more lenient than others, but they are generally applicable. Use one of these reasons if it applies. If the question is off-topic here, but on-topic on a site that we have a migration path to (currently only Stack Overflow), you should vote to migrate that question. If the user is question banned, the migration will be automatically rejected and the question will be closed here anyway. If the question isn't on-topic here and isn't suitable for migration, use one of the custom off-topic reasons. We currently have three - career or education advice, writing/debugging code (that doesn't meet the SO minimum guidelines and isn't suitable for migration), or recommendation. If there's anything else wrong with the question, use a custom close reason. Most of the questions should fall into an existing reason, though.
{ "source": [ "https://softwareengineering.meta.stackexchange.com/questions/8169", "https://softwareengineering.meta.stackexchange.com", "https://softwareengineering.meta.stackexchange.com/users/131624/" ] }
8,266
Awww yeah. WOoooooooOOOOoooOOOOOoooT! PARTY TIME. Check out that domain name and logo! Thanks Stack Exchange :-) If you see any problems anywhere, please leave an answer to this question so we can track them.
We made it, y'all! We just pushed the changes and now, after 8 long months... The name for the site has changed to Software Engineering Stack Exchange. The logo for the site has been updated to the one shown in the top left of the page. The default domain for the site has changed to softwareengineering.stackexchange.com. The site scope has been simplified. The topic string for the site changed to "software engineering". The audience string for the site has been changed to "professionals, academics, and students working within the systems development life cycle who care about creating, delivering, and maintaining software responsibly". We've tried to catch everything but...you know how software goes, so if you see any spots where site copy no longer makes sense or we're otherwise using the old, incorrect name let us know, preferably in a separate meta question. Oh, and one more thing Congratulations! The community took the initiative to identify a critical, long overdue change, brought it to us, and remained ( mostly ) patient while we hashed out the implications and details together in endlessly drawn out stakeholder discussions. Yet everyone remained dedicated and resolute throughout, and at long last, we've shipped. Well played, everybody. :)
{ "source": [ "https://softwareengineering.meta.stackexchange.com/questions/8266", "https://softwareengineering.meta.stackexchange.com", "https://softwareengineering.meta.stackexchange.com/users/52929/" ] }
8,336
The Winter Bash is back! From 19 December 2016 - 8 January 2017, Stack Exchange will be running Winter Bash 2016. Like in the past, participating in Winter Bash will mean that users here can earn "hats" for their gravatars by doing certain stuff, and there'll be a leaderboard to see who is getting the most hats across the network. Individual users, of course, will be able to opt out. Do we want to participate in hats this year? This year, it's an opt-out program. I must confirm with the CM team by 13 December if we don't want hats. Questions? Comments?
No questions. Give us our hats.
{ "source": [ "https://softwareengineering.meta.stackexchange.com/questions/8336", "https://softwareengineering.meta.stackexchange.com", "https://softwareengineering.meta.stackexchange.com/users/4/" ] }
8,342
The site was successfully renamed to SoftwareEngineering. Great. The primary goal of the name change was to make it clear what this site is about, and what is off-topic here. I'm very positive about the idea itself of changing the old name, as well as the new name selected by the community. I find that, indeed, it makes it more obvious that some questions are off-topic. However, I noticed no decrease whatsoever in the number of questions which are obviously off-topic . But this is only my personal impression, which may be wrong. What do statistics say? Was there a clear drop in questions closed as off-topic since the name change? Is it too early to ask?
The close questions statistics don't allow you to see windows that you can adjust, but I wonder if you can use SEDE to explore more. I'll also see if a CM can run some queries and do some data analytics. Here are the statistics for percentage of total questions closed, by reason, in the last 30 days: Assistance in explaining, writing, or debugging code: 18.71% Find or recommend stuff: 19.01% Legal advice or aid: 1.17% Career or education advice: 9.36% Other off-topic: 4.09% Customer support: 6 questions I took a look at a 90 day window , which starts before our name change, but includes the period after the name change: Assistance in explaining, writing, or debugging code: 29.55% (split across two reasons due to a wording change) Find or recommend stuff: 15.58% Legal advice or aid: 0.44% Career or education advice: 7.61% Other off-topic: 4.61% We're trending downward in the explaining, writing, and debugging code questions. This is a very good thing. There's a slight up-tick in finding and recommending, career/education advice, and legal advice/aid. I'm willing to discount the uptick in legal questions. We've been slightly more discerning in what goes here vs what goes to Open Source or Law. However, the fact the find/recommend and career/education is something that we should take a look at. Shog was able to share one chart and gave permission to post it here. He didn't have any more time to do more queries or analysis, but notice the downward trend in October/November 2016? I'll ping a CM, though, and see if they can't do any cooler data analysis with pretty charts and such. I'd be interested to know what we've fixed and what needs work.
{ "source": [ "https://softwareengineering.meta.stackexchange.com/questions/8342", "https://softwareengineering.meta.stackexchange.com", "https://softwareengineering.meta.stackexchange.com/users/6605/" ] }
8,347
After discovering Software Engineering(Previously called Programmers), I've been using it to aid me in finishing a hobby project of mine; building an interpreter, for a language I've been designing. However, I'm beginning to think I've been a bit selfish over these past few months. What prompted this question was me planning to ask another question some time before the holidays. While in the process of researching, I decide to take a quick look over my previous questions. I currently have eight questions that are posted. And while they all seem to be well received and generally on-topic, they all have the same thing in common; they're in a very specific genre. I say these questions are specific, because I really don't think anyone else would find most of them useful. It seems that most everyone I've seen trying to design a language, has a background in computer science or they've been doing it for years. As for me however, I was coming at this from zero knowledge on the subject. I don't think many language designers are wondering What datatype should a lexer return , or How an abstract syntax tree is used to execute source code . My question boils down to: Am I being selfish by asking very specific questions, that are on-topic but unlikely to help future visitors?
The questions which are most welcome here are those which come from real software projects, related to real problems which arise inside the project. It does not matter if it is a commercial project, or a hobby project, or if the topic of the project is very specialized, as long as it is an answerable, on-topic question with a reasonable focus. Opposed to these kind of questions, questions which tell " I just asked for curiosity" or "I have seen this [vague description] two or three times, is there a name for it", or "I have this [vague idea] for a project X, can someone give me a concept for this", which do not address real world problems of the OP, are regularly closed as either beeing too broad or too opinionated. So don't feel bad about your questions because their lack of generality - this site suffers much more from questions which are too general for it's Q&A format than from questions which are too specific. And who knows, just because you think a topic is unlikely for helping future visitors, some of those future visitors may think differently about it.
{ "source": [ "https://softwareengineering.meta.stackexchange.com/questions/8347", "https://softwareengineering.meta.stackexchange.com", "https://softwareengineering.meta.stackexchange.com/users/242544/" ] }
8,359
I think we should highlight a question each week that represents the best of what we would consider the gestalt of our site, Software Engineering. Questions that are well-written, interesting, reasonably-scoped, and illustrate the kind of subject matter we'd like to see on the website. I think this would serve several purposes (increasing question quality is one), but mostly I'd like to be able to point newcomers to such a list as clear examples of what our site is all about. Here is my first nomination: My office wants infinite branch merges as policy; what other options do we have?
The questions which are most welcome here are those which come from real software projects, related to real problems which arise inside the project. It does not matter if it is a commercial project, or a hobby project, or if the topic of the project is very specialized, as long as it is an answerable, on-topic question with a reasonable focus. Opposed to these kind of questions, questions which tell " I just asked for curiosity" or "I have seen this [vague description] two or three times, is there a name for it", or "I have this [vague idea] for a project X, can someone give me a concept for this", which do not address real world problems of the OP, are regularly closed as either beeing too broad or too opinionated. So don't feel bad about your questions because their lack of generality - this site suffers much more from questions which are too general for it's Q&A format than from questions which are too specific. And who knows, just because you think a topic is unlikely for helping future visitors, some of those future visitors may think differently about it.
{ "source": [ "https://softwareengineering.meta.stackexchange.com/questions/8359", "https://softwareengineering.meta.stackexchange.com", "https://softwareengineering.meta.stackexchange.com/users/1204/" ] }
8,371
Occasionally, beginner programmers ask questions here which show that they slightly misunderstood the whole subject. This leads to the questions which are both funny and difficult to read, since they don't make too much sense in their current form. The questions cannot be simply answered by RTFM, since there is an actual underlying misunderstanding of a concept. The recent Does nginx support ajax? is a good illustration. The natural reaction is to downvote the question and/or vote to close it—honestly, that's what I did before later retracting my vote. However, I think it gives a very bad image of an elitist community. I imagine the same situation in real life, where another programmer asks you a similar question and you'll just laugh at him and tell him that what he asks makes no sense. Another possibility is to actually explain the concepts to the OP, so that the person could understand why the question made little sense in its current form, while also learning the concepts the OP was missing originally. Is that a good thing to do, or it leads to some negative consequences I maybe missed?
IMHO a question which is not off-topic per se, but based on a wrong understanding, false assumption or premise, deserves a comment to give the OP a chance to edit it, or at least an explanation for the downvotes he gets. In the given case, there were already some comments revealing the misunderstanding, however there could have been an additional warning like "better edit your question if you don't want the community to close it" . Lots of those questions can be saved by changing the wording slightly, avoiding the wrong assumption without changing the intent of the OP. For example, I am pretty sure the question you linked to could have been saved by writing something like "if I want to implement AJAX on the client side, does it make a difference which web server I use" instead of already imposing the choice of web server has undoubtly a significant influence (especially in the question title). However, if neither the OP nor someone with enough rep edits the question in a reasonable amount of time, it should be closed as "unclear", as it happened here. Otherwise it gives later readers the wrong message that the false assumption might be true. If a user likes to write an answer to such a question, he should probably first improve the question before he answers (or ask the author for improving it, if he hasn't got enough rep by himself). Addendum: right now I edited the question by myself, if you think it is better now and might be undeleted and reopened, feel free to vote accordingly.
{ "source": [ "https://softwareengineering.meta.stackexchange.com/questions/8371", "https://softwareengineering.meta.stackexchange.com", "https://softwareengineering.meta.stackexchange.com/users/6605/" ] }
8,547
I often find myself tempted to downvote a question because the asker has a fundamental misunderstanding of a pattern they are using, or an absurd approach to the problem they are trying to solve. However, I hold myself back because, as a question and answer site, this is an opportunity to correct their misconceptions. The problem is their approach, not their question; the question is the messenger and should not be shot. I suspect that others may sometimes downvote for similar reasons. Do fundamental misunderstandings of technical concepts warrant downvotes, polite correction, or both?
Votes should be used to indicate quality and usefulness and the lack thereof, and not to show agreement. To be clear: it is great when askers write about their (possibly completely wrong) understanding because that makes writing a focused answer much easier. The ideal life cycle of a question is: OP has a concrete problem. OP tries to solve the problem on their own, e.g. by searching on the web or by asking colleagues. Nothing helps. OP asks a question in which they explain the concrete problem with relevant constraints, and what they already tried. For a conceptual question, this means explaining their current understanding of a concept. Community members with relevant expertise answer the question. In the future, other people with similar problems find this Q&A so the same question will never be asked again. In my experience, many downvote-worthy questions with fundamental misunderstandings or really weird ideas are not caused by writing a crappy question, but in the steps before that: The question might be completely speculative and is not motivated by a concrete problem. Aliens ate my Unicode! Why do we need more than one programming language? *sigh* The OP didn't bother with any research of their own first. If an online search for terms in the question or the relevant Wikipedia articles would immediately address the problem, or if they ask an overly broad “what is X” type question, the question is a waste of the community's time. This, in my opinion, leads to the following possible reactions for us: If the absurd approach or misunderstanding is the focus of the question and is sufficiently explained, this misunderstanding can be addressed with an answer. If it is tangential to the question, comments can be used to notify OP about this. They might turn that into another great question! If a fundamentally good question is obfuscated by unclear explanation, the question can be refined with edits or by asking for clarification in comments. For extensive discussions, a chat room about the question should be preferred. If OP doesn't show their research effort, downvoting + asking for elaboration in a comment is the way to go. In stronger cases, also vote to close as too broad: a comprehensive answer would effectively have to be half a Wikipedia article, but we are not an encyclopedia. If the misunderstanding or unusual approach renders the question incomprehensible, downvoting and voting to close as unclear is in order. As with all moderation actions, you are not required to do anything. In particular, you are not required to craft a kind comment to help the OP improve the question. That is a nice thing to do, but can take too much time to be feasible. Votes and especially downvotes are also valuable moderation actions, so do not hold back: actively moderate the site to filter for the quality content you want to see more of, within the time budget you have, with any moderation tools you have available.
{ "source": [ "https://softwareengineering.meta.stackexchange.com/questions/8547", "https://softwareengineering.meta.stackexchange.com", "https://softwareengineering.meta.stackexchange.com/users/173910/" ] }
9,000
I'm resigning as a moderator from all Stack Exchange sites, effective today. I didn't make this decision lightly, frivolously or suddenly. A persistent pattern of corporate missteps , and a monumentally deplorable moderator dismissal , has compelled me to re-evaluate my relationship with Stack Exchange. The decision to resign wasn't a difficult one, from that perspective. What is difficult is that I made a commitment to this community that I must now abandon. I'm deeply sorry about that. I've been with this site from the beginning, and like to think I played some small but important part in shaping the site into what it is today. And what a fine site it has become. Special thanks to Shog9, Tim Post and all of the CM's, who have been nothing but supportive, helpful and educational, even when I was probably a bit more volatile than I should have been. My problems with the network have nothing to do with them. It has been a privilege serving you.
Robert - well stated, and thank you for your service. I'm joining you in resigning from my post as moderator.
{ "source": [ "https://softwareengineering.meta.stackexchange.com/questions/9000", "https://softwareengineering.meta.stackexchange.com", "https://softwareengineering.meta.stackexchange.com/users/1204/" ] }
1
A coworker of mine believes that any use of in-code comments (ie, not javadoc style method or class comments) is a code smell . What do you think?
Only if the comment describes what the code is doing. If I wanted to know what was happening in a method or block, I would read the code. I would hope, anyway, that any developers working on a given project were at least familiar enough with the development language to read what is written and understand what it is doing. In some cases of extreme optimization, you might be using techniques that makes it difficult for someone to follow what your code is doing. In these cases, comments can and should be used to not only explain why you have such optimizations, but what the code is doing. A good rule of thumb would be to have someone else (or multiple other people) familiar with the implementation language and project look at your code - if they can't understand both the why and the how, then you should comment both the why and the how. However, what's not clear in the code is why you have done something. If you take an approach that might not be obvious to others, you should have a comment that explains why you made the decisions that you did. I would suspect that you might not even realize that a comment is needed until after something like a code review, where people want to know why you did X instead of Y - you can capture your answer in the code for everyone else who looks at it in the future. The most important thing, though, is to change your comments when you change your code. If you change an algorithm, be sure to update the comments with why you went with algorithm X over Y. Stale comments are an even bigger code smell.
{ "source": [ "https://softwareengineering.stackexchange.com/questions/1", "https://softwareengineering.stackexchange.com", "https://softwareengineering.stackexchange.com/users/6/" ] }
4
When starting a project for a company that's not primarily a programming company, one of the expectations is that there's a finished product at the end free of all bugs and does everything needed right away. However, that's rarely the case. What are some ways to manage expectations and explain to non-programmers how software development differs from other types of product development?
Pretty much everybody with a computer has encountered the concept of "bugs" these days, so you might start there. "What's the most annoying way an application has ever failed on you? Multiply that by ten, and you'll have our users' experience if we don't devote enough resources to testing and maintenance." And don't underestimate the value of establishing a good working relationship with the non-programmers. If you can establish that your judgment may be trusted, they'll take you seriously when you sound the alarm that X is going to fail spectacularly if you don't do Y pronto, even if they don't completely understand your reasoning.
{ "source": [ "https://softwareengineering.stackexchange.com/questions/4", "https://softwareengineering.stackexchange.com", "https://softwareengineering.stackexchange.com/users/-1/" ] }
16
I have read a few articles on Internet about programming language choice in the enterprise. Recently many dynamic typed languages have been popular, i.e. Ruby, Python, PHP and Erlang. But many enterprises still stay with static typed languages like C, C++, C# and Java. And yes, one of the benefits of static typed languages is that programming errors are caught earlier, at compile time, rather than at run time. But there are also advantages with dynamic typed languages. ( more on Wikipedia ) The main reason why enterprises don't start to use languages like Erlang, Ruby and Python, seem to be the fact that they are dynamic typed. That also seem to be the main reason why people on StackOverflow decide against Erlang. See Why did you decide "against" Erlang . However, there seem to be a strong criticism against dynamic typing in the enterprises, but I don't really get it why it is that strong. Really, why is there so much criticism against dynamic typing in the enterprises? Does it really affect the cost of projects that much, or what? But maybe I'm wrong.
Yes, I believe that they do. There are a few reasons that need to be considered in the selection of a language for a new project: Run-time speed. Compared to C/C++/Fortran, Perl and Python are so slow it's funny. Initialization speed. Compared to the above fast languages, Java falls over and cries as the JVM keeps loading and loading and... while(1) .... Prototype-ability. Exhaustively going through and doing the declaration/definition work required for C++ or Java increases the LOC, which is the only known metric that reliably correlates with bugcounts. It also takes a lot of time. It also requires a bit more thinking about types and connections. Internal fiddlability. Dynamically messing around with your internals is great until you begin to debug your self-modifying code . (Python, Lisp, Perl) Correctness verification. A compiler can provide a quick once-over pass of semi-correctness of your code in C++, and this can be really nice. Static analysis details. C and Java have pretty good static analysis. Perl is not completely statically analyzable at a theoretical level (Possibly Python too). I'm reasonably sure Lisp isn't either. Weird platforms only take C, in general. Support chain. If you can have a contract that you will get your bugs looked at and worked on, that's huge . If you can presume that the organization you are working with has a principle of "Going forward"(There's an accounting term for this), and won't just randomly decide to not work on the software, then you have a much better case for using the software. Since there's no Major Business selling (carrying implication of taking responsibility of maintaining it) Python/Perl/$dynamic_language, it considerably reduces risk. In my experience, open source maintainers often have an issue with fully taking responsibility for bugfixes and releasing updates. "It's free, YOU work on it!" is not an answer that is acceptable to most businesses (not their core compentencies, among other things). Of course, I'm not talking about the webapp/startup world, which tends to play by high risk/high reward rules and be very open to staying on the frothing edge of tech.
{ "source": [ "https://softwareengineering.stackexchange.com/questions/16", "https://softwareengineering.stackexchange.com", "https://softwareengineering.stackexchange.com/users/18/" ] }
39
What's your favourite quote about programming? One quote per answer , and please check for duplicates before posting!
Walking on water and developing software from a specification are easy if both are frozen. — Edward V Berard
{ "source": [ "https://softwareengineering.stackexchange.com/questions/39", "https://softwareengineering.stackexchange.com", "https://softwareengineering.stackexchange.com/users/40/" ] }
44
I am finishing my college degree in programming soon and I'm exploring the next steps to take to further my career. One option I've been considering is getting a certification or a series of certifications in the area of development I want to work in. Are these certifications worth the time and money? Do employers place a lot of value in them?
The main purpose of certifications is to make money for the certifying body. Having said that, I think certifications are more important the earlier on in your career you are. As a hiring manager, I never use certifications or the lack thereof to filter potential employees, but I do think some companies may look for these as proof that you know what you are doing. Personally, I want the job candidate to show me they can do something (which is a whole other question, I realize!) The more experience you have, the more you can prove by examples that you know what you are doing and the less important certifications become.
{ "source": [ "https://softwareengineering.stackexchange.com/questions/44", "https://softwareengineering.stackexchange.com", "https://softwareengineering.stackexchange.com/users/41/" ] }
49
During my four years at university we have been using much functional programming in several functional programming languages. But I have also used much object oriented programming to, and in fact I use object oriented languages more when doing my own small project to prepare for my first job. But I often wish that I was coding in a functional programming language when doing these projects. However, when looking for a job, it is very rare to see a job where knowledge of a functional programming language is required. Why isn't functional programming languages used more in the industry? There is quite much news about functional programming languages these days, so I wonder if functional programming is catching on in the industry now?
I was a professor and, just like programmers, professors are always looking for the Next Big Thing. When they think they've found one, they make it a bandwagon, and everyone piles on. Since they are preaching to students who think professors must be really smart, else why would they be professors, they get no resistance. Functional programming is such a bandwagon. Sure it's got lots of nice interesting questions to investigate, and lots of sort-of-interesting conference articles to write. It's not a particularly new idea, and you can do it in just about any modern language, and ideas don't have to be new to be interesting. It's also a good skill to have. Given that, functional programming is just one arrow to have in your quiver, not the only one, just as OOP is not the only one. My beef with computer science academia is lack of practical interplay with industry to determine what actually makes real-world sense, i.e. quality control. If that quality control were there, there might be a different emphasis, on classifying problems and the ranges of solutions to them, with tradeoffs, rather than just the latest bandwagons.
{ "source": [ "https://softwareengineering.stackexchange.com/questions/49", "https://softwareengineering.stackexchange.com", "https://softwareengineering.stackexchange.com/users/18/" ] }
57
The coding standards for the code hosted in drupal.org suggest to use two spaces to indent the code; other sites suggest to use tabs to indent the code. What is the proper indentation character for everything, and in every situation? Please explain the answer you give.
Spaces A tab could be a different number of columns depending on your environment, but a space is always one column. In terms of how many spaces (or tabs) constitutes indentation, it's more important to be consistent throughout your code than to use any specific tab stop value.
{ "source": [ "https://softwareengineering.stackexchange.com/questions/57", "https://softwareengineering.stackexchange.com", "https://softwareengineering.stackexchange.com/users/44/" ] }
135
Who in the software engineering and software development fields uses Twitter to tweet about relevant happenings in the field?
I'll probably get flamed for this but... 140 characters is hardly the format to get any real pearls of programming wisdom. Most (but not all) programming concepts/thoughts/ideas require more space to be articulated. I would follow the blogs of the list of programmers that everyone is suggesting.
{ "source": [ "https://softwareengineering.stackexchange.com/questions/135", "https://softwareengineering.stackexchange.com", "https://softwareengineering.stackexchange.com/users/4/" ] }
188
For the longest time in places like Java's IRC channel , SO, and other places I've been told something along the lines of "Worry about how the code looks and its readability/understandability now, and performance later if absolutely necessary". So for the longest time, I haven't really been OCD about performance for my small desktop or web apps, just removing the obviously inefficient. Most responses are "What about scalability?". Thats a legitimate point, but if my app was only built to parse, say, files 10,000 lines long, should I make my code a mess for the small percentage of people that are going to shove in a 1,000,000 line file? My main question is when should I trade the easy but somewhat inefficient ways of doing tasks for big giant complicated beasts that do things extremely quickly but destroy any possible ways of upgrading and make the code excessively difficult and prone to rewriting anyway by the next developer?
Worry about performance when it becomes a problem. If you write a small app to process 10,000 line files and you get a 1,000,000 line file every 100th file, it probably doesn't matter that it takes longer to process that one file. However, if you are regularly getting files that are 5-10 times larger than initially and your application is taking too long to do its job, then you start profiling and optimizing. Now, I said "too long to do its job". That is up to the user or sponsoring organization to decide. If I'm doing a task and it takes me 5 minutes to do something when it took me 3 without the software or with a different tool, I'd probably file a bug report or maintenance request to have that improved. If you are the user, how long you want your software to take to do its job is up to you - only you can decide if you want it done faster or if you are willing to wait longer to have more readable code.
{ "source": [ "https://softwareengineering.stackexchange.com/questions/188", "https://softwareengineering.stackexchange.com", "https://softwareengineering.stackexchange.com/users/66/" ] }
192
If I have some code that has 80% test coverage (all tests pass), is it fair to say that it's of higher quality than code with no test coverage? Or is it fair to say it's more maintainable?
In a strict sense, it is not fair to make any claims until the quality of the test suite is established. Passing 100% of the tests isn't meaningful if most of the tests are trivial or repetitive with each other. The question is: In the history of the project, did any of those tests uncover bugs? The goal of a test is to find bugs. And if they didn't, they failed as tests. Instead of improving code quality, they might only be giving you a false sense of security. To improve you test designs, you can use (1) whitebox techniques, (2) blackbox techniques, and (3) mutation testing. (1) Here are some good whitebox techniques to apply to your test designs. A whitebox test is constructed with specific source code in mind. One important aspect of whitebox testing is code coverage: Is every function called? [Functional coverage] Is every statement executed? [Statement coverage-- Both functional coverage and statement coverage are very basic, but better than nothing] For every decision (like if or while ), do you have a test that forces it to be true, and other that forces it to be false? [Decision coverage] For every condition that is a conjunction (uses && ) or disjunction (uses || ), does each subexpression have a test where it is true/false? [Condition coverage] Loop coverage: Do you have a test that forces 0 iterations, 1 iteration, 2 iterations? Is each break from a loop covered? (2) Blackbox techniques are used when the requirements are available, but the code itself is not. These can lead to high-quality tests: Do your blackbox tests cover multiple testing goals? You'll want your tests to be "fat": Not only do they test feature X, but they also test Y and Z. The interaction of different features is a great way to find bugs. The only case you don't want "fat" tests is when you are testing an error condition. For example, testing for invalid user input. If you tried to achieve multiple invalid input testing goals (for example, an invalid zip code and an invalid street address) it's likely that one case is masking the other. Consider the input types and form an "equivalence class" for the types of inputs. For example, if your code tests to see if a triangle is equilateral, the test that uses a triangle with sides (1, 1, 1) will probably find the same kinds of errors that the test data (2, 2, 2) and (3, 3, 3) will find. It's better to spend your time thinking of other classes of input. For example, if your program handles taxes, you'll want a test for each tax bracket. [This is called equivalence partitioning.] Special cases are often associated with defects. Your test data should also have boundary values, such as those on, above, or below the edges of an equivalence task. For example, in testing a sorting algorithm, you'll want to test with an empty array, a single element array, an array with two elements, and then a very large array. You should consider boundary cases not just for input, but for output as well. [This is call boundary-value analysis.] Another technique is "Error guessing." Do you have the feeling if you try some special combination that you can get your program to break? Then just try it! Remember: Your goal is to find bugs, not to confirm that the program is valid . Some people have the knack for error guessing. (3) Finally, suppose you already have lots of nice tests for whitebox coverage, and applied blackbox techniques. What else can you do? It's time to Test your Tests . One technique you can use is Mutation Testing. Under mutation testing, you make a modification to (a copy of) your program, in the hopes of creating a bug. A mutation might be: Change a reference of one variable to another variable; Insert the abs() function; Change less-than to greater-than; Delete a statement; Replace a variable with a constant; Delete an overriding method; Delete a reference to a super method; Change argument order Create several dozen mutants, in various places in your program [the program will still need to compile in order to test]. If your tests do not find these bugs, then you now need to write a test that can find the bug in the mutated version of your program. Once a test finds the bug, you have killed the mutant and can try another. Addendum : I forgot to mention this effect: Bugs tend to cluster . What that means is that the more bugs you find in one module, the higher the probability that you'll find more bugs. So, if you have a test that fails (which is to say, the test is successful, since the goal is to find bugs), not only should you fix the bug, but you should also write more tests for the module, using the techniques above. So long as you are finding bugs at a steady rate, testing efforts must continue. Only when there is a decline in the rate of new bugs found should you have confidence that you've made good testing efforts for that phase of development.
{ "source": [ "https://softwareengineering.stackexchange.com/questions/192", "https://softwareengineering.stackexchange.com", "https://softwareengineering.stackexchange.com/users/55/" ] }
206
Test driven development. I get it, like it. But writing tests does require overhead. So should TDD be used universally throughout the code base, or are there areas where TDD provides a high ROI and other areas where the ROI is so low that it is not worth following.
I'd say avoid TDD in places where the code is likely to change structurally a lot. Ie, it's great to have a pile of tests for a method whose signature changes rarely but gets refactored internally more frequently, but it sucks to have to fix your tests every time a highly volatile interface changes dramatically. The apps I've been working on recently have been data-driven webapps built on a Gui->Presenter->BusinessLogic->Data Access Layer-based architecture. My data access layer is tested like nobody's business. The business logic layer is pretty well tested. The Presenters are only tested in the more stable areas, and the GUI, which is changing hourly, has almost no tests.
{ "source": [ "https://softwareengineering.stackexchange.com/questions/206", "https://softwareengineering.stackexchange.com", "https://softwareengineering.stackexchange.com/users/85/" ] }
220
How would someone implement Agile process concepts as a solo developer? Agile seems useful for getting applications developed at a faster pace, but it also seems very team oriented...
By doing test-driven development By developing in small sprints By having a lot of contact with the customer I remember reading a thesis about Cowboy Development, that is essentially Agile for solo developers. The thesis can be read here: Cowboy: An Agile Programming Methodology For a Solo Programmer (PDF)
{ "source": [ "https://softwareengineering.stackexchange.com/questions/220", "https://softwareengineering.stackexchange.com", "https://softwareengineering.stackexchange.com/users/75/" ] }
221
Possible Duplicate: Using “Foo” and “Bar” in examples I know AT&T labs used them in their Unix days, but do they have even deeper histories?
From the Jargon file : When ‘foo’ is used in connection with ‘bar’ it has generally traced to the WWII-era Army slang acronym FUBAR (‘Fucked Up Beyond All Repair’ or ‘Fucked Up Beyond All Recognition’), later modified to foobar. Early versions of the Jargon File interpreted this change as a post-war bowdlerization, but it it now seems more likely that FUBAR was itself a derivative of ‘foo’ perhaps influenced by German furchtbar (terrible) — ‘foobar’ may actually have been the original form. For, it seems, the word ‘foo’ itself had an immediate prewar history in comic strips and cartoons. The earliest documented uses were in the Smokey Stover comic strip published from about 1930 to about 1952. Bill Holman, the author of the strip, filled it with odd jokes and personal contrivances, including other nonsense phrases such as “Notary Sojac” and “1506 nix nix”. The word “foo” frequently appeared on license plates of cars, in nonsense sayings in the background of some frames (such as “He who foos last foos best” or “Many smoke but foo men chew”), and Holman had Smokey say “Where there's foo, there's fire”.
{ "source": [ "https://softwareengineering.stackexchange.com/questions/221", "https://softwareengineering.stackexchange.com", "https://softwareengineering.stackexchange.com/users/85/" ] }
247
Does learning COBOL still make sense?
I don't think so, unless you are already in the niche market where COBOL is still maintained.
{ "source": [ "https://softwareengineering.stackexchange.com/questions/247", "https://softwareengineering.stackexchange.com", "https://softwareengineering.stackexchange.com/users/44/" ] }
252
There is a widely accepted opinion that Singleton is an anti-pattern. As usual, there are always exceptions to the rule. Can you explain why Singleton is a bad choice in general and give an example of some valid use cases for it?
The two main criticisms of Singletons fall into two camps from what I've observed: Singletons are misused and abused by less capable programmers and so everything becomes a singleton and you see code littered with Class::get_instance() references. Generally speaking there are only one or two resources (like a database connection for example) that qualify for use of the Singleton pattern. Singletons are essentially static classes, relying on one or more static methods and properties. All things static present real, tangible problems when you try to do Unit Testing because they represent dead ends in your code that cannot be mocked or stubbed. As a result, when you test a class that relies on a Singleton (or any other static method or class) you are not only testing that class but also the static method or class. As a result of both of these, a common approach is to use create a broad container object to hold a single instance of these classes and only the container object modifies these types of classes while many other classes can be granted access to them to use from the container object.
{ "source": [ "https://softwareengineering.stackexchange.com/questions/252", "https://softwareengineering.stackexchange.com", "https://softwareengineering.stackexchange.com/users/6/" ] }
262
Will Java have the same importance it had in the past, or it will be less relevant than nowadays?
Java is relevant and will continue to be relevant for many years in the Enterprise computing world. Whether it continues to be relevant in other areas depends a lot on what Oracle does. If they inject some life (and resources) into ME, desktop applications and other areas, and if they press on with the evolution of the Java language, then Java will do well. But if Oracle cuts back on R&D and/or tries to stomp other players in the Java space, there's a good chance that someone / some company will develop a better (and more open) Java-like language. If Oracle win their lawsuit against Google, I predict that the next generation of the Android platform will have a new language, just like happened with C#. If Google get the openness right ... then, the game is on!
{ "source": [ "https://softwareengineering.stackexchange.com/questions/262", "https://softwareengineering.stackexchange.com", "https://softwareengineering.stackexchange.com/users/44/" ] }
294
I just started working a year ago, and I want to join an open source project for the same reasons as anyone else: help create something useful and develop my skills further. My problem is, I don't know how to find a project where I'll fit in. How can I find a beginner-friendly project? What attributes should I be searching for? What are warning signs that a project might not be the right fit? Are there any tools out there to help match people with open source projects? There's a similar question here , but that question has to do with employment and is limited to PHP/Drupal.
My first open source contribution was for a library that I had previously used (and would've suffered greatly without) on a previous paid project. During my initial use I had spotted a bug in the code so I created a patch, joined the project, and submitted it for review. About 8 months later when I had some free time I decided that I would give back (and work on my development skills) by contributing more to the project. So I cloned the repository and started getting familiar with the codebase. After a few weeks of submitting minor patch fixes to the codebase and monitoring the feature requests, I picked up a feature request to add a pretty substantial module to the project. Since generating many individual patch fixes is pretty tedious for any significant development I cloned the repository to a branch on git hub and started punching away code. A few weeks and several thousand lines of code later the project leader and me worked through integrating and testing my fixes into the library in a way that worked consistently with the rest of the codebase. It was an invaluable process that I learned a lot from: When I started I didn't know how to use Git, by the end I could proficiently create remote tracking branches and merge or rebase them into the master branch without breaking a sweat. I started in VS 2008 and ended up migrating to Linux and Monodevelop to work on writing code (because VS is unicode retarded and line endings are such a pain in git). It turns out that theres not much you can't do in *nix that you can do in *dows. I had never really done any unit testing before, Nunit is a piece of cake to use and writing unit tests is pretty elementary stuff. I had to learn to swallow my tongue and listen as well as practice patience. There's no point in standing a firm ground on your position on an open source project because everybody involved is knowledgeable (probably more so than yourself) and capable of accepting/rejecting your ideas based on substance not delivery. It's extremely humbling and rewarding at the same time. Just having one other skilled developer's eyes on a large base of my code pointed out flaws in my style that I had never considered before (as well as I pointed out flaws in his code). For me, I learned that it's easier/better to define constants than it is to use a bunch of magical numbers with detailed commenting. That particular project was based around generating and decoding networking packets on all levels of networking protocols. I have a personal interest in lower level networking so it was great to have discussions with another developer with shared interest and knowledge in the domain. If you want to just get your feet wet: find a project that you already use; clone the repository; and start seeing if you can fix some bugs and/or add some unit tests. It seems intimidating to look at someone else's codebase with fresh eyes but it's an extremely valuable skill to learn. Submit some patches. You can expect your code to be closely scrutinized at first. Don't worry about it, it's a normal part of the process to gain the trust of the project admin(s). After establishing a base of merit with the projects admin(s) start seeking more responsibilities such as, proposing new features, or asking to be assigned to implementing feature requests. If you can't find an already existing project on one of the main open source repository networks (github, sourceforge, google code) think of an app that you'd really like to use that doesn't exist yet and start your own. Be prepared to be humbled and expect work to be rejected in favor of further revisions. The myth that anybody can add code to an open source project is completely false. There's always a gatekeeper between you and push access. The better your code, the less it will be scrutinized in the long run as you gain trust of the project admin(s). If it's your project, you'll be that gatekeeper. Update: I just thought about it and realized that I didn't bother to mention which project that a lot of my answer is referencing. For those who want to know, it's SharpPcap . The lead developer Chris Morgan is very professional and on point. He does a hell of a job managing the project and taught me a lot about what it takes to mature a OSS project. Due to personal time constraints I haven't been able to contribute code in over a year but I still try to give back by lurking on Stack Overflow and answering questions about SharpPcap occasionally.
{ "source": [ "https://softwareengineering.stackexchange.com/questions/294", "https://softwareengineering.stackexchange.com", "https://softwareengineering.stackexchange.com/users/94/" ] }
348
Elite developers can be 10x more productive than an average developer. Clearly it's easier to find an elite developer around the whole world than in a company's backyard. If a company is not located in a programming hot spot, should they consider hiring people who work from home?
I have worked as, and managed staff in both situations, and combinations of both. I've made the following observations: Junior staff do not work remotely. They require a good and personal working relationship with a mentor. I find my junior staff would rather wait for me to be available than to ask the rather senior (and good) remote developer anything. Ensure anyone you consider for working remotely is effective when self-guided and doesn't go off on tangents. Remote staff can get isolated really easily and not feel part of a team unless special effort is made to be inclusive of them. This isolation can lead to a misunderstanding of the specific business driver for a project, or to misinterpret events in a negative manner. Never get a contractor working remotely, unless they have the right incentive to perform. When working with a remote team member, make sure they get equitable access to resources, including source control, reference material, etc. Don't make them jump through hoops to get work done. Arrange those face to face meetings as often as practical. This encourages far better team collaboration as people are more comfortable with those they have met.
{ "source": [ "https://softwareengineering.stackexchange.com/questions/348", "https://softwareengineering.stackexchange.com", "https://softwareengineering.stackexchange.com/users/24/" ] }
368
For a long time in SO and in other places Java has the reputation of being slow. From jokes to many comments in questions and answers, people still believe Java is slow based solely on experience with it in the 90s. This is my issue: we have disproved (most) of the reasons that people believe Java is slow. Outside of small things, Java is pretty fast. So why is it that people still refuse to believe Java is fast now? Is it part of their mindset that anything thats not C/C++ is slow? Is it because people don't check over time? Is it because people are just biased?
It's the applications. As you note, we have proved, time and time again, that in contrived scenarios Java code can meet or even beat the performance of so-called "performant" languages like C, C++, Lisp, VB6, or JavaScript. And when presented with such evidence, most sane, open-minded opponents will hang their heads in shame and promise never again to spread such slander. ...but then, they fire up Eclipse, or NetBeans, or Guiffy, or enable the Java support in their browser, or try to run an app on their favorite feature phone. And they wait for it to become responsive... ...and wait... ...and wait... ...and wait... ...and wait... ...and... ...what did I promise never to do again? Sorry, must have dozed off...
{ "source": [ "https://softwareengineering.stackexchange.com/questions/368", "https://softwareengineering.stackexchange.com", "https://softwareengineering.stackexchange.com/users/66/" ] }
370
I've been told that to be taken seriously as a job applicant, I should drop years of relevant experience off my résumé, remove the year I got my degree, or both. Or not even bother applying, because no one wants to hire programmers older than them. 1 Or that I should found a company, not because I want to, or because I have a product I care about, but because that way I can get a job if/when my company is acquired. Or that I should focus more on management jobs (which I've successfully done in the past) because… well, they couldn't really explain this one, except the implication was that over a certain age you're a loser if you're still writing code. But I like writing code. Have you seen this? Is this only a local (Northern California) issue? If you've ever hired programmers: 2 Of the résumés you've received, how old was the eldest applicant? What was the age of the oldest person you've interviewed? How old (when hired) was the oldest person you hired? How old is "too old" to employed as a programmer? 1 I'm assuming all applicants have equivalent applicable experience. This isn't about someone with three decades of COBOL applying for a Java guru job. 2 Yes, I know that (at least in the US) you aren't supposed to ask how old an applicant is. In my experience, though, you can get a general idea from a résumé.
Having just got a new job at nearly 50 in the UK I can say that it's possible and you're never too old. There are two approaches - both rely on your skills being relevant to the job. Stick with what you know and become a guru. This is risky as the number of jobs requiring "old" technologies are becoming fewer and further between as each year passes. However, as people retire from such jobs there will be openings. Keep refreshing your skills. I moved into Silverlight last year, which is what got me this job. That and my previous team leadership roles which my new employer saw as relevant.
{ "source": [ "https://softwareengineering.stackexchange.com/questions/370", "https://softwareengineering.stackexchange.com", "https://softwareengineering.stackexchange.com/users/99/" ] }
487
If you were to design a programming language, how would you do it? What features would you put in? What would you leave out? Statically or dynamically typed? Strongly or weakly typed? Compiled or interpreted? Justify your answers.
I definitely think that functional programming languages will catch on, so my language will be functional. See Taming Effects with Functional Programming I think the CPUs soon will have hundreads of cores, and threads will he a hell to manage. So the Actor Model is a must instead of threads. See Erlang - software for a concurrent world I also think that OOP has failed, the communication between objects was assumed to be asynchronous . So I think we need message passing , with immutable messages. Send and Forget. As in the Actor model. See Object Oriented Programming: The Wrong Path? I think that it would be good to have static typing , so errors are catched earlier in the development cycle. But I would use type inference as in Haskell, so that the developer don't need to write the type everywhere in the code as in C, C# and Java. See Learn You A Haskell for Great Good I would also design a great UI library , with declarative layout , as in WPF and Android. But I would like to have it as in Functional Reactive Programming . So my language would be like the concurrency in Erlang but with the typing as in Haskell and a GUI framework as in WPF.NET.
{ "source": [ "https://softwareengineering.stackexchange.com/questions/487", "https://softwareengineering.stackexchange.com", "https://softwareengineering.stackexchange.com/users/61/" ] }
492
Did you learn to touch-type when you were already working as a programmer? If so how did it affect your productivity? Or are you still unable to touch type and do you think it holds you back? According to Steve Yegge it is essential, Personally I did not notice much difference, possibly because I was spending less than 25% of my work time actually typing (I was working on a large legacy project at the time and I was spending more time on reading and debugging existing code.)
Well, I said my piece on this here: When you're a fast, efficient typist, you spend less time between thinking that thought and expressing it in code. Which means, if you're me at least, that you might actually get some of your ideas committed to screen before you completely lose your train of thought. Again. Personally, I can't take slow typists seriously as programmers. When was the last time you saw a hunt-and-peck pianist?
{ "source": [ "https://softwareengineering.stackexchange.com/questions/492", "https://softwareengineering.stackexchange.com", "https://softwareengineering.stackexchange.com/users/113/" ] }
500
Rather than slavishly pair program all the time, we use pair programming selectively on our team. I think it works best in the following circumstances: Ramping up brand new team members on a project (instead of letting them wade through documentation or code on their own). Having junior and senior people work together (helps to show some of the skills and tricks of the more experienced developers, plus it allows the old dogs to learn new tricks sometimes). When someone is trying to track down a defect, it often helps to pair with a fresh set of eyes. When to use pair program and why? When to avoid pair programming? Why?
Research compiled by Laurie Williams indicates that pair programming works best on industrial teams when Pairs work on specification, design, and complex programming tasks - experiments indicate that no quality improvement is shown when working on simple tasks in a pair but there may be speed improvements. Also note that pair "programming" often includes activities other than writing code. Each individual in a pairing has about the same level of expertise - while pair programming is great for training, pairs are most engaged when they are about on the same level. Roles rotate regularly - rotating regularly helps keep the current copilot engaged as individuals tend to contribute most when they drive or sense they are about to drive. Pairs rotate regularly - teams have expressed comfort in knowing about different parts of the system they are building. Pair rotation helps with knowledge transfer which reduces certain risks in the project. In an academic setting pairs are often assigned, however industry they are generally self-assigned often during stand-ups. In both cases, the pair is most effective when both individuals are willing participants who see value in the pairing activity. In my personal experience I've found that my XP team spends about 60% of our development time pair programming on average. The remainder of the time is spent doing individual development. It is not uncommon to pair up to create an initial design, work alone on the design for a few hours, then come back together to finish tricky or difficult parts of the code. I've also found that pair programming is most effective in approximately 1.5 to 2.5 hour blocks. Anything much less tends to require too much overhead to setup while much more and the pairs tend to get cranky and tired. Cranky and tired means you're not communicating well and might be letting defects slip into the system.
{ "source": [ "https://softwareengineering.stackexchange.com/questions/500", "https://softwareengineering.stackexchange.com", "https://softwareengineering.stackexchange.com/users/38/" ] }
507
I have seen this on the SO on many times. Whenever a question is vague and the question is asking some magical answer somebody or the other leaves a comment saying answer is 42. Even a book I am reading right now uses '42' as the number whenever it wants demonstrate some basic concept using an integer. So is there any history behind it or it is just a coincidence?
It's the answer to Life, The Universe, and Everything from Douglas Adams' Hitchhiker's Guide to the Galaxy .
{ "source": [ "https://softwareengineering.stackexchange.com/questions/507", "https://softwareengineering.stackexchange.com", "https://softwareengineering.stackexchange.com/users/123/" ] }
566
goto is almost universally discouraged. Is using this statement ever worthwhile?
This has been discussed several times on Stack Overflow, and Chris Gillum summarized the possible uses of goto : Cleanly exiting a function Often in a function, you may allocate resources and need to exit in multiple places. Programmers can simplify their code by putting the resource cleanup code at the end of the function all all "exit points" of the function would goto the cleanup label. This way, you don't have to write cleanup code at every "exit point" of the function. Exiting nested loops If you're in a nested loop and need to break out of all loops, a goto can make this much cleaner and simpler than break statements and if-checks. Low-level performance improvements This is only valid in perf-critical code, but goto statements execute very quickly and can give you a boost when moving through a function. This is a double-edged sword, however, because a compiler typically cannot optimize code that contains gotos. I'd argue, as many others would argue, that in all of these cases, the usage of goto is used as a means to get out of a corner one coded oneself into, and is generally a symptom of code that could be refactored.
{ "source": [ "https://softwareengineering.stackexchange.com/questions/566", "https://softwareengineering.stackexchange.com", "https://softwareengineering.stackexchange.com/users/86/" ] }
604
on a widescreen monitor one can easily see more than 80 characters at a time, without scrollbars. even linus torvalds sees the 80 character limit as outdated . so, is the 80 character limit still relevant in times of widescreen monitors?
If I keep my lines to less than about 100 characters, I can have two editor windows side-by-side on a widescreen monitor. It's very useful to have both the class header file and implementation both visible at the same time, or have code on one side that calls into the code on the other. And, if I keep the lines short, I don't need a horizontal scrollbar on my editor windows, which gives me more vertical space. 80 characters may be outdated, but there's some merit in keeping things within reason.
{ "source": [ "https://softwareengineering.stackexchange.com/questions/604", "https://softwareengineering.stackexchange.com", "https://softwareengineering.stackexchange.com/users/116/" ] }
616
When I am in a code or design rut, I tend to find a non-dev coworker to discuss the problem with. It forces me to explain the problem in great detail and I'll usually find something I missed in the process. What are your "unsticking" methods?
Some of my tactics: Explain the problem to someone, or even no one. My girlfriend used to explain problems to a potato she kept. Work on something else for a bit (if opportunity allows)- some other functionality or even another project. Get your ming off the current project. A lot of times problems that seem impossible at 4:30pm seem trivial at 9:30 am the next day. Go to the pub (if possible). Same principle as above. Beat your head against it. This isn't often that productive for solving the problem, but at least for me, I tend to learn a lot. If my gridview isn't auto-sorting, I'll try and read everything I can about the problem. It'll still take me 3 hours to solve a stupid error on my part, but by the end, I'll have learned everything there is to know about gridviews and how they bind to data- I'll be able to solve any number of similar problems in the future. Get another input- preferably someone who knows at least something about the context of the project. Most of my errors are stupid ones that only require a few minutes from a second set of eyes to solve where it would take me hours. Isolate the problem. I keep a folder labeled "proof of bugs" where I keep a pile of project that each reproduce a specific issue outside the overall context of the large, complex project. This can be a little time consuming, but it allows you to narrow down the cause of the issue independent of the bazillion interfering factors of a large project.
{ "source": [ "https://softwareengineering.stackexchange.com/questions/616", "https://softwareengineering.stackexchange.com", "https://softwareengineering.stackexchange.com/users/-1/" ] }
648
We, as programmers, are constantly being asked 'How long will it take'? And you know, the situation is almost always like this: The requirements are unclear. Nobody has done an in depth analysis of all the implications. The new feature will probably break some assumptions you made in your code and you start thinking immediately of all the things you might have to refactor. You have other things to do from past assignments and you will have to come up with an estimate that takes that other work into account. The 'done' definition is probably unclear: When will it be done? 'Done' as in just finished coding it, or 'done' as in "the users are using it"? No matter how conscious you are of all these things, sometimes your "programmer's pride" makes you give/accept shorter times than you originally suppose it might take. Specially when you feel the pressure of deadlines and management expectations. Many of these are organizational or cultural issues that are not simple and easy to solve, but in the end the reality is that you are being asked for an estimate and they expect you to give a reasonable answer. It's part of your job. You cannot simply say: I don't know. As a result, I always end up giving estimates that I later realize I cannot fulfill. It has happened countless of times, and I always promise it won't happen again. But it does. What is your personal process for deciding and delivering an estimate? What techniques have you found useful?
From The Pragmatic Programmer: From Journeyman to Master : What to Say When Asked for an Estimate You say "I'll get back to you." You almost always get better results if you slow the process down and spend some time going through the steps we describe in this section. Estimates given at the coffee machine will (like the coffee) come back to haunt you. In the section, the authors recommend the following process: Determine the accuracy that you need. Based on the duration, you can quote the estimate in different precision. Saying "5 to 6 months" is different than saying "150 days". If you slip a little into the 7th month, you're still pretty accurate. But if you slip into the 180th or 210th day, not so much. Make sure you understand what is being asked. Determine the scope of the problem. Model the system. A model might be a mental model, diagrams, or existing data records. Decompose this model and build estimates from the components. Assign values and error ranges (+/-) to each value. Calculate the estimate based on your model. Track your estimates. Record information about the problem you are estimating, your estimate, and the actual values. Other things to include in your estimate are developing and documenting requirements or changes to requirements specifications, creating or updating design documents and specifications, testing (unit, integration, and acceptance), creating or updating user's manuals or READMEs with the changes. If 2 or more people working together, there's overhead of communication (phone calls, emails, meetings) and merging source code. If it's a long task, account for things like other work, time off (holidays, vacation, sick time), meetings, and other overhead tasks when picking a delivery date.
{ "source": [ "https://softwareengineering.stackexchange.com/questions/648", "https://softwareengineering.stackexchange.com", "https://softwareengineering.stackexchange.com/users/76/" ] }
678
I know some people are massive proponents of test driven development. I have used unit tests in the past, but only to test operations that can be tested easily or which I believe will quite possibly be correct. Complete or near complete code coverage sounds like it would take a lot of time. What projects do you use test-driven development for? Do you only use it for projects above a certain size? Should I be using it or not? Convince me!
Ok, some advantages to TDD: It means you end up with more tests. Everyone likes having tests, but few people like writing them. Building test-writing into your development flow means you end up with more tests. Writing to a test forces you to think about the testability of your design, and testable design is almost always better design. It's not entirely clear to me why this happens to be the case, but my experience and that of most TDD evangelists seems to bear it out. Here's a study saying that although TDD takes a bit longer to write, there's a good return on investment because you get higher quality code, and therefore fewer bugs to fix. It gives you confidence in refactoring. It's a great feeling to be able to change one system without worrying about breaking everything else because it's pretty well covered by unit tests. You almost never get a repeat bug, since every one you find should get a test before it gets a fix. You asked to be convinced, so these were benefits. See this question for a more balanced view.
{ "source": [ "https://softwareengineering.stackexchange.com/questions/678", "https://softwareengineering.stackexchange.com", "https://softwareengineering.stackexchange.com/users/86/" ] }
724
When learning a new programming language you sometimes come across a language feature which makes you wish you had it in your other programming languages that you know. What are some language feature which were at the time of learning very new to you and that you wish your other programming languages had. An example of this is generators in Python or C#. Other examples may include list comprehensions in Python, template in C++ or LINQ in .NET or lazy evaluation in Haskell. What other semi-unique language features have you come across which were completely new and enlightening to you? Are there other features of older programming languages which were unique and have fallen out of fashion?
Practically anything in Haskell Monads. Yes - the big scary word that makes increadibly easy parsers, IO, operations on Lists and other things so easy (once you notice common pattern) Arrows. The same for advanced users ;) Standard stuff like lambdas etc. Currying functions Algebraic data types Pattern matching And many more. PS. Yes. I am Haskell fanboy if anyone asked.
{ "source": [ "https://softwareengineering.stackexchange.com/questions/724", "https://softwareengineering.stackexchange.com", "https://softwareengineering.stackexchange.com/users/24/" ] }
750
I'm freshly out of college, and starting university somewhere next week. We've seen unit tests, but we kinda not used them much; and everyone talks about them, so I figured maybe I should do some. The problem is, I don't know what to test. Should I test the common case? The edge case? How do I know that a function is adequately covered? I always have the terrible feeling that while a test will prove that a function works for a certain case, it's utterly useless to prove that the function works, period.
My personal philosophy has thusfar been: Test the common case of everything you can. This will tell you when that code breaks after you make some change (which is, in my opinion, the single greatest benefit of automated unit testing). Test the edge cases of a few unusually complex code that you think will probably have errors. Whenever you find a bug, write a test case to cover it before fixing it Add edge-case tests to less critical code whenever someone has time to kill.
{ "source": [ "https://softwareengineering.stackexchange.com/questions/750", "https://softwareengineering.stackexchange.com", "https://softwareengineering.stackexchange.com/users/90/" ] }
779
It doesn't have to be programming or software development related, but just asked during an interview for an IT related job. I know some "left field" questions are meant to see how the candidate copes with unexpected and novel situations, but here I'm looking for a question that appeared to be completely unrelated to the job they were interviewing you for, or something that made you think "what useful information could they possibly get from my answer to that question?".
Where do you see yourself in 5 years? Do they really think people are dumb enough to say that they want to do something completely different? Or don't want to work for them? I guess it can be useful as an indicator of who not to hire but it's so stupid easy to fake that you can't use it as an indicator of who to hire in any way if they answer correctly.
{ "source": [ "https://softwareengineering.stackexchange.com/questions/779", "https://softwareengineering.stackexchange.com", "https://softwareengineering.stackexchange.com/users/47/" ] }
811
Literate programming has good ideals. Why do you think that this isn't mainstream? It is because it has failed to deliver?
I first saw it in a book of Knuth's writings, and thought it looked neat. Then I tried to use the literary programming display to comprehend what was going on in the program, and found it harder than it looked. It may have been that I was too used to going through program listings, but it seemed confusing. Then I looked at the source code, and that turned me off then and there. I'd have to learn to write programs in an entirely new way, with less correspondence between the program text and what the compiler saw, and saw no corresponding benefit. In addition, people can write long and convincing arguments that the code is doing X when it's actually doing Y, and I've run into my share of misleading comments. I developed a fondness for reading the code to see what it's doing fairly early. Literate programming is the antithesis of that.
{ "source": [ "https://softwareengineering.stackexchange.com/questions/811", "https://softwareengineering.stackexchange.com", "https://softwareengineering.stackexchange.com/users/86/" ] }
812
I originally tried asking this on StackOverflow, but it was too subjective :-(. I am interested in methods of defining the power of programming languages. Turing completeness is one, but it is almost universally satisfied. What would be nice is to define a measure of power that discriminates among programming languages that are actually in used. For example, can anyone propose a non-subjective method that would discriminate between assembly and Java? Turing completeness means that a language is maximally powerful in what it can output (which pretty much means it can do anything non-time based in the real world). So if we want to define a stronger measure of power, we need to take another approach. Shortness was suggested in the original question, but this is not easy to define at all. Does anyone have any other suggestions?
The notion you are looking for is called expressiveness and Matthias Felleisen has a mathematically rigorous definition: " On the Expressive Power of Programming Languages " www.ccs.neu.edu/scheme/pubs/scp91-felleisen.ps.gz (Postscript version) The intuition behind the idea is that if you have two equivalent programs in two different languages-- say, program A in language X and program B in language Y-- and if you make a local change to A that requires a global change to B, then X is more expressive than Y. One example Felleisen provides is assignment: In the Scheme programming languages you can remove the assignment operator and still have a Turing complete language. However, in such a restricted language, adding in a feature that would be localized if assignment was allowed would require a global change to the program without assignment. My discussion has simplified some details, and you should read the paper itself for the full account. To answer your other question: You can say that Java is more expressive than assembly because you can add a new class to your Java program, and then gain the benefits of polymorphism by having other parts of your program call its methods without global modification. Exception handling is another example where Java is more expressive than assembly: You simply need to write a single throw statement to transfer control up the stack. On a more elementary level, you can also add a new case statement near the beginning of a switch and you won't have to worry about recalculating any jump offsets by hand.
{ "source": [ "https://softwareengineering.stackexchange.com/questions/812", "https://softwareengineering.stackexchange.com", "https://softwareengineering.stackexchange.com/users/86/" ] }
827
For decades, the accepted degree to get to become a software developer was "Compter Science." We've had a few questions already about whether that degree really prepares students to develop software. Some schools have, in the last 8 years or so, started offering multiple distinct majors in programming. Using the curriculum from my school: Computer Science , which starts out with some intro programming courses in the first year, and then focuses on theoretical computer science, algorithms, and a bit of OS stuff. Most classes involve several smaller projects and homeworks, done solo or in pairs. Software Engineering , which starts out with the same intro programming courses, does a few classes of theory, and then goes into software development practices (testing, process methodologies, sofware metrics, requirements gathering) and software design (distributed system design, info system design, real-time/embedded design, subsystem design, etc) Different schools do it differently, so the above is just a real-world example I'm familiar with. What I ask is: Is there a need for distinct majors in programming?
Yes, they should be. The relationship between computer science and software engineering is the same as the relationship between physics and mechanical engineering. One provides the theoretical background while the other takes those theories, along with good engineering principles, and applies them to the design and implementation of software. You need both in order to produce new and better software. A good computer science education trains people to produce new and better algorithms, data structures, programming languages and paradigms, compilers, and a number of other things that can be used to enhance software systems. A good software engineering education, on the other hand, trains you to take these tools and knowledge obtained through a scientific study of computation, along with a knowledge of the software development lifecycle and process models to actually build the system that a customer wants and needs.
{ "source": [ "https://softwareengineering.stackexchange.com/questions/827", "https://softwareengineering.stackexchange.com", "https://softwareengineering.stackexchange.com/users/6/" ] }
870
One per answer please. I'll add my favourite as an answer.
Code Complete by Steve McConnell. I don't even think it needs explanation. It's the definitive book on software construction. Incredibly well written and covers all aspects of the practical (programming) side of creating software.
{ "source": [ "https://softwareengineering.stackexchange.com/questions/870", "https://softwareengineering.stackexchange.com", "https://softwareengineering.stackexchange.com/users/38/" ] }
937
I know that they are implemented extremely unsafely in C/C++. Can't they be implemented in a safer way? Are the disadvantages of macros really bad enough to outweigh the massive power they provide?
I think the main reason is that macros are lexical . This has several consequences: The compiler has no way of checking that a macro is semantically closed, i.e. that it represents a “unit of meaning” like a function does. (Consider #define TWO 1+1 — what does TWO*TWO equal? 3.) Macros are not typed like functions are. The compiler cannot check that the parameters and return type make sense. It can only check the expanded expression that uses the macro. If the code doesn’t compile, the compiler has no way of knowing whether the error is in the macro itself or the place where the macro is used. The compiler will either report the wrong place half of the time, or it has to report both even though one of them is probably fine. (Consider #define min(x,y) (((x)<(y))?(x):(y)) : What should the compiler do if the types of x and y don’t match or don’t implement operator< ?) Automated tools cannot work with them in semantically useful ways. In particular, you can’t have things like IntelliSense for macros that work like functions but expand to an expression. (Again, the min example.) The side-effects of a macro are not as explicit as they are with functions, causing potential confusion for the programmer. (Consider again the min example: in a function call, you know that the expression for x is evaluated only once, but here you can’t know without looking at the macro.) Like I said, these are all consequences of the fact that macros are lexical. When you try to turn them into something more proper, you end up with functions and constants.
{ "source": [ "https://softwareengineering.stackexchange.com/questions/937", "https://softwareengineering.stackexchange.com", "https://softwareengineering.stackexchange.com/users/86/" ] }
940
This is more a discussion question than an actual attempt to determine the "best", since that clearly varies by the needs of the organization. I'm more curious about the arguments in favor of different systems across categories (centralized vs distributed, open vs proprietary, etc). So, what do you think is the best version control system?
Mercurial Because of it's sophisticated ability to branch and merge code, it is the best I've used. The whole DVCS paradigm just makes so much sense. I've not used Git, but I suppose that it qualifies as well.
{ "source": [ "https://softwareengineering.stackexchange.com/questions/940", "https://softwareengineering.stackexchange.com", "https://softwareengineering.stackexchange.com/users/6/" ] }
966
There are a lot of questions about what programming books should be on the programmer's bookshelf. How about non-programming related books that can help you become a better programmer or developer? It would also be interesting to know why they would help. My first choice would be Sun Tzu's "Art of War" (however cliché), because it made it obvious that the success of any project depends on the strength of its weakest link (and warfare is a big project).
The Design of Everyday Things by Donald Norman
{ "source": [ "https://softwareengineering.stackexchange.com/questions/966", "https://softwareengineering.stackexchange.com", "https://softwareengineering.stackexchange.com/users/170/" ] }
1,090
Language shortcuts can often be used to make code more concise. For example, ternary and null coalescing operators can reduce the amount of code, but arguably to the detriment of readability: In C#: Person newGuy = new Person(); if (boss == null) { newGuy.Boss = GetDefaultBoss(); } else { newGuy.Boss = boss; } is functionally equivalent to: Person newGuy = new Person(); newGuy.Boss = boss ?? GetDefaultBoss(); but obviously a lot more verbose. Where do you draw the line when it comes to conciseness vs readability?
Both. Your first example is certainly more verbose, and arguably more explicit... but it also requires me to scan five lines instead of one. Worse, it deemphasizes its purpose - assigning a value to newGuy.Boss . Your second example may cost me a second if I'm unfamiliar with the null coalescing operator, but there can be no doubt as to its purpose, and if I'm scanning through a larger routine looking for the source of a value, it will be much easier for me to pick this one out. Now, contrast this: if (boss == null) { newGuy.Boss = GetDefaultBoss(); newGuy.IsTemp = true; newGuy.AddTask("orientation"); } else { newGuy.Boss = boss; newGuy.IsTemp = false; } ...with: newGuy.Boss = boss ?? GetDefaultBoss(); newGuy.IsTemp = boss == null; if ( boss == null ) newGuy.AddTask("orientation"); The latter example is again much shorter, but now it obscures its purpose by making tasks triggered by the same test appear to be distinct. Here, I feel the verbosity of the former is justified.
{ "source": [ "https://softwareengineering.stackexchange.com/questions/1090", "https://softwareengineering.stackexchange.com", "https://softwareengineering.stackexchange.com/users/91/" ] }
1,095
I used ad-hoc MUML (made-up modeling language) to design and explain system fairly frequently. It looks similar to UML and tends to be pretty well understood. However, I've had a professor or two that harped on the use of strict, formal UML, as close to the spec as possible. I always suspected that strict UML wasn't really as common as they claimed. So, how 'bout it- how often do you actually draw out complete diagrams that use all the proper line endings, multiplicity, member type symbols, etc?
Never. Heck, it's been years since I last created any UML. Line diagrams on whiteboards and scraps of paper don't count. In fact, we just removed the sole UML question from the guide we use during interviews, because none of us really cared about the answers.
{ "source": [ "https://softwareengineering.stackexchange.com/questions/1095", "https://softwareengineering.stackexchange.com", "https://softwareengineering.stackexchange.com/users/6/" ] }
1,189
By now I work with asp.net and C#. I have done a decent work in Java as well. I am planning my career in such a way I should be language-agnostic someday. What are the things that I need to learn? First would OOP paradigms as its speaks about the Class design. Are there any others?
To be language agnostic you need to have experience in all of the common styles and types of languages. An imperative language (You tell it what to do, step by step. Eg - C) A declarative language (You tell it your goal, it figures out what to do. Eg - SQL/HTML/Prolog) Also: A functional language (Functions are key, avoiding state and side effects are the goals. Eg - Haskell/OCaml/Lisp/F#) An object oriented language (Architecture where objects encapsulate related data and the methods that act on them). Eg - Java/C#) Some typing styles: A statically typed language (Data types are defined and checked at compile time. Eg - C#) A dynamically typed language (Data types are checked at runtime. Eg - Python/Javascript) Experience of strong vs. weak typing is also useful. Some different runtime styles: Something compiled (Eg - C++) Something interpreted (Eg - PHP) Something Managed (Eg - C#/Java) Lower level stuff: Something fairly low level (Eg - C) Some dialect of assembly (Eg - NASM) On top of that I would say you need experience of some concurrent programming and something event driven . You should probably also make sure you know something about the various domains such as web programming (client & server), rich client development/desktop, games. You might also want to learn about embedded programming, or dedicated hardware (like games consoles), and mobile development is becoming an increasingly relevant domain. Others have also mentioned that it's worth getting some experience of Generic programming and Meta programming approaches. When you learn these paradigms avoid just learning the syntax and writing in your old style. I've seen many C# devs write JavaScript as if it's statically typed. Don't do this, try to learn the language paradigms and embrace them. If you've done all of this, the differences between languages will become largely syntactical so switching will become a fairly simple exercise of learning some new syntax. Don't forget though that modern programming is almost always dependant on a framework, so familiarising yourself with the common and popular frameworks for each language you learn is also critical. Knowing C# is irrelevant without .net.
{ "source": [ "https://softwareengineering.stackexchange.com/questions/1189", "https://softwareengineering.stackexchange.com", "https://softwareengineering.stackexchange.com/users/156/" ] }
1,262
I'm considering whether I should start using VIM again instead of an IDE. What are the most useful features of VIM that aren't standard in an IDE?
I don't think its necessarily the advanced features of VIM that make it so powerful. Its the fact that you never have to take your hands off the keyboard to do anything. Finding something in a huge file is as simple as a couple of keystrokes. Opening and closing multiple files in the same window is incredibly fast as well. While it may not seem intuitive at first, its well worth your time. Even if you don't use it as your standard IDE (I generally use Visual Studio or Eclipse, for example), you'll find your self using VIM to quickly open and edit files because it becomes way faster than waiting for the IDE to load. Invest the time to learn how to use VIM well and you'll never regret it. I'd say its comparable to learning to touch-type.
{ "source": [ "https://softwareengineering.stackexchange.com/questions/1262", "https://softwareengineering.stackexchange.com", "https://softwareengineering.stackexchange.com/users/86/" ] }
1,323
I recently saw that Microsoft released a coding standards document ( All-In-One Code Framework Coding Standards ) and it got me thinking... The company that I work for has no formal coding standards at all. There are only a few developers and we have been together long enough to have evolved into similar styles and its never been an issue. Does the company you work for have a documented coding standards? If no, why not? Does having a standard make a difference? Is it worth writing a standard from scratch or should you adopt another standard as your own (ie. make Microsoft's standards yours)?
It's important for a team to have a single coding standard for each language to avoid several problems: A lack of standards can make your code unreadable. Disagreement over standards can cause check-in wars between developers. Seeing different standards in the same class can be extremely irritating. I'm a big fan of what Uncle Bob has to say about standards: Let them evolve during the first few iterations. Let them be team specific instead of company specific. Don't write them down if you can avoid it. Rather, let the code be the way the standards are captured. Don't legislate good design. (e.g. don't tell people not to use goto) Make sure everyone knows that the standard is about communication, and nothing else. After the first few iterations, get the team together to decide.
{ "source": [ "https://softwareengineering.stackexchange.com/questions/1323", "https://softwareengineering.stackexchange.com", "https://softwareengineering.stackexchange.com/users/101/" ] }
1,338
Have you ever had to work to coding standards that: Greatly decreased your productivity? Were originally included for good reasons but were kept long after the original concern became irrelevant? Were in a list so long that it was impossible to remember them all? Made you think the author was just trying to leave their mark rather than encouraging good coding practice? You had no idea why they were included? If so, what is your least favorite rule and why? Some examples here
This may ruffle a few feathers, but standards that mandate templated block comments at the top of each method always bug the crap out of me. 1) They are always out of date since they are too far from the code that does the actual work to notice when you are updating things. Bad comments are worse than no comments. 2) They often just repeat information that is already contained the source control tool, just less accurate. For example: Last Modified by, list of modification date/reasons.
{ "source": [ "https://softwareengineering.stackexchange.com/questions/1338", "https://softwareengineering.stackexchange.com", "https://softwareengineering.stackexchange.com/users/113/" ] }