News:

And we're back!

Main Menu

What does a TRUMP presidency look like?

Started by FunkMonk, November 08, 2016, 11:02:57 PM

Previous topic - Next topic

jimmy olsen

Quote from: alfred russel on December 10, 2017, 10:48:15 PM
Quote from: jimmy olsen on December 10, 2017, 10:08:19 PM
If someone says the test/poll results were skewed, they mean someone put there thumb on the scale to change the result.

You are an idiot. There is no point to a discussion with you, because you are just dumb.

I'm sorry that the way langauge is used in everday conversation upsets you.
It is far better for the truth to tear my flesh to pieces, then for my soul to wander through darkness in eternal damnation.

Jet: So what kind of woman is she? What's Julia like?
Faye: Ordinary. The kind of beautiful, dangerous ordinary that you just can't leave alone.
Jet: I see.
Faye: Like an angel from the underworld. Or a devil from Paradise.
--------------------------------------------
1 Karma Chameleon point

CountDeMoney

Quote from: Razgovory on December 10, 2017, 10:57:00 PM
Quote from: alfred russel on December 10, 2017, 10:48:15 PM
Quote from: jimmy olsen on December 10, 2017, 10:08:19 PM
If someone says the test/poll results were skewed, they mean someone put there thumb on the scale to change the result.

You are an idiot. There is no point to a discussion with you, because you are just dumb.


You can talk to me.  I'm also an idiot, but not quite as dumb.  What's Hat Vocabulary?


Stop being Razzy, Raz.  Nobody likes it when you get all twitchy and shit.

dps

Quote from: jimmy olsen on December 10, 2017, 11:00:55 PM
Quote from: alfred russel on December 10, 2017, 10:48:15 PM
Quote from: jimmy olsen on December 10, 2017, 10:08:19 PM
If someone says the test/poll results were skewed, they mean someone put there thumb on the scale to change the result.

You are an idiot. There is no point to a discussion with you, because you are just dumb.

I'm sorry that the way langauge is used in everday conversation upsets you.

People who used technical terms a lot in their work tend to get bitchy when the general populace uses those technical terms in an informal manner. 

That said, your supposed way that people use the term "skewed" in everyday conversation isn't merely imprecise, it's wrong.  "Skewed" in everyday conversation just means that something is throwing the results off.  Putting your thumb on a scale is just call "fraud" or "cheating".  Granted, that's a subset of things that can skew results, but only a relatively small subset.

DGuller

Quote from: jimmy olsen on December 10, 2017, 11:00:55 PM
Quote from: alfred russel on December 10, 2017, 10:48:15 PM
Quote from: jimmy olsen on December 10, 2017, 10:08:19 PM
If someone says the test/poll results were skewed, they mean someone put there thumb on the scale to change the result.

You are an idiot. There is no point to a discussion with you, because you are just dumb.

I'm sorry that the way langauge is used in everday conversation upsets you.
This is all because you people use words like "bias" and "skew" in randomly imprecise ways.  The world would be a much better place if everyone spoke like a statistician.

Eddie Teach

Quote from: Razgovory on December 10, 2017, 10:21:52 PM
Seems Roy Moore has a problem with every amendment of the US Constitution after the 10th.

Not just 13-15? And 20, of course.
To sleep, perchance to dream. But in that sleep of death, what dreams may come?

dps

Quote from: Eddie Teach on December 10, 2017, 11:50:00 PM
Quote from: Razgovory on December 10, 2017, 10:21:52 PM
Seems Roy Moore has a problem with every amendment of the US Constitution after the 10th.

Not just 13-15? And 20, of course.

I know you're at least partly being sarcastic, but what problem exactly does Moore or anyone have with the 20th?  It's mostly just procedural.

Eddie Teach

To sleep, perchance to dream. But in that sleep of death, what dreams may come?

The Minsky Moment

Quote from: Oexmelin on December 09, 2017, 09:56:59 PM
Just like Constitution. I imagine Trump spends the same amount of time thinking about both.

His staff gave him a 1 pager with bullet points:

+ House of Reps - can't appoint Baron yet, need to be 25

+ Senate - Mike P gets to be Prez there!  Senators have to be 30 so sh/be no new problems with Moore

+ Sorry - no titles of nobility

+ Art II - work on pronouncing "emoluments"

+ Art III - need to amend Section 3 to omit Russia

+ Bill of Rights - right to bear arms, states rights, other stuff

+ 25th Amendment - tl;dr



The purpose of studying economics is not to acquire a set of ready-made answers to economic questions, but to learn how to avoid being deceived by economists.
--Joan Robinson

The Larch

Quote from: CountDeMoney on December 10, 2017, 11:51:13 AM
Quote from: The Larch on December 10, 2017, 11:32:12 AM
Quote from: CountDeMoney on December 09, 2017, 09:13:16 PM
"The fight to end slavery, to break down Jim Crow, to end segregation, to gain the right to vote, and to achieve the sacred birthright of equality — that's big stuff.  Those are very big phrases, very big words."
--Donald J. Trump, Civil Rights Warrior

It's so funny to see exactly where he stops reading the prepared speech and where he starts rambling.  :lol:

Isn't it, though?  That's how you can tell he's reading it for the first time, too.

Yeah, there's no way he'd say "sacred birthright of equality" on his own, that's when the attention span ended. His delivery must have been painful.

HisMajestyBOB

Quote from: Eddie Teach on December 10, 2017, 11:50:00 PM
Quote from: Razgovory on December 10, 2017, 10:21:52 PM
Seems Roy Moore has a problem with every amendment of the US Constitution after the 10th.

Not just 13-15? And 20, of course.

Moore likes 14-16.
Three lovely Prada points for HoI2 help

crazy canuck

Quote from: The Larch on December 11, 2017, 06:14:33 AM
Yeah, there's no way he'd say "sacred birthright of equality" on his own, that's when the attention span ended. His delivery must have been painful.

He was concentrating very hard on reading and then he stopped reading but kept talking.

jimmy olsen

Alabama polls are all over the place.


Dozens of embedded links here
https://fivethirtyeight.com/features/what-the-hell-is-happening-with-these-alabama-polls/

Quote
What The Hell Is Happening With These Alabama Polls?

By Nate Silver

Filed under Special Elections

Somebody's going to be wrong in Alabama.

We've already urged caution when interpreting polls of Alabama's special election to the U.S. Senate, which will be held on Tuesday. Some of that is because of the media's usual tendency to demand certainty from the polls when the polls can't provide it. And some of it is because of the circumstances of this particular race: a special election in mid-December in a state where Republicans almost never lose but where the Republican candidate, Roy Moore, has been accused of sexual misconduct toward multiple underaged women.

What we're seeing in Alabama goes beyond the usual warnings about minding the margin of error, however. There's a massive spread in results from poll to poll — with surveys on Monday morning showing everything from a 9-point lead for Moore to a 10-point advantage for Democrat Doug Jones — and they reflect two highly different approaches to polling.

Most polls of the state have been made using automated scripts (these are sometimes also called IVR or "robopolls"). These polls have generally shown Moore ahead and closing strongly toward the end of the campaign, such as the Emerson College poll on Monday that showed Moore leading by 9 points. Recent automated polls from Trafalgar Group, JMC Analytics and Polling, Gravis Marketing and Strategy Research have also shown Moore with the lead.

But when traditional, live-caller polls have weighed in — although these polls have been few and far between — they've shown a much different result. A Monmouth University survey released on Monday showed a tied race. Fox News's final poll of the race, also released on Monday, showed Jones ahead by 10 percentage points. An earlier Fox News survey also had Jones comfortably ahead, while a Washington Post poll from late November had Jones up 3 points at a time when most other polls showed the race swinging back to Moore. And a poll conducted for the National Republican Senatorial Committee in mid-November — possibly released to the public in an effort to get Moore to withdraw from the race — also showed Jones well ahead.1

What accounts for the differences between live-caller and automated polls? There are several factors, all of which are potentially relevant to the race in Alabama:
1.Automated polls are prohibited by law from calling voters on cellphones.
2.Automated polls get lower response rates and therefore may have less representative samples.
3.Automated polls may have fewer problems with "shy" voters who are reluctant to disclose their true voting intentions.
4.Automated pollsters (in part to compensate for issues No. 1 and 2 above) generally make more assumptions when modeling turnout, whereas traditional pollsters prefer to let the voters "speak for themselves" and take the results they obtain more at face value.

Issue No. 1, not calling cellphones, is potentially a major problem: The Fox News poll found Jones leading by 30 points among people who were interviewed by cellphone. Slightly more than half of American adults don't have access to a landline, according to recent estimates by the federal Centers for Disease Control and Prevention, which also found a higher share of mobile-only households in the South than in other parts of the country. Moreover, voters with landline service are older than the voting population as a whole and are more likely to be white — characteristics that correlate strongly with voting Republican, especially in states such as Alabama.

Pollsters are aware of these problems, so they use demographic weighting to try to compensate. Even if you can't get enough black voters on a (landline) phone, for instance, you may have some reasonable way to estimate how many black voters there "should" be in the electorate, based on Census Bureau data or turnout in previous elections — so you can weight the black voters you do get on the phone more heavily until you get the "right" demographic mix.

This sounds dubious — and there are better and worse ways to conduct demographic weighting — but it's a well-accepted practice. (Almost all pollsters use demographic weighting in some form.) And sometimes everything turns out just fine — automated polls don't have a great track record, but firms such as Trafalgar Group that do automated polling generally performed pretty well in 2016, for example. Some automated firms have also begun to supplement their landline samples with online panels in an effort to get a more representative sample. Still, cell-only and landline voters may be differentiated from one another in ways that are relevant for voting behavior but which don't fall into traditional demographic categories — cell-only voters may have different media consumption habits, for instance. If nothing else, failing to call cellphones adds an additional layer of unpredictability to the results.

Apart from their failure to call mobile phones, automated polls have lower response rates (issue No. 2) — often in the low single digits. This is because voters are more likely to hang up when there isn't a human on the other end of the line nudging them to complete a survey. Also, many automated polls call each household only once, whereas pollsters conducting traditional surveys often make several attempts to reach the same household. Calling a household only once could bias the sample in various ways — for instance, toward whichever party's voters are more enthusiastic (probably Democrats in the Alabama race) or toward whoever tends to pick up the phone in a particular household (often older voters, rather than younger ones).

As for issue No. 3, proponents of automated polls — and online polls — sometimes claim that they yield more honest responses from voters than traditional polls do. Respondents may be less concerned about social desirability bias when pushing numbers on their phone or clicking on an online menu as opposed to talking to another human being. That could be particularly relevant in the case of Alabama if some voters are ashamed to admit that they plan to vote for Moore, a man accused of molesting teenagers.

With that said, while there's a rich theoretical literature on social desirability bias, the empirical evidence for it affecting election polls is somewhat flimsy. The Bradley Effect (the supposed tendency for polls to overestimate support for minority candidates) has pretty much gone away, for instance. There's been no tendency for nationalist parties to outperform their polls in Europe. And so-called "shy Trump" voters do not appear to have been the reason that Trump outperformed his polls last year.2

Finally (No. 4), automated and traditional pollsters often take different philosophies toward working with their data. Although they probably wouldn't put it this way themselves, automated pollsters know that their raw data is somewhat crappy — so they rely more heavily on complicated types of turnout and demographic weighting to make up for it. Automated pollsters are more likely to weight their results by party identification, for instance — by how many Republicans, Democrats and independents are in their sample — whereas traditional pollsters usually don't do this because partisan identification is a fluid, rather than a fixed, characteristic.

Although I don't conduct polls myself, I generally side with the traditional pollsters on this philosophical question. I don't like polls that impose too many assumptions on their data; instead, I prefer an Ann Selzer-ish approach of trusting one's data, even when it shows an "unusual" turnout pattern or produces a result that initially appears to be an outlier. Sometimes what initially appears to be an outlier turns out to have been right all along.

With that said, automated pollsters can make a few good counterarguments. Traditional polls also have fairly low response rates — generally around 10 percent — and potentially introduce their own demographic biases, such as winding up with electorates that are more educated than the actual electorate. Partisan non-response bias may also be a problem — if the supporters of one candidate see him or her get a string of bad news (such as Moore in the Alabama race), they may be less likely to respond to surveys ... but they may still turn up to vote.

Essentially, the automated pollsters would argue that nobody's raw data approximates a truly random sample anymore — and that even though it can be dangerous to impose too many assumptions on one's data, the classical assumptions made by traditional pollsters aren't working very well, either. (Traditional pollsters have had a better track record over the long run, but they also overestimated Democrats' performance in 2014 and 2016.)

So, who's right? There's a potential tiebreaker of sorts, which is online polls. Online polls potentially have better raw data than automated polls — they get higher response rates, and there are more households without landline access than without internet access. However, because there's no way to randomly "ping" people online in the same way that you'd randomly call their phone, online surveys have no way to ensure a truly random probability sample.

To generalize a bit, online polls therefore tend to do a lot of turnout weighting and modeling instead of letting their data stand "as is." But their raw data is usually more comprehensive and representative than automated polls, so they have better material to work with.

The online polls also come out somewhat in Moore's favor. Recent polls from YouGov and Change Research show him ahead by 6 points and 7 points, respectively; in the case of the Change Research poll, this reflects a reversal from a mid-November poll that had Jones ahead.

But perhaps the most interesting poll of all is from the online firm SurveyMonkey. It released 10 different versions (!) of its recent survey, showing everything from a 9-point Jones lead to a 10-point Moore lead, depending on various assumptions — all with the same underlying data.



Although releasing 10 different versions of the same poll may be overkill, it illustrates the extent to which polling can be an assumption-driven exercise, especially in an unusual race such as Alabama's Senate contest. Perhaps the most interesting thing SurveyMonkey found is that there may be substantial partisan non-response bias in the polling — that Democrats were more likely to take the survey than Republicans. "The Alabama registered voters who reported voting in 2016 favored Donald Trump over Hillary Clinton by a 50 to 39 percentage point margin," SurveyMonkey's Mark Blumenthal wrote. "Trump's actual margin was significantly larger (62 to 34 percent)."

In other words, SurveyMonkey's raw data was showing a much more purple electorate than the solid-red one that you usually get in Alabama. If that manifests in actual turnout patterns — if Democrats are more likely to respond to surveys and are more likely to vote because of their greater enthusiasm — Jones will probably win. If there are some "shy Moore" voters, however, then Moore will probably win. To make another generalization, traditional pollsters usually assume that their polls don't have partisan non-response bias, while automated polls (and some online polls such as YouGov) generally assume that they do have it, which is part of why they're showing such different results.

Because you've read so much detail about the polls, I don't want to leave you without some characterization of the race. I still think Moore is favored, although not by much; Jones's chances are probably somewhere in the same ballpark as Trump's were of winning the Electoral College last November (about 30 percent).

The reason I say that is because in a state as red as Alabama, Jones needs two things to go right for him: He needs a lopsided turnout in his favor, and he needs pretty much all of the swing voters in Alabama (and there aren't all that many of them) to vote for him. Neither of these are all that implausible. But if either one goes wrong for Jones, Moore will probably win narrowly (and if both go wrong, Moore could still win in a landslide). The stakes couldn't be much higher for the candidates — or for the pollsters who surveyed the race.
It is far better for the truth to tear my flesh to pieces, then for my soul to wander through darkness in eternal damnation.

Jet: So what kind of woman is she? What's Julia like?
Faye: Ordinary. The kind of beautiful, dangerous ordinary that you just can't leave alone.
Jet: I see.
Faye: Like an angel from the underworld. Or a devil from Paradise.
--------------------------------------------
1 Karma Chameleon point

Admiral Yi

Read in The Economist that 73% of Alabama Republicans disbelieve Moore's accusers, compared to 37% of nationwide Republicans who do.

My apologies if this fact shows up in Timmy's wall of text.

Eddie Teach

I agree with SNL's Cathy Anne, buncha Moore voters too embarrassed to admit it.
To sleep, perchance to dream. But in that sleep of death, what dreams may come?

CountDeMoney

Quote from: jimmy olsen on December 11, 2017, 06:45:52 PM
Issue No. 1, not calling cellphones, is potentially a major problem: The Fox News poll found Jones leading by 30 points among people who were interviewed by cellphone. Slightly more than half of American adults don't have access to a landline, according to recent estimates by the federal Centers for Disease Control and Prevention, which also found a higher share of mobile-only households in the South than in other parts of the country. Moreover, voters with landline service are older than the voting population as a whole and are more likely to be white — characteristics that correlate strongly with voting Republican, especially in states such as Alabama.

Quote from: CountDeMoney on November 08, 2016, 10:59:33 PM
Traditional unsolicited polling has been dead for a while, what with the disappearance of landline phones.

Yeah.