Phone

An “outside, independent firm” will soon be calling residents in the Killeen Independent School District, asking how they feel about the proposed bond issue, according to Terry Abbott, KISD chief communications officer.

While Abbott does not know exactly who will be contacted to participate in the survey, “It will be a random, stratified sample representative of the KISD community demographics, to guarantee scientifically valid results within a margin of error, which is standard in public opinion polling,” he said.

Participants will be contacted via land line phones and cellphones, Abbott said. “Modern polling methods always make sure of cellphone numbers.”

Abbott said the survey calls should be done by Thanksgiving.

The questions to be asked as part of the survey, the number of participants to be called, and the name of the firm being hired to conduct the survey was all information Abbott did not have available when contacted on Wednesday.

Possible questions could include which projects should be included on the district’s proposed bond issue, which is tentatively scheduled for a May 5, 2018, ballot.

The bond amount could be as high as $500 million, if all projects under consideration by the district and the bond steering committee would be included. However, this district has not said what the bond amount could be.

Three new elementary schools, a middle school and a high school are listed on KISD’s Strategic Facility Plan, and could be included in the bond issue. Also on that list is a stadium, estimated to cost $50 million.

Much-needed renovations to 12 older school buildings in the district are under consideration for the bond issue. Among those renovations would be replacing 239 aging, inefficient heating and air conditioning units.

Purchasing new school buses, or retrofitting older buses with seat belts has been mentioned by Craft as another potential inclusion for the bond issue.

Abbott said the survey results will be made available to the KISD bond steering committee and the public when they are complete.

The bond steering committee’s next meeting is at 6 p.m. Thursday at East Ward Elementary School, 1608 E. Rancier Ave. It is open to the public.

254-501-7568 | jferraro@kdhnews.com

Tags

(1) comment

eyewatchingu

I wonder how much this will cost us the homeowners, tax payers and rate payers?
Is this going to be funded with money that should be used for the school, educational tools, pay raises for good teachers?
Now lets explain to people the truth on Polls, I will post the link and then the info, so the citizens can base their opinion on facts, and not on untruths. Any one that went to a good school or collages understands the difference, that a good school teaches one to do research and come up with the answer, a failing school only teaches how to answer a question on a test and not how to apply the theory.
https://www.usnews.com/news/the-report/articles/2015/09/28/why-public-opinion-polls-are-increasingly-inaccurate
In 2012, his own campaign's polls, among others, predicted Mitt Romney, the Republicans' nominee, would defeat President Barack Obama for the presidency, but just barely. Two years later, surveys in Kentucky strongly suggested then-Senate Minority Leader Mitch McConnell, perhaps the shrewdest, most powerful Republican in Washington, could lose his seat to an upstart rookie Democrat.
Overseas, meanwhile, multiple public-opinion experts in 2014 said Scottish voters were deadlocked on whether to scrap one of the oldest relationships in European history and choose full independence from Great Britain – and underdog nationalists' hopes soared.
Three different public-opinion polls, three important elections, three decisively erroneous results: President Barack Obama blindsided Romney in the 2012 presidential elections, winning a second term by five points; McConnell crushed Alison Lundergan Grimes en route to becoming Senate majority leader in 2014; and Scots last year overwhelmingly chose to keep ties with the United Kingdom, an outcome that stunned the polling establishment.
Once a seemingly infallible cornerstone of the political system, public opinion polls have racked up a few big-time fails in recent years, embarrassments that compelled a leading firm to conduct an internal audit to find out what went wrong. Analysts are also openly questioning whether the industry, whose leaders were household names in the 60s and 70s, has kept up with a rapidly transforming, highly-mobile electorate – one that's relying on everyday technology to opt out of the public discourse.
Others wonder if those factors combined to create the unlikely Summer of Trump, in which a boorish reality-TV billionaire with zero political experience and no apparent verbal filter shot past a dozen experienced politicians in 2015 presidential opinion polls to become the presidential front-runner, defying political gravity in the process. Or if skewed polls triggered the Hillary Slide, in which the likeability numbers of the Democrats' perpetual 2016 front-runner nosedived over several months, leading her campaign to plan more moments of spontaneity and reboot her warm, human side.


"The science of public surveying is in something of a crisis right now," says Geoffrey Skelley, a political analyst at the University of Virginia's Center for Politics.

And it matters because "polling is a very important element of democracy," said Michael Traugott, a University of Michigan political science professor who specializes in polling and opinion surveys. Traugott also helped prepare a groundbreaking report on how Gallup, a public-opinion titan, erroneously predicted Romney would defeat Obama in 2012.
Polls "give the public an independent voice that's not generally present" otherwise in politics and political news coverage, Traugott said. But he says the recent errors, and a steep decline in the number of people responding to opinion surveys, is "a worrisome trend because one of the main claims of polling is that it represents the people's views."
"I used to remember when survey conductors were celebrities," says Roger Tourangeau, an esteemed survey methodologist and vice president of the research firm Westat. "George Gallup and Lou Harris [of Harris Research Associates, an early industry giant] had columns in the newspaper" and would regularly appear on national TV.
"It's a different world now," he says.

In an unprecedented internal report on its 2012 Romney blunder, Gallup says it made mistakes in its core samples, including its racial makeup and political ideology, as well as its overall methodology.
Gallup's audit, however, also says the entire industry is due for an overhaul, with some of the leading firms using analog, black-and-white methods in a digital, multicultural world. Case in point: the rise of the cell phone and the fall of public engagement in opinion surveys.
Besides not being tied to a fixed address – it's not unusual for owners to have a different area code than where they actually live, and the numbers usually aren't listed in the white pages – cell phones provide more control over its users' privacy than a landline. It's likely, analysts say, that the ability to screen or block incoming calls has accelerated the public's unwillingness to take part in what used to be considered a civic duty.
"Everyone in the industry is worried about the falling response rate," Tourangeau says.
Traugott says the percentage at which people participate in opinion polls has bottomed out in the past few decades, from more than half in the 1980s to the single digits today, and most experts believe cellphone use is the reason.
Tourangeau says the technology factor likely slanted 2015 polls in Israel indicating embattled Israeli Prime Minister Benjamin Netanyahu's Likud Party was locked in a dead heat with the opposition just days before high-stakes parliamentary elections. Likud, however, won in a blowout, handing Netanyahu another term as Israel's leader and giving the polling industry another headache.
"It's not just an American problem. it's a worldwide problem," Tourangeau says.
At the same time in the U.S., a federal law designed to protect consumers from aggressive debt collectors or telemarketers bans pollsters from using automated calls to get opinions, even if it's on important issues like the presidential election or whether the nation's on the right track.
"People are leading more active lives, and they're harder to locate," Traugott says. People would rather text, make calls or perhaps play another round of Candy Crush Saga than spend up ten minutes or longer "for an interview with an organization they they might not know and a survey whose content might be unclear," he says.
Other factors, Tourangeau says: Gated, private communities that door-to-door surveyors can't reach, and more survey subjects who don't speak English as a first language.
Celinda Lake, a pollster, political consultant and president of Lake Research Partners, a Washington, D.C.-based polling firm, says polling has seen "kind of a steady decline. It's getting harder to reach people. It's also harder to get them to cooperate."
That means Lake Research, Gallup and others spend far more time, effort and money than ever before, trying to get solid opinions and deliver an accurate snapshot of the public mind.
"You try them more often. We've upped the number of callbacks; we used to do two or three. We now do three, four and five," Lake says. Sometimes, she adds, they even make appointments with survey subjects to get their participation.
But not every pollster has the time, the money, or a staff big enough, to up their game and dig into a major opinion survey with that level of commitment. That's especially true when campaigns, polling firms and news organizations are competing for attention in a hyper-speed, social media-fueled, 24-hour news environment, Lake says.
That means there's getting to be a broader and broader range of quality of polls, where some people have the resources and some don't," Lake says. "And it makes for more variability" in across-the-board quality of results on a particular issue or political campaign.
Another key factor is polling firms' methodologies, the "secret sauce" of the industry, Traugott says. Put simply, if you ask the right sampling of people the wrong thing – questioning disengaged Virginia residents in 2014, for example, whether they prefer Rep. Eric Cantor, a veteran politician with wide name recognition, or Dave Brat, his unknown but more radical challenger, without determining if they'll actually show up to vote – you'll get a bad result.
Cantor, who had been one of House Speaker John Boehner's top lieutenants, learned that the hard way: he overestimated the polls, including his own campaign's, and didn't factor in a minimal primary turnout that gave Brat the slingshot he needed to slay a political Goliath. Instead of cruising to an easy win, Cantor fell hard, and Brat seized the seat Cantor held for 13 years.
Ultimately, Traugott and others say, the industry is responding with better methodology, improved modeling of public behavior, smarter ways to reach people (including Internet solicitations and small amounts of cash) and a commitment to learn from its mistakes. And they each said the industry as a whole nearly always gets it right.
"The trends are tough, Tourangeau says. "But it's not like the whole field has collapsed either. I'm not ready to give up on surveys at all."
"It's an essential tool for the government and the politicians," he says, "and it's not going away."

Hmm, seems facts are facts, the chances of getting an accurate polling is in possible for a number of reason. Lets go to the next one.

https://www.acsh.org/news/2016/10/19/polls-are-not-rigged-they-also-arent-scientific-10329

Still, as Brexit proved, skewing can happen due to faulty sampling or weighting. That is why there is no such thing as a "scientific poll." Tweaking turnout models is more akin to refining a cake recipe than doing a science experiment. Therefore, polls should be referred to as "statistically correct," not "scientific."

More food for thought. Lets go to the next one, You will love this one and proves once more that if Mrs. Abbot is trying to feed the citizens an obtuse view, to only promote her career instead of actually telling the truth.
https://soonerpoll.com/questionable-polling/
What to Look Out For
At SoonerPoll, we think it is important to provide the public with information about polls they may encounter that are not based on solid, scientific methods. An informed public should have the information it needs to distinguish between questionable polls and polls they can rely on.
We often receive calls and emails about the results of various polls reported by the media. People want to know if they can take the results of these polls seriously. Were they really based upon solid scientific research methods? With the help of several research associations, we have compiled information about some non-scientific polls of which an informed public should be wary.
Science or Entertainment?
Many people these days are using the term “poll” or “survey,” but these terms may not always be used these days to indicate a scientifically-conducted poll or survey.
Do not be fooled.
One good indication is if a newspaper article doesn’t report the sample size and the margin of error of the poll, then it was NOT conducted scientifically and was printed for “entertainment” purposes only.  Many times, the sample sizes are very small and any conclusions should be highly questionable.
This is an injustice to the typical reader who might believe the poll was a scientific poll, but nonetheless some disreputable pollsters will try and make it look that way.
Internet Surveys
A growing trend in public opinion polls has been that of Internet surveys. For the most part, these surveys are not conducted in a scientific manner. Results of such polls are often misleading.
Because there are many Americans who do not have access to or are not regular users of the Internet, online polls that claim to measure public opinion in a scientific manner can hardly be expected to do so with accuracy.
Another problem with online polling is the fact that respondents decide for themselves whether or not to participate. In a scientifically valid poll, respondents are targeted by a carefully designed sampling process. Also, people may choose to respond more than once to an Internet poll, further compromising its validity.
Read the American Association of Public Opinion Research’s statement about online polling.
Push Polls
A push poll is where, using the guise of opinion polling, disinformation about a candidate or issue is planted in the minds of those being ‘surveyed’. Push polls are designed to shape, rather than measure, public opinion.  A type of political telemarketing, push polls are only effective if they can call more voters than just a representative sample (the goal of legitimate pollsters).
But, not all questions that seem negative are part of push polls. As Kathy Frankovic, Director of Surveys for CBS News noted, “Candidate organizations sometimes do actual polls that contain negative information about the opposing candidate. These polls, which are not push polls, are conducted for the same reasons market and advertising researchers do their work: to see what kinds of themes and packages move the public”.
In the advertisers’ case, they want to figure out the best way to reach buyers; candidate pollsters need to motivate voters. Polls done for campaign research are full-length, with more topics than just questions about the opponent, and include demographic questions that allow researchers to categorize respondents. Interviewers will not ask to speak to anyone by name, but are calling a sample of randomly selected telephone numbers.
How do you distinguish a push poll from the legitimate poll?  According to Mark Blumenthal, the Mystery Pollster, the proof is in the intent: If the sponsor intends to communicate a message to as many voters as possible rather than measure opinions or test messages among a sample of voters, it qualifies as a “push poll”.
We can usually identify a true push poll by a few characteristics that serve as evidence of that intent. “Push pollsters” (and Mystery Pollster hates that term) aim to reach as many voters as possible, so they typically make tens or even hundreds of thousands of calls. Real surveys usually attempt to interview only a few hundred or perhaps a few thousand respondents (though not always).
Push polls typically ask just a question or two, while real surveys are almost always much longer and typically conclude with demographic questions about the respondent (such as age, race, education, income).
The information presented in a true push poll is usually false or highly distorted, but not always.  A call made for the purposes of disseminating information under the guise of survey is still a fraud – and thus still a “push poll” – even if the facts of the “questions” are technically true or defensible.
Accusations of push polling have become a political attack in itself in recent elections with candidates attacking each other over polls their opponent’s pollster conducted that asked negative questions.  In most cases the polls were legitimate, but each side decided to take advantage of the situation to merely attack their opponent.  In the end, the listening public is typically left in doubt about what was fraud and what was legitimate , and more skeptical and less willing to participate the next time legitimate pollsters, like SoonerPoll, come calling.
For the record, SoonerPoll is a public opinion research firm and does NOT engage in push polling of any type.
University-Organized Surveys for Class Credit.
Surveys conducted by university students as part of a class assignment can be flawed for a number of reasons.
For example, in January 2004, the Modesto Bee reported that a professor at California State University, Stanislaus, required students in his class to conduct public opinion interviews by telephone to receive credit for a course, but failed to provide resources and oversight or to validate that the students carried out the interviews.
The Bee reported that student interviewers received only an hour of training and were expected to pay for long distance charges. Several students came forward to say they falsified their interviews because they were pressed for time and would have to make lengthy long distance telephone calls at their own expense, according to the story. The falsified results of the public opinion poll were introduced as evidence in the Scott Petersen murder trial for possible change of venue in the trial.

So now that I have showed my work and both sides of the Polls. It is time that Mrs Abbott, speak the truth and not uses her name or her personal political career agenda to try and win over the voters she feels are not smart enough to think and learn the facts on their own.

Once again, I ask all Killeen Citizens to vote no clear across the ballot if this Bond is in anyway on the ballot.
Vote no clear across the ballot to inform our city leaders that we will no longer be mislead, we will no longer allow them to waste our money, misuse public property, misuse their offices for their own profit and most of all we will not allow them to get away with not being transparent, and using the kids as weapons while they hold our kids back from an actual education.


VOTE NO CLEAR ACROOS THE BALLOT IF THIS BOND AND 50MILLION STATDUIM IS ON THE BALLOT.

Welcome to the discussion.

Keep it Clean. Please avoid obscene, vulgar, lewd, racist or sexually-oriented language.
PLEASE TURN OFF YOUR CAPS LOCK.
Don't Threaten. Threats of harming another person will not be tolerated.
Be Truthful. Don't knowingly lie about anyone or anything.
Be Nice. No racism, sexism or any sort of -ism that is degrading to another person.
Be Proactive. Use the 'Report' link on each comment to let us know of abusive posts.
Share with Us. We'd love to hear eyewitness accounts, the history behind an article.