A short list of opposites can help us make a major point in decisioning.
These are actually decisions on auto-pilot.
Opposites
War - Peace
Black - White
Start - Stop
Good - Evil
Begin - End
Life - Death
Bondage - Freedom
Hot - Cold
Night - Day
Finite - Infinite
Even though these are polar opposites, they can become choices that, in the extreme, are easy decisions. But, what about the in-betweens?
It’s the gray (B/W), the lukewarm (H/C), the afternoon (AM/PM)…the somewhere in the middle ones that catch us.
Point is, when it’s not as stark in definition (as above), how do you come to a conclusion?
We are, and will remain, consistent about the answer. It is a model, system, formula or track.
The Zillion Dollar Thinking MODEL:
One. Discovery
Two. Commitment
Three. Solution
Four. Action
So, the next time you get into the gray area of decisioning, please consider the ZDT track. If not this one, cyberspace will have plenty of options (there are now 700,000 apps for the iPhone alone). At the minimum, please download the complimentary eBook at our ZDT site.
Point is that a model is the antidote to staying endlessly stuck. As one author framed it…“better to be consistently decisive than to be completely right.”
As always…you decide.
Wednesday, September 26, 2012
Tuesday, September 18, 2012
The Latest Polls Say...
How many times a day do we see this headline?
How many times is there full disclosure that describes the details of how a poll was structured and framed?
One site that you can use as a benchmark:
And here is a mantra you can repeat to yourself: “Do Not Trust Undocumented Polls”
And, maybe you can say it (and write it on the whiteboard) to this compounded degree:
2 X 2 = 4
4 X 4 = 16
16 X 16 = 256
256 X 256 = 65,536
65,536 X 65,536 = 4,294,967,296
…and so on to the zillioneth power.
Exaggeration…yes…but here’s the point:
Please do not trust any poll without considering at least some of the following criteria (from the article):
1. Who did the poll?
2. Who paid for the poll and why was it done?
3. How many people were interviewed for the survey?
4. How were those people chosen?
5. What area (nation, state, or region) or what group (teachers, lawyers, voters, etc.) were these people chosen from?
6. Are the results based on the answers of all the people interviewed?
7. Who should have been interviewed and was not? Or do response rates matter?
8. When was the poll done?
9. How were the interviews conducted?
10. What about polls on the Internet or World Wide Web?
11. What is the sampling error for the poll results?
12. Who’s on first?
13. What other kinds of factors can skew the poll results?
14. What questions were asked?
15. In what order were the questions asked?
16. What about "push polls"?
17. What other polls have been done on this topic? Do they say the same thing? If they are different, why are they different?
18. What about exit polls?
19. What else needs to be included in the report of the poll?
20. So I've asked all the questions. The answers sound good. Should we report the results?
So, you don’t know the answers to all these questions? That is the point. If you don’t have enough of the answers…don’t trust the poll result…especially one with a +/- 7point differential (as many now hide).
Simple as that.
As always…you decide.
How many times is there full disclosure that describes the details of how a poll was structured and framed?
One site that you can use as a benchmark:
“20 Questions A Journalist Should Ask About Poll Results”
And here is a mantra you can repeat to yourself: “Do Not Trust Undocumented Polls”
And, maybe you can say it (and write it on the whiteboard) to this compounded degree:
2 X 2 = 4
4 X 4 = 16
16 X 16 = 256
256 X 256 = 65,536
65,536 X 65,536 = 4,294,967,296
…and so on to the zillioneth power.
Exaggeration…yes…but here’s the point:
Please do not trust any poll without considering at least some of the following criteria (from the article):
1. Who did the poll?
2. Who paid for the poll and why was it done?
3. How many people were interviewed for the survey?
4. How were those people chosen?
5. What area (nation, state, or region) or what group (teachers, lawyers, voters, etc.) were these people chosen from?
6. Are the results based on the answers of all the people interviewed?
7. Who should have been interviewed and was not? Or do response rates matter?
8. When was the poll done?
9. How were the interviews conducted?
10. What about polls on the Internet or World Wide Web?
11. What is the sampling error for the poll results?
12. Who’s on first?
13. What other kinds of factors can skew the poll results?
14. What questions were asked?
15. In what order were the questions asked?
16. What about "push polls"?
17. What other polls have been done on this topic? Do they say the same thing? If they are different, why are they different?
18. What about exit polls?
19. What else needs to be included in the report of the poll?
20. So I've asked all the questions. The answers sound good. Should we report the results?
So, you don’t know the answers to all these questions? That is the point. If you don’t have enough of the answers…don’t trust the poll result…especially one with a +/- 7point differential (as many now hide).
Simple as that.
As always…you decide.
Subscribe to:
Posts (Atom)