subject
Mathematics, 06.12.2021 22:50 dorianhenderson987

The Nielsen Company conducts survey each year about the use of various media, such as television. A representative sample of adult Americans was surveyed in 2010, and the mean number of minutes per week spent watching television was 576 minutes and sample standard deviation was 60 minutes. An independently selected representative sample of adults was surveyed in 2011, the mean was 646 minutes and sample standard deviation was 80 minutes. Suppose that sample size in each year was 1000. Estimate the difference in the mean spent watching television in 2010 and the mean time spent in 2011 using a 99% confidence interval. Interpret the interval in context.

ansver
Answers: 1

Another question on Mathematics

question
Mathematics, 21.06.2019 14:00
Ataxi cab charges $1.75 for the flat fee and $0.25 for each time. write an in equality to determine how many miles eddie can travel if he has $15 to spend.
Answers: 1
question
Mathematics, 21.06.2019 20:30
Stacey filled her 1/2 cup of measuring cup seven times to have enough flour for a cake recipe how much flour does the cake recipe calls for
Answers: 2
question
Mathematics, 21.06.2019 22:30
What is 1.5% of 120 or what number is 1.5% of 120
Answers: 2
question
Mathematics, 22.06.2019 01:30
What is 60.9643790503 to 5 decimal
Answers: 1
You know the right answer?
The Nielsen Company conducts survey each year about the use of various media, such as television. A...
Questions
question
Mathematics, 18.05.2021 19:00