Tuesday, March 8, 2022

Fake Data?

Back in 2017 my youngest and I were in a children's museum when a woman approached asking if William could participate in the research she was conducting. I agreed and after signing a bunch of consent forms, the two of us were ushered into a small room. On the table was a few shapes of different colors and a wooden box. The woman mixed the shapes up like a shell game. William was instructed to select a shape and put it atop the box. He chose a yellow circle. The woman said,

"Ding!"

She shuffled the shapes, then he selected a blue circle. The woman made a buzzer sound. In between selections, she fervently shuffled the shapes. She administered the test so rapidly that I didn't immediately pick up what she wanted, a shape or a color. Then she said, as she produced a new box,

"The first box is pretend. This new box is real."

She shuffled the pieces, and William selected a blue square. When he put it on the box, a real buzzer sounded. William selected a red circle, and the buzzer sounded again. Eventually he stumbled upon a yellow triangle and was rewarded with a bell. The woman shuffled the shapes, then William selected a blue rectangle. The researcher looked to me and exclaimed,


"It happens every time!"

Still unsure, I asked, "What happens?"

"They select the wrong color during pretend and real."

I didn't pick up what she was looking for due to the speed at which she executed the test so it was doubtful that William did either. She added,

"It's to prove that play is a poor way to learn."

Now, if you are doing research, ideally you want to break new ground, refute long standing beliefs, espouse something controversial. We've heard for years that kids learn through play, but this research refuted this.

When we got home I fashioned a "pretend" and "real" box along with an assortment of colored shapes. I had William select a shape. I used the pretend bell sounds for blue shapes and the buzzer in response to William selecting any other colored shape. I reinforced this through three runs with the pretend box because, while I'm no child education specialist, I do know that you have to reinforce behavior with multiple examples at his age. Any parent who is teaching their child to say "thank you," and "please," knows this. When I exchanged the pretend box with the real box which used the sound effects from my smart phone, William selected the blue shapes right off. Imagine that, reinforcement during play did, in fact, teach William what he needed to do in the real world.

In the early 80's, I read my first article that suggested pets make people live longer. The health benefits were deduced from a study that showed on average the blood pressure of pet owners was six points lower than for non pet owners. Although the authors declared the result "statistically significant," average alone is a poor measure from which to draw conclusions. Depending on the sample size, all you need is a few outliers to skew the results. My wife's blood pressure is low almost all the time without Dinckes the Dog in her lap. Mine reads like a basketball score. It would be easy to bias an average by giving people like Christine a pet and letting me fly solo.

Some of the subsequent studies that drew the favorable pet conclusion were performed in such a way that researchers knew which subjects were exposed to pets and which were not. In some cases, the blood pressure was taken with the pet in the subject's lap. Non blind studies are ripe with skewed data. Since the researchers are likely pet owners themselves, they sought favorable result. Many of these pet studies were funded by advocacy groups like the Pet Food Institute (PFI), which clearly has a vested interest in more pet ownership.

There are some blind, longitudinal studies performed in Australia that garnished medical information from surveys and usage of health services based on information from medical insurance companies. The results showed that pet owners reported more depression, poorer physical health and higher rates of pain medication usage. There was also a higher level of psychoticism asscioated with pet ownership.

For years, doctors have been telling me that my sinus problems are due to limited exposure to pets although my family had cats throughout my childhood. About half the research says that exposure to pet dander and fur helps with allergies and half says it exasperates them. With such diametric results, how does anyone know what is real research and what is fake data?

A relative of mine returned from a company sponsored seminar on diversity in the workplace. She reported,

"They presented results that concluded companies with a diverse workforce make more money."

I tend to believe that diverse companies give opportunities to the most qualified people, leading to a more innovative workforce. It got me thinking though. What if the data showed that the most profitable companies were staffed by a specific ethnic group? Would they publish this result?

Back in 1994, psychologist Richard J. Hernstien and political scientist Charles Murray published, The Bell Curve, in which they tried to explain the differences in intelligence in American society. The authors claimed that inherited and environmental factors were more important than socioeconomic status when predicting success. They garnished their findings from a study conducted by the Department of Labor called the National Longitudinal Study of Youth as well as the Armed Services Vocabulary Aptitude Battery (ASVAB), which comprises ten tests taken by military applicants.

The book was controversial because the authors stated that a possible reason for low test scores among certain minority groups might be genetics. The authors did not say that minority groups who performed poorly on standardized test were genetically inferior. They said that as scientists they were obliged to study every hypothesis to systematically rule out those that were incorrect. The book was widely criticized in the media, and it's likely that most of us wouldn't even known about it today, let alone have read it, without all the negative publicity it received. Some of the criticism of the time claimed that the authors cherry picked the data that supported their hypothesis. Even so, it was published. I doubt that today such a book would ever make it into print.

In March of 2017, Charles Murray was invited to speak at Middlebury College in Vermont on his latest book, Coming Apart, and how it corresponded to the 2016 presidential election. He was shouted down by protesters. One student, described the protest as,

"Democracy in action."

Unfortunately, he was wrong. Censorship due to disagreement is not democracy. People today are fond of saying that they "follow the science," but when societal pressure cancels an idea, we are no longer subscribing to the rigors of science. Instead, we're following the mob.

I was once discussing a discrepancy between an analytic solution and the data with a professor I worked with at an engineering firm. I was certain that my analysis was correct. The professor, a brilliant educator who taught a graduate level class in fluid mechanics said,

"The data is always right."

Given the set of conditions under which the data was taken that assertion is most definitely true. If we reject a hypothesis based on how it makes us feel, we can't possibly draw any meaningful conclusions, and as such we won't solve any real problems.

Editor's Note: Originally posted on March 23, 2017.

No comments:

Post a Comment

Blog of Done

Ten years ago my wife, Christine and our two boys, Aidan and William, and I were on vacation in a warm place with our friends from Californi...