# The Moral Obligation of Teaching

Anyone who teaches in the painting industry assumes a moral responsibility: to engage with the best available field evidence before giving advice that influences someone's livelihood. This article explains why that obligation exists, how the industry has failed to meet it, and why conventional practice is no longer acceptable, given decades of research that have established a clear baseline.

When someone positions themselves as a teacher and an industry leader, and people make career decisions and business investments based on their guidance, they assume a moral obligation. Not a legal one. Not a contractual one. A fundamental ethical responsibility that comes with claiming authority: **to engage with the best available data in your field.**

This isn't about being right. It's about being honest about what you know, what you don't, and whether you've done the work to find out.

#### The Professional Standard Across Domains

In medicine, a doctor who treats patients without reviewing current research is negligent. In engineering, designing structures while ignoring testing data is malpractice. In finance, managing portfolios without analyzing market data is fraudulent.

In every field claiming professional status, there's a baseline ethical requirement: **engage with proven field data, or acknowledge that you haven't.**

The painting industry has no such standard.

That void is the reason Jack Pauhl was established. For 100 years, the industry operated purely on anecdotal experience—every painter had a different "opinion" or "best practice," manufacturers couldn't agree on application methods, and six different Sherwin-Williams stores gave six different recommendations to the same question.

The confusion reaches every level—job sites, retail, manufacturing, and so-called experts.

The 200,000 pages of comprehensive field research, developed over 40 years, is the direct result of recognizing that failure and compiling the data necessary to move the industry forward.

This is arguably the longest continuous study of industry knowledge transmission ever conducted—35 years of engagement with painter groups, forums, and platforms, documenting every question asked and every answer given, then verifying accuracy against actual field performance. Most researchers study industries from the outside. This research was conducted from within, serving as both practitioner and observer, and allowing for the distinction between what actually works and what the industry teaches.

The goal isn't to convince anyone of anything—it's to document what is demonstrably incomplete, harmful, false, or misleading so it can be removed from practice. Not add more opinions.

Why farm industry data? Because if you really want to know the best course of action to take with a high degree of certainty, that's what you do. Otherwise, it's just speculation under the guise of business advice.

The absence of that standard has produced measurable harm.

#### What Happens When Leaders Don't Keep Up

When 200,000 pages of documentation exist—**research that didn't exist before**, covering every imaginable aspect of running a painting business across four decades—and every "leader" in the industry maintains complete silence about it, that's not an accident. **It's a choice.**

The equivalent of this is attempting to write a book review without having read the book. Worse, many of these leaders don't even know the book exists while continuing to teach from their old, outdated practices and 'personal' experiences.

That choice has consequences.

#### Painters Invest Based on Incomplete Information

When a painter chooses a business model, purchases equipment, selects products, or develops methods based on what you're teaching—and you haven't verified your guidance against the most data available—you're treating their livelihoods carelessly.

Business owners run high-overhead volume operations because that's what industry leaders model. They purchase products and consulting because that's what industry leaders promote. They follow strategies because that's what industry leaders teach.

If those leaders haven't examined the data showing that more efficient approaches exist, they're not just uninformed; they're actively misleading. They're causing measurable harm through **willful ignorance**.

#### Inferior Practices Perpetuate Across the Industry

By continuing to teach conventional methods without checking whether research has identified better approaches, leaders are actively perpetuating costly and inefficient outcomes across the industry.

The B Test, showing that 32 commercial primers failed to outperform self-priming paint, isn't just interesting data. It's a potential six-figure saving in product and labor costs. It's information that affects purchasing decisions, project costs, and job outcomes. When industry leaders never reference it, never test against it, and never acknowledge it, painters keep buying products that don't deliver the claimed advantage.

What's equally damaging is what they don't disclose. When teaching a business model, recommending a product, or promoting a method, professional responsibility requires acknowledging known problems, limitations, and failures. These leaders rarely do. It's unclear whether this omission is intentional—avoiding information that might undermine their authority or their sponsors—or whether they're unaware because their oversimplified view of the industry never forced them to investigate deeply enough to recognize these problems are the result of their own advice. Either way, the result is the same: painters implement strategies without understanding the risks, then struggle with predictable problems that were never mentioned.

That's systematic misdirection through omission.

#### The Economic Incentive Problem

Many industry leaders can't afford to examine the research because their business model **depends on keeping current practices intact.**

If you're partnered with manufacturers, running a volume operation, teaching methods that justify high overhead, or building a platform on being relatable rather than thorough, then confronting data that contradicts any of that threatens income.

I've turned down multiple product-endorsement offers worth $10,000+ because the products didn't meet field requirements. That's not virtue—that's the minimum standard when you've positioned yourself as a source of reliable information. If you can't say no to income that contradicts your research, you're not a teacher with a business model—you're a salesperson with an audience.

Someone without decades of field research might not recognize that these products lack what it takes to be useful in the field—they might genuinely believe the manufacturer's claims and accept the offer in good faith. That's precisely the problem: these companies get their product feedback from people who don't have the knowledge to know any better. One manufacturer, after I declined their offer, responded, "We appreciate your integrity." Even they recognized what professional responsibility looks like.

But here's the ethical reality: **Your business model doesn't override your moral obligation to the people trusting your guidance.** If staying profitable means avoiding information that might improve your followers' outcomes, you're not in education—you're in exploitation. You've prioritized your profit model over your moral contract.

#### What This Means for Followers

If you're a business owner learning from someone positioned as an industry leader, consider whether they cite research or just personal experience. Do they reference comprehensive field data or avoid it? Ask whether they test their methods against alternatives or teach what they've always done. Do they acknowledge limitations in their knowledge or present everything with certainty?

People trust these leaders because they believe they've done the work to stay informed. If they haven't, that trust is being exploited.

**Real expertise welcomes examination.** Performative expertise avoids it.

If someone has been teaching for years without ever referencing the most comprehensive field research in the industry, that tells you something about whether they're truth-seeking or gatekeeping.

Many of these leaders frequently talk about "growth mindset"—yet if they had one, they'd engage with the research that challenges their assumptions. That's not a growth mindset; that's "do as I say, not as I do." A growth mindset would compel you to examine the evidence, especially when it contradicts what you've been teaching. Their avoidance reveals a fixed mindset disguised as buzzwords.

#### The Standard That Should Exist

In any field claiming professional status, there should be a baseline expectation: if you're teaching, you're responsible for engaging with the best available data in your domain.

Not agreeing with all of it. Not changing everything based on it. But at minimum: acknowledging it exists, evaluating it honestly, and either integrating it or explaining why you're not.

The painting industry doesn't meet that standard. Which means it doesn't meet professional standards at all—**it pretends to be professional without practicing it.**

#### The Blind Leading the Blind

This isn't just individual incompetence—it's industry-wide blindness. When the "leaders" don't know what they don't know, and the people following them don't know to ask for evidence, you get an entire industry operating on confidence and consensus instead of verified knowledge.

The blind leading the blind means painters trust "leaders" who've never done thorough research. Manufacturers trust "influencers" who can't distinguish between anecdote and evidence. Industry associations set standards based on what everyone does, not what actually works best. Retail stores parrot the manufacturer's claims without field validation. Everyone's teaching; nobody's researching beyond their "personal" experiences or immediate circles

<figure><img src="https://474306782-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F3YVknxQjTY2AXSlwtWgR%2Fuploads%2F7s5qz7Q6sUgH23J6Gqfk%2Fimage.png?alt=media&#x26;token=cfa49cf8-74fa-4d7c-a977-c82105cf184c" alt=""><figcaption></figcaption></figure>

A significant portion of the 35-year data set captures this dysfunction directly: tracking questions, verifying answers, and cataloging misinformation. A painter posts a legitimate technical question—305 responses. 34 correct. 271 wrong. The person leaves more confused than when they arrived. Sometimes, there are 500 comments with no correct answer. This pattern repeats across every topic, every platform, every year, decade after decade. The field data isn't just about what works—it's comprehensive documentation of what the industry teaches with confidence, even when it's wrong.

What these leaders fail to understand is that their advice and "best practices" can be traced directly back to the same painter forums where business owners discuss problems they are having by following online advice. The cycle is self-reinforcing: forum discussions produce consensus, leaders observe and amplify that consensus as "best practices," new painters follow that advice and encounter problems, they ask about those problems in forums, and the cycle continues. By the time painters are struggling, most can't even remember who gave the advice or where they found it. The industry isn't building knowledge—it's circulating bad advice.

The system perpetuates because no feedback loop exposes the blindness. People following bad advice don't know it's bad—they assume their outcome is normal because everyone around them achieves the same. When everyone operates at roughly the same level of competence—good enough to function, not thorough enough to know what they're missing—they all confirm each other's authority. They mistake years of repetition for expertise, volume for validation, and social proof for evidence.

If you're teaching and you haven't engaged with the most comprehensive research on painting business practices, you have no way to assess whether your teaching is accurate. You're comparing your knowledge against what? Other people who also haven't done research? Your own limited experience? Manufacturer claims? You have no baseline. You're operating blind.

This is why structured field research is so threatening. It's not just about being informed—it's about proving that an entire industry has been operating in the dark, and that most of its "leaders" still can't see it. That's what happens when an industry has no epistemic standards and confuses consensus for knowledge.

#### Closing

When you teach without examining the data, you're not leading—**you're guessing while pretending to know.** And the people who trust you are paying the cost of that pretense with their businesses, their efficiency, and their outcomes. That's not a difference of opinion. **That's an ethical failure.** And decent human beings don't engage in that kind of exploitation.

Until the industry recognizes it, "industry leader" will continue to mean "person with mic" rather than "person with responsibility and accountability."

The 200,000 pages of field research fill an entire pallet. When you've spent decades documenting, testing, and verifying every claim the industry makes—**are they really going to tell me I need to use a drywall primer? Or that we can apply latex over oil? Or that I need more leads and marketing? Or reference the PCA's tiny sample of industry data as if it constitutes a clear industry benchmark?**

That's what decades of thorough investigation produce—**a baseline that makes conventional practice obsolete**. Unless they have their own pallet of documented research, what do they have to offer besides repetition of what's already been proven problematic, costly, or insufficient?

If you're genuinely committed to professionalizing the industry, **this is where it starts**.
