What you see is all there is

Once you know what it is, everything changes.

What you see is all there is

In Daniel Kahneman’s book, Thinking Fast and Slow, he illustrates an idea we are all too familiar with – but one we ignore far too often.

The first part of Kahneman’s work focuses on the wide array of intuitional biases we possess as a side-effect of how our brains work. These include items such as the Halo Effect, Priming Effect, and the Law of Small Numbers.

The one I want to focus on in this article is as imperceptible as any named effect. It causes us to jump to conclusions. It limits our knowledge while giving us a false sense of security in what we think we know. It’s responsible for overconfidence, ignorance, and poor decision-making.

Today’s topic is the fallacy of what you see is all there is (WYSIATI).

Specifically, I want to cover how this mind trick limits our options in the new digital economy. Why does it matter that we make decisions based on incomplete information? What would we (people looking to do something new or interesting or more financially rewarding) do differently if this bias didn’t exist?

How can we learn to see beyond what’s in front of us, so that we make the best overall decision, and not just choose from the best options currently available?

Near the end of this article, I will introduce you to an experiment I am tinkering with at the moment. An experiment that would take on the “What You See Is All There Is” fallacy head-on. The time and financial commitment would be rather steep, which is why writing about it in a public way is only step one in the process of deciding whether or not to pursue this route.

By the end of this article, you will have a much clearer idea of what the WYSIATI idea is all about, how it is likely limiting your life, and what you can do today to begin breaking out of it.

What you see is all there is - explained

To begin, let’s review what our minds do on a daily basis.

Our minds are “associative machines.” We are constantly looking at the world and deciding how previous information and connections can help us understand and navigate what we see.

To accomplish this, our brain divides the work between two “systems.” System 1 could be called our intuition. It works “automatically and quickly” with “little or no effort.”

This system is useful because most of our lives are filled with routines. We wake up, go to work, eat, talk, drive, watch television, and sleep. Practicing these habits carves pathways into the connections of our brains, which make it easy for them to repeat.

The problem is that because these routes are so easy to repeat, our brains sometimes choose them without considering all of the critical information.

This is where System 2 comes in to play. System 2 “requires attention.” It is responsible for “choice,” “concentration,” and “complex computations.” This analytical tool helps us apply self-control when needed. It is a rational machine that enables us to learn, perform, and grow.

But System 2 comes at a cost: it requires effort. Most of the time, our brains are lazy. Or more appropriately, we are.

We neglect to pay attention to the present, or the question being asked, or the thousands of bits of information continually charging at us in our 21st-century world. And so we default to System 1 more than we should.

We think fast when we should be thinking slow.

What You See Is All There Is is a function of System 2 defaulting to System 1 when it shouldn’t. Kahneman explains the effect in the following excerpt.

“An essential design feature of the associate machine is that it represents only activated ideas. Information that is not retrieved (even unconsciously) from memory might as well not exist. System 1 excels at constructing the best possible story that incorporates ideas currently activated, but it does not (cannot) allow for information it does not have."

— Thinking Fast and Slow, pg.81

Kahneman shows, time and time again, throughout his book that we do not make decisions as rationally as we would like to believe. We let unconscious effects influence our decisions unknowingly.

WYSIATI often leads us to believe we have more information than we actually do. When in reality, our brains “fill in the gaps” with memories, feelings, and unconscious cues. We fall for “one-sided evidence” because our brains have a bias towards belief. Or more clearly, a bias to believe what they already believe.

When it comes to the difficulty of making well-informed decisions, Kahneman captures the WYSIATI fallacy in the following sentence: “It is the consistency of the information that matters for a good story, not its completeness.”

The keyword is consistency.

When making decisions, we look for patterns as opposed to exceptions. We try to make the information we have say what we want it to, rather than objectively evaluating what we really see.

In religious studies, we call this error eisegesis vs. exegesis.

In case the image is too small, the comic reads "Don't bother me. I'm looking for a verse of Scripture to back up one of my preconceived notions." This is a religious example but the same thing is practiced in every field.

Exegesis is the practice of studying a religious text using objective techniques to construct meaning. The goal is to be as objective as possible. This usually requires one to state their biases explicitly, so that readers can consider that when reading their assessment.

Eisegesis is the religious studies version of proof-texting – the practice of taking quotes and other written work out of context so that you can shape your desired meaning. A skilled proof texter could make the Pope look like a terrorist-sympathizer. It's a dangerous tool in the wrong hands, which is why CONTEXT is one of the values I try to live by.

Bringing it back… how does this all tie back into the idea of WYSIATI?

The WYSIATI fallacy is essentially a form of mental proof-texting. Because the brain looks for consistency over completeness, it prioritizes information that supports the belief it already has and neglects (or gives less weight to) information that challenges that belief.

How WYSIATI impacts your career

Most people replicate the careers they’re familiar with.

Children of chefs become chefs themselves. Students grow up with teachers, and a few positive interactions at key points in their lives will turn a number of them towards that same profession. People watch television shows about lawyers and doctors, which reinforces their prestige and cements those careers as viable options in their brains.

These are the jobs they see, so these are the jobs they can realistically choose from.

Depending on their socio-economic and geographic context, people are much less familiar with roles such as technical writers, social media managers, and astrophysicists. Therefore, because of WYSIATI, these careers likely never become viable options for people – even if they may have been extremely gifted in these arenas – because their brains were simply unable to consider them.

Now, I understand that most of the evidence for this is anecdotal. But take a minute to think back… Have you ever found yourself stunned by a stranger’s career and thought to yourself, “I never knew that job even existed!”

Then, maybe, you found yourself questioning if it might have been a good fit for you – and you go on to review how you ended up in your current job in the first place.

It's not your fault this option never came up. We're incapable of making decisions based upon information we don't have, and because of WYSIATI, we are usually satisfied with much less information than we should be.

So how does this apply to our new digital economy?

Right now, the internet appears to be filled by two extremes when it comes to the new age of alternative careers. On the one hand, you have the super-earners who are making seven figures in the realms of blogging, YouTube, and related social media endeavors. On the other, you have intelligent, capable people in these same positions making little to no money whatsoever.

Why the divide?

First, our brains were built to notice extremes. We are pattern-recognition machines, so when something or someone breaks a pattern we are familiar with, like an 18-year old college-dropout earning $50,000+/month on social media, we notice.

Because it's an extreme, we're able to remember this example easily. Then, due to the Law of Small Numbers, because we're able to remember that case so easily – we believe it's more prevalent than it is.

This bias works in both ways. If we hear a story about a person who tried to establish a career as a freelance writer and ended up broke, burnt-out, and homeless, we're going to assume that option is more dangerous than it is.

Second, because the digital economy is still relatively new (<50 years old), we lack a large enough sample of established routes to making these careers viable options.

We all know what it takes to become a doctor: get an undergraduate degree in biology, attend medical school, complete a residency, and pass your licensure exams.

But what about becoming a beauty vlogger? Or a food blogger?

We can find various examples, but nearly all of them took a different journey to get to where they are today. Some have degrees; some don't. Some use Facebook exclusively, while others spread their brand across every available platform.

Remember, when our brains make decisions, the consistency of the information weighs heavier than its completeness.

Therefore, because we lack a large enough set of consistent examples for how to make alternative careers work in the new digital economy, we don’t consider these roles to be viable options.

My argument is simple: they should be.

Twenty years from now, I believe it will be as logical to become a full-time blogger as it is to become a lawyer today, or it was to become a secretary 50 years ago. The internet has made countless more options available, but until those options meet certain criteria, we’ll never seriously consider them to be real options for us.

The experiment

I think one of the first criteria is that our expectations must shift.

If someone offered you an $800 course on tax law and told you that within 30 days, you would be making $1,000,000+/year as a tax lawyer – you’d laugh in their face… or report them for false advertising.

There are rules around becoming a lawyer, tests you have to take, and standards you have to follow. Becoming a lawyer is a real career decision that requires a substantial investment followed by substantial upside (high income, prestige, job security).

So then, why do we approach digital careers with such immature expectations? I know I have.

My B.A. and M.A. degrees cost me well over $100,000 in total and netted me jobs which paid in the $40k - $60k salary range. Why do I believe that a $500 course is going to double my income in the next six weeks? And then when it doesn’t, I feel as though I’ve been scammed or that the industry doesn’t work?

Insane results do happen for people, but they are the exceptions, not the rule. Many people take online courses in hopes of dramatically improving or changing their careers. Most of those people never do either.

Again, it's not (entirely) their fault. Their expectations were wrong, and so their outcomes were nearly doomed from the start. But what if we took a different approach?

What if we treated digital economy careers like other job fields?

What are you willing to invest?

I’ve thought a lot about how hard I’ve worked to get to where I am today: self-employed with passive income, a healthy financial cushion in the bank, with the opportunity to experiment.

Even if you don’t exactly love what you’re doing, chances are you’ve worked incredibly hard and invested lots of time and money to get there.

Recently, I began researching going back to school to get an additional master's degree. On the one hand, I could pursue an MFA. The degree would:

  • Help me improve my writing
  • Give me the opportunity for more college-level teaching
  • Build my writing-related network.

Option two would be an MBA. This route is attractive because:

  • It would open up opportunities to consult
  • I could develop a business-niche writing brand
  • Attending a prestigious program would lead to financially-rewarding prospects.

Both have their pros and cons, and as I was working through my research, I started to think about how the WYSIATI bias might be unintentionally guiding my conclusions.

What if there was a third option? Something outside of the traditional route that could provide me with not only better opportunities, but also a better lifestyle than the standard options?

This was when the experiment began to take shape in my mind.

What if I pursued a personal master’s degree in blogging?

This idea owes its roots to the personal MBA experiment Tim Ferris conducted a decade ago. Instead of attending a standard program, he took the same amount of time (approximately two years) and money (roughly $100,000) it would have cost him and shaped a syllabus around what he wanted to learn instead.

I found this fascinating because Tim was willing to invest the same resources, just in a different way, and for a potentially much better outcome.

At the end of his traditional route, he would have had a piece of paper and some great connections. But by choosing this alternative path, he had made much more money than invested, created incredible human relationships, and crafted a great story – all of which are more valuable in his market than a piece of paper.

A detailed look

Here’s what an experiment like this could look like:

According to the National Center of Educational Studies (part of the research arm of the U.S. Department of Education), the average salary of a person with a Master’s Degree in the U.S. is $65,000. (For this experiment, let’s assume that other factors such as age, experience, and location are all equal).

It's a healthy living wage for most people in most places in the U.S. Your monthly gross at this income level would be right around $5,417. This would put you around the 22% tax bracket ($14,300 per year in taxes), so your take-home pay would be about $4,225 per month. Here’s how this would break down into a budget using the 50/30/20 rule:

Now that we covered the outcome we would like to reach, let's talk about the investment.

According to research by Sallie Mae, the average master's degree takes two years to complete and costs $24,812 per year, which brings the total investment required to $49,624. This money would cover items like education (trainings, courses, conferences), materials (books), software, and more. It wouldn't include most ordinary living expenses.

The question

Given the average person looking to change careers from X (some unrelated field) into blogging (or some other modern, digitally-focused role), could they achieve this goal by following the same standards of a traditional route’s requirements (time and money)?

Or more simply:

Can I create a blog that makes at least $65,000 a year in profit within two years by investing a maximum of $49,624 to get there?

Other considerations

When I was thinking through this experiment and the objections people might have, one of the most pressing ones that came to mind was the job-security factor.*

Is building a blog more or less secure than getting a graduate degree from an accredited college?

As a former higher education administrator, I know this fact is up for debate. Graduate degrees aren't what they used to be, and our modern economy is beginning to recognize this. That doesn't mean they aren't extremely valuable in many cases. The question is less about what is right in general and more about what is right for you.

Where would you like to be ten years from now? And will a large, established online presence and audience be more or less beneficial to you and to where you want to end up than an advanced degree?

Or vice versa?

You are the rule

I wrote this article primarily for this reason: to convince you that you have more options than you might think.

You’re only as stuck as you allow yourself to be.

I don't know how viable this experiment really is, and I don't know if I'm ready to take on a challenge that will require all of my resources and attention for two full years. But getting this idea out into the world is step one, and I'm proud of that.

I believe, strongly, that we are capable of so much more than we think. But to break those barriers, we must push past the limitations our minds and environments put on us. The vast majority of them don't even exist.

When we take control of our expectations and intelligently apply our resources towards what we truly want, that is when things which once seemed impossible begin to appear inevitable. As Dr. Athmane Tadjine, a physicist and Medium contributor, writes,

“Yes we are all unique, but none of us is an exception. Initially, we are all the rule. Then two groups emerge: the ones who know they are the rule, and work hard to become an exception (and eventually succeed), and those who think they are the exception, and simply stay the rule forever.”- Dr. Athmane Tadjine

As soon as something becomes visible, it becomes possible. If the What You See Is All There Is bias has kept you small, start seeing more.

You may be only one experiment away from extraordinary.

Notes: