PSYC 420 LU Psychology Epistemology and the Christian Worldview Essay

PSYC 420 LU Psychology Epistemology and the Christian Worldview Essay

Description

 

 

Epistemology Assignment:

Define epistemology and describe the 3 models of how certain we are that our perceptions mirror
reality. Which position do you hold, and why? What are the different methods of knowing
(see Entwistle chapter 5)? What are the limitations of these (or any) methods of knowing? What
methods of knowing are appropriate for Christians, and why?

 

Unformatted Attachment Preview

5 The Pursuit of Truth: Epistemology—Ways of Knowing For I do not seek to understand that I may believe, but I believe in order to understand. For this I believe—that unless I believe, I should not understand. —Saint Anselm If a man will begin with certainties, he shall end in doubts; but if he will be content to begin with doubts, he shall end in certainties. —Francis Bacon Objective evidence and certitude are doubtless very fine ideals to play with, but where on this moonlit and dream-visited planet are they found? —William James I was giving an abnormal psychology lecture on psychosis. As I often do, I used an appropriately disguised clinical vignette to illustrate the material. I recounted the day that I had walked into the patient lounge of the psychiatric hospital, the blue haze and pungent aroma of cigarette smoke hanging heavily in the air. Jimmy sat on the couch, playing some incredibly good licks on his acoustic guitar. “Hey, Jimmy,” I said as I sat down, “How long you been playin’?” Jimmy suddenly became wide-eyed and deeply animated. “You ain’t gonna believe this, man! I was at a Cat Stevens concert once, and I was out there singin’, an’ Cat looked down at me—and I ain’t never met the guy—and he says, ‘Hey, you! Yeah, you! Come up here an’ play guitar for me!’ So I hopped up on stage and started playing guitar. I ain’t never played before! I was jammin’ with Cat, and I been playin’ ever since!” After telling my students this story (and explaining that Cat Stevens was a rock and roll star whom most of them had never heard of), I asked, “Now, how many of you really believe that Jimmy played guitar with Cat Stevens?” Rebecca’s hand shot up in the air. “I do,” she said. “You can do all kinds of things when you’re possessed by a demon!” I had planned to talk about delusional systems, but the discussion ended up taking a brief detour on the way to understanding psychosis. Jimmy was convinced that he had learned to play guitar on stage with Cat Stevens. I had read Jimmy’s chart; I was convinced that he suffered from delusions caused by a psychotic break and a long history of hallucinogenic drug abuse. Rebecca was convinced that Jimmy was demon possessed. Clearly, we could not all have been right, but we were each convinced that we were. While Jimmy, Rebecca, and I came to starkly different conclusions, our worldviews were actually similar. We diverged not so much on our understanding of the basic nature of the world as we did in our reality testing abilities and the way we went about evaluating knowledge claims. Unfortunately, drugs and disease had greatly limited Jimmy’s ability to evaluate his own claims—which is why his belief system is labeled “delusional.” Rebecca and I differed regarding assumptions that we made about how to evaluate and classify extraordinary claims. Rebecca’s lack of experience with the mentally ill, coupled with the beliefs espoused by her church, led her to discount my explanation and to offer the alternate conclusion that Jimmy was demon possessed. My own worldview, shaped by my denominational background, my education, and my experience with the mentally ill, added to the information contained on Jimmy’s chart, led me to conclude that Jimmy was delusional and Rebecca was wrong. Despite our conflicting points of view, Rebecca and I shared a common Christian faith, and all three of us possessed worldviews shaped to a large degree by our common culture. Since our worldviews were similar, we must turn in a different direction to explain why our conclusions differed so starkly. It is at this point that we must narrow our focus from broad questions about worldviews, to more specific questions about how we can evaluate knowledge claims. • The Enlightenment—a period of Western thought and culture, stretching roughly from the mid-seventeenth century through the eighteenth century. Rationalism and empiricism became its major epistemic frameworks, with doubt cast upon reliance on authority and church dogma as sources of truth. It was accompanied by revolutions in science, philosophy, culture, and politics. • Positivism—a philosophical system that emerged during the Enlightenment which rejects metaphysics and seeks rational or empirical justification of truth claims. • Empiricism—a method for seeking knowledge based on sense experience, which gave rise to the scientific method. It is a result of positivism. • Modernism—a philosophical movement in Western society spanning late 19th and early 20th centuries which tended to see knowledge as based on human objectivity and which optimistically expected that technological advances would lead to social progress. Modernism was often accompanied by a shift away from religious belief. • Post-modernism—although difficult to define, post-modernism is a latetwentieth century western movement that is generally suspicious of the “grand narratives” of modernism, claims of objectivity, and claims of inevitable progress. Situating Ourselves in History For many readers, this chapter will contain words that they may be unfamiliar with, words such as epistemology, Enlightenment, positivism, empiricism, modernism, and postmodernism. All of these words relate in some way to the possibilities and difficulties of obtaining knowledge, so it will be instructive to take a few moments to look at our place in the history of ideas. Epistemology is the branch of philosophy that deals with the grounds and nature of knowledge. It would be nice if we could proceed straightaway to cataloging a handful of epistemic methods and—voila!—we could use them to obtain knowledge and be sure of our findings, but the task ahead is a bit more complicated than that. The Enlightenment was a period in Europe during which the authority of the church had come into question. People began seeking other sources of authority. Gradually, the belief emerged that we could use precise epistemic methods to vouchsafe our certainty. Chief among these epistemic methods were logical reasoning and empiricism. It was thought that we could essentially become our own authorities, knowing with absolute certainty what was right and what was wrong. This became the hallmark of modernism. The belief in the human capacity to function as an independent authority gave rise to another aspect of modernism: the myth of progress. People began to believe that we could know things with god-like certainty, and that we could solve all human problems. We could solve the problems of starvation and disease. An endless movement toward progress would do away with war and conflict. Modernism was an incredible expression of pride and self-aggrandizement. True, modernism did have many beneficial results, not the least of which was the development of the scientific method and the many things that it provided. But eventually, the naïve belief in inevitable and eternal progress collapsed. World War I and World War II dealt stunning blows to the belief in progress. It also devastated the belief that human nature is essentially good. The deaths of millions of people in combat, concentration camps, and bombing raids stood as a monument to how “progress” could lead to “better” weapons but not necessarily better people. As modernism collapsed, it did so incompletely and unevenly. Existentialism emerged (or reemerged) as a challenge to objectivism. Postmodernism evolved as a rejection of modernity’s epistemic structures. As we will see, the pendulum swung from the modernist belief in authority centered on self to postmodern doubt of all authority. Some postmodernists believe that these vestiges of modernism ought to be dispensed with altogether. They argue that people use knowledge claims to wield power, and that the epistemic methods of modernism have become tools to maintain the status quo. By way of example, it is easy to document “scientific” claims and rational arguments that have been used to support slavery, racism, sexism, and other social inequalities. Postmodernism has offered a stunning critique of modernism, especially its assumption that we can be completely objective and that we are marching inevitably toward progress. Postmodernism underscores the degree to which knowledge claims can be abused, as well as the many obstacles that militate against epistemic certainty. Despite the collapse of the modern myth of progress, the epistemic methods of modernism continue to be very influential, and are still the mainstay of most quantitative social- science research. However, faith in modernist epistemic methods as mechanisms to uncover truth with complete objective certainty is in shambles. Elizabeth Lewis Hall provides a good description of how one can respect many of the valid criticisms offered by postmodernism without completely abandoning the useful methodologies that came from modernism. “The vision presented here begins with … [the assumption of] ontological realism, but with epistemological modesty. In other words, this approach will appeal to those who acknowledge the existence of objective truth, but recognize some limitations in our ability to apprehend it. In this way, it differs both from a traditional modernist view, and from a radical postmodern approach which would question the existence of objective truth.” We can no longer sustain the overconfidence promised by modernism, but its methods are still useful. A more humble claim seems warranted, namely, that some of the epistemic methods of modernism may be useful, imperfect tools that can aid in our search for understanding. Tentative Certainty One of the assumptions that we make about the world involves how well our perception of the world mirrors reality. At one extreme are naive realists, who believe that there is a direct one-to-one correspondence between perception and reality. At the opposite extreme are radical antirealists, who believe that there is no necessary correspondence between perception and reality; the biases of the individual and community, along with the assumptions of our theories, determine what we see, since “all observation is theory-laden.”Critical realists take a middle ground, believing that while assumptions and biases color perception, reality imposes some limitations on interpretation. Critical realists recognize that assumptions and biases affect data interpretation, but they also believe that assumptions and biases can be evaluated (at least to some degree), and that interpretations can be judged by their fitness with the data. From a critical realist perspective, what we see depends, to some degree, on what we expect and are predisposed to see. As we noted in the last chapter, the window through which we view the world both frames and obscures what we can see. “Our thinking starts from somewhere, and not from absolute objectivity.” Our thinking is inevitably shaped by cultural and historical contexts; “all human understanding, including the understanding that engenders the specific social sciences, is based on historically situated and tradition-saturated beginning points.” Our ability to know is both dependent upon, and limited by, the assumptions of our worldviews. This being the case, we would be well advised to discern how to evaluate whether or not our conclusions are accurate. The first thing we might note is that any of us can have errant thinking—whether due to the effects of drugs, delusions, dementia, or deceptive reasoning. Most people would agree that Jimmy’s reality-testing abilities were severely impaired by his inability to separate reality from his own delusions. Attempts to convince him that his perceptions were hallucinatory were futile. It rarely makes sense to argue with people who are psychotic or drugged about the nature of reality, because their perceptions seem accurate to them. While medications can sometimes help individuals who are cognitively impaired, regrettably, there are many cases in which the future simply becomes increasingly bleak as a person’s cognitive faculties fade. Fortunately, these conditions are the exception rather than the rule, but they must be taken into account. To a certain degree, we are left with the conclusion that unusual claims made by a small percentage of the population are suspect unless sufficient support for those claims exists. Unfortunately, while there is safety in numbers, the crowd can be wrong; things that are commonly believed to be true can be false. Human cognition is amazing, complex, and usually dependable. While our mental processes typically allow us to navigate our world successfully, they are imperfect. Few of us suffer from severe delusions, but we all experience limitations that affect our reasoning abilities. Human memory is imperfect. Reasoning can be flawed. Our experiences prime us to think about the world in ways that influence our conclusions. The frailty of our thinking imposes limitations on our quest for certainty. The assumptions and beliefs that are widely held in our cultures and subcultures can also impede our attempts to gain knowledge. Rebecca, for instance, was primed by the teachings of her religious group to assume that unusual abilities could be imparted by demons. All of us are affected by assumptions that are entrenched within our cultures. Consider, for example, that prejudice is typically learned from culturally embedded attitudes. In relatively recent times, in the Deep South of the United States, white supremacists concluded that white people were superior to black people, often using a perversion of Christianity to support their conclusions, their message, and their actions. Internet sites still spew their message of hate, often framed in “Christian” language. Black churches were firebombed and burned by churchgoing whites. African Americans were lynched and shot by white-hooded mobs and beaten by police in the streets. Even the white churches largely failed to stand up for justice. Martin Luther King Jr. once remarked, “Well, the most pervasive mistake I have made was in believing that because our cause was just, we could be sure that the white ministers of the South, once their Christian consciences were challenged, would rise to our aid … As our movement unfolded, and direct appeals were made to white ministers, most folded their hands—and some even took stands against us.” The noetic effects of sin—distortions of human reasoning caused by individual or corporate sin. Classic treatments of this phenomenon assume that sin affects some things less (such as mathematics or chemistry) and others more (such as ethics and theology). The sins and assumptions of a culture surely affect the sins and the thinking of its resident members. Phillip Yancey, in a remarkably transparent confession of the racism that he was raised with and once practiced, provided this sobering reflection: “Only one thing haunts me more than the sins of my past: What sins am I blind to today?” The sins and assumptions that pervade our culture are blindspots in our moral and epistemic frameworks. Sadly, we must conclude that culture can adversely affect our thinking. We must also acknowledge that our own human sinfulness affects human thinking. Lack of virtue may be expressed in intentional fabrication. Personal sin can warp our thinking about our own culpability or the effects of our behavior on others. Rationalization, and a host of defense mechanisms, can distort our perception of truth. Theologians use the phrase the noetic effects of sin to describe the ways that sin distorts human thinking. Emil Brunner, for instance, noted that sin affects thinking to greater or lesser degrees. For instance, mathematics and science are less affected by sinful human thinking than are ethics, theology, and the humanities. Sin has a tremendous ability to skew our thinking about our relationship to God and about how we ought to live. Because the social sciences involve our personal and social lives, Brunner believed that we must explicitly consider how the noetic effects of sin can warp our thinking in these areas. This is especially important when we consider the goals toward which we strive. Should we, for instance, encourage people to seek their own happiness above all else? Alternatively, should we encourage them to seek to be faithful to God’s call and commands? The purpose of human life is not morally neutral, and personal sin can skew the goals toward which we strive. It would be bad enough if we only had to concern ourselves with the detrimental effects of personal sin, but the effects of sins that are culturally imbedded also create epistemic problems. The noetic effects of sin are expressed in corporate sins, such as racism, sexism, failure to minister to the plight of the orphan and the widow, and moral laxity. Contemporary American culture may promote materialism, narcissism, and self-aggrandizement to a degree to which we are largely unaware, but which nonetheless affects us in profound ways. Christians in the social sciences need to think carefully about how sin affects our thinking about human individuals and social systems. The detrimental effects of drugs, disease, cultural bias, and personal sin can distort human thinking. Human thinking is also limited because it is the product of imperfect, finite creatures. We never have access to all the data, and even if we did, we could never comprehend it in its entirety. While our thinking is a remarkable and powerful phenomenon, we must remember that from a Christian perspective, we are frail, fallen, and finite creatures. How can we know anything? Since all of us begin with worldview beliefs that shape the way we respond to the world, are raised in cultures that mold our thinking through education and everyday experience, and suffer the effects of minds that are finite, frail, and fallen, we must conclude that human beings can never eliminate all error from our thinking. As James Guy concluded, “To recognize that man is unable to fully know the truth is to risk some degree of skepticism. However, in the search for truth such a realization is necessary in order to avoid illusion and fantasy. Dialogue amidst this uncertainty serves as a catalyst for greater accuracy and understanding in conceptualizing truth.” We should react with humility as we grasp the significant limitations placed on our epistemic efforts by our finitude, the frailties of our minds and bodies, and the fallenness of our moral reasoning. This state of affairs should also prompt a deep appreciation that God loves us despite our failings, longs to redeem us, desires to use us to bring restoration to the brokenness of the world, and is Himself actively at work in the world and sovereign over it. As the old hymn proclaims, Oh, worship the King, all glorious above. Oh, gratefully sing his power and his love; Our shield and defender, the Ancient of Days, Pavilioned in splendor and girded with praise. Frail children of dust, and feeble as frail, In you do we trust, nor find you to fail; Your mercies, how tender, how firm to the end, Our maker, defender, redeemer, and friend! The hymn proclaims human limitations and God’s goodness, leading naturally to worship. So, too, our recognition of our limitations should lead to grateful acknowledgement of our dependence upon God and the majesty of His power and His love. At best, we can humbly try to evaluate our beliefs carefully enough to arrive at a contingent certainty; that is, if our assumptions are correct, and if we discern a coherent epistemology, and if we apply our epistemic methodologies consistently, then we can be tentatively certain about our conclusions. To hope for (or worse, to claim) more than that is to assert a god-like quality which frail, fallen, and finite creatures cannot attain. Epistemology and Its Necessary Virtues Epistemology is the branch of philosophy that considers the nature, possibilities, and limitations of knowledge. Epistemology is concerned, in part, with whether or not our knowledge claims can stand up to scrutiny in such a way that we can separate mere opinion from justified belief. If our efforts to obtain knowledge are to be fruitful, we will use methods that are appropriate to the type of questions that we ask. For instance, if we want to know if a new medication will decrease the symptoms of dementia, we will not rely exclusively on a rational argument—we will look for empirical data. Likewise, if we want to know about the nature of God, we will turn to methods other than empirical ones. Before we can discern and apply relevant epistemic methods, though, we must first possess certain cognitive qualities, such as the ability to reason well, and certain virtues, such as perseverance at the difficult tasks of doing empirical research or constructing careful arguments. Additionally, once we have obtained knowledge, we need to know how to use it for appropriate ends. In summary, then the minimum criteria for reasoning accurately and well and for using knowledge for good purposes in order to obtain and make appropriate use of knowledge are: a) that we possess the necessary intellectual qualities, b) that we exercise the crucial virtues, and c) that we competently make use of relevant epistemic methods. Most institutions of higher education address the requirement of intellectual aptitude by setting admission requirements, such as a minimum high school GPA and SAT scores. We may well question how effectively these requirements function as measures of intellectual aptitude, but let us assume, for the sake of argument, that our college classes are filled with cognitively competent students. If this were the case, could we then simply dispense information and instruct students in correct epistemic methods and be sure that they would be able to obtain and apply knowledge well? Clearly, the answer is no. Most professors are aware that students possess character traits that either facilitate or inhibit their learning. As one professor laments, “Our frustrations often concern not student’s lack of intellectual aptitude … but their unwillingness to exercise that aptitude well.” Some students have cultivated virtues that make it more likely that they will drink deeply from the academic well; others have cultivated vices that make it likely that they will drop out of school or earn a diploma without actually receiving an education. If we want to avoid such undesirable outcomes, we need to exercise stewardship of our own minds. I often remind my students that they have been given a rare and lavish gift. This gift requires stewardship on our part if we are to make the most of it. To maximally profit from opportunities to obtain knowledge and to use information for appropriate ends, we need to be both morally virtuous and intellectually virtuous. Moral virtues help us to moderate our passions rightly and to discern what is right and just, so that we aim towards the right ends. Intellectual virtues are character traits and habits that allow us to think well and, in turn, to employ knowledge well. So far, we have argued that in order to obtain knowledge and use it for desirable ends, we first need to be a certain kind of person, one who possesses particular habits of thought. Jay Wood highlights the importance of being intellectually virtuous: Thinking about epistemology as encompassing the pursuit of intellectual virtue … is important … for the simple reason that your very character, the kind of person you are and are becoming, is at stake. Careful oversight of our intellectual lives is imperative if we are to think well, and thinking well is an indispensable ingredient to living well … If we fail to oversee our intellectual life and cultivate virtue, the likely consequence will be a maimed and stunted mind that thwarts our prospects for living a flourishing life. According to Wood, “Seeking truth appropriately is a matter of seeking it in the right way, for the right reason, using the right methods and for the right purposes.” So, what are the intellectual virtues that will allow us to “seek truth correctly”? We will consider a few of these virtues: studiositas, intellectual humility, intellectual caution, intellectual courage, intellectual integrity, and intellectual perseverance. Studiositas Medieval scholastics made a distinction between the vice of curiositas, a vicious intellectual appetite that aimed to privatize and possess knowledge, and studiositas, an intellectual virtue that was similar to curiosity in the sense of having a desire to know or understand the world, but framed within the context of seeing knowledge as something that was a gift to be used towards right ends. Griffiths points out that curiosity was considered to be a vice by Augustine and the early church, whereas today we tend to blindly assume that curiosity is a virtue. Augustine compares curiosity to sexual desire outside of a committed marital relationship, and contends that studiousness, in contrast, is an “appetite for participation in what is given” by God. According to Griffiths, curiosity as a vice seeks possession and ownership of knowledge, much as one buys sex from a prostitute. It is ultimately arrogant and selfish. The studious person, in contrast, seeks knowledge not just to obtain a body of information, but to understand all things as gifts to be explored in appropriate ways and to be used towards godly ends. We are stewards of this gift, not possessors of it. Selfcentered curiosity stems from and exacerbates selfishness, arrogance, and possessiveness. On the other hand, if we are studious, we seek knowledge and its connection to God as stewards, and we find our diligence expressed in gratitude, worship, and generosity. Intellectual humility Intellectual humility is concerned with the ability to judge oneself accurately and to be open to correction and to the insights of others. Humility, as all of the virtues, stands in opposition to one or more vices. In this case, humility is opposed to such things as arrogance, conceit, selfrighteousness, and selfish ambition. Robert Roberts describes humility as “the ability, without prejudice to one’s self-comfort, to admit one’s inferiority … [or] superiority” in some way. An elite athlete could humbly and accurately say, “I am a very talented basketball player,” without being false or self-aggrandizing. Likewise, for a mediocre player to admit that he isn’t all that good isn’t just false modesty, it’s the truth. Humility as an intellectual virtue involves our recognition of our intellectual abilities and liabilities. It allows us to receive correction, feedback, or affirmation appropriately, and to develop and exercise our abilities unencumbered by arrogance or self-deprecation. Without intellectual humility, our pursuit of knowledge will be seriously impaired. Imagine that you are involved in a class discussion about an assigned reading. One of your peers condescendingly rejects the author’s position and claims that it has no merit or evidence in its favor. He confidently proclaims that the author is wrong. Having read the article carefully, you question your peer, thinking that he must have missed something in the article, only to discover that he never read the assignment! Clearly, there are numerous intellectual vices at play here, but surely one of them is a lack of humility, expressed in the belief that he could substantively critique a position that he had never actually considered. While this vice is clear, many of us are guilty of a lack of intellectual humility in less obvious ways. We do not read or listen to other points of view carefully. In subtle or overtly arrogant ways, we fail to respect the cultures, experiences, and education that have shaped other people’s perspectives. We privilege our own unconsidered opinions. We fail to respect the accumulated wisdom of others. And the list could go on. To guard against these types of errors, we need to cultivate intellectual humility by submitting to instruction. We need to recognize the limits of our intellectual abilities and our knowledge base. We need to admit that our own biases and fallibility can skew our conclusions. These are the first steps in creating the framework of an intellectually virtuous life. Intellectual caution and intellectual courage Intellectual humility may lay the groundwork for our ability to profit from instruction and to seek knowledge, but to it we must add several other virtues. Intellectual caution and intellectual courage, our next virtues, are concerned with how we evaluate and respond to the real and imagined threats involved in intellectual pursuits. Allow me to introduce this with an example. I recently attended a conference where I anticipated that the viewpoints I would encounter would be strongly opposed to some of my own beliefs; I was not disappointed. So why did I go to this conference? I did not go to courageously fight for my beliefs. I went to allow my own beliefs to be challenged, so that I could give an honest hearing to viewpoints that I would otherwise tend to avoid. Doing so was unsettling but good. I found that I understood the beliefs of some people with whom I disagree much better for having listened to them. In fact, I found myself having to reconsider some of my own beliefs and reasoning. Intellectual courage is a virtue that allows us to face situations where knowledge may be unsettling. Clients in therapy and parishioners in confessionals sometimes need intellectual courage to come to know uncomfortable truths about themselves. Education, too, can be unsettling as we face viewpoints that challenge beliefs that we may have never questioned. On the other hand, one should not rush in where angels fear to tread: there are times when intellectual caution is advised. Intellectual caution requires that we be careful and circumspect in forming judgments. Perhaps in our excitement, we might want to brashly assert more than is justified by the data. In such cases, intellectual caution counsels restraint and discretion. While intellectual courage calls us to face or overcome fears that ought to be confronted, intellectual caution calls us to appropriately recognize dangers that ought to be avoided or carefully negotiated. Intellectual caution might appropriately cause us to avoid certain quests for knowledge. For instance, given that we have limited intellectual and financial resources, we might appropriately decide that pursuing a particular intellectual path is not worth the costs. Some quests for knowledge could be damaging, for example, if it is likely to be used in ways that are harmful. So intellectual courage will appropriately push us forward, and intellectual caution will appropriately hold us back. To these, we will now add another virtue which is necessary if we are to think well. Intellectual integrity Intellectual integrity or honesty is related to the pursuit of truth in at least two ways—the potential for self-deception, and the intent to deceive others. That we can deceive ourselves is one of the hallmarks of Freud’s theory of defense mechanisms. For instance, when we do not want to admit that something is morally wrong, we may use rationalization to justify our errant thoughts, feelings, or behaviors. What is particularly damning in Freud’s observation is how easily self-deceit takes place automatically and unconsciously. Our own self-deceit naturally leads us to mislead others, but we can also do so quite consciously. Why might we want to deceive others? Pride and greed are notable vices that come into play in this respect. Pride, of course, can cause us to exaggerate the merits of our own opinions. Greed has certainly been a motive for falsifying data in order to increase the profits of a person or a corporation. A related problem is that we must often rely on the credibility of others for information, and if their intellectual integrity is impaired, our faith in their credibility may be misplaced. When people “lie, exaggerate, or withhold evidence” we are left to doubt the credibility of their truth claims. Intellectual honesty is one of the most obvious intellectual virtues. We now turn our attention to one more intellectual virtue, which while perhaps not as obvious, is indispensable and cultivated only with great effort. “If one has the answers to all the questions—that is the proof that God is not with him. It means that he is a false prophet using religion for himself. The great leaders of the people of God, like Moses, have always left room for doubt. You must leave room for the Lord, not for our certainties; we must be humble.” — Pope Francis Intellectual perseverance Intellectual perseverance or discipline is another trait that facilitates the attainment of knowledge. Discipline of the mind is much like exercise—it takes time to strengthen muscles, to develop coordination, and to build up endurance. People who give in to the vice of sloth fail to extend adequate energy to attaining goals. By contrast, the disciplined person will put the necessary effort and resources towards developing habits and skills that are necessary to accomplishing worthy objectives. Academia, again, holds out examples of the difficulties of intellectual perseverance. When I collect term papers, I often ask students what time they finished writing. I suppose you can guess what happens: a few rare souls will have finished a day or two before, while most students finished in the wee hours of the morning. Procrastination is a vice that affects most of us, and to combat it, we need perseverance. But limitations of time are only one of the obstacles with which we must contend, as we will see. The quest for knowledge requires that we must be disciplined in applying ourselves to the task of learning. This means, for instance, reading a textbook when you might prefer to play tennis. It may mean needing to read an article over and over and over again to begin to understand the nuances and flow of the argument. It may mean having to struggle with uncertainty and confusion over a long period of time as you attempt to understand complex issues. There are a number of other moral and intellectual virtues that we need to cultivate and to possess if we are to think well and if we are to make good use of the knowledge that we obtain. The central argument of this section, though, is that the first step in seeking knowledge is to become the kind of people who have the characteristics and dispositions that will allow us to learn and to use knowledge wisely. Without such virtues, the quest for knowledge will be unproductive intellectually and damaging personally. As we turn our attention to look at several epistemic strategies, let us remember that we must cultivate intellectual virtue if our pursuits are to be worthwhile and maximally effective. Of Madness and Methodologies Imagine that your ability to test reality was grossly impaired, as Jimmy’s was. You would be what was once called mad. The word mad comes to us from Old English and Old Saxon words: the Old English gemad (“insane”) and the Old Saxon gimed (“foolish”). While most of us are not mad in the delusional sense, we are all quite familiar with being mad in the foolish sense. We make mistakes, draw incorrect inferences, and come to wrong conclusions. Several methodologies can be employed to help us be less foolish. These methodologies provide us with rules to follow so that we can evaluate knowledge claims carefully. An exhaustive discussion of epistemic strategies is beyond the scope of our present concerns, so we will limit ourselves to four that have specific relevance for the integration of psychology and Christianity: authority, logic, empiricism, and hermeneutics. By learning about these methodologies and applying them well, we can avoid being unduly simplistic or excessively foolish in our integrative efforts. Appeals to Authority The vast majority of what we know has been handed down to us from authorities: parents, teachers, doctors, scientists, theologians, translators, and mechanics, among others. Unfortunately, many falsehoods are also passed down through authority. That any method can go wrong, though, is no reason to get rid of it. Instead, we need to look at how to increase the likelihood that using the method will yield the desired result. Since the Enlightenment, appeals to authority have been treated with great suspicion. Nevertheless, authority is a powerful means of transferring knowledge from person to person. When a doctor provides a diagnosis and treatment recommendation, she is providing her knowledge (gained in part from the authority of other experts who published their own empirical work, and others who ran lab tests on the patient) to her patient. The patient could accept the diagnosis and the recommendation, or decide to consult his pastor, his plumber, his attorney, his Aunt Irene, or google. Notice, though, that these “authorities” are of vastly different quality. For an authority to be credible the knowledge base of the authority must be relevant; the plumber’s knowledge base for designing the water supply and drainage system for a new bathroom is clearly irrelevant to the medical condition of the patient, whereas the doctor’s knowledge base is more relevant. For a relevant authority to be reliable, we need at least three more conditions to be met. First, we need to be able to have a reasonable degree of assurance that the authority accurately discerned the situation (e.g., that the doctor gave the correct diagnosis). This means that the authority must occupy a privileged position from which she knows things that we do not—she went to medical school, she saw the test results, she is competent in her judgments, and so forth. Of course, the authority can be wrong— misdiagnosis can happen, especially when dealing with probabilities rather than certainties. The second criteria for authority to be credible involves the character of the authority figure, for instance, we need to have sufficient reason to believe that she is honestly representing her knowledge (honesty) and providing it for our benefit and not for personal gain (beneficence). Finally, the usefulness of appeals to authority requires that we accurately understand what the authority is communicating. In the foregoing example, we may need the authority to break down technical terms into language that we can understand. We also might need to make sure that our initial emotional state does not keep us from accurately processing the information given to us. Although appeals to authority are much maligned, without them we would be utterly lost. There is a social dimension to knowledge that we cannot do without. Of course, we should not blindly trust authority, and authorities can and do err, but authority nevertheless remains one of our most powerful means of seeking knowledge. The social fabric of knowledge requires trust and—under some circumstances—verification. One epistemic method that we can use in verifying some types of truth claims is logic, and it is this method to which we now turn our attention. “Even in the valley of the shadow of death, two and two do not make six.” —Leo Tolstoy Logic The logical approach to epistemology begins from the assumption that we can separate fact from belief by looking at the rational consistency of one’s reasoning. Logic can be separated into deductive and inductive types. Deductive logic is used to establish truth by combining and evaluating premises based on standard rules and axioms. Similar to geometric proofs, deduction begins with premises that are asserted and then asks what can be concluded if the premises are accepted and combined together. A simple example is the argument, if A = B, and if B = C, then A = C. Deductive arguments contain three elements: premises, inferences, and conclusions. A premise is a statement that must be true or false. In the forgoing example, the first premise was, A = B. Note that a premise is a statement that either is or is not true. If a premise is wrong, the argument is flawed. In some cases not everyone will agree that the proposition is true, which highlights the importance of coming up with agreed-upon definitions and propositions if one’s arguments are to be compelling. If you have ever read Socratic dialogues, you know just how much effort can be put into this phase of the exercise! The second element of a deductive argument is the inference. Once the propositions have been agreed upon, a new proposition is created that states an implication that should logically follow from the accepted propositions. It is usually the part of the argument that proceeds from if to therefore. If the inference (the if— then connection) is wrong, the argument is invalid. The third part of a deductive argument is the final inference, which is called the conclusion. The conclusion is a new statement that is logically connected to the propositional statements. Conclusions can be true or false based on the validity of the premises and the inferences drawn from them. In order to show how deductive arguments can go wrong and to help us construct them correctly, we make use of a truth table. A simple truth table is included below. TABLE 5.1 Truth Table Premise Inference Conclusion False Invalid True or false False Valid True or false True Invalid True or false True Valid True In order to be sound, the argument must start from true premises, and include valid inferences. If those conditions are met, then we can be theoretically certain that the conclusion is true. Unfortunately, the triple limitations of finitude, frailty, and fallenness restrict how sure we can be of the veracity of our premises and the correct application of the method. Still, logical deduction is a very useful method. Rebecca’s conclusion that Jimmy was demon possessed was the result of logical deduction. The following propositions represent the kind of argument she might have made: • Rock-and-roll music is often a product of demonic influence. • People can make Faustian bargains whereby they give their soul to a demonic force in exchange for a supernatural ability. • Jimmy claims that his guitar-playing ability suddenly emerged during a rock-and-roll concert. • Jimmy’s account is accurate. • Therefore, Jimmy must have made a deal with a demon to acquire his guitar playing ability. While Rebecca used logical deduction to conclude that Jimmy was demon possessed, it is obvious that there are numerous flaws in her argument. Logical deduction is extremely important to experimental psychology, because hypothesis testing involves the application of deductive logic. We evaluate the statistical significance of data by determining whether or not they support the null hypothesis or the experimental hypothesis. Hypothesis testing using statistical analysis would not be possible without deductive logic. Inductive logic, in contrast to deductive logic, attempts to develop generalizations based on isolated observations. As such, it is probabilistic, inferring how things usually work. Inductive logic is often the initial source of psychological theories. For example, one might notice that a significant portion of special-education students come from economically disadvantaged homes and surmise that poverty contributes to educational problems. At this point the age-old statistical axiom comes into play: correlation does not imply causation. While we might observe the correlation of events, we cannot demonstrate cause without manipulating the variables experimentally. It is thus to experimentation that we now turn our attention. Empiricism The words empiricism and empirical come to us from the ancient Greek root empiric, which involved reliance on experience to evaluate knowledge claims. The origin of the word referred to “an ancient sect of physicians who based their practice on experience alone disregarding all theoretical and philosophical considerations.” Saint Luke is widely believed to have been a physician, and he exemplified this experiential approach in the opening of his gospel: “Therefore, since I myself have carefully investigated everything from the beginning, it seemed good also to me to write an orderly account … so that you may know the certainty of the things you have been taught” (Luke 1:3–4, NIV, italics added). Luke was unwilling to trust philosophical speculation alone—he checked out the sources to the best of his ability in order to be sure that what he reported was accurately conveyed and confirmed by investigation. Contemporary empiricism as an epistemic method is derived from logical positivism, a view in which belief is validated or invalidated solely by the evidence of experience. In the sciences, empiricism proceeds along the familiar lines of the scientific method. One observes a phenomenon, creates a theory about the nature or cause of the phenomenon, and then generates a hypothesis—a prediction based on the theory about what would happen if certain elements of the phenomenon are altered. An experiment is then created to test the hypothesis. Finally, one observes the results of the experiment and evaluates whether the evidence supports or refutes the theory. Note that theories can never be proved, only disproved or supported. Empirical methods have been central to the development of psychology as a science. By applying the scientific method to human beings, we have increased our understanding of the biological, psychological, and social determinants of human behavior and mental processes. For example, empiricism has drastically improved our understanding of schizophrenia, the disorder that Jimmy suffered from. Inheritance patterns have been determined that may ultimately lead to identification of genes that are implicated in schizophrenia. We have uncovered social factors, such as expressed emotion (characterized by criticism, hostility, and emotionality) that are associated with increased risk of relapse. If we were able to study Jimmy’s brain, we would likely see characteristic reduction of gray matter in his temporal and frontal lobes. We have greater understanding of the biological changes that take place in the brains of people who have schizophrenia, and of pharmaceutical and psychological interventions that can help them manage their symptoms. Although Jimmy was not cured by his treatments, he was treated more humanely, and he likely had a much better outcome than would have been the case without those treatments. Since psychology adopted a scientific approach in the late nineteenth century, our understanding of human behavior has advanced considerably. However, psychology, and science in general, have been forced to recognize the limitations of the modernist framework that is usually associated with empiricism. The goal of positivism was irrefutable evidence upon which to base beliefs, but such precision and certainty are never possible. Additionally, scientists do not rely on empiricism alone; for instance, they make use of “the testimony of past witnesses” (appeals to authority), intuition, and reasoning, as well as observation based on sense perception. While we like to think of science as a way of gathering “facts” about the world, it is better to think of it as an epistemic framework that allows us to view the world through a specific theoretical framework. Each theoretical framework allows us to understand the world through a particular paradigm (e.g., Aristotelian, Neoplatonic, and mechanistic). Each of these paradigms is based on a set of metaphysical presuppositions that “primes its adherents to look for certain kinds of facts and to apply certain [kinds of] interpretations.” The most pervasive scientific paradigm today is that of the world-asmachine, yet that paradigm reflects not just brute facts, but philosophical assumptions about the nature of the world (e.g., that it functions like a machine, that it can be calibrated and controlled, and the like.). Although empirical methods have given us a unique and powerful lens through which to view natural phenomena, we must always remember that they provide a constricted view. Science focuses its attention on a limited domain (the physical world). This intense focus allows it to rigorously investigate material aspects of the world, but it necessarily neglects many other aspects of reality. As theologian Hans Küng noted, “It has become more clear that all the measuring, experimenting, and extrapolating of the highly developed behavioral sciences—as terribly important as it is—rests on a preliminary understanding of human reality that encompasses only certain aspects and dimensions of it.” The limited scope of psychology and other sciences allows them to be very productive, but in a very limited domain. Complicating this picture is the fact that no one develops a theory without first making a series of philosophical assumptions, and in the case of psychology these assumptions go to the very heart of what it means to be human. Sigmund Koch cautioned us about our tendency to overlook these philosophical assumptions. Psychology is necessarily the most philosophy-sensitive discipline in the entire gamut of disciplines that claim empirical status. We cannot discriminate a so-called variable, pose a research question, choose or invent a method, project a theory, [or] stipulate a psychotechnology, without making strong presumptions of philosophical cast about the nature of our human subject matter—presumptions that can be ordered to age-old contexts of philosophical discussion. Furthermore, especially when theories involve human beings, they inevitably involve value judgments. Gordon Allport made this point fifty years ago as he struggled to define what it means to be a “healthy” human being. We cannot answer this question solely in terms of pure psychology. In order to say that a person is mentally healthy, sound, [or] mature, we need to know what health, soundness, and maturity are. Psychology alone cannot tell us. To some degree ethical judgment is involved … Science alone can never tell us what is sound, healthy, or good. For this reason, we cannot simply assume that psychology as a science will be “objective” or “self-correcting”—it will always reflect the worldviews and values of its theorists. To these concerns, some people object that science is a self-correcting discipline because the data we collect force us to reconsider our theories. While there is some truth in this, the degree to which we can be objective is limited: our biases predictably affect our interpretation. Van Leeuwen rightly cautions that recent critiques “have demonstrated the degree to which scientific data are theory-laden, and theories are underdetermined by facts.” Despite these shortcomings, empirical methods can be very useful when they are applied to observable phenomena. However, we need to exercise cautious oversight of our own biases and worldviews. Science has advanced our knowledge of the physical world, but it can only be applied to observable phenomena, and its claims are limited to that domain. Correctly used, empiricism can be a very powerful means of evaluating limited types of claims, notably those which can be observed and tested experientially. To obtain knowledge about non-physical phenomena we need to use other epistemic methods. Before leaving our discussion of empiricism, we must note that a tumultuous shift has been taking place within the social sciences over the past several decades. Postmodern critiques have resulted in two versions of psychology: one that cherishes empirical research and one that sees empiricism as damaging on political, social, and ethical grounds. One of the first attempts to posit a middle ground was the development of grounded theory by Glaser and Strauss. Their approach established the use of qualitative research as a means of giving more latitude and input to research participants, while still yielding quantifiable data. Objectivist grounded theory still maintained “positivistic assumptions of an external world that can be described, analyzed, explained, and predicted.” More recently, constructivist grounded theory has rejected these positivistic assumptions. “Constructivism assumes the relativism of multiple social realities, recognizes the mutual creation of knowledge by the viewer and the viewed, and aims toward interpretive understanding of subjects’ meanings.” “Science can purify religion from error and superstition. Religion can purify science from idolatry and false absolutes.” — Pope John Paul II Constructivist-grounded theory and other postempirical approaches to psychology will continue to present epistemic challenges to the field, and by extension, to integration. For the time being, empirical and nonempirical methodologies continue to be influential, competing paradigms within the social sciences. Revelation and Interpretation Christians who are committed to historic, orthodox expressions of the faith believe that God has revealed Himself indirectly, through general revelation, and directly, through special revelation. General revelation is invoked by the apostle Paul when he wrote, “For since the creation of the world God’s invisible qualities—his eternal power and divine nature—have been clearly seen, being understood from what has been made, so that men are without excuse” (Rom 1:20, NIV). Special revelation refers to unique acts whereby God discloses Himself in word or deed, such as in theophanies, miracles, dreams, and visions. Orthodox Christians accept the Old and New Testament books as God’s special revelation, preserved as His written Word. While all of God’s Word is true, Christians do not see Scripture as containing all truth. For instance, a car-repair manual will be more useful than the Gospel of John if you need instructions about changing the oil in your car. As Arthur Holmes remarked, “The Christian regards the biblical revelation as the final rule of faith and conduct, but he does not think of it as an exhaustive source of all truth … Moreover, if all truth is God’s truth and truth is one, then God does not contradict himself, and in the final analysis there will be no conflict between the truth taught in scripture and truth available from other sources.” Harmony must thus exist between truths obtained from the biblical message and truths that God has made known through other channels. Scripture contains numerous accounts of people who experienced psychological distress. These accounts help us understand the experience of human anguish and they remind us of our responsibility to help those who suffer. However, the Bible does not provide extensive information about the etiology and treatment of mental disorders. Failing to understand the natural causes of psychotic behavior, some Christians, especially in the Middle Ages, misinterpreted delusions and hallucinations as having demonic origins. Some contemporary Christians, such as Rebecca, continue to misconstrue the causes of psychosis because they have not adequately considered alternative explanations. While God’s Word is without fault, human understanding and interpretation of the Scriptures are fallible. Our theologies and Christian beliefs are shaped by human thinking and human reflection on the Word of God. We must be careful to distinguish between what Scripture says and what we think it says (or worse, what we want to make it say). To aid us in this task, we make use of hermeneutics. Hermeneutics are rules of interpretation. Once again, we are indebted to the ancient Greeks for an English word. The word “hermeneutics” originates from the name of the Greek god Hermes, whose job was to deliver messages between the gods or from the gods to human beings. The task of the messenger is to make sure that the plain meaning of the message is accurately communicated. Hence, hermeneutics is the science and art of making sure that the message is accurately understood. It is concerned with how we derive meaning from spoken or written words. For help in elucidating some hermeneutical principles, we will turn to what may seem an unlikely source at first: William Shakespeare. Hermeneutics: An Introduction in Two Acts A Shakespearean Example In the English language perhaps one of the most commonly quoted lines—and one of the most commonly misunderstood—is this: “Romeo, Romeo, Wherefore art thou Romeo?” It is often construed as if a love-struck teenager is either waiting for a hot date or wishing she had one. Just where is this lover boy anyway? Perhaps, if we take a look at the original source, we might learn what William Shakespeare really meant, and it might help us learn to read our Bibles more carefully too. To accomplish this, we have to set the scene. Romeo and Juliet have been smitten with each other and only learn as the story unfolds that they are members of rival families whose quarrel threatens any relationship they might wish to have. It is shortly after this that we pick up the story. Romeo (a Montague) has just scaled the wall surrounding the Capulets’ home, seeking to catch a glimpse of Juliet. Juliet (a Capulet) appears above at a window. After brief monologues, Romeo overhears Juliet, and a dialogue breaks out. Romeo and Juliet, Act 2, Scene 2: Capulet’s orchard JULIET: O Romeo, Romeo! wherefore art thou Romeo? Deny thy father and refuse thy name; Or, if thou wilt not, be but sworn my love, And I’ll no longer be a Capulet. ROMEO [Aside]: Shall I hear more, or shall I speak at this? JULIET: ’Tis but thy name that is my enemy; Thou art thyself, though not a Montague. What’s Montague? it is nor hand, nor foot, Nor arm, nor face, nor any other part Belonging to a man. O, be some other name! What’s in a name? that which we call a rose By any other name would smell as sweet; So Romeo would, were he not Romeo call’d, Retain that dear perfection which he owes Without that title. Romeo, doff thy name, And for that name which is no part of thee Take all myself. ROMEO: I take thee at thy word: Call me but love, and I’ll be new baptized; Henceforth I never will be Romeo. “Romeo, Romeo! wherefore art thou Romeo?” To understand the line, we need to ask several questions. What is the immediate context of the line? What is the overall context of the passage? Are there any words whose meanings are unclear? First, the immediate context: The line we are considering occurs while Juliet is on the balcony, unaware that Romeo is listening. (Perhaps we should avoid giving Romeo any psychiatric labels at this point!) The surrounding lines seem to suggest that more is going on here than the longings of a love-struck adolescent. She dwells on the meaning of names, calls for Romeo to deny his name, and offers to give up her own name. When Romeo speaks, he offers to be “new baptized,” at which point a child being christened is given his or her new or “Christian” name. Whatever Juliet is lamenting, it would seem to have more to do with names than with the absence of her lover. Second, the overall context: Romeo and Juliet come from rival families, the Montagues and the Capulets. Their family rivalry runs so deep that neither can see any way of pursuing their relationship in the open, so Romeo and Juliet decide to run away together, and through a tragic set of circumstances, both end up committing suicide, and their families are left to confront the feuding hatred that Romeo and Juliet had overcome through their love, but which then contributed to their deaths. Again, the overall context makes untenable the reading of the neophyte that this line expresses Juliet’s pining because she misses Romeo. Third, the question of language. Looking at Juliet’s line, the one word that seems most awkward to the modern reader is “wherefore.” It is a word that was common in Shakespeare’s day but is not in ours. Rather than meaning where it actually means why. So, if we were to update the language, the line would read, “Romeo, Romeo, why are you Romeo?” Putting the clues together helps us uncover an utterly different meaning than we might have expected. Far from listlessly waiting for the absent Romeo, Juliet is asking why Romeo, the man she loves, is by definition of his name, the enemy of her family. The common misinterpretation of Juliet’s soliloquy is due to careless analysis. In the same way, carelessness in our approach to Scripture can lead to misunderstanding and misapplication. To avoid such problems, we need to try to understand the author’s intent, and to aid us in this task we can utilize a variety of hermeneutical principles. An Evangelical Example The hermeneutical approach adopted by most evangelicals rests on the belief that the Scriptures are God-given and are trustworthy as a guide for faith and practice (2 Tim 3:16). With a strong belief in the value of Scripture for personal application, evangelicals are—at least ideally—committed to understanding Scripture as fully as possible. In this ideal, they seek to discern the meaning of Scripture in the context of its original audience, culture, language, and history. From this perspective, it is best to try to understand what the biblical author intended to say to the original audience as the first step in determining how Scripture might apply to our contemporary situation. Several hermeneutical principles are utilized in the quest to discern the intent of the author. A few of these principles follow in the form of questions that may be asked as we look at Scripture. • What does the Scripture say? Approach the study of Scripture with an open mind in an attempt to see what it says rather than trying to make it agree with your own preconceptions. • Who wrote the book? • To whom was it written? • Why was it written? What was the stated or implied purpose? • What was the current situation of author and audience (culturally, historically, etc.)? • In what genre is it written (apocalyptic, historical, narrative, poetic, prophetic, etc.)? The genre will influence how interpretation should proceed. • Are there words that are unclear to us, or that may carry a different meaning in today’s language? (This is especially important if you are using a translation with archaic language. Even when using modern translations, it is often helpful to look at the Scripture in the original language, if possible.) • What is the immediate context of a passage of Scripture? How does the context of a verse make sense of the passage under consideration? • What is the overall context of a passage? How does it fit in with the argument or teaching of the whole book? • How does it relate to other Scriptures? “Scripture interprets Scripture.” (Less clear Scriptures must be understood in the light of more clear Scriptures; uncommon themes are secondary to primary scriptural themes.) • How is the passage understood by other credible sources? (Church tradition and biblical commentaries, while not infallible, are helpful interpretive aids. Novel interpretations that depart from those of reliable sources must meet a higher burden of proof before they can be accepted.) Many more principles could be elucidated, but the foregoing questions provide a good basis for Scripture study by the average layperson. The danger in not being rigorous in the application of hermeneutical principles is that it may result in misunderstanding and misapplication. Admittedly, this makes study a bit more work than simply opening a Bible and haphazardly reading whatever is there, but it also opens up the possibility of gaining a deeper understanding and appreciation of Scripture as one attempts to grow as a Christian. Only after the hermeneutical task is accomplished can one effectively ask the interpretive question of how Scripture applies to our contemporary situation. A significant difficulty in applying Scripture is how to deal with culturally embedded aspects of the text. All of Scripture is shaped by the culture and experience of the original audience. Every Scripture was communicated through the language, culture, worldview, and knowledge base of the authors through whom God spoke and the people to whom His message was addressed. The message of Scripture, though, provides an unfolding picture of God’s redemptive acts in human history and expresses timeless truths that are not culturally relative. Application of the text to our cultural situation requires that we struggle to determine not just what the text meant to the original audience, but how it should be understood and applied to our contemporary situation given an understanding of cultural matters and the flow of redemptive history. A Theocentric Unified Model of Knowing After exploring worldviews, human finitude and frailty, the place of virtues in seeking knowledge, and various means of seeking knowledge, we must ask ourselves how these all fit together. Listed below, in outline form, are several propositions that bring together a theory of knowing based on a Christian worldview. (Figure 5.1 presents these propositions schematically.) Figure 5.1: A Theocentric Unified Model of Knowing David N. Entwistle, Integrative Approaches to Psychology and Christianity: An Introduction to Worldview Issues, Philosophical Foundations, and Models of Integration, Third Edition. (Cascade, 2015), 89–116. 1. There is a real world, whose normal workings (“scientific laws”) human beings can perceive with some reasonable degree of correspondence to the external world. 2. God is the transcendent creator, Who not only created the world, but also upholds and sustains it, and is capable of intervening within it (i.e., miracles, revelation, Scripture). 3. We are situated within a context of time, physical locale, and culture. These contextual realities shape our perceptions, limit our access to information and thereby limit our objectivity. 4. Human beings are rational, relational, spiritual and biological beings, (created as good, though finite, frail, and now fallen). 5. A Christian worldview recognizes that a. all truth is grounded in the transcendent God who created an orderly world, and whose truths can often be known through rational, experimental, or revelational means; b. the creative and intellectual abilities of human beings are God-given, to be guided by intellectual and moral virtues and exercised in adoration of God. c. reality is physical and spiritual. Our experience is social and psychological in nature; d. our ability to know is limited both by our finitude and by the effects of personal and corporate sins (e.g., lack of personal virtue, cultural bias); the noetic effects of sin; e. various means of epistemic inquiry can be used to evaluate truth claims, though each has unique strengths, limitations, and areas of epistemic application. These methods include (but are not limited to) i. Rational discourse ii. Experimentation iii. Hermeneutics 6. Historical events can be explored using a variety of methods and sources (e.g., archeology, geology, written records, oral tradition), with a recognition that our investigations are guided and limited by the assumptions made and the materials found. 7. The future is not completely determined, but a limited degree of predictive accuracy is possible based on past trends and reasoned outcomes. A major implication of this model is that psychology and theology can be seen as united under a common set of assumptions about the world. Given that their sources are rooted in the truths of God’s world, the fundamental unity of their domains is assumed. The limitations of their respective methodologies and of human thought in general guarantee that the conclusions in each field will not always fit nicely together, but the framework itself suggests that harmony is conceptually possible. Moreover, an approach informed by both psychology and Christian theology is desirable if we seek a more complete picture of human nature and functioning. In order to continue developing this picture, it is necessary for us to ask about the nature of the world in which human beings reside, and that is the topic of our next chapter. To bring this chapter to a close, recall the discussion about Jimmy (the man who claimed to have learned to play guitar on stage with Cat Stevens). Jimmy’s story highlights the importance of epistemic justification. If we are to evaluate whether Jimmy was sane, demon possessed, or delusional, we must look at the quality and consistency of the evidence used to justify each of those conclusions. Jimmy’s conclusion (“I really did learn to play guitar on stage with Cat Stevens”) seems hard to believe because it is such an extraordinary claim: people just don’t learn to play guitar on the spot; rock-and-roll stars rarely invite people on stage that they don’t know, and they certainly don’t let people whose musical skills are unknown play with the band! Rebecca’s conclusion (“Jimmy had a demon who gave him supernatural guitar-playing ability”) is internally consistent but is advanced with no evidence to corroborate her conclusion, and it fails to take into account other possible explanations or the evidence in favor of those explanations. My explanation (“Jimmy was delusional, and his guitarplaying abilities were learned through years of practice”) is internally consistent, fits the data on his chart (the history of drug abuse, previous psychotic diagnoses, and a positive history of favorable response to neuroleptic medications), and is consistent with a theology that understands disease and infirmity as a result of the Fall. Whether evaluated on the basis of logic, empirical evidence, or scriptural evidence, my position offers an explanation that is consistent, comprehensive, and which stands up well against alternative hypotheses. While I could still be wrong, I’ll stick with my hypothesis, because I haven’t been confronted with sufficient evidence to compel me to replace it with an alternative one. Summary The desire to know and understand God’s world is a God-given capacity, but the pursuit of knowledge is affected by worldviews, human finitude, human frailty, individual and communal sin, assumptions, methodological limitations, and the availability of data, among other things. Our ability to know is contingent, limited, and fallible. If we approach the quest for knowledge with an awareness of our limitations and guided by intellectual virtues to appropriate ends, this quest is worthwhile and partly attainable. Given this epistemic framework, it is possible to use various methods to understand human experience, including psychological theory, psychological science, and Christian theology. While our findings must be held tentatively and humbly, this framework makes integration conceptually possible and intellectually appealing. David N. Entwistle, Integrative Approaches to Psychology and Christianity: An Introduction to Worldview Issues, Philosophical Foundations, and Models of Integration, Third Edition. (Cascade, 2015), 115–119.
Purchase answer to see full attachment

Explanation & Answer:

550 Words