Thursday, March 6, 2014

Humanistic Psychology





Humanism, putting “I” or “me” at the center of the universe, has been an insidious problem for humanity from the very beginning of man’s existence. We could begin with Adam and Eve, who, when the serpent spoke to Eve, revealed more about the human condition than we could ever hope to learn from all that the history of psychology might want to teach us. In the garden the following scene plays out,


 1Now the serpent was more crafty than any beast of the field which the LORD God had made. And he said to the woman, "Indeed, has God said, 'You shall not eat from any tree of the garden'?"

 2The woman said to the serpent, "From the fruit of the trees of the garden we may eat;

 3but from the fruit of the tree which is in the middle of the garden, God has said, 'You shall not eat from it or touch it, or you will die.'"

 4 The serpent said to the woman, "You surely will not die!

 5"For God knows that in the day you eat from it your eyes will be opened, and you will be like God, knowing good and evil."

 6When the woman saw that the tree was good for food, and that it was a delight to the eyes, and that the tree was desirable to make one wise, she took from its fruit and ate; and she gave also to her husband with her, and he ate.  (Emphasis mine)

 7Then the eyes of both of them were opened, and they knew that they were naked; and they sewed fig leaves together and made themselves loin coverings.

Genesis 3:1-7 (NASB)


As these events play out we get the first glimpse of person-centered behavior ever recorded, and what eventually proves to be the fall of mankind. The thought processes exhibited by Eve (and Adam) continue to be evidenced throughout history. 


        In the ancient Greek world early humanistic beliefs were propagated by Anaxagoras who proposed the idea of “Free Thinking.” This idea allowed Epicurus to develop his human-centered approaches to achieving a state of “eudaimon,” or what we would know as happiness achieved through self-actualization. However, on the horizon, we see the mechanistic view marching forward. 


Beginning with the 14th and 15th centuries the mechanistic view of humanity was taking shape. Also, the idea of self-determinism was on the rise. Thomas Hobbes recognized how society was wandering down the path of humanistic thought. One feature of Hobbes’s theory of why people behaved the way they did was that he considered human nature as being absolutely egoistic. He describes people as being, by nature, selfish and not in fact considering others. “In his psychological analysis he finds naught but self-regarding feelings impelling man’s activity.”  (Hobbes, 1898) Although Hobbes himself was a proponent of a Christian worldview, his reference in this writing is to reveal the prevailing zeitgeist of his day. His words show the pervasiveness of humanistic thought processes during the late 16th and early 17th centuries. 

The progression of humanistic thought continues to grow, and it finds a foothold in the center of Germany the home of the beginnings of modern psychology.

One of the more ardent proponents of humanistic psychology was Charlotte Buhler. She conveys that her clients often say they don't know what they want, and they do not know what they believe in. Buhler subsequently states, “…this is a call for humanistic psychology, a psychology that guides people in defining what they think is healthy and meaningful living. It is through this clarification of goals that people become fulfilled.”  (Buhler, 1972)  (Emphasis mine) Again we see the same type of thinking that caused Adam and Eve so much pain, and what Thomas Hobbes saw as he observed the behavior of his day. Humanistic thinking focuses on “I” and “me.” It focuses on the self-_________    (fill in the blank). The people mentioned thus far reveal to us the anthropocentric nature of humanism. All was not humanistic thought from the 15th through 19th centuries. Blaise Pascal, Francis Bacon, Sir Isaac Newton, Michael Faraday, James C. Maxwell, and J. Robert Oppenheimer represented the best that science had to offer. All based there science in the fact that there is a creator who is also a lawgiver, and none of them believed that man, starting from himself could understand anything. None of them had a humanistic outlook. 


Non-Christian philosophers from the time of Plato to Sartre had some common ground. They were rationalists. They assumed man, starting with himself, can gather enough information to form logical conclusions (understanding) of who they were and why they behaved as they did. They believed all knowledge comes from inside them leaving out the possibility of God. They also took reasoned though very seriously. Their logical conclusion was they could achieve true knowledge through reason alone. These older views, albeit faulty, show an optimistic outlook on life and knowledge. But a shift occurred that moved men from an optimistic view to a view that all is lost. The humanistic expectation that autonomous man would be able to bring together a unified view of human nature stalled. The pessimistic view of man was ushered in by men like Jean-Jacques Rousseau, David Hume, Immanuel Kant, Georg Hegel, and Soren Kierkegaard. The main idea for these men could be summed in this way, autonomous freedom, meaning freedom from any kind of restraint, and truth being sought in the synthesis of ideas instead of absolutes or antithesis. Some going as far to say meaning is found through a “leap of faith.” Without absolutes the door was left wide open for humanistic thought to inflict more damage. The age of psychology was beginning and the men of the 20th century will take humanistic ideas to the edge of reasoned thought. Thus, Humanistic Psychology will be born. Abraham Maslow, born in 1908, believed that although psychoanalysis as posited by Freud was somewhat useful, but Maslow said it was on useful on the sick. He placed his emphasis on studying the non-sick. 

Maslow studied those who had achieved higher levels of satisfaction with life. He wanted to understand what motivated the thought processes of successful and well adjusted people. Thus marks the beginning of Humanistic Psychology which teaches that every person has a strong desire to realize his or her full potential, to reach a level of Self-actualization. Maslow used a visual aid to represent his idea of a Hierarchy of Needs which shows how people progress from the most basic needs to the pinnacle of self-actualization. The system emanates as follows, 


“By satisfying basic needs such as food, water, sex, exercise, and recreation, and feeling safe, we can progress to higher order, psychological needs such as love, needs for belonging, and self-esteem. When these are met, it provides the confidence and focus to reach the pinnacle of psychological integration, or self-actualization.”  (Jacobs, 2002)


Basically Maslow states that once we satisfy our basic needs we, in humanistic fashion, continue to put ourselves first. With that understanding, human behavior is seen as based on a perception of reality that causes the individual to act accordingly and satisfy their needs in light of those perceptions. Maslow took this idea a step further by stating that the way the needs are filled are just as important as the needs themselves. He said that filling the needs and the way they are filled combine to make up the human experience. Maslow’s idea of meaning level of self-actualization is achieved when a person establishes meaningful connections to an external reality.  Establishing an external connection is the goal of Carl Roger’s client-centered therapy. 


        The client-centered or person-centered psychology is probably the biggest perpetuator of humanistic psychology, and the dangerous ideas that flow from it that have come about in our lifetime. Carl Rogers was influenced by the views of John Dewey, Sigmund Freud, and Soren Kierkegaard. His brand of psychology is humanistic, but it is also existentialistic as well. Where Rogers ideas begin to break down are in the area of human nature. He believes that people are intrinsically good. He also believes they are rational, and trustworthy. From these basic beliefs he constructs his entire premise which states that people have an inherit tendency toward actualization, growth, health, independence, and autonomy. His theory is not without many shortcomings, “The person-centered counseling perspective in its “classic” form possesses nearly insurmountable obstacles for rehabilitation practitioners.”  (See, 1986) Rogers offers a statement in his book, A Way of Being, which is true to humanistic thinking, but causes problems for those trying to engage his theory, “individuals have within themselves vast resources for self-understanding and for altering their self-concepts, basic attitudes, and self-directed behavior.”  (Rogers, 1980) The theories of Maslow and Rogers seem, on the surface, to be reasonable, but are they really?


The Christian view of humanity contradicts the Rogerian belief in man’s natural goodness. The Bible teaches that,


“23for all have sinned and fall short of the glory of God,”

                                                        Romans 3:23 (NASB)


Man is in not intrinsically good, but his nature was altered all the way back in the very beginning of human existence as stated in the verses of Genesis that began this writing. Except for God’s provision through His Son, Jesus Christ, and His finished work on the cross, man cannot overcome his fallen nature. The ultimate problem for mankind is not our childhood, our past, or the thwarting of our actualizing potential. Our problem is sin, and we cannot save ourselves. Taking the correspondent’s view of truth the Christian faith has the only real answers to our problems. That being the case, and it can be made with very strong evidence, then the humanistic view that we can find answers to our problems within ourselves and without God, is a bankrupt idea. So is there any hope for utilizing any of the ideas contained within humanistic psychology?


        Humanistic thought should have very little use within the Christian counseling setting. For counselors to be true to Biblical teaching, they must jettison humanistic practices such as no boundaries, permissive therapy atmosphere, and nondirective therapy. The logical outcome of using these practices is a counseling practice that will have little resemblance to anything Biblical.

Works Cited

Buhler, C. (1972). Introduction to Humanistic Psychology. Bellmont, California: Wadsworth Publishing Co.

Hobbes, T. (1898). The Ethics of Thomas Hobbes as Contained in Selections from His Works. Boston, MA, U.S.A.: Ginn & Company.

Jacobs, D. (2002). Psychology (Brain, Behavior, and Popular Culture) (4th Edition ed.). Dubuque, Iowa, U.S.A.: Kendall/Hunt Publishing Co.

Rogers, C. (1980). A Way of Being. Boston: Houghton Mifflin.

See, J. (1986). Applied Rehabilitation Counseling. (M. W. Riggar, Ed.) New York, NY, U.S.A.: Springer.


t

Wednesday, March 5, 2014

Cognitive Development



          Cognitive abilities make available a way for us to handle the information that we receive from our senses. These processing abilities include how we determine what our actions should be, how we are able to compare one thing to another, remember past events, keeping what we learn in memory, evaluate things and situations, and analyze information in order to make decisions. There appears to be some instinctive element that is present within our cognitive functioning. The strength of the heredity argument is considered unsettled science at this time. However, Robert Plomin states, "The evidence for a strong genetic contribution to general cognitive ability is clearer than for any other area of psychology" (Davis, 2004). Genetics notwithstanding, most of the research available considers cognitive ability to be a learned skill/s.

            One of the key issues with cognitive development is when it does not occur as it should within the individual. Most development issues will have a direct impact on a person being able to learn, and these issues more often than not will need to be addressed and acted upon to help correct them to any degree. Cognitive functioning is not only side-tracked through developmental issues, it can also be affected by physical injury to the brain as well. No matter what the situation might be for an individual cognitive functioning, when not up to speed, can in many cases be brought closer to where it should be, and those with "normal" functioning can be taken to higher levels of learning. 

Childhood Cognitive Development
            Many opinions and much research has been done in relation to cognitive development during the childhood years. Jean Piaget (1896-1980) posited that children, "are actively involved in their own cognitive development" (Jacobs, 2002). Piaget's theory of cognitive development was based on the idea that, "the mind, through its interaction with the environment, undergoes a series of metamorphoses...as children grow their brains achieve new and more advanced connections for different kinds of thinking" (Jacobs, 2002). Piaget's took his theory, based on assimilation and accommodation and delineated it into stages. He referred to the childhood years (from birth to 12 years old) as the sensorimotor stage, preoperational stage, and concrete operational stage. His final stage, formal operations, applied to those form 12 years old through adulthood. However, Piaget did not "corner the market" on childhood cognitive development. Others would follow his with ideas that were similar and that were somewhat contrary.

            The social-cognitive perspective is a type of hybrid approach to understanding childhood development that includes cognitive functioning. In more recent times, "Albert Bandura and others have demonstrated that much of what we know and do is acquired through the process of observational learning" (Duffy & Eastwood, 2005). What they mean here is that large portions of what is learned throughout childhood in particular, and life in general, is learned by watching other people model a behavior or activity without the learner receiving any reinforcer at the time the activity is observed. This makes the environment in which a child is raised a very important aspect of their cognitive development. As they grow and use the knowledge acquired from observational learning, the reinforcer not received at the point of learning will be present at the time of acting of what was learned. The outcome can be good or bad based on the information gathered during observation. One aspect of Bandura's idea is that repeated exposure to, "violence in films, television, video games, and other media not only desensitize us to violence but also induce violence in children and adults" (Duffy & Eastwood, 2005). Thinking about how behavior patterns have seemingly disintegrated over time should cause someone to pause after reading this last statement. How true does it ring?

Adolescence and Cognitive Development
            Cognitive development as defined in the opening portion of this paper could also be amended to include the ability to think and reason. During the childhood years, the cognitive development process (ability to think) becomes, as Piaget would say "concrete." However, when adolescence is reached a more complex form of cognition begins including reason, abstract thinking, and comparative thinking. 

            Cognitive change does not happen overnight. The cognitive function changes that occur all along the lifespan occur slowly, but never does it seem as slow, at least to the person in process, than it does during the adolescent years (12-18 years old). The change during these years, unlike Piaget's spurts of growth during childhood, is more along the lines of the, "information-processing perspective (that) sees changes in cognitive abilities as gradual transformations in the capacity to take in, use, and store information" (Feldman, 2003). 

Education during these years is vital (assuming the educators are proceeding properly) to cognitive growth. It is during these years that, "people organize their thinking about the world, develop strategies for dealing with new situations, sort facts, and achieve advances in memory capacity and perceptual abilities" (Feldman, 2003). This information should make us keenly aware of the importance of what a person is taught at this stage of life. 

            "Many developmentalists find middle schools to be developmentally regressive - they force children to step backward" (Berger, 2011). What does this mean in terms of cognitive development? "Long-term academic trajectories - the choice to stay in school or to drop out and the selection in high school of academic college-prep courses versus basic-level courses - are strongly influenced by experience in grades 6-8" (Berger, 2011).  It should be accurate to suggest that this is the point of cognitive growth that the phenomena of evaluation apprehension (Cottrell) arises. Why is this important? Evaluation apprehension, "concern for how others are evaluating us" (Myers, 2010) can help or hurt a person's cognitive progress by altering how much effort they put forth to grow cognitively based on how they feel about being evaluated, and to some extent those evaluating them. This, in my opinion, is but one point of many as to why standardized testing is not good for most people during this stage of life or anytime for that matter. 

            While addressing the education aspect of cognitive development, omitting the technological component would leave out one of the most relative aspects of cognitive development at this stage of life. The internet, cell phones, and video games tend to eat away at the day. They are time-killers to put it mildly. There could be the usual discussion about how to properly use the internet, what is appropriate to send in a text message, and what video game rating is too explicit for someone of this age bracket. However, these things pale in comparison to two other sinister outcomes of using these tools and games. Sleep deprivation and addiction are concerns that all parents should have when it comes to using these mediums. The Mayo Clinic posted a report about adolescent sleep issues that is very pertinent to this discussion, "Most teens need about nine hours of sleep a night — and sometimes more — to maintain optimal daytime alertness. But few teens actually get that much sleep regularly, thanks to factors such as part-time jobs, early-morning classes, homework,   extracurricular activities, social demands, and use of computers and other electronic gadgets. More than 90 percent of teens in a recent study published in the Journal of School Health reported sleeping less than the recommended nine hours a night. In the same study, 10 percent of teens reported sleeping less than six hours a night.

            Although this might seem like no big deal, sleep deprivation can have serious consequences. Tired teens can find it difficult to concentrate and learn, or even stay     awake in class. Too little sleep also might contribute to mood swings and behavioral problems. Another major concern is drowsy driving, which can lead to serious — even deadly — accidents" (Mayo Clinic, 2011).

Lack of sleep can inhibit information intake that in turn can leave gaps in learning. The gaps can be stumbling blocks in the cognitive development future of the person who allows games to keep them from sleeping enough.
            The other issue, addiction, can be just as, if not more, devastating in terms of cognitive development. Like sleep deprivation, "Addiction of any kind limits life experience, with the harm especially severe when the brain is still growing" (Berger, 2011). Does the brain grow throughout adolescence? It does continue to grow, and a quick glance at the physiological aspects of brain development reinforce the warning of Berger, "The final stage is synaptogenesis, or the formation of synapses. Although the process (brain growth) begins before birth, it continues throughout life as neurons form new synapses and discard old ones" (Kalat, 2009). With limits on life experiences and improper brain functioning (due to the fact the prefrontal cortex is not fully developed until a person is in their early 20s), it would make cognitive development much more difficult at best or non-existent at worse.
            Leaving the negative side of adolescence, can there be healthy ways to address cognitive development? Robert J. Sternberg, in his resume that was posted on the Yale website recorded one of the best short descriptions of, not only his study of intelligence, but a great map to help understand how to maximize cognitive growth,
" My research is motivated primarily by a theory of successful intelligence, which attempts to account for the intellectual sources of individual differences that enable people to achieve success in their lives, given the socio-cultural context in which they live. Successfully intelligent people discern their strengths and weaknesses, and then figure out how to capitalize on their strengths, and to compensate for or remediate their weaknesses. Successfully intelligent individuals succeed in part because they achieve a functional balance among a "triarchy" of abilities: analytical abilities, which are used to analyze, evaluate, judge, compare and contrast; creative abilities, which are used to create, invent, discover, imagine; practical abilities, which are used to apply, utilize, implement, and activate. Successfully intelligent people are not necessarily high in all three of these abilities, but find a way effectively to exploit whatever pattern of abilities they may have. Moreover, all of these abilities can be further developed. A fundamental idea underlying this research is that conventional notions of intelligence and tests of intelligence miss important kinds of intellectual talent, and overweigh what are sometimes less important kinds of intellectual talent" (University of Oregon, 2001).
Sternberg is quite an intelligent man, and his contributions to the inner-workings of cognitive functioning are very helpful. His insights can help parents understand the role "true" intelligence plays in the life of their teenager, and from there the parents should be able to help foster cognitive growth in their teenager. Although intelligence and cognitive ability are not exactly the same, they are inter-related and inter-twined to the point that improvement in one should have a positive effect on the other. Cognitive ability can be, in most people, improved (fluidly) to a mature point (of crystallized intelligence). So, therefore a discussion of intelligence is relevant to the overall idea of this paper. So, in transition to adulthood, find the words of Sternberg from an audio transcript of an interview in which he "defines intelligence" in a most interesting way,
            "I prefer to refer to it as "successful intelligence." And the reason is that the emphasis is    on the use of your intelligence to achieve success in your life.   So I define it as your skill        in achieving whatever it is you want to attain in your life within your socio-cultural   context-Meaning that people have different goals for themselves, and for some it's to get      very good grades in school and to do well on tests, and for others it might be to become a             very good basketball player or actress or musician. So, it's your skill in obtaining what       you want in life within your socio-cultural context [which] means that if you want to be      an axe murderer it wouldn't count--by capitalizing on your strengths and compensating       for, or correcting, your weaknesses. And what that means is that people differ in their   personal profiles. Some people are good at one thing; some are good at another thing.               And if you look at people who attain success by their own standards, they're generally      people who found something they do really well. And it can be very different things for         different people. For some, it's doing well on IQ tests, or SATs. For others it might be   playing basketball. For others it might be being a politician.   So they capitalize on their        strengths. And the things that they don't do so well-They find ways either to             compensate, meaning that they perhaps have someone else do the things they don't do      well, or they have them done by electronic means or whatever. Or correcting their  weaknesses- They make themselves better at whatever it is they didn't do so well, by  adapting to, shaping and selecting environments, which means that some of the time you change yourself to fit the environment. That's adaptation. So if you started a new job or a   new relationship, part of the time you change yourself to fit the job or to fit the relationship. And shaping means that part of the time you change the environment that is. So you might try to modify the job to make it a better fit to you, or you might try to change the relationship to make it more what you hoped it would be. And selection means that sometimes you just get out.   Being successfully intelligent means knowing when you're in the wrong place at the wrong time--the wrong job, the wrong relationship,  the wrong place to live---Um, through a combination of analytical, creative and practical abilities. You need creative skills to come up with ideas; you need analytical abilities to know whether they're good ideas-to evaluate the ideas-and you need practical abilities to make your ideas work and to persuade other people that your ideas are worth listening to. So that's the definition" (Plucker, 2003).

                                     Adult Cognitive Development
Emerging Adults
            Often defined in terms of those who are ages 18-25 this period of life seems to have as much, if not more, in terms of cognitive development potential. This is also the beginning of the, "postformal thought" stage of life that, "originated because several developmentalists agreed that Piaget's fourth stage, formal operational thought, was inadequate to describe adult thinking" (Berger, 2011). However, when discussing cognitive development is there any real development? Do adults acquire "new" intellectual capabilities? The answer could be no. If that is the case then, "adulthood has no stages" (Berger, 2011). Maybe, maybe not?
            Emerging adulthood reveals quite a bit of evidence for continued cognitive growth. One of the first things one might notice is the ability to combine the subjective with the objective. Subjectivity being emotionally driven based on the perceptions of life events, and objectivity being the way things really are (intellectual and logical in nature) sometimes need to be combined to make informed decisions in adulthood. "Without this consolidation of intellect and emotion, behavioral extremes...or cognitive extremes (such as believing that one is the best or the worst person on Earth) are common" (Berger, 2011).
            The most important (as some consider it) cognitive advancement usually comes into play during this stage of life. The ability to, "consider a thesis and its antithesis simultaneously and arrive at a synthesis" (Berger, 2011) is called dialectical thought, and it is often considered the pinnacle of all cognitive development. There is one caveat to the synthesis arrived at employing this type of thought process, sometimes one should not strive to create a synthesis. Some things in life are separated into thesis/antithesis for a reason. Truth and non-truth cannot be synthesized into something that has objective meaning. If something is objectively true, it implies that anything opposite (antithetical) to it is false, and no amount of synthesis can  blend the two. However, not to take the discussion too deep, if I say chocolate ice cream is the best, and you say vanilla ice cream is the best, I have formed a thesis and you have formed an antithesis. Who is right? Dialectical thought would enable us to agree that both are very good in spite of the fact that I prefer chocolate and you prefer vanilla.
            There are other areas that come into play in emerging adult cognitive development. Gender differences tend to become clearer based on how each processes role ideas. Faith also becomes something to struggle with during this time for many people.
Middle Adulthood
            Basic adulthood spans the ages of 25-40 (or even up to 65 in some opinions). The references from Sternberg are equally as applicable here as they were in the section on adolescence. IQ tends to be the big focus of those who study cognitive development during these years.
            The most notable part of cognitive development during these years is not the new information that these people will acquire, but utilization of what they have learned. "People in their forties and fifties are better at solving problems that require the use of a store of practical knowledge" (Davis, 2004). The reason for this is that, "when middle adulthood is reached, considerable information concerning everyday problems and ways to solve them has been accumulated" (Davis, 2004).
            It is often cited that as people age their IQ declines. Nettleback and Rabbit (1992) suggested that, "poorer performance on IQ tests may be the result of physical rather than cognitive changes" (Feldman, 2003). Why was the decline determined to be physical and not cognitive? It was suggested as such because of the slower reaction times present in people as they age. So, just because some is older and slower does not necessarily mean they have dropped in cognitive functioning.
Late Adulthood
            From a cognitive point of view, at this stage of life, "fluid intelligence begins a gradual decline...whereas crystallized intelligence continues to increase gradually" (Davis, 2004). Why is this the case when people reach late adulthood? The evidence suggests, "as we grow older, we begin to experience difficulties in successfully encoding new material" (Davis, 2004). More recent studies show that there is a decline in memory as well as encoding. Mather and Carstensen (2005) suggests if there is a motivating cognitive factor present it will often have a positive effect on attention and memory. They begin the study by confirming how control over cognitive processes declines with age,
            "Perhaps the most widely acknowledged psychological change with age is the decline in   cognitive processes, especially memory. However, not all cognitive processes decline with age – not even all types of memory. One general characterization is that older adults have impaired cognitive control that is associated with deterioration in prefrontal brain  regions. Thus, older adults show deficits on attention and memory tasks that require the generation and maintenance of internal strategies rather than just reliance on external cues. For example, explicit recall of words studied a few minutes previously was shown to decline across a four-year period whereas implicit memory of recently studied words   did not show a decline with age" (Mather & Laura, 2005). The up-side to this study was that older adults improve in the area of emotional control, "In contrast with the declines seen in cognitive control, age does not impair emotional control. Compared with younger adults, older adults report that they focus more on self-control of their emotions and rate their emotion regulation skills as better" (Mather & Laura, 2005).
            The down-side to the whole discussion of aging and cognitive ability consists in the ailments that arise as people age. The most common issue older adults deal with is depression that is usually brought about by the, "death of spouses and friends...declining health...which may rob older people of their sense of control" (Feldman, 2003).
            Other cognitive bearing issues that arise at this stage of life are dementia "the most common mental disorder of the elderly" (Feldman, 2003), and Alzheimer's disease, "a progressive brain disorder that produces loss of memory and confusion" (Feldman, 2003). It has been published that, "almost 50% of people over the age of 85 are affected by Alzheimer's disease" (Feldman, 2003).
            Cognitive is fleeting thing as we age. No matter what type of spin the researchers might put on it cognitive development builds from birth to middle adulthood and then begins its decent. Does life really matter after age 85? If we pass 85yrs old will we be cognizant enough to know or even care? The moral of the cognitive process is for each person to maximize their abilities while they have the ability to do so.
            At this point, the words of Solomon seem apropos,
            Remember also your Creator in the days of your youth, before the evil days come and    the years draw near when you will say, “I have no delight in them”; before the sun and    the light, the moon and the stars are darkened, and clouds return after the rain; in the day that the watchmen of the house tremble, and mighty men stoop, the grinding ones stand    idle because they are few, and those who look through [a]windows grow dim; and the  doors on the street are shut as the sound of the grinding mill is low, and one will arise at   the sound of the bird, and all the daughters of song will [b]sing softly. Furthermore, [c]men are afraid of a high place and of terrors on the road; the almond tree blossoms, the grasshopper drags himself along, and the caperberry is ineffective. For man goes to his  eternal home while mourners go about in the street. Remember Him before the silver cord is [d]broken and the golden bowl is crushed, the pitcher by the well is shattered and the wheel at the cistern is crushed; then the dust will return to the earth as it was, and the    [e] spirit will return to God who gave it. “ Vanity of vanities,” says the Preacher, “all is  vanity!” (Ecclesiastes 12:1-8, NASB).

Conclusion
            Application of the information contained within this paper is and will be useful in any endeavor. For me this information is useful in my current situation as someone who teaches, shares, preaches, and sings about the Good News of Jesus Christ. How, you may ask? It is very simple to explain how it is useful. When it comes to helping others come to know Jesus Christ I have wisdom in the words of the man I consider to be the "last great theologian,"
" Every generation of Christians has this problem of learning how to speak meaningfully to its own age. It cannot be solved without an understanding of the changing existential situation which it faces. If we are to communicate the Christian faith effectively, therefore, we must know and understand the thought forms of our own generation. These will differ slightly from place to place, and more so from nation to nation. Nevertheless there are characteristics of an age such as ours which are the same wherever we happen to be" (Schaeffer, 1968).


Works Cited


Berger, K. S. (2011). Developing a person through the life span. New York, NY: Worth Publishers.

Davis, S. F. (2004). Psychology (4th ed.). Upper Saddle River, New Jersey: Pearson Education.

Duffy, K. G., & Eastwood, A. (2005). Psychology for Living: Adjustment Growth and Behavior Today (8th ed.). Upper Saddle River, New Jersey: Pearson.

Feldman, R. S. (2003). Development Across the Life Span (3rd ed.). Upper Saddle River, New Jersey: Pearson Education.

Jacobs, D. (2002). Psychology: Brain Behavior and Popular Culture (4th ed.). Dubuque, Iowa: Kendall/Hunt Publishing.

Kalat, J. W. (2009). Biological Psychology (10th ed.). Belmont, California: Wadsworth.

Mather, M., & Laura, C. (2005). Aging and motivated cognition: the positivity effect in attention and memory. TRENDS in Cognitive Sciences , IX (10), 496-502.

Mayo Clinic, S. (2011, August 4). Teen Sleep: Why is your teen so tired. Retrieved April 30, 2012, from Mayo Clinic Health Information: http://www.mayoclinic.com/health/teens-health/CC00019

Myers, D. G. (2010). Social Psychology (3rd ed.). New York: Mc Graw Hill Publishing.

Plucker, J. A. (2003). Human intelligence: Historical influences, current controversies, teaching resources. Retrieved April 27, 2012, from Human intelligence: http://www.indiana.edu/~intell/sternberg_interview.shtml

Schaeffer, F. A. (1968). Escape From Reason. Downers Grove, Illinois, USA: Inner-Varsity Press.

University of Oregon, S. (2001, April 17). Theories of Intelligence. Retrieved April 28, 2012, from Oregon Technology in Education Council: http://otec.uoregon.edu/intelligence.htm


Docendo Discimus - We Learn By Teaching



There are many theories of how people learn. Almost everyone who has written or spoken of learning has a different opinion of how it takes place, and how the brain processes information. From an experiential point of view, learning that can be accurately retrieved is a direct result of how, and under what circumstances specific information is encoded and stored in long-term memory. One of the most effective ways this is done is through learning material with a view to teaching that same material. In the academic, as well as the personal realm, being able to teach someone else what is learned serves as proof that someone has learned the specific information.

Clearer Knowledge
The foundation of learning is thinking about the facts, and being able to restate those ideas in a truthful manner with objectivity. From a teaching perspective it is imperative that we understand that fundamental to all teaching,
            “No progress can be made in teaching any subject until the facts, the truth about it, are imparted. All systems of education begin here”  (Marquis, 1917)
The use of questions through elaborative interrogation, whether internally or externally initiated, forces one to combine the new information with existing knowledge for, expectantly, a clearer and more enduring store of knowledge. Donald K. Adams adds to this thought with a poignant phrase when he says, “Any constructs that require us to close our eyes to any of the phenomena of experience are bad constructs” (Adams, 1954). Along with clearer knowledge about ideas, the information to be learned has to be meaningful.
Better Remembered
Meaningfulness is probably the most important element in the learning environment, and it is usually dependent on felt needs. This thought can be closely tied to Maslow’s Hierarchy of Needs from his paper, A Theory of Human Motivation, 1943, because instinctual needs are felt needs whether or not they reside in consciousness or not. Meaningfulness is also an important component of behavior and cognitive theories, in that people most often respond to and process information in ways that meet their felt needs. Therefore, in light of the aspect of learning addressed in this paper, finding meaning is paramount to better encoding through connecting new ideas to current thought structures. To be useful learning itself must, “imply at least some permanence in connections” (Thorndike, 1913). However, sometimes people do get in their own way on the road to understanding.
Variables
Many things can derail the learning process. Within the field of psychology many people often miss an important fact, “In order to work, psychological laws have to use psychological variables: i.e. acts rather than responses, organisms as personalities, rather than proton-electron aggregates, pieces of protoplasm, or cell assemblies, and objects rather than stimuli”  (Adams, 1954). First, does the individual have the intention to learn? Learning has many facets or acts that can be visible, and they can be a products of trial and error based on stimulus-response conditioning often taught though the implementation of punishments and rewards, or they can be a self-directed efforts to improve understanding of an idea or set of ideas. Second, is the individual interested in the topic, and if not can it be incentivized? Incentive is a difficult target to hit, and remembering that people have personalities that are as varied as the people themselves is keys to finding the bull’s-eye. Positive reinforcement in an educational setting can be very effective. When someone is able to use the information learned in a positive way it elevates their self-confidence, and enhances their self-image. Other learning environments require other motivators. Some may require negative reinforcement to reinforce the desired behavior. How do we know we have cleared the variable hurdles?
Proof
 Why is teaching possibly the best way to demonstrate what has been learned? The connection between learning and memory is best exemplified by restating in a coherent way the knowledge acquired through learning. A critical error is often perpetuated by instructors who think testing is a good measure for what has been learned. It could be expressed this way, “The average examination tests very little more than memory” (Schaeffer, 1901). Testing, for the most part, is akin to teaching a parrot to speak. We can teach the bird to repeat a word, but the bird cannot explain the meaning of the word, therefore, the bird has not learned how to understand the word, only how say the word. Suffice it to say, “In studying his lessons the average schoolboy’s sole aim is to be able to repeat once, and only once, the knowledge before him” (Lyon, 1917). It is not problematical to memorize lists of names, words, and numbers, but attributing meaning to each of these is proof of learning. Regurgitation of facts is not necessarily indicative of true knowledge of a subject. Adding meaning to what is regurgitated is evidence that an idea is learned. What is gained by learning with a view to teaching?
Final Words
When we learn with a view to being able to teach the material we become motivated during the encoding process. Our motivation can produce openness to the ideas studied while our schemas keep our storage of the information in perspective. This format retains much in common with narrative theory involving sequential, action-oriented, and detailed thought. Truth and reality must come first in the learning process. Based on the correspondence view of the truth, ideas not based in reality should not be considered to have been learned. This does not include ideas presented as hypothesis or theory because those must be understood in light of what they are and not presented as facts. Teaching as a proof of learning is to exhibit comprehension of an idea, although not always exhaustively, in a truthful manner with considerable depth of understanding.
Conclusion
Docendo discimus, are not only words Seneca the Younger wrote in his letters to Lucilius Junior in Epistulae Morales in 50 – 65 AD, they are the words that I believe best describe the learning process. There are others in higher education that agree with this belief as they have taken these words and employed them as mottos at their various institutions. If we encode ideas, store them, and are able to retrieve them with an adequate depth of true knowledge, we can be said to have learned the material. I believe in this idea so deeply I have a blog with its namesake. Therefore, like Seneca, I believe we learn by teaching or docendo discimus. 

Works Cited
Adams, D. K. (1954). Learning and Explanation. In Learning Theory, Personality Theory, and Clinical Research: the Kentucky Symposium (pp. 69-77). New York, NY, USA: Wiley.
Lyon, D. O. (1917). The Educational Value of Psychological Research. In D. O. Lyon, Memory and the Learning Process (p. 156). Baltimore, MD, USA: Warwick & York.
Marquis, J. A. (1917). Learning to Teach from The Master Teacher. Philadelphia: Westminster Press.
Schaeffer, N. C. (1901). The Materials of Thought. In Thinking and Learning To Think (p. 50). Philadephia: Lippincott Company.
Thorndike, E. L. (1913). The Original Nature of Man (Vol. I). New York, NY, USA: Teachers College, Columbia University.