How and why do myths arise?

Thoughts on myths from a true academian:

How and why do myths arise?:

Myth: A Very Short Introduction
By Robert A. Segal

It is trite to say that one’s pet subject is interdisciplinary. These days what subject isn’t? The prostate? But myth really is interdisciplinary. For there is no study of myth as myth, the way, by contrast, there is said to be the study of literature as literature or of religion as religion. Myth is studied by other disciplines, above all by sociology, anthropology, psychology, politics, philosophy, literature, and religious studies. Each discipline applies itself to myth. For example, sociologists see myth as something belonging to a group.

Within each discipline are theories. A discipline can harbor only a few theories or scores of them.  What makes theories theories is that they are generalizations. They presume to know the answers to one or more of the three main questions about myth:  the origin, the function, or the subject matter.

The question of origin asks why, if not also how, myth arises. The answer is a need, which can be of any kind and on the part of an individual, such as the need to eat or to explain, or on the part of the group, such as the need to stay together. The need exists before myth, which arises to fulfill the need. Myth may be the initial or even the sole means of fulfilling the need. Or there may be other means, which compete with myth and may best it. For example, myth may be said to explain the physical world and to do so exceedingly well — until science arises and does it better. So claims the theorist E. B. Tylor, the pioneering English anthropologist.

Function is the flip side of origin. The need that causes myth to arise is the need that keeps it going. Myth functions as long as both the need continues to exist and myth continues to fulfill it at least as well as any competitor. The need for myth is always a need so basic that it itself never ceases. The need to eat, to explain the world, to express the unconscious, to give meaningfulness to life – these needs are panhuman. But the need for myth to fulfill these needs may not last forever. The need to eat can be fulfilled through hunting or farming without the involvement of myth. The need to express the unconscious can be fulfilled through therapy, which for both Sigmund Freud and his rival C. G. Jung is superior to myth. The need to find or to forge meaningfulness in life can be fulfilled without religion and therefore without myth for secular existentialists such as Albert Camus.

For some theorists, myth has always existed and will always continue to exist. For others, myth has not always existed and will not always continue to exist. For Mircea Eliade, a celebrated Romanian-born scholar of religion, religion has always existed and will always continue to exist. Because Eliade ties myth to religion, myth is safe. For not only Tylor but also J. G. Frazer, author of The Golden Bough, myth is doomed exactly because myth is tied to religion. For them science has replaced religion and as a consequence has replaced myth. “Modern myth” is a contradiction in terms.
The third main question about myth is that of subject matter. What is myth really about? There are two main answers: myth is about what it is literally about, or myth symbolizes something else. Taken literally, myth is usually about gods or heroes or physical events like rain. Tylor, Eliade, and the anthropologist Bronislaw Malinowski all read myth literally. Myth taken literally may also mean myth taken historically, especially in myths about heroes.

The subject matter of myth taken symbolically is open-ended. A myth about the Greek god Zeus can be said to symbolize one’s father (so Freud), one’s father archetype (so Jung), or the sky (so nature mythologists).  The religious existentialists Rudolf Bultmann and Hans Jonas would contend that the myth of the biblical flood is to be read not as a explanation of a supposedly global event from long ago but as a description of what it is like for anyone anywhere to live in a world in which, it is believed, God exists and treats humans fairly.

To call the flood story a myth is not to spurn it. I am happy to consider any theory of myth, but not the crude dismissal of a story or a belief as a “mere myth.” True or false, myth is never “mere.” For to call even a conspicuously false story or belief a mere myth is to miss the power that that story or belief holds for those who accept it. The difficulty in persuading anyone to give up an obviously false myth attests to its allure.
Robert A. Segal is Sixth Century Chair in Religious Studies at the University of Aberdeen.  He is the author of Myth: A Very Short Introduction and of Theorizing about Myth. He is presently at work editing the Oxford Handbook of Myth Theory. He directs the Centre for the Study of Myth at Aberdeen.

Subscribe to the OUPblog via email or RSS.
Subscribe to only VSI articles on the OUPblog via email or RSS.
View more about this book on the  Who Was Who online, part of Who’s Who online, has granted free access for a limited time to the entries for the philosophers and scholars mentioned in the above article.
Image credit: Thetis and Zeus by Anton Losenko, 1769. Copy of artwork used for the purposes of illustration in a critical commentary on the work. Source: Wikimedia Commons. 

Is a degree worth it?

This post is on a topic that weighs heavily on my mind given that I am in the throes of completing my last course before being awarded a Master of Science in Management. My comments follow:

Liberating the Liberal Arts—and Making Higher Education Affordable:
By Andrew J. Coulson

If you wanted a liberal arts education in 1499, you were probably out of luck. But, if you happened to be a 0.01 percenter, you might have been able to saddle up the horse and ride to Oxford or Cambridge. Because that’s where the books were. Books didn’t generally come to you, you had to go to them.

Today, every one of us has more works of art, philosophy, literature, and history at our fingertips than existed, worldwide, half-a-millennium ago. We can call them up, free or for a nominal charge, on electronic gadgets that cost little to own and operate. Despite that fact, we’re still captives to the idea that a liberal arts education must be dispensed by colleges and must be acquired between the ages of 19 and 22.

But the liberal arts can be studied without granite buildings, frat houses, or sports venues. Discussions about great works of literature can be held just as easily in coffee shops as in stadium-riser classrooms—perhaps more easily. Nor is there any reason to believe that there is some great advantage to concentrating the study of those works in the few years immediately after high school—or that our study of them must engage us full-time. The traditional association of liberal arts education and four-year colleges was already becoming an anachronism before the rise of the World Wide Web. It is now a crumbling fossil.

Handing colleges tens of thousands of dollars—worse yet, hundreds of thousands—for an education that can be obtained independently at little cost, would be tragically wasteful even if the college education were effective. In many cases, it is not. Research by Richard Arum and Josipa Roksa reveals that almost half of all college students make no significant gains in critical thinking, complex reasoning, or written communication after two full years of study. Those are skills that any liberal arts education should cultivate. Even among the subset of students who linger for four years at college, fully one-third make no significant gains in those areas.

And yet, instead of recognizing the incredible democratization of access to the liberal arts that modern technology has permitted, and the consequent moribundity of this role of colleges, public policy remains mired in a medieval conception of higher education. Politicians compete to promise what they think young people want to hear: more and larger subsidies for college fees. Even if perpetuating a 15th century approach to higher education were in students’ interests—which it clearly is not—increased government subsidies for college tuition would not achieve that goal. We have tripled student aid in real, inflation-adjusted dollars since 1980, to roughly $14,000 per student, and yet student debt recently hit an all-time high of roughly $1 trillion. And barely half of students at four-year public colleges even complete their studies in six years. Aid to colleges is good for colleges, but it is an outlandish waste of resources if the goal is to improve the educational options available to young people.

These same realities apply, to an only slightly lesser extent, to the sciences and engineering. In those areas, colleges sometimes have equipment and facilities of instructional value that students could not independently afford. But even in the sciences and engineering, such cases are limited. It is perfectly feasible for an avid computer geek to learn everything he or she needs to know to work in software engineering by doing individual and group projects with inexpensive consumer hardware and software. This was even possible before the rise of the Web. My first direct supervisor at Microsoft was a brilliant software architect hired right out of high school… in the 1980s.

Though most politicians have been slow on the uptake, the public seems increasingly aware of all this. A Pew Research survey finds that 57 percent Americans no longer think college is worth the money. So why are they still sending their kids there? Habit is no doubt part of the picture, but so is signaling. People realize that colleges are instructionally inefficient, but being accepted to and graduating from an academically selective one signals ability and assiduity to potential employers.

So what’s the solution? Alternative signaling options and better hiring practices would be a good start. Anyone who studies hard for the SAT, ACT, GRE, or the like, and scores well, can send the academic ability signal. But in the end, employers want more than academic ability. What they really want are subject area expertise, a good work ethic, an ability to work smoothly with a variety of people, and, for management, leadership ability. Any institution that develops good metrics for these attributes, and issues certifications accordingly, will provide an incredibly valuable service for employers. Students would then be free to study independently, occasionally paying for instruction where necessary, and then seek a certification signaling what they’ve learned. In the meantime, job candidates can create a portfolio of work on the Web showing what they know and can do (a “savoir faire”)—which would be more useful to employers than most resumes.

What is certainly not useful is raising taxes still further in a time of economic difficulty in order to pad the budgets of colleges and encourage students to take on yet more college debt.
Liberating the Liberal Arts—and Making Higher Education Affordable is a post from Cato @ Liberty – Cato Institute Blog

If you’ve read my blog for very long, you’ve probably noticed I have an admiration for things medieval, but I have to agree in spirit with Mr. Coulson on this issue. I think the state of secondary education, at least in my experience, is less than ideal.

It took me from 1996 to 2008 to earn my Bachelor’s Degree, and from then until now (or a few weeks from now) to earn my Master’s. It’s been a long ordeal and my wife and kids have patiently suffered through it.

I never had a “traditional” college experience, or an education in liberal arts. After finishing my military training so that I would have access to the GI Bill, I worked at McDonald’s as a shift manager on days (and nights) that I wasn’t attending classes. There’s only so much fast food employment that a person can take, and I moved on after that to other full-time jobs while attending school at night full time. My first road block what when I was discharged from the National Guard. My GI Bill immediately ceased, even though I had not used up all my funds. This is a big difference between the Active Duty GI Bill and what was given to reserve components at that time. So, my enrollment went from full time down to what I thought I could afford. In 2004 I received my first college degree, an Associates of Applied Science in Missile and Munitions Technology, a degree for which I took no courses in missiles or munitions technology. At that time (and it still may be now) that was the only degree that the community college I was attending would apply military experience listed on an AARTES transcript toward. I didn’t have to take any extra classes to get it, and it was at least a degree that I could list on my résumé. I trudged on there for another couple years building up the credit I would need to transfer to a four year school.

Around that time I had worked in IT enough to realized two things: technicians/engineers who were promoted to management didn’t always make good managers, and managers always made more money, whether they understood what the techs under them were doing or not. And then I discovered Faulkner University’s Bachelor of Business Administration program, that compressed the last two years of undergraduate studies into a single, lock-stepped, regimented year. This was an interesting program. Terms were broken down into eight-week sessions consisting of two one night classes, and independent study, and a one day seminar at the end of the eight weeks at the main Montgomery, AL campus. I spent near every Friday night before the seminar up late in my hotel room putting the final touches on the paper that had to be submitted for the independent study, to include trudging a laser printer and binding machine up to my hotel room, every time. I don’t intentionally procrastinate, but I seem to do better thinking with a little pressure. The end result of that endeavor was a Bachelor’s degree from a private “Christian” university. I don’t know if that says anything about my integrity, but I did find the courses on ethics and marriage and family, from that perspective, highly valuable.

After I finished that up, I immediately entered a Master’s program. I had a couple options there: Faulkner had a Master of Science in Management program that was compressed into a single year, but it would require a trek down to Montgomery every other week (at a time when the price of a gallon of gas was skyrocketing), so I went with the MSM offering that Embry-Riddle Aeronautical University had at their Redstone Arsenal satellite campus. It wasn’t lockstep like Faulkner’s was, but it was local. Given Embry-Riddle’s aeronautical focus, I stuck with the general option and took course within it that allowed me to pick up a graduate certificate in project management. I finished all of the actual classwork for the program in 2010 and then took my time working on a capstone project that was required to graduate. I like to think I was waiting for an epiphany so that I could write a great paper, but I think it was the lack of structure that allowed me to drag it out so long. Ultimately, some things did fall into place earlier this year that allowed me to draft a much better paper than if I had jumped onto it immediately. So that brings me to the present where I am two weeks into MGMT 690, which is the course that I submit my paper to my committee for a pass/fail. I’ve been working with them since this past Spring, so I’m confident that I’ll have approving signatures in the next week or two.

All that being said, here is my take on the state of secondary education in the United States: it is still a necessary evil, but I think my best learning has never occurred in a classroom. I learned in the classroom what I needed to obtain credits to earn degrees. Employers look for degrees as threshold requirements. I learned the hard way that it really doesn’t matter what your skills are if you don’t have the magic key that opens the door to an interview. With so many companies using automated parsing of applicants, certain buzzwords like “Bachelor” and “Master” have to be there to keep from getting thrown into the bit bucket before human eyes even have a chance to give you a second glance. Degrees are gates that one must pass through to reach their objectives.

However, I don’t think a traditional education is the way to go. I worked my way through college in some pretty good jobs that helped develop my skills so that I had real world experience while I was being taught the theory behind it. Starting college with no experience would have been disastrous for me. Starting college without having first served in the military would have had the same results. I needed the discipline.

And from the apparent debauchery that goes on in many American universities, I cannot fathom being a parent and throwing away my hard-earned money that way. Like many students, I have a pretty hefty student loan debt, but since I’m the one having to pay for it, I’ve always had a desire to get my money’s worth in the classroom. I wasn’t in a fraternity and the only college parties I ever went to consisted of a weekend when I was in AIT at Ft. Gordon, GA, and went with a buddy to visit his sister at the University of Georgia in Athens. We spent the Friday night searching for her at a couple frat houses only to give up and spend the night sleeping in the floor of one of his friend’s apartments. We met up with his sister sometime that Saturday, spent the afternoon at her sorority house (an interesting experience), I got to see the only college football game I’ve ever been to, and then went back to her sorority house. His sister was supposed to have went out to pick us up some dinner, but sever hours later we had to go pick her up from the pokey because she’d gotten arrested for being drunk. That is my “college” experience.

So traditional liberal education… meh. I’d rather read the classics on my own, develop my own theories, read the blogs of intelligent people that maybe I can learn from, and have a lifelong liberal arts education – minus the degree.

Degrees are essential in the current business environment, and I wholeheartedly endorse the pursuit of an education that has a return on investment. However, many degrees are not worth the paper they are written on, and many years (and dollars) can be wasted in pursuit of them. I’ll end with a joke to make my point.

The engineering graduate asks, “How does it work?” The science graduate asks, “Why does it work?” The accounting graduate asks, “How much will it cost?” The arts graduate asks, “Do you want fries with that?”


What do you call a liberal arts graduate? Barista.