That it explains it!

I discovered Russian car cam videos a while back, and today I read an article that explains why all those crashes happen:

Russians have had trouble shaking off one stereotype for centuries: razgildyaistvo, or negligence and carelessness. Razgildyaistvo, some say, is as Russian as long, cold winters — a ­Russian institution of its own making.

How accurate is this stereotype?

Prominent historian and journalist Leonid Mlechin perhaps put it best: “Razgildyaistvo is part of the Russian character for the simple reason that Russians often have difficulty following rules and instructions. The Russian workplace is all too often defined by a lack of discipline and system of control.”

Now I understand the mindset that had led to some pretty mind boggling events behind the wheel. Here is just a sampling:

A Republic if you can keep it

Following my post on “cashless communism“, I’d like to address why democracy doesn’t work, either.

There’s a story (and I blatantly paraphrase) of Ben Franklin being asked what form of government had came out the Constitutional convention, to which he replied “A Republic if you can keep it.”. This sentiment was strong amongst the Founding Fathers, that a democracy was sure to fail, but republicanism had more stability.

There’s a great article on the website of the journal The New American that catalogs this belief far better than I ever could. I won’t expound on this topic like I did cashless communism, but in a nutshell democracy unrestrained is mob rule. The article mentions this quote, but it’s worth reiterating own its own:

A democracy cannot exist as a permanent form of government. It can only exist until the majority discovers it can vote itself largess out of the public treasury. After that, the majority always votes for the candidate promising the most benefits with the result the democracy collapses because of the loose fiscal policy ensuing, always to be followed by a dictatorship, then a monarchy.” – Alexander Fraser Tytler, Lord Woodhouselee

There is debate as to whether Lord Woodhouselee actually made the statement attributed to him, but the basis of the statement is fairly sound. We’ve seen this in history, and we have seen the West attempt proselytise democracy across the Middle East for the past decade or so. The Arab Spring was based on a premise of spreading democracy. We may have opportunity to possibly test this theory as current political affairs unravel, at least in the deprecation from democracy into dictatorship.

So is this also the fate of the United States? We’re a republic and not a democracy, right? I’m sure the U.S. will cease to exist as we know it at some point. Not too many governments can stand the test of time. The name of a nation may stand, but its underlying political structure changes. The Roman Republic fell with the ascendency of Julius Gaius Julius Cæsar. A couple hundred years later the whole empire collapsed. Great Britain is today by no means the same type of monarchy that was established by William the Conqueror (also known by another name by those who weren’t fond of him) when he unseated the last Anglo-Saxon kings. The Magna Carta saw Britain’s nobles force the Angevin king John Lackland to capitulate some of the powers of the throne. Parliament, and particularly the House of Commons, grew in power over the centuries under the premise of representing the people. History repeats itself, so we must understand what the possible outcomes are. A Nation with that kind of insight can guide itself into calm waters.

To me this all goes back a basic belief expressed in my post on Secession and the Christian:

13 [This is] the end of the matter; all hath been heard: fear God, and keep his commandments; for this is the whole [duty] of man. 14 For God will bring every work into judgment, with every hidden thing, whether it be good, or whether it be evil. (Ecclesiastes 12:13-14)

Cashless communism

I watch the keywords that lead people to this site, and one from this week caught my attention. Somehow through the magic of search engines, someone looking for the phrase “cashless communism” was directed here. I’m not sure how that happened, but I thought I would expound on the topic for fun.

Definitions

So first, lets define the two words in question. Per the Macmillan Dictionary cashless is defined simply as “done without any exchange of cash“, and communism is “a political and economic system in which individual people cannot own property or industries and in which people of all social classes are treated equally“.  That sounds like a great concept, and I can think of an excellent example of where we can see this principle implemented today: a graveyard. I can’t think of any other examples of where this has ever existed. There have been cashless societies that were not made up of equals, and there have been communist societies that have always had a method of exchange.

Concepts

Cashless. There have been a few societies in history that have came close to this concept, but even then there was always a method of bartering. Hunter-gatherers don’t have a lot of need for cash, but I don’t see how cultures agrarian and onward can function without money. If you don’t have a direct one-to-one transfer of goods, then there has to be some medium of exchange. I do not consider myself to be even a novice economist, so at this point, I will direct you to the Ludwig von Mises Institute for a proper education in sound economics. Their approach may sound a little contrarian at first glance, but doesn’t simple truth usually run contrary to popular wisdom anyway?

Communism. I am unaware of any pure communist state that has ever succeeded.  As I stated above, a great concept, but unfortunately, humanity gets in the way of all that equality. Take the Soviet Union for example. Everyone was equally poor except for the oligarchs driving the state apparatus.  Equality is an idea that can only be met by the lowest common denominator. Those who want to excel will not be interested in holding themselves back for the betterment of their fellow man. I’ve heard of a concept of Christian communism that is purported to have existed in the 1st century Church. Proponents of this concept usually quote Acts 2:44-47:

44 Now all who believed were together, and had all things in common, 45 and sold their possessions and goods, and divided them among all, as anyone had need. 46 So continuing daily with one accord in the temple, and breaking bread from house to house, they ate their food with gladness and simplicity of heart, 47 praising God and having favor with all the people. And the Lord added to the church daily those who were being saved.

First, note that while they had all things in common, they did not sell all their possessions and goods, and the did not divide them equally amongst all, they divided the goods based on need. Now note that they still broke bread from house to house. It is fair to assume that they were still working and buying bread. I think the rest of the book of Acts will bear that out. Also look at the account of Ananias and Saphira from Acts 5:

1 But a certain man named Ananias, with Sapphira his wife, sold a possession, 2 and kept back [part] of the price, his wife also being privy to it, and brought a certain part, and laid it at the apostles’ feet. 3 But Peter said, Ananias, why hath Satan filled thy heart to lie to the Holy Spirit, and to keep back [part] of the price of the land? 4 While it remained, did it not remain thine own? and after it was sold, was it not in thy power? How is it that thou hast conceived this thing in thy heart? thou has not lied unto men, but unto God. 5 And Ananias hearing these words fell down and gave up the ghost: and great fear came upon all that heard it. 6 And the young men arose and wrapped him round, and they carried him out and buried him.

7 And it was about the space of three hours after, when his wife, not knowing what was done, came in. 8 And Peter answered unto her, Tell me whether ye sold the land for so much. And she said, Yea, for so much.9 But Peter [said] unto her, How is it that ye have agreed together to try the Spirit of the Lord? behold, the feet of them that have buried thy husband are at the door, and they shall carry thee out. 10 And she fell down immediately at his feet, and gave up the ghost: and the young men came in and found her dead, and they carried her out and buried her by her husband. 11 And great fear came upon the whole church, and upon all that heard these things.

I do not think that the concept that early Christianity was communist stands of its own accord when honestly analysed. But back to the concept of communism itself, I would direct you to the Marxist Internet Archive for a more in-depth study. Caveat emptor: I have not spent much time on this site, so don’t count this link as an endorsement of anything contained on it.

So how would a cashless communist state operate? 

First, it would have to be robust enough to sustain itself without any external commerce. All production would have to be accomplished internally. There would have to be a rigid division of labor. One could not be free to fulfil his own ambitions that might be contrary to the role that society needed him to play. This strict control would only be able to be maintained by a central authority and dissenters would have to be dealt with harshly. Those controlling the apparatus would of necessity not be producers: you can’t work the farm and run the country. King Saul tried this early in his career, but was pulled in full time shortly thereafter. This then gets into one of my favorite Orwellian quotes: “All animals are equal, but some animals are more equal than others“.

Second, we’ll now stick with whimsy and assume that everyone’s labor will be rewarded equally. To do this, only products that yield consistently and in great quantity can be produced. That rules out the consumption of meat for the most part. It is inefficient to produce, consumes other usable resources, and output is variable. That leaves only the most robust and bland foodstuffs for consumption. Storage would have to be a concern, so only foods that can be stored long-term at ambient temperatures can be utilized. Man cannot live by bread alone, but grains and root crops would be the staple foods of such a society. Much entertainment would not exist, it doesn’t have a value add in such a society. Religion, a core tenet of humanity would either exist in the form of a state religion, or no religion, which is not a-theist, but the god is the state.

Third is how goods are distributed in this cashless society. The central authority would have to track everyone’s output to determine distribution. There would have to be 100% accountability of all citizens/serfs/miserable wretches in this culture. What would stop those controlling distribution from skewing allocations? How can you have checks and balances in favor of the workers when the oligarchs are the only ones in overhead positions? It is in their interests to serve themselves first. How would the workers mandate an accurate audit of records? I don’t know. NB: I’m a little biased against communism, as you may have ascertained.

What might work?

There is a way that this could work possibly, but it involves devolving to the most stable (in my humble estimation) form of government: small scale tribalism. It won’t work in “advanced” societies. Things just don’t scale that well. With tribalism, you revert back to extended family groups in small geographic areas that can be self sustaining. Everyone does have a vested interest in being their brother’s (or cousin’s) keeper. Those who have experienced life the most (elders) are given opportunity to lead. Honor holds ambition in check (See Brett McKay’s great series on this topic). Those who shun honor are cast out. Tribalism is the closest to a functioning cashless communism that I can perceive.

Why adults don’t like bicycle helmets

What isn’t said in the summary below is why adults don’t like to wear bicycle helmets: They make you look like a dork.

To Encourage Biking, Lose the Helmets: Hugh Pickens writes in about the detrimental effects of mandatory helmet laws (at least as applied to adults): “Elisabeth Rosenthal writes that in the United States the notion that bike helmets promote health and safety by preventing head injuries is taken as pretty near God’s truth but many European health experts have taken a very different view. ‘Yes, there are studies that show that if you fall off a bicycle at a certain speed and hit your head, a helmet can reduce your risk of serious head injury,’ writes Rosenthal. ‘But such falls off bikes are rare — exceedingly so in mature urban cycling systems.’ On the other hand, many researchers say, if you force people to wear helmets, you discourage them from riding bicycles causing more health problems like obesity, heart disease, and diabetes. Bicycling advocates say that the problem with pushing helmets isn’t practicality but that helmets make a basically safe activity seem really dangerous, which makes it harder to develop a safe bicycling network like the one in New York City, where a bike-sharing program is to open next year. The safest biking cities are places like Amsterdam and Copenhagen, where middle-aged commuters are mainstay riders and the fraction of adults in helmets is minuscule. ‘Pushing helmets really kills cycling and bike-sharing in particular because it promotes a sense of danger that just isn’t justified — in fact, cycling has many health benefits,’ says Piet de Jong. ‘Statistically, if we wear helmets for cycling, maybe we should wear helmets when we climb ladders or get into a bath, because there are lots more injuries during those activities.'”

Share on Google+

Read more of this story at Slashdot.

How and why do myths arise?

Thoughts on myths from a true academian:

How and why do myths arise?:

Myth: A Very Short Introduction
By Robert A. Segal

It is trite to say that one’s pet subject is interdisciplinary. These days what subject isn’t? The prostate? But myth really is interdisciplinary. For there is no study of myth as myth, the way, by contrast, there is said to be the study of literature as literature or of religion as religion. Myth is studied by other disciplines, above all by sociology, anthropology, psychology, politics, philosophy, literature, and religious studies. Each discipline applies itself to myth. For example, sociologists see myth as something belonging to a group.

Within each discipline are theories. A discipline can harbor only a few theories or scores of them.  What makes theories theories is that they are generalizations. They presume to know the answers to one or more of the three main questions about myth:  the origin, the function, or the subject matter.

The question of origin asks why, if not also how, myth arises. The answer is a need, which can be of any kind and on the part of an individual, such as the need to eat or to explain, or on the part of the group, such as the need to stay together. The need exists before myth, which arises to fulfill the need. Myth may be the initial or even the sole means of fulfilling the need. Or there may be other means, which compete with myth and may best it. For example, myth may be said to explain the physical world and to do so exceedingly well — until science arises and does it better. So claims the theorist E. B. Tylor, the pioneering English anthropologist.

Function is the flip side of origin. The need that causes myth to arise is the need that keeps it going. Myth functions as long as both the need continues to exist and myth continues to fulfill it at least as well as any competitor. The need for myth is always a need so basic that it itself never ceases. The need to eat, to explain the world, to express the unconscious, to give meaningfulness to life – these needs are panhuman. But the need for myth to fulfill these needs may not last forever. The need to eat can be fulfilled through hunting or farming without the involvement of myth. The need to express the unconscious can be fulfilled through therapy, which for both Sigmund Freud and his rival C. G. Jung is superior to myth. The need to find or to forge meaningfulness in life can be fulfilled without religion and therefore without myth for secular existentialists such as Albert Camus.

For some theorists, myth has always existed and will always continue to exist. For others, myth has not always existed and will not always continue to exist. For Mircea Eliade, a celebrated Romanian-born scholar of religion, religion has always existed and will always continue to exist. Because Eliade ties myth to religion, myth is safe. For not only Tylor but also J. G. Frazer, author of The Golden Bough, myth is doomed exactly because myth is tied to religion. For them science has replaced religion and as a consequence has replaced myth. “Modern myth” is a contradiction in terms.
The third main question about myth is that of subject matter. What is myth really about? There are two main answers: myth is about what it is literally about, or myth symbolizes something else. Taken literally, myth is usually about gods or heroes or physical events like rain. Tylor, Eliade, and the anthropologist Bronislaw Malinowski all read myth literally. Myth taken literally may also mean myth taken historically, especially in myths about heroes.

The subject matter of myth taken symbolically is open-ended. A myth about the Greek god Zeus can be said to symbolize one’s father (so Freud), one’s father archetype (so Jung), or the sky (so nature mythologists).  The religious existentialists Rudolf Bultmann and Hans Jonas would contend that the myth of the biblical flood is to be read not as a explanation of a supposedly global event from long ago but as a description of what it is like for anyone anywhere to live in a world in which, it is believed, God exists and treats humans fairly.

To call the flood story a myth is not to spurn it. I am happy to consider any theory of myth, but not the crude dismissal of a story or a belief as a “mere myth.” True or false, myth is never “mere.” For to call even a conspicuously false story or belief a mere myth is to miss the power that that story or belief holds for those who accept it. The difficulty in persuading anyone to give up an obviously false myth attests to its allure.
Robert A. Segal is Sixth Century Chair in Religious Studies at the University of Aberdeen.  He is the author of Myth: A Very Short Introduction and of Theorizing about Myth. He is presently at work editing the Oxford Handbook of Myth Theory. He directs the Centre for the Study of Myth at Aberdeen.

Subscribe to the OUPblog via email or RSS.
Subscribe to only VSI articles on the OUPblog via email or RSS.
View more about this book on the  Who Was Who online, part of Who’s Who online, has granted free access for a limited time to the entries for the philosophers and scholars mentioned in the above article.
Image credit: Thetis and Zeus by Anton Losenko, 1769. Copy of artwork used for the purposes of illustration in a critical commentary on the work. Source: Wikimedia Commons. 

Is a degree worth it?

This post is on a topic that weighs heavily on my mind given that I am in the throes of completing my last course before being awarded a Master of Science in Management. My comments follow:

Liberating the Liberal Arts—and Making Higher Education Affordable:
By Andrew J. Coulson

If you wanted a liberal arts education in 1499, you were probably out of luck. But, if you happened to be a 0.01 percenter, you might have been able to saddle up the horse and ride to Oxford or Cambridge. Because that’s where the books were. Books didn’t generally come to you, you had to go to them.

Today, every one of us has more works of art, philosophy, literature, and history at our fingertips than existed, worldwide, half-a-millennium ago. We can call them up, free or for a nominal charge, on electronic gadgets that cost little to own and operate. Despite that fact, we’re still captives to the idea that a liberal arts education must be dispensed by colleges and must be acquired between the ages of 19 and 22.

But the liberal arts can be studied without granite buildings, frat houses, or sports venues. Discussions about great works of literature can be held just as easily in coffee shops as in stadium-riser classrooms—perhaps more easily. Nor is there any reason to believe that there is some great advantage to concentrating the study of those works in the few years immediately after high school—or that our study of them must engage us full-time. The traditional association of liberal arts education and four-year colleges was already becoming an anachronism before the rise of the World Wide Web. It is now a crumbling fossil.

Handing colleges tens of thousands of dollars—worse yet, hundreds of thousands—for an education that can be obtained independently at little cost, would be tragically wasteful even if the college education were effective. In many cases, it is not. Research by Richard Arum and Josipa Roksa reveals that almost half of all college students make no significant gains in critical thinking, complex reasoning, or written communication after two full years of study. Those are skills that any liberal arts education should cultivate. Even among the subset of students who linger for four years at college, fully one-third make no significant gains in those areas.

And yet, instead of recognizing the incredible democratization of access to the liberal arts that modern technology has permitted, and the consequent moribundity of this role of colleges, public policy remains mired in a medieval conception of higher education. Politicians compete to promise what they think young people want to hear: more and larger subsidies for college fees. Even if perpetuating a 15th century approach to higher education were in students’ interests—which it clearly is not—increased government subsidies for college tuition would not achieve that goal. We have tripled student aid in real, inflation-adjusted dollars since 1980, to roughly $14,000 per student, and yet student debt recently hit an all-time high of roughly $1 trillion. And barely half of students at four-year public colleges even complete their studies in six years. Aid to colleges is good for colleges, but it is an outlandish waste of resources if the goal is to improve the educational options available to young people.

These same realities apply, to an only slightly lesser extent, to the sciences and engineering. In those areas, colleges sometimes have equipment and facilities of instructional value that students could not independently afford. But even in the sciences and engineering, such cases are limited. It is perfectly feasible for an avid computer geek to learn everything he or she needs to know to work in software engineering by doing individual and group projects with inexpensive consumer hardware and software. This was even possible before the rise of the Web. My first direct supervisor at Microsoft was a brilliant software architect hired right out of high school… in the 1980s.

Though most politicians have been slow on the uptake, the public seems increasingly aware of all this. A Pew Research survey finds that 57 percent Americans no longer think college is worth the money. So why are they still sending their kids there? Habit is no doubt part of the picture, but so is signaling. People realize that colleges are instructionally inefficient, but being accepted to and graduating from an academically selective one signals ability and assiduity to potential employers.

So what’s the solution? Alternative signaling options and better hiring practices would be a good start. Anyone who studies hard for the SAT, ACT, GRE, or the like, and scores well, can send the academic ability signal. But in the end, employers want more than academic ability. What they really want are subject area expertise, a good work ethic, an ability to work smoothly with a variety of people, and, for management, leadership ability. Any institution that develops good metrics for these attributes, and issues certifications accordingly, will provide an incredibly valuable service for employers. Students would then be free to study independently, occasionally paying for instruction where necessary, and then seek a certification signaling what they’ve learned. In the meantime, job candidates can create a portfolio of work on the Web showing what they know and can do (a “savoir faire”)—which would be more useful to employers than most resumes.

What is certainly not useful is raising taxes still further in a time of economic difficulty in order to pad the budgets of colleges and encourage students to take on yet more college debt.
Liberating the Liberal Arts—and Making Higher Education Affordable is a post from Cato @ Liberty – Cato Institute Blog

If you’ve read my blog for very long, you’ve probably noticed I have an admiration for things medieval, but I have to agree in spirit with Mr. Coulson on this issue. I think the state of secondary education, at least in my experience, is less than ideal.

It took me from 1996 to 2008 to earn my Bachelor’s Degree, and from then until now (or a few weeks from now) to earn my Master’s. It’s been a long ordeal and my wife and kids have patiently suffered through it.

I never had a “traditional” college experience, or an education in liberal arts. After finishing my military training so that I would have access to the GI Bill, I worked at McDonald’s as a shift manager on days (and nights) that I wasn’t attending classes. There’s only so much fast food employment that a person can take, and I moved on after that to other full-time jobs while attending school at night full time. My first road block what when I was discharged from the National Guard. My GI Bill immediately ceased, even though I had not used up all my funds. This is a big difference between the Active Duty GI Bill and what was given to reserve components at that time. So, my enrollment went from full time down to what I thought I could afford. In 2004 I received my first college degree, an Associates of Applied Science in Missile and Munitions Technology, a degree for which I took no courses in missiles or munitions technology. At that time (and it still may be now) that was the only degree that the community college I was attending would apply military experience listed on an AARTES transcript toward. I didn’t have to take any extra classes to get it, and it was at least a degree that I could list on my résumé. I trudged on there for another couple years building up the credit I would need to transfer to a four year school.

Around that time I had worked in IT enough to realized two things: technicians/engineers who were promoted to management didn’t always make good managers, and managers always made more money, whether they understood what the techs under them were doing or not. And then I discovered Faulkner University’s Bachelor of Business Administration program, that compressed the last two years of undergraduate studies into a single, lock-stepped, regimented year. This was an interesting program. Terms were broken down into eight-week sessions consisting of two one night classes, and independent study, and a one day seminar at the end of the eight weeks at the main Montgomery, AL campus. I spent near every Friday night before the seminar up late in my hotel room putting the final touches on the paper that had to be submitted for the independent study, to include trudging a laser printer and binding machine up to my hotel room, every time. I don’t intentionally procrastinate, but I seem to do better thinking with a little pressure. The end result of that endeavor was a Bachelor’s degree from a private “Christian” university. I don’t know if that says anything about my integrity, but I did find the courses on ethics and marriage and family, from that perspective, highly valuable.

After I finished that up, I immediately entered a Master’s program. I had a couple options there: Faulkner had a Master of Science in Management program that was compressed into a single year, but it would require a trek down to Montgomery every other week (at a time when the price of a gallon of gas was skyrocketing), so I went with the MSM offering that Embry-Riddle Aeronautical University had at their Redstone Arsenal satellite campus. It wasn’t lockstep like Faulkner’s was, but it was local. Given Embry-Riddle’s aeronautical focus, I stuck with the general option and took course within it that allowed me to pick up a graduate certificate in project management. I finished all of the actual classwork for the program in 2010 and then took my time working on a capstone project that was required to graduate. I like to think I was waiting for an epiphany so that I could write a great paper, but I think it was the lack of structure that allowed me to drag it out so long. Ultimately, some things did fall into place earlier this year that allowed me to draft a much better paper than if I had jumped onto it immediately. So that brings me to the present where I am two weeks into MGMT 690, which is the course that I submit my paper to my committee for a pass/fail. I’ve been working with them since this past Spring, so I’m confident that I’ll have approving signatures in the next week or two.

All that being said, here is my take on the state of secondary education in the United States: it is still a necessary evil, but I think my best learning has never occurred in a classroom. I learned in the classroom what I needed to obtain credits to earn degrees. Employers look for degrees as threshold requirements. I learned the hard way that it really doesn’t matter what your skills are if you don’t have the magic key that opens the door to an interview. With so many companies using automated parsing of applicants, certain buzzwords like “Bachelor” and “Master” have to be there to keep from getting thrown into the bit bucket before human eyes even have a chance to give you a second glance. Degrees are gates that one must pass through to reach their objectives.

However, I don’t think a traditional education is the way to go. I worked my way through college in some pretty good jobs that helped develop my skills so that I had real world experience while I was being taught the theory behind it. Starting college with no experience would have been disastrous for me. Starting college without having first served in the military would have had the same results. I needed the discipline.

And from the apparent debauchery that goes on in many American universities, I cannot fathom being a parent and throwing away my hard-earned money that way. Like many students, I have a pretty hefty student loan debt, but since I’m the one having to pay for it, I’ve always had a desire to get my money’s worth in the classroom. I wasn’t in a fraternity and the only college parties I ever went to consisted of a weekend when I was in AIT at Ft. Gordon, GA, and went with a buddy to visit his sister at the University of Georgia in Athens. We spent the Friday night searching for her at a couple frat houses only to give up and spend the night sleeping in the floor of one of his friend’s apartments. We met up with his sister sometime that Saturday, spent the afternoon at her sorority house (an interesting experience), I got to see the only college football game I’ve ever been to, and then went back to her sorority house. His sister was supposed to have went out to pick us up some dinner, but sever hours later we had to go pick her up from the pokey because she’d gotten arrested for being drunk. That is my “college” experience.

So traditional liberal education… meh. I’d rather read the classics on my own, develop my own theories, read the blogs of intelligent people that maybe I can learn from, and have a lifelong liberal arts education – minus the degree.

Degrees are essential in the current business environment, and I wholeheartedly endorse the pursuit of an education that has a return on investment. However, many degrees are not worth the paper they are written on, and many years (and dollars) can be wasted in pursuit of them. I’ll end with a joke to make my point.

The engineering graduate asks, “How does it work?” The science graduate asks, “Why does it work?” The accounting graduate asks, “How much will it cost?” The arts graduate asks, “Do you want fries with that?”

Or:

What do you call a liberal arts graduate? Barista.