Chunder [chuhn•der] - Australian Informal
verb/noun - (to) vomitus horrendous.

Tuesday, April 17, 2012

The Sun Needs to Set on the USC Empire

Not too long after USC announced its enterprise to build a new University Village in the image of the Grove in Beverly Hills, rumor spread that they want to buy the coliseum for 50 million dollars.

I understand that USC’s number one fall back is its location, but why are we aiming for perfection? If building this new shopping mall as well as gaining ownership of the stadium is in pursuit of creating a better area and attracting brighter students, it’s sending the opposite message.

If USC wants to attract students of Ivy-league caliber, they have to do things of Ivy-league status. Constructing a high-end shopping center and luxury apartments isn’t going to attract the next generation of Ivy-league intellect, but those craving the status that accompanies such prestige. If we want to attract the minds of the future, then we shouldn’t resort to methods of the past by further commercializing an area that would benefit much more from a grassroots campaign.

If USC really wants to improve the area and attract the brightest applicants, why not use that money to improve the state of education in the surrounding area? The magnet schools around campus provide a great opportunity for the inner city Los Angeles Students that attend. But the influence of USC quickly dissipates the further you get from campus, even if we’re talking 1 or 2 miles. There are many students receiving a very sub-par education in zip codes reachable by foot, so why not use that money to benefit them? Institute community programs to replace the arts and music classes that have been lost due to budget cuts. Offer free college-prep courses to high school students in the area. Provide them with scholarships not only to USC, but any accredited institution. If USC has the power to take ownership of the acres of land the UV sits on and the coliseum, I’m sure they could finagle their way into the Los Angeles Unified School District and supplement the low budgets of nearby public schools as well.

Investing in education is the big picture solution. By improving the foundation on which the future sits, the students in the area will have more opportunities and inherently more success. They will include their families in their newfound glory, advancing their socioeconomic position, and improving the area around USC naturally.

It will be a slow process, but one that benefits both the university and the area in the long run. Creating a megaplex shopping center will only validate our reputation as the “University of Spoiled Children.” If we want to attract students that will one day change the world, we have to change the world now.

Saturday, April 7, 2012

Big Bad Bully America


The United States’ position as top contender of the industrialized world is the very reason why it sucks. Because Americans no longer have to fight for their survival, they’ve begun to fight against it.

Although there are times when there is, like, “nothing to drink” despite the virtually endless flow of clean water from our sinks, and other times when you have like, “literally nothing to wear” despite the 50+ articles of clothing in your closest, most of us can agree that our basic survival needs are met on a daily basis.

But we’ve had it too good for too long, and the implications are becoming hard to ignore. Rather than fending off starvation, we suffer from an obesity epidemic.
While adolescents in developing nations band together for survival, we see our youth bullying each other to the point of suicide. Prolonged prosperity has invited a whole new realm of issues that are difficult to define and even harder to fix.

Like mentioned, one of these difficult-to-define and hard-to-fix problems is bullying. In a typical bully-victim relationship, the bully, usually someone higher on the social ladder (a position achieved by instilling fear, probably due to “advanced” physical size (ahem, obesity epidemic)) preys on victims because of a perceived physical, social, or financial ineptness (or because they have pudding snacks). Why the bully engages in such aggressive behavior is not quite understood, but it is certain that it lowers the self-esteem of the victim, making them more vulnerable.

But in all seriousness, bullying is becoming an increasing concern. On Sunday, a 16 year-old boy from Corpus Christi, Texas, committed suicide after years of torment and no effort from the school district to stop it. The school denied that bullying was a problem, despite community parents who spoke up and claimed to have withdrawn their students from the school because of unrelenting torment. This is just one of the dozens of cases that has made national spotlight this year, and just one of millions instances of bullying itself.

From an evolutionary perspective, bullying could be explained by Darwin’s “survival of the fittest” hypothesis. But we don’t have to fight for our survival anymore because it is practically guaranteed by our modern society—our psyche just needs to catch up with reality.

The autonomy allowed by our society creates a false sense of independence, people undermining the role other people play in our lives. They think independence is buying their own groceries, but where would they be without markets in the first place? Because mass collaboration is no longer needed to survive, there has been a mass disintegration in order to “thrive. ”

Psychologists argue that humans are social beings in the first place because we needed each other to stay alive. Men hunt and protect. Women gather and make babies. But with the autonomy allowed by our current industrialized society interdependence is less pronounced. Fueled by the idea that we no longer need to each other to get by, have we subconsciously began to divide, doing what we can to retain our allusion of power we hold over others?

What is bullying if not a magnified version of the social stratification of America—an elitist group leveraging their power on a vulnerable lower class? The recent Occupy Movements come to mind. It’s driven by a group of people who feel victimized and immobilized by another group, whose dominance is vague yet unyielding. There have been countless acts of physical aggression against the lower class in reaction to their pleas for equality. Earlier this week, around 30 people were pepper sprayed outside of a Santa Monica College trustee board meeting protesting proposed higher course fees. Two of the victims were a mother and toddler in the crowd. Like bullying in school, this is one of countless examples.

Our prolonged prosperity has us fighting to ensure individual success, not collective. And when this is reversed, we will see an end to the complex issues our society faces. We will be able to reap of the benefits of being on top when people think it how they could help our whole population, not only themselves. If we were all working together, bills for universe healthcare would pass. We would petition to have education programs expanded, not cut. The wealthy would except their social responsibility as the breadwinners and happily pay more taxes to provide more for the millions of people who live below them.

In order to stop sucking, we have to start sharing. 

Saturday, March 31, 2012

Los Angeles Publically Transit-ed Disease


Los Angeles public transit is quite the ride. Occasionally, I take it from downtown to my job in Venice on Fridays. I would complain more but it takes about the same amount of time as actually driving during rush hour and is much cheaper than paying $5 per gallon for gas. It also provides steady entertainment.  

Last week, I was standing near the rear exit of a bus on the Metro 733 route that runs along Venice Boulevard. A man dressed in a suit chundered on the floor for a good 30 seconds. With every stop and go, it slid closer and closer to my feet until we were all asked to evacuate the bus and get on the next one.

A week before that, a homeless man sitting across from me peed himself while drinking from a flask concealed in a brown paper bag and simultaneously attesting his sobriety. The aroma was splendid.

Needless to say, I’m more than excited for the highly anticipated Expo Line to debut on April 28th. The first phase, running from downtown LA to Culver City, has finally been approved to open to the public, after much more time and money than anticipated. It will be the first line to connect downtown and the Westside in over 50 years, and it’s about time.

Los Angeles and its lack of cohesiveness has always been a point of criticism. A point that, as an Angelino immigrant from the East Coast, I couldn’t agree with more. It’s the only major city I’ve been to that lacks any easy and tangible way to get from point A to point B. And I think this disconnect is the source of its fragmented feeling. After living here for 3 years, I still don’t understand its boundaries. Is Beverly Hills in Los Angeles? Is Hollywood in Los Angeles? I’ve asked people who are from those areas, and they don’t even know. It doesn’t matter how many attractions there are in the general area because if you can’t easily get from one to the other, you can’t appreciate their existence.

Maybe the new Expo Line will ignite a sense of unity among the numerous and diverse neighborhoods that fall under the larger umbrella of “Los Angeles,” mixing the ‘elite’ and the ‘street.’ I think the line will be very successful in attracting all kinds of riders, from the people who already frequent public transit, to business men and women looking for a shorter commute, to the many 20-somethings who want to avoid car payments and are advocates of green living.

I’m actually almost positive the line will be a success. It will surely be used by the lower class that already depends on busses. And I think it will attract the middle class and public transit virgins, because for some reason, public transportation attached to a rail of any sort is more appealing than when on an ordinary road. On the ranks of public transportation, speed lines and subways are usually top-tier, with busses on the bottom as least desirable. Taking the subway is like, totally cool and indie, and the bus is like, gross. Ask any girl from LA who has visited a friend In New York City for a weekend.

And for my last personal anecdote, my favorite transit experience: several weeks ago, a man sitting 2 seats away from kept humming Britney Spears songs and laughing uncontrollably. He dropped something under his seat and tried to pick up the object for a good 2 or 3 minutes. I realized it took him so long because he actually didn’t have either of his hands. I then realized the object he so fervently tried to pick up was a switchblade. Awesome.

Okay, so the Expo Line [thankfully] won’t prevent situations like that or those mentioned earlier. But it will provide a long overdue connection between the geographically close but culturally distant regions, allowing Los Angeles to function more like, well, a city. 

Sunday, March 25, 2012

Saturday, March 24, 2012

STOP KONY: A New High For Social Media, A New Low Gen Y


What does it take to get Gen Y invested in a global crisis involving the abduction of thousands of children and their subsequent conversion into soldiers and sex slaves? Branding and celebrity endorsements, of course.

This is not an attack on the STOP KONY movement or the unprecedented support the campaign has inspired. I doubt neither the authenticity of the cause nor its supporters. But, what presents a new high for social awareness exposes a new low for my generation: the most viral video in history owes its success to social media and celebrities.

Reaching over 100 million views in two weeks and dominating the virtual worlds of Facebook, Twitter, and YouTube, the STOP KONY Campaign speaks volumes about how we communicate, but it’s not saying anything good. Although it is groundbreaking that millions of people have been united by a common cause, they weren’t lead there by global awareness or empathy, but by the delivery service of social media and the presumptuous pull of celebrities.

It would have been near impossible to miss the STOP KONY video if you were anywhere near the internet during the month of March, and even harder to ignore its message.

How did this video become viral so quickly? The Invisible Children selected 20 “Culture Makers” to endorse and promote the cause. The list ranges from Gates to Gaga, Bono to Bieber. A range of celebrities who combined audience covers nearly every media niche. The culture makers updated their social media sites with links to the video and pleas to repost. It worked as planned, and the video had millions of views with hours, and 100 million in a few weeks. People listened.

From the get-go, the video captures its audience with slick production and a heartfelt story. The plot involves an innocent toddler slowly learning about Joseph Kony and the Lord’s Resistance Army (LRA), a group of Ugandan children telling their stories, and a bunch of young adults, like you and I, taking a stand. Emotion-provoking songs by upcoming artists fill the background noise and it ends with a strong call-to-action.

But, the call-to-action is asking the viewer to contribute by buying an “Action Kit,” which contains posters, a t-shirt, stickers, and a bracelet. Everything is nicely designed, trendy, and loudly promotes the cause. Buying this kit and sporting its contents is pretty much the extent of your support. All you have to do is buy and wear, the money going to Invisible Children to find Kony and rebuild the lives of the Ugandans affected. This message speaks loud and clear to Generation Y, who gets to feel the satisfaction of making a difference without exerting any tangential effort.

With the rate at which the video spread, it’s unlikely many people did much research before reposting, meaning they validated its truthfulness by the celebrity supporters and sheer number of hits. The new connectivity permitted by social media allows a message to spread quickly, but without any valor. The difference of being in the know or not is a matter of clicks, not a pursuit. But more awareness, in terms of numbers, is not necessarily an advantage. With large numbers comes a diffusion of responsibility. The more people that know about a particular misfortune, the less each individual feels the need to take action. Thinking “someone else will do it,” they justify their passive attitude and simply share the video.  The “awareness” reached by social media may in fact be a detrimental paradox.

External factors grabbed the audience, with empathy coming second. In a globally aware and humanitarian society, people would have already been concerned with this ongoing tragedy and been inspired to help by the empathy born from sharing the human experience. Instead, the millions of supporters were initially attracted by celebrities, and drawn in further by a video difficult for anyone with a conscious to disregard. What about the misfortunes without an eye-catching brand identity and celebrity endorsers? Should they take the backseat due to lack of appeal?

The people behind the campaign really know how to make us tick. They seem to have a better idea of how to influence the masses than the government. Using tools proved effective by social psychology such as good design, groupthink, a loud call to action, and clear instructions to reach the goal, the STOP KONY movement was ingeniously devised. It might even be one of the most influential campaigns since Nazi Propaganda. 

Saturday, March 17, 2012

www.me.com

You can do whatever you set your mind to. Impossible is nothing. Never say never.  Sound familiar? Of course it does—such clichés are the white noise of our childhood. 

Think about the Disney movies, TV shows, and children’s books, where failure was never the outcome and everyone wins. Has this bred an overly optimistic yet under-qualified generation?

And then we grew up a little.  How do we express ourselves nowadays? Facebook and Twitter. Status updates, profile pictures, mobile uploads, and 140-character witticisms. Each nurturing the idea that our thoughts/actions/experiences are worthy of being published to the interweb, to our own personal website, creating a false sense that everyone actually cares about what’s going on with each of us at any given moment.

Hungry? Pop something in the microwave. Viola—dinner in minutes. What’s everyone up to tonight? Text a few people and be in the know within seconds. Last minute question before exam? Email the TA late the night before. No response? Complain to the teacher the next day.

What does this all add up to? Anything, everything, me, now: the mantra of Generation Y, aka Generation Me. What’s the worst part? We’re all victims. I’m a victim, writing this on my blog like anybody cares. But does anyone care? Is just thinking people care enough? These are my peers and I face, problems we don’t know the answers to quite yet.

Psychologists say we’re going through a Narcissism Epidemic, everyone over-confident about their abilities and importance. Some even say it led to the economic crisis. “Sure I can afford that house.” “I can pay back that loan no problem.”

Anything, everything, me, now. What I fear the most about this mindset is my upcoming graduation. I expect to get a fulfilling, enjoyable job, aligned with my ideals, that pays enough that I can live comfortably while paying back the student loans that were so graciously handed to me by the government and private lenders.  Has the environment that birthed me fostered unrealistic hopes for the future, or gotten me so high on myself and instant satisfaction that perhaps I will be able to fulfill my own expectations? Time will tell.

Saturday, March 3, 2012

Virtual Realit-Y?


Walking down a busy street in Manhatt…DUCK! A pterodactyl is coming straight towards you. Regain composure in just the knick of time to jump over the stream of molten lava. Wait…is it the end of the world or something? Are you on drugs? No, but you sure do look like a crackhead wearing those ridiculous glasses.

I wish I was talking about the end of the world or drugs or something else cool like that, but instead I’m referring to a new technology that’s been getting a lot of attention lately. Google recently announced they will be releasing virtual reality eyeglasses within the year. But despite the buzz, Google actually hasn’t unveiled too much about the product. Good. And I hope it stays that way for a while. Honestly, I think the last thing we need is another gadget weighing down our technological tool belt.

Good news for all you aspiring optometrists out there: I realized I spend probably about 90% of my day looking at a screen of sorts. At work? Staring at laptop. In class? Staring at laptop and projector. Doing homework? Glued to laptop, phone, and television. I’m not proud of this, but it’s the medium of my generation. And with the integration of virtual reality glasses, there goes the last 10% of the time I spend actually looking at the world.

I’m not denying that the technology would be useful, educational, and even fun. Running into someone you’ve met before but his or her name has slipped your mind. No problem, it’s floating above their head. Walking down the street and you see a historical monument. At the blink of an eye (well probably before that) a full explanation the piece is scribed in the sky above it.

But just because we can doesn’t mean we should. If you haven’t noticed, new technology doesn’t make our lives simpler, or even better. Sure it makes many tasks easier, but the time saved usually isn’t spent relaxing, but rather filled with more tasks. According to my social psychology teacher, we are “primordial beings living in an advanced technological world.” Remember that episodes when the Flintstones met the Jetsons? Kinda like that.

According to my teacher, we peaked during the time of Roman Empire. We were in our physical prime, walking miles a day and eating the basics: protein, fruits, vegetables. Now, we suffer from a national obesity epidemic and are lucky if we walk to the water cooler twice within the hour. Humans haven’t evolved much for thousands of years, but our technology has.

Is my teacher right? Should we look to the past for solutions for the present? I think so, because the remedy to many of our self-inflicted problems is simplification. Take a look at this video that shows the future of the virtual-and-augmented-reality technology discussed above. The video focuses on empathy. Pay close attention to 6:45, when the character is given a lesson on empathy and nature after putting a cigarette out on a tree:

“Empathy is not only projecting ourselves on our fellows but also on the world around us. One should never forget that we are all stardust we are one with the matter of the universe”

Best advice I’ve gotten in a while from talking glasses. But why does it take a billion dollar technology and a seven+ minute creepy video to tell us the same thing we could have learned by reading (gasp!) some Thoreau or get this—even venturing out into nature ourselves! (Gasp again, but this time for breathing in fresh air for the first time in years).

But what advice is it that we should borrow from the past? Simplify. Be a minimalist. Get rid of everything you don’t need.  Join the Simplicity Movement.

Think of times people associated with happiness. Childhood. College. Vacations. Traveling abroad. What do all of these have in common? Not a lot of stuff. People following the Simplicity Movement have traded in their McMansions, cars, technology, and luxurious material items for flats and studio apartments containing only the bare necessities. According to them, the less stuff you own, the less stuff that owns you. And they claim to never been happier.

Okay so I’m probably not going to run home and throw my iPhone in the garbage disposal or burn my wardrobe, but I don’t plan on adding to my technological armoire any time soon. If you see me ducking a pterodactyl or jumping over a stream of lava anytime soon, hopefully its because I’m starring in Jurassic Park V, and not lost in a virtual world. 

Saturday, February 25, 2012

You Say Potato, I Say Patato: The Meaning of Life


I’m going to begin with one of the most devastating excerpts you will ever read, an idea that has forced me to construct a rebuttal strong enough to give my life purpose, a quote from psychologist Tom Pyszczynski:

“Self-esteem is a protective shield designed to control the potential for terror that results from awareness of the horrifying possibility that we humans are merely transient animals grouping to survive in a meaningless universe, designed only to die and decay. From this perspective, each individual human’s name and identity, family and social identifications, goals and aspirations, occupation and title, and humanly created adornments are draped over an animal that, in the cosmic scheme of things, may be no more significant or enduring than an individual potato, pineapple, or porcupine.”

Okay. I’m sorry if that put you in a bad mood. I’m even more sorry if it ruined your day. But I’m not sorry if it undermined your aspirations and dreams. Because by the time you’ve finished reading this, your mood will be improved, your day will seem boundless, and you’ll realize how insignificant those aspirations and dreams were to begin with. And I have no qualms declaring that because I believe that as humans, conscious beings aware of our existence, we have complete self-autonomy. Though a product of both nature and nurture, two influences that have an undeniable impact on our present state, we have the final say in who we are and what we do. Because of this ability to completely control our impact on the present, we are not potatoes, nor pineapples, and not even porcupines, but rather complex social and emotional beings equipped with the power to devise our own happiness or sadness, satisfaction or disappointment. And if you decide to agree with me, then get ready: the world is your oyster.

But first, a quick preface: I am going to frequent words and ideas that have trickled into political and religious debacle, words and ideas whose affiliation which such “industries” has deemed them utopian fallacies in common talk. But my mentioning of such words refer purely to their philosophical meanings, because I don’t subscribe to the arbitrary laws implemented by such “superior” powers, but rather to the power of the individual and the empathy inspired by sharing the human experience: libertarianism and humanitarianism.

Simply stated, libertarianism refers to the belief of free will and is characterized by the individual’s power “to do otherwise.” In every situation, you are faced with a myriad of choices. What you ultimately choose is up to you, without any influence wielded by external powers or prefabricated equations. Think of a simple board game analogy: are you a pawn being moved wherever the player decides, or are you the player, making the decisions first hand? If you consider yourself a pawn, who or what is controlling you?

But really. If you don’t own you, then who does? Whose voice is that in your head, if not yours? The default argument against free will is determinism—the idea that we are following a linear path created by the laws of physics, that we are just variables in a greater equation that explains the universe. According to this school of thought, we have created the illusion of free will, but really our destiny has been predetermined. But how can this be so? How can the pure randomness of our thought processes be the product of an equation? I guess that dream I had last night where I showed up to class naked was determined even before I was born. The millions of synaptic connections that strengthen and weaken perpetually and conjure up the most random of thoughts, memories, and ideas have been programmed to follow an algorithm that has existed since the beginning of time. An infinite amount of years ago, it was determined that the six-hundredth-and-forty-second word in this post would be fuckleberry.

Okay, so maybe I’m being a little harsh on determinism. Maybe it wasn’t decided at the beginning of time that the six-hundredth-and-seventy-third word in my post would be crakpin. Perhaps some sort of third party power does in fact influence my thoughts and behaviors. Wait—scratch that. Something else definitely determines who I am: nature and nurture.

The two things that we have absolutely no control over are nature and nurture. We can’t help which sperm fertilizes which egg and in turn what genetic combination is created. We can’t control the decisions our mothers make as we develop in their wombs and furthermore, whether we reap the benefits of healthy choices or wallow in a bath of toxic chemicals. Once born, we have no influence on what hormones are released in our body and when, and how those hormones will permanently change our neural connections.

It’s not up to us whether we are born into an affluent neighborhood or third-world slums. And usually ending up somewhere in the middle, we can’t control the values our guardians instill in us, or the values we develop ourselves as a result of neglection and bad parenting. With every stimulus we encounter, the wiring of our brain changes forever. By the time we are old enough to form memories, we have already been exposed to millions of stimuli, none of which were in our control. The combination of our genetic makeup and the environment in which we were raised creates a disposition that will influence our every decision. One could argue that we are each subject to certain thoughts and behaviors as a direct result of the interplay of nature and nurture. Since neither can be controlled, yet both have brought you to your current disposition, your actions from this point on aren’t a result of your free will, but rather a byproduct of the interaction of nature and nurture throughout your life thus far.

Many fields, namely psychology, but also neuroscience, sociology, and philosophy, have been created and exhausted trying to better understand the human condition. Each of these subjects has been dedicated to discerning why humans do what they do. And after thousands of years of trying to figure it out, not much is known. The nature verse nurture debate is still just that: a debate. There is no way to predict what any one person will do with the rest of his or her life, let alone the next five minutes. In trying to gain a better understanding of humans, psychologists have done a lot of research, with the most revealing results found in twin and adoption studies. In twin studies, researchers have tracked down many identical twins who were separated at birth yet ended up eerily similar. Let’s take the “Jim Twins:”

“Jim Lewis and Jim Springer first met February 9, 1979, after 39 years of being separated…Both had been adopted by separate families in Ohio, and had grown up within 45 miles of each other. Both had been named James by their adoptive parents, both had married twice; first to women named Linda and second to women named Betty. Both had children, including sons named James Allan. Both had at one time owned dogs named Toy…In one test which measured personality variables (tolerance, conformity, flexibility), the twins' scores were so close that they approximated the averaging of the totals of one person taking the test twice. Brain wave tests produced skyline-like graphs looking like 2 views of the same city. Intelligence tests, mental abilities, gestures, voice tones, likes and dislikes, were similar as well. So were medical histories: both had high blood pressure, both had experienced what they thought were heart attacks, both had undergone vasectomies, and both suffered from migraine headaches. They even used the same words to describe these headaches. The twins discovered they shared alike habits too. Both chain-smoked Salems, both liked beer, both had woodworking workshops in their garages. Both drove Blue Chevys, both had served as Sheriff's deputies in nearby Ohio counties. They had even vacationed on the same beach in the Florida Gulf Coast. Both lived in the only house on their block.”

The Jim Twins make a very convincing case for the overwhelming influence of “nature”—identical twins, leading identical lives, separately, without the knowledge of their other half. But now lets take a look at Winner and Loser Lane, the poster children for the influence of nurture. The Lanes, who lived in a housing project in Harlem during the late 1950s, had a son who they had a particularly good feeling about. They decided to name him “Winner.” Several years later, they had another son. For reasons unknown, this one they named “Loser.” Lets take a look at how they turned out:

“Loser Lane did in fact succeed. He went to a prep school on a scholarship, graduated from Lafayette College in Pennsylvania, and joined the New York Police Department (this was his mother’s longtime wish), where he made detective, and eventually, sergeant. Although he never hid his name, many people were uncomfortable using it. “So I have a bunch of names,” he says today, from Jimmy to James to whatever they want to call you. Timmy. But they rarely call you loser.” Once in a while, he said, “they throw a French twist on it: ‘Losier.’” To his police colleagues, he is known as Lou.
And what of his brother with the can’t-miss name? The most noteworthy achievement of Winner Lane, now in his midforties, is the sheer length of his criminal record: nearly three dozen arrests for burglary, domestic violence, trespassing, resisting arrest, and other mayhem.”

The Jim twins make an outstanding case for nature. The Lanes make just as convincing a case for nurture. Is each story probably the most extreme example you could find to support either nature or nurture? Yes. Are there thousands, maybe even millions of studies out there that show both nature and nurture have a nearly equal affect on how you turn out? Probably. But the stories make my point—both nature and nurture have an overwhelming affect on whom you become. But it is unclear what affect each has, and which, if either is dominant. The reason it can’t be determined is because a third factor comes into play: free will. But if each case makes a pretty convincing argument against free will—one relying solely on nature and the other on nurture—why did I spend so much time discussing them? They illustrate the extremes of nature and nurture, not the more common interaction of each that is achieved through the practice of free will. But how does free will come into effect? Self-awareness.

Technically defined, self-awareness is the “capacity for introspection and the ability to reconcile oneself as an individual separate from the environment and other individuals.” Usually associated with free will and libertarianism, self-awareness operates closely with the idea “I think, therefore I am.” I think, therefore I am…what? What am I? Me.

You are not fully you until that very moment you become self-aware and separate yourself from the people and environment that have for so long defined you. Now, you are truly living as you, and not as a product of your DNA and the countless experiences that have led to this moment. Before, I was Peter Stephen Routzahn, as defined by my genetic disposition, environment, and upbringing. A combination of my parents Stephen and Tara, a product of a little town in South Jersey, the result of the values my parents instilled in me. Now, I am Peter Stephen Routzahn, as defined by me. As the one in control, I decide who and what influences and shapes me, and no longer settle for what is merely available. Completely self-aware, I can no longer blame or praise my parents, my upbringing, or even my luck for the shortcomings or successes I have experienced thus far. Because I am the sole determinant of who I am, all of the weight, all of the pressure, all of the blame, and all of the praise is on me. I decide whether I want to be an optimist or a pessimist, someone who learns from his mistakes or someone who wallows in self-pity. And you get to decide for yourself as well.

But there is one more thing you must liberate yourself from before you can enjoy the freedom allowed by self-awareness: the future. As a believer of libertarianism and as a person who practices complete self-autonomy, you are in complete control of yourself, but you are not in control of other people. You aren’t in control of the weather, your friends, your car, or even your dog. So don’t waste time planning for or deciding around an unsure future that is the outcome of an infinite number of factors outside of your control. Instead, realize that you can control how you react to others, how you react to the weather, how you react to your friends, your car, and your dog. So live in the now, make decisions as events occur, not before they occur. This will help you enjoy the present most and decrease the chance of disappointment in the future.

Think about it. Say you plan for events A, B, C, and D, in that order. Since you exert no control on the universe, they don’t actually fall in that order, and instead, event D occurs first. And then unforeseen event E pops out of nowhere. And then event A finally happens, but it isn’t followed by B, but rather event D repeats itself. And before you know it, your plans were ruined, and you’re D-E-A-D. So moral of the story is, don’t plan—react. React to each event as it occurs and look forward to the plethora of opportunities created by each twist.

Just as you can control how you react to events, you can control how you receive and relate to other people. This is where religion and politics come in handy: they suggest a list of rules on how people should relate to each other. But there is no universal set that everyone abides by. Instead, there are different cultures, religions, and governments, which, although attempting to preach similar ideas, clash in the fight to have the dominant ideology, creating a world-scale culture war waged just to declare who are right. Instead, we should prescribe to the idea of humanitarianism and the empathy inspired by sharing the human experience. Take off the masks created by race, gender, religion, orientation, caste, political party, and age, and , and we’re all just human beings, and we should treat each other as such. Why waste energy time and energy disliking something. Thinking negatively towards other is just that: thinking negatively.

Actually, why waste your energy and time thinking negatively at all. Now that you’ve read about libertarianism and self-awareness, rid yourself of the stress and fear caused by the unpredictability of the future, and perhaps reconsidered how you relate to others, celebrate your newfound freedom. From this point on, only do what you want and what you love. Don’t waste any time on things you don’t care about or things that don’t make you happy. I mean who says you can’t? You’re in charge of you—no one else. Don’t settle for anything less than what makes you happy. Make changes. Quit. Try again. Move. Whatever. Or keep everything exactly the same. Said very succinctly by an angsty teenage boy in the film Little Miss Sunshine: “Do what you love and fuck the rest.” Said eloquently by Steve Jobs in his commencement speech at Stanford: “Your time is limited, so don’t waste it living someone else’s’ life. Don’t be trapped by dogma, which is living with the results of other people’s thinking.” Follow the footsteps of Steve Jobs and adopt his morning ritual and ask yourself the following everyday when you wake up:

“If today was the last day of my life, would I want to do what I am about to do today? Whenever the answer has been no for too many days in a row, I know I need to change something.”

So that’s that. There is my worldview, the doctrine I live my life by. It might seem absurd, unrealistic, and maybe even too optimistic. But it works for me. It’s a set of rules I’ve created for myself to ensure I make the best of every moment. And if you make the best of every moment, when you look back, you have the best life you could have made for yourself. This is my protective shield against the horrifying possibility that we humans are merely transient animals groping to survive in a meaning universe designed only to die and decay. If, in the long run, it turns out that we are no more significant or enduring than any individual potato, pineapple, or porcupine, who cares? I’m happy and if I create a meaningful life that I enjoy, what else matters?

Thursday, February 16, 2012

Education Reform: Teaching How to Think, Not What to Think


When was the last time you witnessed a student, of any age, claim he or she was excited for school? Looking forward to studying? Bright-eyed and bushy-tailed waiting by the door for the bus? Probably never. The education system is just a series of steps getting you to the next level of schooling, and this journey is becoming increasingly long with academic inflation. People spend the best years of their lives pursuing a bachelor’s degree and then a master’s degree and eventually a Ph. D. And not everyone gets hired, for the skill set needed to land a job in our ever-evolving society changes perpetually.

The education system is proving ineffective because we are teaching students the wrong things. There seems to be a universal hierarchy in schools all over the nation: an emphasis on literacy, math, and science, and a depletion of the arts. The former teaches formulas and organized thought, and the latter emphasizes creativity. But as we continue to cut the arts nationwide, do we see an improvement in the education students receive? In our evolving society, we must teach creativity so students learn to adapt to change, rather than teach them a curriculum that expires before it can be used in the real world.

Thoreau captures this sentiment eloquently: “What does education often do? It makes a right cut ditch of free, meandering brook.” As it turns out, many of the people who we regard as the literary, scientific, and innovative geniuses of both our time and history books have it out for traditional schooling. Einstein. Gates. Jobs. Judging by their impact on our world, I might have to agree.

A few years back, Ken Robinson did a TEDxLecture called “School Kills Creativity.” In this lecture, he discusses the unpredictability of the future—in the long run—and even five years from now. We don’t know what the future holds, hence teaching static information seems irrelevant. Instead we should teach creativity so that we can be prepared for the unsure future. He claims that we are born free thinkers and we get educated out of creativity. Just think back to your early childhood. Remember how those students who always did the assigned work, followed the arbitrary rules of the classroom, and sat quietly with their hands folded on their desk were praised? And how students who do anything but that were diagnosed with a “learning disorder?”

Michael Michalko, another advocate of teaching creativity and author of “Creative Thinking,” has a whole reservoir of quotes ready to chuck at you on the topic. Similar to Robinson, he claims “everyone is born a creative, spontaneous thinker, but people create mental blocks that keep them trying something new.” Referencing the famous fact that Edison came up with 3000 prototypes before inviting the light bulb, he encourages people to “desire success, but embrace failure” as well as “listen to experts but know how to disregard them.” If his ideas were employed on a daily basis, I think we would see more innovation and less fear of failure. Because failure is inevitable, embracing it creates a positive learning experience, rather than a creativity killer.

I’m highly interested in education reform because I think a solid education, like the one discussed above, is the solution to many problems—with the least important being low test scores. What problem in our society couldn’t be fixed, or at least improved, by innovative and creative thinking? I’m not embarrassed to admit that I shed a tear or two during the film Freedom Writers. I’ve always been sensitive to the topic because I feel I’ve been shaped by traditional schooling—at first a product of it and eventually an adversary.

When I was a little kid, I could draw a sentence much more clearly than I could write it. My teachers told me I was a gifted artist, but also told me to focus on academics. I tried to integrate creativity into my daily life, but it got harder as school progressed and college loomed in the imminent future. By the time I graduated high school, I was the poster child of traditional schooling. Straight A’s. More extra-curriculars than I could count on my hands. Acceptance letters from great universities. But stressed to the point of sickness. School wasn’t fun—it seemed like a never-ending job.

Sometimes I regret the decisions I made in high school, but then I remember if I hadn’t been such an obsessive student, I wouldn’t have gotten into USC. And now that I’m here, I’ve realized it’s time to control my education, instead of letting my education control me. I take classes that seem interesting and study parts of the textbooks and lectures that intrigue me. Sure I don’t have a 4.0 GPA anymore, but I look forward to school. I get excited to study because I focus on the material that interests me most. Rather than having a temporary storage box of material I forced myself to memorize, I have a free flowing consciousness of information I want to know. Taking art classes again, I’m encouraged to be creative and to think outside the box on a daily basis. And I can honestly say I like school now. I’m lucky to have had this epiphany, because many students don’t. But If schools taught how to think, rather than what to think, this could be the mindset from day one. 

The Future

The future, as brought to you by Microsoft. Very interesting. And I can't decide whether I like it or not. But I do find the music quite motivational.

O Rly?

So apparently, there is a scientific explanation behind crazy cat ladies.

Also, new craze "Inbread cats" has swept the interweb.


Just one of many examples of the latest meme phe-nom-e-noms. Google "inbread cats" for more cuddly cats garnished with grains.

Monday, February 6, 2012

Urinalysis & Not-so-fair Welfare


As more people turn to the government for assistance in times of economic hardship, states are fighting back by making it more difficult to receive aid. Employing the motto “If we have to piss to pass, so do you,” state governments are enforcing drug screens before eligible applicants can receive welfare, food stamps, and public housing. Is this an attempt to even the judicial scales or a ploy to curb money spent on government aid?

Either way, I’m not sure how fair it really is. According to Hammurabi’s Code, “an eye for an eye,” “a piss for a check,” it makes sense. If I have to pass a drug screen to earn the money deducted for welfare, then you best pass to get it. But here’s the loophole: drug tests at the work place are not instituted by the government; drug tests for welfare recipients are. So a more accurate description is “I might have to pass a drug test to earn money, but you definitely have to if you want welfare.” Because each one is regulated by different standards, the stereotype that those on welfare are more likely to be drug users is reinforced, making it even harder for citizens hit the hardest by economic woes to get back on their feet. Especially because there is no reported significant difference in drug use among the employed and those receiving government aid.

Again, this becomes a bipartisan debate. An article from the New York Times describes how Republicans favor the drug screening laws that encourage budget cuts, while democrats believe it is only creating more indignity amongst those suffering the side effects of our less-than-perfect economy. In situations like this, it seems like some politicians come up with the most stock-response to their party and stick by it.  We’re all just humans, and we make mistakes. Sometimes people experiment with drugs. Some people go through a phase. No one is perfect, and we need a law that takes that into account. But we also need a law that considers America’s best interest, and such a law would limit budget spending on government aid and focus on things like education and healthcare.

Leave it to a state split down the middle to come up with the most reasonable solution: something that takes the dignity of the person and the interest of the states into account. Florida requires participants buy their own drug test. If they pass, they are reimbursed and receive aid as needed. People who fail are disqualified for one year (only six months if they receive treatment). Payments can continue through grandparents or other relatives so kids aren’t cut off as a result of their parent’s mistakes. After enforcing this policy, Florida has seen the number of welfare recipients return to the same level as the beginning of the recession. 

Florida’s approach says “if you pass, here’s your aid and $40 for the test. If you don’t, you have a chance to redeem yourself, and we won’t jeopardize your kids. See you in a year.” The results speak volumes about laws that favor a split and aren’t too heavily influenced by either party. This makes me rethink my previous sentiment about our bipartisan government, and, ahem, nation. Although such a divide can sometimes leave half the population unhappy, it can also create a compromise that pleases everyone. And I guess that’s what a democracy is all about.

Saturday, January 28, 2012

Rob Delaney...Wait Who?


Public intellectual. The phrase vibrates through my ears and my brain interprets the noise, but, in turn, doesn’t make much sense of it. Not because I don’t know phonics, but because it’s not a phrase thrown around too much nowadays.

Thinking a bit harder, my mind drifts to high school history and literature classes, when I learned about famous politicians and philosophers and the like. John Locke. Henry Thoreau. Now those guys are classic public intellectuals. And it’s easy to say now, looking back, because we have had the time to discuss and assess their contributions to society. But how do you correctly define a public intellectual in our present era, when we don’t know the outcome of one’s influence quite yet? How are we supposed to deem one person a true public intellectual when they have probably just as many enemies as followers? How can we, as a society, agree on a public intellectual when our population has conflicting ideologies, and both sides will go to extreme lengths to prove theirs right?

I’m not saying this hasn’t been a problem since the beginning of forever. And I’m not talking down on the intellectuals who have dedicated their lives to improving our world, either—it’s just that they are out of date. Had we listened to the theories of the thousands of “public intellectuals” before our time, wouldn’t the answers be obvious? Haven’t they bestowed upon us enough knowledge that by now, we should all occupy our own personal Utopia? No. Instead, we “occupy” Wall Street. We “occupy” LA. We “occupy” Springfield. (And guess what—there’s a Springfield in every state of the US.) We “occupy” every city, town, and street where the people feel they are being mistreated by the group that claims superiority—which, if you haven’t noticed, is almost everywhere nowadays.

So rather than define “public intellectuals” as the people who have studied human behavior, politics, history, and philosophy, and written the books we are required to read throughout our lives which hardly make any sense without twice as many footnotes as content, and who preach very brilliant, revolutionary ideas that sound great in theory but have proven to not work for hundreds and even thousands of years, I am going to define the “public intellectual” as the average Joe. Someone who is very normal, someone you and I can relate to, someone who knows how to get his point across in a way people actually respond to, and lastly someone whose ideas aren’t revolutionary, but subtle enough that they could actually be put into effect and make a difference. Ladies and gentleworms, I present to you, Rob Delaney.

You don’t have to have your name out there to be a public intellectual. In fact, I argue that with the right amount of exposure, you and I have the potential to become public intellectuals. Rather than nominating the elite for the position, I nominate normal people who have an extraordinary influence on those around them. I like the way Stephen Mack explains it in his blog post:

 Now, are some people ill-equipped for self-government? Of course. But the strongest alternative argument, the best argument for democracy, is not that the people are “naturally” equipped for self-government—but that they need to become so, and, moreover, experience is the only teacher. So here’s the point: Any argument for the public intellectual that rests the assumption that common citizens are forever childlike and must be led by a class of experts is politically corrosive and historically dangerous.

If “common” people are the ones that need to be guided, who better than one of them—a “commoner” if you must, be the one to do it?

Rob Delaney is a normal guy. According to an interview, he’s from a small fishing village in Massachusetts and is now married with a kid. I’m not sure what happened in between because he doesn’t have a Wikipedia page (yet), but from experience I know that he’s a comedian currently touring the country and has a show coming out on Comedy Central. Putting his beginning and end together, I assume a lot must have happened in between. Not sure what it was, but I do know he made a living for himself doing something he’s passionate about—something we can all admire.

If you already know who he is, or if you didn’t and just googled him and came across an array of his sexual fantasies about women and food (at least he’s honest), you’re probably questioning why I consider him a public intellectual. But don’t worry, that’s just his normal side. You might think he’s inappropriate and sometimes strange, but I think he provides a dead-on satirical commentary on our society. And this is my blog—so I’m right and you’re wrong.

Just kidding. But besides his nonsensical tweets, he has a column in a popular culture magazine where he talks about mostly serious things. But even the not-so-serious-stuff is ingeniously crafted to make people realize how ridiculous our culture can be at times. But before I discuss his intellectual side, I want to emphasize how public he really is.

I learned about Rob Delaney from Twitter. When I first made my account, the little blue bird kindly reminded me that I should probably follow more people. Like when most little blue birds tell me what to do, I took the advice unconditionally and sifted through the different categories Twitter has organized. When going through “Comedy” I saw Delaney’s picture and quietly laughed to myself and thought that alone deserved a follow.

He currently has about 300,000 followers, 6,000 tweets, and averages around 1,000 retweets per tweet. (I’m shocked to say there were no red squiggly lines under any of the words in that sentence.) So if most of his followers see his tweets, and then about 1,000 of them decide to retweet each one, some kind of magical algorithm which includes the number of retweeters’ followers, etcetera, tells me that each on of his witticisms reaches a lot, and mean a lot, of people. Given this fact, he’s very different from a public intellectual in the traditional sense. Instead of the inner-workings of his brain being read in books, heard at lectures, and reserved for the minds of the elite who even give the time of day to contemplate what defines a public intellectual at all, what he has to say can be accessed by anyone with internet, easily, without actively making the decision to enrich themselves. His knowledge is spread online, to the bored students in lectures, to the people waiting in line at the grocery story, and to everyone catching up on their social media while sittin’ on the john. I think I just covered most people in America—especially with that last one. Point being, he’s easily accessible to millions of people at any given moment.

Now that you know quite how public Delaney is, I want to discuss why he should be considered an intellectual. Rob Delaney has a column called “Take a Stroll…With Rob Delaney” in a culture mag-and-web-a-zine popular with the younger crowd called Vice, where he writes about socially relevant and topical events. He covers everything from the upcoming election and debt ceilings to Kim Kardashian’s divorce and Katy Perry’s song “Last Friday Night.” But no matter how newsworthy his articles may or may not seem, they all have a clear take-away message.

I’ll start with his piece called “A Voter’s Guide” about the upcoming election. He starts with explanations for every candidate he’s voted for since he was 19 and why. He admits his naivety in his first few political decisions but reveals the finer-tuned strategies he employed as he grew older. Discussing the 2008 election, he says:

In 2008 I voted for Ralph Nader again. This will upset some people, and that’s fantastic. Please channel your angry energy into the outlet you feel will effect the most change. In the Democratic Primary, however, I voted for Barack Obama. But get this! I would’ve voted for John Edwards had he not already bowed out. The reason for my decision was that he had a better health care plan than Hilary Clinton or Obama. In fact, Clinton and Obama liked it so much they copied it for their campaigns! Thank the good Lord he deprived me of that opportunity. It is popular (and appropriate) to denounce Edwards now for his mind-shattering and mythic hubris, but at the time we didn’t know that he was secretly stinking, suppurating human garbage with a hot, gooey center or selfishness that could implode stars.

I placed so much stock in Edwards’ health plan because I am unable to shake the belief that there is anything more important to our nation’s future than A. access to affordable healthcare and B. education. Make it easier for your citizens to be healthy and smart and they will save you in ways you have yet to imagine. Make it difficult and your nation will swirl history’s toilet on its way to hell. When a person spends energy worrying about access to affordable healthcare they don’t have the energy to dream up the next Google. I’m sorry that this is a newsflash to some of you, but we are born dying and will each of us have “problems” that need medical intervention; it is not something to be ashamed of or afraid to experience. It is a condition of being alive and I am shocked that ANYONE WITH A HUMAN BODY would place obstacles in the way of their brothers and sisters getting a pill or a procedure that could help them.

The same goes for education. When your citizens’ minds aren’t stimulated by an excellent education, they don’t have the tools to think up the next life-saving vaccine. A country that doesn’t invest in education cannot claim for one second to be interested in its future. There are plenty of words to describe politicians who don’t make their constituents’ health and education their top priority, but for now I’ll let you pick one somewhere on the spectrum between “misguided” and “evil.” I will insist you tack on the word “shortsighted” as well.

These ideas aren’t novel. They aren’t revolutionary. But they are things too many people ignore in elections and he presents them in a forward and entertaining way. They are a reminder of how we should, but don’t, think. It’s simple advice, and if more people subscribed to this school of thought our country would be in the hands of someone that cares about its people and its future. What’s more important than that?

A presidential election is something most public intellectuals frequently discuss. But lets talk about something they probably ignore and might not even know about. In a recent article, Rob wrote about Kim Kardashian’s divorce. No, it wasn’t a gossip column in a tabloid, but rather an inspiring piece about divorce, a disease that currently infects 50% of marriages in America. As you learned from the previous article, Delaney is an advocate of education and healthcare, so he obviously rants about her over-the-top publically aired wedding how that money could have been better spent. But the real meat of the article is in his own rant about marriage, how it sucks, and how he’s wanted to end his plenty of times. But he doesn’t, and that’s what the sanctity of marriage is all about: overcoming the problem at hand and becoming that much closer. He says it much more, er, eloquently, than I do:

 I’ve been married for five years. To the same woman. I’ve wanted to divorce her at times. She’s wanted to divorce me at times. But one great thing about marriage, when it’s entered by regular folks, in good faith, is that it’s hard to exit. It costs money. You have to talk to lawyers during business hours except whoops—you have a job that you need to earn money to buy food and pants—so when are you going to both take the time to do that? By the time you’d have gotten around to it, you’ve forgiven each other and maybe even reached a new appreciation for each other as you worked through whatever seemingly insurmountable problem made you hate each other for 20 minutes while you sat in your shitty car outside a CVS yelling at each other and crying. Because guess what, Kim? That’s a huge ingredient in a SUCCESSFUL marriage. Sometimes it sucks. And I don’t mean lower-case “s” sucks. I mean it SUCKS so fucking hard you’re POSITIVE you’ll give yourself stomach cancer or an embolism as you try to make your spouse explode through telekinesis. When you relax, however, and remember that you’re a bigger asshole than they are, with enough neuroses and calcified bad habits to warrant their own card catalog, you realize that they’re struggling through life’s shit storm just like you. Then you take a shower together and fuck while laughing.

Personally, I think that quote should be tattooed on the forehead of every divorce lawyer, and his or her clients be forced to trace it with their fingers before they even look at any paper work. Again, he isn’t presenting a revolutionary idea, but he is presenting a simple idea well. A concept, if adopted by more people, would make them bite their tongues and think once—twice—maybe even THREE times before racing to closest divorce lawyer and/or butcher shop to end their marriage prematurely. It’s not about making the whole world a better place, but making each individual a better person so that they, in turn, can work together to solve the bigger issues.

Those still against the idea of Rob Delaney as a public intellectual, perhaps claiming he isn’t decorated enough, doesn’t write for any noteworthy publications, or will have no impact on the world: are you happy? If not, do you want to be happy? Positive advice that is relevant to our everyday lives has more power and potential than fluffed up jabber about things that don’t influence your wellbeing on a daily basis that is far too often the musings provided by your typical “public intellectual.” If you can’t relate to his opinion on marriage, maybe you can relate to what he has to say about racism, homosexuality, mental illness, alcoholism, feminism, or Anti-Semitism. Or maybe some of his jokes will make you laugh.

That brings me to my last point. Most people reared as public intellectuals are well-versed in a variety of fields and they’ll make sure you know it within the first 30 seconds of listening to them speak and/or reading their publications. Designer vocabulary plus a “breadth and depth” of knowledge usually makes their language hard to decipher. And I don’t want to break a sweat doing so. It isn’t easy to both present your idea clearly and to engage people. There are many ways to achieve this, but Delaney does it through comedy—a medium much harder to master than most think. After all, when was the last time you met someone who both claims to be an intellectual and has you hyperventilating with laughter? People always have and always will love laughing, because it makes us happy, and isn’t that the ultimate goal? Humor isn’t a trait you can teach, and the power of another’s sense of humor is often underestimated.

I’ve heard the quote “comedians are the philosophers of our era” several times, and the more I think about it, the more it makes sense. It’s kind of like they both watch society from a bird’s eye point of view. Each has an eerily accurate understanding of how humans work but the difference is how they communicate their knowledge. Philosophers use their unique knowledge to make guidelines for how people should function in society. Comedians use their unique understanding of how people work to make them laugh. That’s why we laugh at comedians, because they point out things so obvious that once they say it, we hit ourselves on the head for not realizing it first. I don’t think either the philosopher or the comedian is right or wrong, just different. In our current times, I think satire is a better tool to point out our flaws and inspire us to correct them than academic essays. Referring back to Mack’s article on the public intellectual, Jean Bethke Elshtain seems to agree with me:

So the public intellectual needs, it seems to me, to puncture the myth-makers of any era, including his own, whether it's those who promise that utopia is just around the corner if we see the total victory of free markets worldwide, or communism worldwide or positive genetic enhancement worldwide, or mouse-maneuvering democracy worldwide, or any other run-amok enthusiasm. Public intellectuals, much of the time at least, should be party poopers.

So if the public intellectual, a party pooper, a wet blanket, a bearer of bad news, is going to criticize us, how better than to do it through humor? If someone is going to point out my flaws, I’d rather them do in a joking manner that makes me laugh than in a solemn academic essay that will probably make me feel dumb. Isn’t that the biggest concern in the long run—happiness? If everybody was happy, they wouldn’t complain. If everybody was happy, our society wouldn’t be so screwed up. But it is screwed up, because not everyone is happy. So if we must be told what we’re doing wrong, I’d rather hear it from a comedian than a politician or philosopher or someone of equal “importance,” because I’d rather laugh at myself with others than feel humiliated alone.

ROB DELANEY FOR PRESIDENT!