April 30, 2010

I read the news

If you aren't head-explodingly outraged with the Catholic church, you aren't paying enough attention (or, alternatively, you're incapable of independent thought).

With that in mind, this video tickled me (language NSFW).





Unsurprisingly, I agree with the Hitchens/Dawkins initiative to have the Pope brought to court.

Edit: I realize this was an inflammatory post, especially since I don't provide any links to what's going on, or the conversation in place. Here are a few articles that guided my understanding:

  • Andrew Sullivan in invaluable in this discussion, as a compassionate Catholic who isn't afraid of following the truth where it lies. He's written a great article for The Times that lays it all out pretty well.

  • Sullivan continues with more damning (literally, I hope) facts. The Vatican's response so far has been along the lines of calling this 'petty gossip,' seemingly unaware that the documented rape of hundreds children (and documented ignoring of calls for help) does not constitute gossip. It is fact, and there is a difference.

  • How have Church representatives responded? By blaming the Jews, blaming the gays, and when that didn't work, blaming those slutty children who were totally asking for it.

  • This has raised the question, can the Pope be fired? As in, suppose tomorrow we find out he also burned orphanages, uses performance-enhancing drugs, was in Arizona illegally, and was actually not even Catholic. Could the Vatican defrock him? Turns out, they can't. Surprisingly, the Catholic hierarchy has no real systems for accountability.

  • Some bishops do step down, however. The timing suggests they're not so sorry that they did it as they are that they got caught.

  • Christopher Hitchins was on Bill Maher's show, and while both are polarizing atheists who don't generally resonate well with believers (they're pretty inflammatory), this video lays it out pretty simply: would you accept a child molester (or someone who aided one) in your company? In your country? Then why is this any different?



  • The Onion, a satirical publication, wins with the best headline though: Pope Vows to get Church Pedophilia Down to Acceptable Levels. The funny thing is, he hasn't even done this.

  • In a major breakthrough, however, the Vatican may apologize in some way shape or form in June. Again, I have a feeling this is only happening because we're making such a big stink about it, since the abuse has happened over the last century (at least) and nobody from their end has said a peep about it, other than blaming.


The Vatican, like Phil Donahue and most other shills, have demonstrated themselves to be incapable of owning up to this, and have only shown their incompetence by blaming others. The best defense of the Catholic Church came just this morning, by Nick Kristof of the New York Times.

He basically says "don't mock so hard or cruelly, because these idiots in the Vatican aren't the entire Catholic church; many priests and nuns on the ground give their lives selflessly in the true spirit of the organization, and the mockery/criticism belittles their unmatched generosity."

Dan Savage addresses this, and I more or less agree. Standard accountability models are necessary, but mockery (especially honest and well-produced mockery, as in the video at the top of the post) will accelerate the response from the Church while they continue to earn it. Until that response comes, the good, "real" representatives of the Catholic faith will not be represented by the organization they deserve to be, and the churchgoers will suffer as they always have.

Until then, regarding Ratzinger, fuck the motherfucker.

Edit 2 (5/3/2010): More from Andrew Sullivan, reflecting on the NYT summary published yesterday. I also left out the reporting on Marcial Maciel, the Legion of Christ, and other establishment blights that further illustrate how horrible the whole situation is (Sullivan, National Catholic Reporter).

Edit 3 (5/11/2010): Ratzinger finally says something more substantive. This doesn't put him off the hook for the scandals he's been a part of, and I'm still waiting for him to do something. Took long enough, but it's better than blaming outsiders or the victims.

April 29, 2010

Videos with me!

ME ME ME! <STOP>



MY GAME IS BETTER <STOP>



THE GAME I MADE WITH MY TEAM <STOP>



I WILL TRY TO WRITE MORE, FEWER VIDEO-ONLY POSTS <STOP> UNLESS OF COURSE YOU LIKE VIDEOS <STOP>

April 25, 2010

Patriotas!

I can write about Spring Weekend a bit later (this was the first and last time I'll do Spring Weekend as a student). In the meantime, I'll procrastinate by posting another favorite video. Sadly for you gringos, you have to speak Spanish to glean much meaning from it:




I used this as the basis for my first major project in an Electronic Writing course I took last year. Other works I did for the course are on the course wiki, my work including a programmatic riff on Pierre Menard, (this also used a text generator for grammar files I coded up), and my final project.

April 23, 2010

Video Games, and the failure of the word "Art"

Roger Ebert decided to revisit a topic that got him a lot of attention a few years ago, where he claimed that video games weren't art. Now he's strengthened his claim, stating that video games can never be art.

My reaction to this was mostly along the lines of Penny Arcade: there's nothing to see here. An older person who's never really played video games decides to classify them ungenerously. Whoop whoop.

And normally I would let it rest, but then PZ Myers decided to weigh in (he agrees with Ebert). Now PZ is all about thinking rationally, letting evidence trump prejudices, etc. So it surprised me greatly that he made such a claim with such little knowledge, and such poor understanding. It's very un-PZ.

Here are a few pennies for the conversation. This is really about two separate issues: not knowing about the medium you're criticizing, and more broadly, having a stupid question to begin with.

---
Regarding video games, there are a few major blunders in PZ's argument. PZ believes (emphasis mine):

Art is a kind of distillation and representation of human experience, filtered through the minds of its creators. A great painting or poem is something that represents an idea or emotion, communicated through the skill of an artist, to make you see through his or her eyes for a moment. Computer games just don't do that. No team sits down to script out a video game with the intent of creating a tone poem in interactive visual displays that will make the player appreciate the play of sunlight on a lake, for instance.

But they do, at least as much as producers of any other 'artistic' medium do. Compare Hideo Kojima's process with any filmmaker's. Look at the Flower example given in the TED talk that inspired the whole discussion. Stating that "this simply isn't something game makers do" is like an old Pythagorean stating that irrational numbers don't exist: it's simply not true, by very observable counterexample.

Even if you don't accept my examples (Ebert's discussions in particular are full of monkey-patches and amendments to definitions to ensure that no example is quite right), making a sweeping generalization about an entire expressive medium because someone hasn't done a specific project you prescribe (or template for a project) is simply bad logic. There's nothing stopping me from doing it myself, today, and poof! I've beaten your argument in its own, irrational territory.

PZ continues:

Video games will become art when replaying the performance becomes something we find interesting, when the execution of those tools generates something splendid and lasting. It just doesn't now, though. If you want to see something really boring, watch someone else playing a video game. Then imagine recording that game, and wanting to go back and watch the replay again sometime. That's where games fail as art, which is not to say they can't succeed as something comparable to a sport — we may want to explore the rules of a game at length, and repeatedly, and we may enjoy getting better at it. But no matter how well or how long you play a game, it's never going to be something you can display in your home as a representation of an experience.

To quote an old TA of mine, "isolated but incorrect assertion" on the "watching others play video games is boring." The only times I fire up video games is to watch strangers play Warcraft III, and Starcraft is on TV in Korea. YouTube is loaded with video game clips, and we know this is only for sporty demonstrations, not because the games have had a transformative effect on someone!

A less "sporty," more personal example: while growing up my siblings would often come into my room to watch me play Final Fantasy VII or Skies of Arcadia. They didn't have an interest in playing, but were very invested in my playing it for them. "It's like watching a movie, but better," they would say.

Finally, you also can't take this argument seriously when, if you follow it through, you see it enters a heads-I-win-tails-you-lose loop. You see, there are games where playing them and watching the outcome is very much an artistic experience: think Mario Paint, UmJammer Lammy, or the Everyday Looper. Then they would argue that the games themselves aren't art, that they are more akin to canvas, paint, etc. since they take the role of tools, and you only produce art when you play them, get the distinction? But here, PZ wants to classify them as art only if playing them is such an experience that you could hang on the wall. So the definitions he prescribes has it both ways so that he doesn't have to accept a new medium as being viable to the others.

---

Which brings us to the second major point: the whole argument is stupid because it concerns the biggest failure in the English language: the word "Art." The word is completely meaningless; its use almost always means someone trying to dodge a more difficult discussion involving more precise terms.

What is art anyways? Does it need to have an audience? Can anything be art in the right context? These questions are never answered because the word means whatever its listener wants it to mean at the moment they say it. Asking if X or Y is Art or High Art is like asking how many angels can stand on the head of a pin at one time: we can combine all we think we know about angels, pins, and the nature of standing, but it doesn't make the collection anything less than a clusterfuck of misunderstanding to no fruitful answer.

If I defecate into my hands and spread it on people around me, I'm a candidate to be institutionalized. But if I do the same in a theatre, and say I'm doing it for Art, I'm to be taken seriously and you should be more open-minded. The word Art, like God, is frequently invoked to tell people to stop thinking and allow idiocy to pass for proper, constructive thought.

So when I made arguments in the first section responding to PZ's quotes, I was a Copernican using what I knew to be weak Ptolemaic arguments to convince a stubborn Ptomaine who could hear nothing else. Just like the most reasonable answer to most "holy questions" is that there may simply be no God, the most reasonable answer to "can video games be art?" is usually "ask a better question, or clarify your terms."

April 20, 2010

Metaphors I use to describe software development.

Programming is Magic




You don't have to be much of a fantasy fan to have a conception of magic and wizards: with curiosity, hard study, and diligence, you can perform taxing and exacting actions to do something marvelous. After studying until you're old enough to have a white beard, you perform some sequence of actions that will turn a man into a toad.

We have magic, and it's programming. After much study, you can perform a very intricate mental task (the code a programmer produces is just an artifact of a much richer process) which yields impossible fruit.

And like magic, you have to do it just right. Compilers will annoyingly tell you it's Wingardium Leviosa, not Wingardium Leviosaaaa! The worst are spells that pass, but work incorrectly. You may end up spitting slugs.

Programming is the new literacy



It's very recent that reading and writing are expected skills, which we believe everyone has a right to learn. For hundreds of years, it was only scribes who could do it. The writing systems were themselves complex, it required privilege, education, and training before you could put anything to a tablet (and you lived very comfortably for it).

Somewhere in the last few thousand years, it's a basic skill everyone is expected to have. Knowing how to read and write isn't enough for a competitive, skilled job. You need to know how to write to even apply for one.

Programming is very new, and we're in the cave-painting phases of it. Long after I'm dead, programmers will look at today's languages and wonder how we bothered to endure it.

And like reading, once you learn it, worlds and worlds of intellectual exercise and stimulation open up as a result.

I'm with the authors of How To Design Programs in that I believe everybody should learn to design programs, just as we expect them to learn to read and do math.

On imperative vs. functional


Most programmers are imperative programmers, which I like to think of as boxers. They punch very hard, and very well. They have giant upper bodies, mean biceps, and when an obstacle presents itself, great programmers can take it down with a sustained set of punches.

Sadly, most boxers are a little brutish: there is an established and valuable boxing theory which consists of more than just punching hard. But few boxers take the time and discipline to learn it. Most will just punch as hard as possible, as close to center as they know. Boxers who study on the theory tend to be marvelous in combat.

In this world however, are people who do things a little differently. Functional programmers are mixed martial artists. At some point they got tired of punching, and learned to grapple. Learned to kick. They can still punch, but its not their favored solution, and for many their arms aren't quite so beastly for lack of practice.

Boxers judge the mixed martial artists using tests made for boxers, and deem it unworthy. When allowed to use pressure points or grapples, the mixed martial artist will generally win, but most spars are set to boxing rules.

But in the world of big biceps, a few boxers get curious, and wonder what the other side can teach.

April 19, 2010

Another Remix

I went over some favorite remixes a few days ago, but here's another I forgot (I might do this from time to time): Wilford Brimley raps about his experience with Diabetes. It really picks up after a minute and a half, and like the Trolls & Love, is underscored by Ratatat.




Mooooovies!

I posted these on the last blog, months before I was digitally evicted. Last summer I participated with some friends in the 48-hour Film Festival, making two movies.

The game works as follows: at the start of the 48 hours, you're given a character, prop, and line of dialogue that must be present in your movie. You're also given a genre, with one chance to re-draw if desired.

The first, called Hair Today, Gone Tomorrow!, was a dry run of sorts, where we would identify bottlenecks for Game Time. The plan was to do this in 12 hours, we really did it in about 14.






We drew Science Fiction, a salesperson, a wooden spoon, and the line "I wouldn't go down there if I were you." Understanding the radio address in the beginning is pretty key: it introduces the Sci-Fi element on which the plot is based (namely, a drug has been invented that lets someone re-live another person's experiences with a sample of their hair).

I acted, with a touch of sound design. See if you can spot the errors! I count two.

For the 'real' movie we made for the festival (Not All Who Wander), we drew Horror, a pot of coffee, an addict named Sonia, and the line "I'm pretty sure that's not right."






I didn't act in this, I was one of three involved in sound. Being at the end of the pipeline (sound was done last), it was a hellish few hours, especially considering all the on-site audio collected during shooting was garbage. I am proud of the sound work I did on this, but mostly if I remember what the conditions were.

This one has three major, hard-to-spot production errors. Brownie points if you can find them, one of them in particular is hilarious.

April 17, 2010

Baal Bless the Internet

I love remixes, and hope sometime (probably after I graduate) I can get my lazy bum off the computer chair into the computer chair so I can make a few myself. Here are a few that I really love (I also link to the original sources, you should check them out if you get the chance):



Bale Out (warning, NOT SAFE FOR WORK) is a remix of Christian Bale's massive explosion on the set of Terminator: Extinction. Aside from being absolutely hilarious, it's a pretty hip dance piece.



This is Sparta! is the classic. Thanks, Gerard Butler, for making that role so damn good (original) so we could get so much mileage out of it (more at Know Your Meme).




Love and Trolls - Boxxy takes the theatricality of Boxxy and makes it into a hell of an expressive piece in its own right. Know Your Meme covers Boxxy pretty well. As an aside, my roommate and I see lots of things on the Internet that gross out or shock many people, and we don't bat an eyelash. But when I watch the original Boxxy videos... man. It's hard to get through it all.



Ronald McDonald Insanity, both the first (above) and the second, are surreal. Not for everyone, but as a fan of electronic music and noise music, the kinds of dissonances and textures created (as well as the video cacophony) are right up my alley. For the curious, they're primarily voice audio overlaid from established music (like Love and Trolls, which used Ratatat), in this case from a series of shooter games in Japan. The Ronald McDonald Insanity songs are here (first) and here (second).

On Programming Interviews

Lots of great things have been written on the subject of programming interviews, but since I'll be entering the workforce very soon, I've taken away a few notes on how I would like to conduct them in the future, having just run the job search gauntlet.

Regarding phone screens, I learned a lot from Steve Yegge's post on his process. To summarize, he believes the candidate should demonstrate some basic proficiency and understanding in five areas to get the on-site: coding, OO design, scripting/regexes, data structures, and binary. It's alright if the candidate struggles a little, but if their answer to 'describe a function to sort an array of integers' is 'Collections.sort(array),' you might want to think twice about bringing them in.

Another advantage of living in 2010 is that we can actually see some code during a phone screen: one interviewer had me use EtherPad while on the phone with them, and I would probably do something similar.

More generally, diversity in the phone screen process will help you eliminate candidates who can talk big in one or a few fields, but don't have (or can't form) a more complete picture of what's going on.

If the candidate makes it to on-site, I would extend the diversity principle, but probably ask a few questions not listed above (if they demonstrated that they can write a regular expression in the phone screen, they don't need to show me one on-site). Here you could look at another round of fun brain-warpers that Joel Spolsky points out: pointers and recursion. I would love to write out a few problems as he has that just show they can read/write programs that use these techniques.

Brain teasers are a contested form in the interviews, but I love them too. Well, good ones anyways. I would like a candidate who smiles when they know a puzzle is headed their way. They would be very hintable, so I'm not anticipating an 'aha!' moment. An example of one of these that I think would rock, comes from Skorks:

Write a quine, in whatever language you like (for those who don't know, a quine is a program that prints its own source code without reading itself).

Now I've written about it, I probably wouldn't use it. A risk you run with any type of programming question that is meant to challenge is that they've run into it before, or researched the standard questions before arriving. While there's nothing wrong with researching beforehand, you probably want to see the candidate think, not just what they can remember. For this reason:

  • Pick nonstandard problems. For coding samples, probably avoid direct library functions, or anything from here. They should be simple, so maybe library functions with a twist. A favorite of mine was "write a function that takes a string, and returns whether the braces, brackets and parenthesis are matched." Challenging, but appropriate, and more applicable to any job than reversing a string.

  • Have backups. One interviewer asked me what a good data structure would be for the search function of an address book. Given that I'd just finished the Facebook Breathalyzer puzzle, we were finished with what might have been twenty minutes of material in five.


A few more notes:

  • I personally don't like whiteboard coding. I get nervous, can't edit/iterate the function rapidly, and can't go at an appropriate speed. So when my time comes, I might minimize that. Regardless, you can't be an actor unless you audition, so it'll stay.

  • A short quiz, on paper, would probably be included. This wouldn't be multiple-choice or anything silly, just a block of code, and ask the candidate to comment on all of its qualities.


The only flaw in my plan is, with all the material I've mentioned, I'd probably need more than 45 minutes to get a feel for a candidate, and I doubt I'd be given more than that at a time. I'll have to find a way to resolve this ^_^

April 8, 2010

Software and Evolution

I think the software is growing, and will continue to grow, the way lifeforms have grown and evolved on Earth. By this I mean we started with a single ancestor, likely of a few proteins or perhaps a single cell, only to become a planet housing humans, echidnas, sponges, fungi, insects, trees, and more.

This mostly comes to mind when I look at essays like this series, by Mike Taylor, on how so much of coding these days is just playing plumber between various libraries, fixing leaks and disasters that occur when the piping isn't perfect. The argument is stated well by jdietrich commenting on the story (where else?) on Hacker News:

My biggest gripe with modern programming is the sheer volume of arbitrary stuff I need to know. My current project has so far required me to know about Python, Django, Google App Engine and it’s datastore, XHTML, CSS, JQuery, Javascript, JSON, and a clutch of XML schema, APIs and the like.Don’t get me wrong, I’m grateful for all of it, but it just doesn’t seem like what I was promised when I followed SICP for the first time. It just feels like I spend most of my time scouring through documentation and trying to remember umpteen different sets of syntax and class names rather than actually thinking in code.

Back in ye olden days, most programming tasks I performed felt quite natural and painless, just a quiet little chat between me and the compiler. Sometimes longwinded, sometimes repetitive, but I just sat and though and typed and software happened. The work I do these days feels more like being a dogsbody at the tower of babel. I just don’t seem to feel fluent in anything much any more.

We talk about ‘flow’ quite a lot in software and I just have to wonder what’s happening to us all in that respect. Just like a conversation becomes stilted if the speakers keep having to refer to their phrasebooks and dictionaries, I wonder how much longer it will be possible to retain any sort of flowful state when writing software. Might the idea of mastery disappear forever under a constant torrent of new tools and technologies?

I happen to agree with most of the posts, but their symptomatic of something that's been on my mind: our code is really inefficient. But more importantly: that's okay, and further, we will have to live with it in order to reach software at the level that humans are at biologically.

Allow me to clear up the mapping. When we started with computers, we wrote in raw, unadulterated binary. Every machine instruction was treasured, coddled, and several amazingly clever hacks were developed so operations could use minimal resources.

This was a necessity! We had to! But then we moved up to assembly, then the Capital Languages (FORTRAN, COBOL), and so on, until computers got powerful enough that we could afford ourselves some abstractions. What level of abstractions? Imagine how Mel the Real Programmer and other hackers of the binary era must feel when we're using languages with immutable strings, and someone writes code like:

String container = "";
for (String suffix : suffixes)
    container += suffix;
return container;

In which every iteration of the loop allocates a new string! And the code doesn't render the program unusable!

How does Mel feel? Probably how a bacteria (or other single-celled organism) would feel when I scratch an itch, and kill or damage hundreds of skin cells ostensibly for nothing.

Single cell organisms are still with us, and will almost certainly outlast us. We still have them in programming as well. To this day, if you want to really bust out the performance, you still gain lots by living close to the metal: I know a student in the introductory graphics class who implemented his linear algebra package by including x86 in his C. And almost all projects for my combinatorial optimization class are done in C only because, true or not, we believe "it's the fastest." (it is really fast).

The truth is, while people are still busting out assembly and squeezing whatever hardware gains they can, most of us can now get away with being pretty wasteful. And its the only way we can build the truly large, monolithic systems people pay big money for.

What am I trying to communicate with this metaphor?

First, stop arguing that speed be the limiting factor of a language or technology's eventual success. Every abstraction we use today (structured programming, object-orientation) was painfully slow during its introduction, but it will be one of these abstractions that will be the key to the next step in software evolution.

I recognize there are many good arguments against the use of functional programming, logic programming, and other alternate paradigms. Having speed comparable to other non-C languages today and calling it slow is not one of them.

Second, the diversity of software will propagate. Bacteria, fungi, plants, and eagles all live in radically different ways. Learn this and love it. Saying 'my form of programming is the real way' is like saying fungi are a real life form, but plant life isn't. Embedded systems have different needs than white-collar users of 'enterprise' software, different than logicians.

Finally, as it relates to Mike Taylor's article, what we are seeing now with library hell is the bad mutations of software evolution, the ones that will die out until we figure out how to do it right. If software at this point is at a jellyfish level, us sorting out library or framework programming are all the failed experiments to grow bones, gills, feet, and wings. One of them will work eventually, but lots and lots of our software will die until it does.

April 7, 2010

Being lucky, being strange. Being lucky to be strange.

The Olympics brought more opportunities for people to be hatin' on Johnny Weir and his style of performance. I love what he said: "Every little boy should be so lucky to turn into me."

And, you know, it's sort of true. How wonderful if we were all so lucky that we could freely be who we want to*, without judgement?

I bring this all up because I saw this video yesterday (the action starts at about 1:08).



If we could only all be so lucky as Johnny Weir and this guy, the world would be a much more fun, interesting place to be.

*= Provided of course, nobody gets hurt. Sorry, murderers, pedos, etc...

April 6, 2010

Common Lisp

I've been playing around with Common Lisp recently, using Practical Common Lisp and Let Over Lambda as guides (not gonna lie, having LOL on the spine of your book is wonderful). Last year when I went through my Scheme phase (the original URL of this blog was littleschemer.blogspot.com), I never thought I'd see the day that I'd switch sides.



FWIW, this is my iPhone background image.


I do feel like a traitor though, since even though I'm a theory-head and advocate of FP, I'm loving Common Lisp. There are a few reasons for this:

  • Simpler macros. I'll bet some hardcore Schemers will disagree with me on this, but I feel that defmacro is much, much simpler to learn than syntax-case, and syntax-rules always leaves me wanting more. Maybe I'm deficient, but I took to defmacro immediately, whereas when I want to do anything non-trivial in Scheme, I feel myself always going back to Dybvig's explanations, taking far longer than I'd like.

    While in principle I'm for hygiene as the default, it's not too big an issue in practice. Hoyte gives a great little macro (one of the first in his book) that ensures you fresh variables whenever you want them, without even having to declare them!

  • Language libraries. CL comes with every function you could ever want. It comes back to what Peter Norvig said in Paradigms in Artificial Intelligence Programming (paraphrasing): Scheme is one of the smallest languages to define (< 50 pages), whereas CL is one of the largest (> 1200 pages).

    While some Schemes provide these, since they aren't part of the standard you aren't guaranteed anything across implementations. In fact, my favorite Scheme in terms of libraries (good ole' PLT) even broke across versions when they enforced module declarations at the top of every file.

    Besides, once a Scheme gets these, they stop wanting to be called Scheme.


This isn't to say its all peaches and cream: I still prefer Scheme's single namespace over Common Lisp's, I prefer Scheme's naming conventions (map vs. mapcar, or worse, loop for elem in list collecting). And issues with lambdas, namely sharp-quoting and not being able to place functions in the function position (especially after learning the beauty of Scheme's semantics so well last semester) still throw me for a loop.

But after a year of mounting tension with the residents of Shelbyville, I realize they've got quite a bit right ^_^.