June 28, 2010

Coding socially

My gravatar for ALL these sites I decided to finally get a GitHub account, since I've downloaded countless great software from it, and think it's about time I joined/gave back. So check out my profile, feel free to be my friend (or follow my project[s], whatever it is...), and check out my first "release" of a side project, a program to help you cheat in Scrabble!

Many other Scrabble cheaters have been written, but mine is in a silly functional language! Also, I have plans to make it more than just an anagram/word generator, even though that's all this release contains. You'll need an Erlang VM to run it.

I'll also mention I have a BitBucket account, but no public repositories. My video games group used Mercurial, so now that two of us have graduated we moved our project to BitBucket. I put Rat Race and FlipTile on there as private repos. If you'd like to friend, follow, or want access to those codebases, let me know ^_^

Finally, though this is on the sidebar, I'm also on Stack Overflow.

(the pic is my Gravatar, used in all these sites; a picture taken when I was in Guatemala last year, with much longer hair and a budding beard. I think it's one of the few pictures that's not unflattering with that style. I look a bit different now.)

June 27, 2010

Video Games, and the failure of the word "Art" Redux

UPDATE (7/20/10): A little old, but Ebert closes the book by saying that, while he still thinks he's right, he shouldn't have brought it up in the first place. He essentially says "I'm not wrong, but I can't explain why I feel that I'm right." It's actually a pretty nice piece.

---

I wrote about Ebert et. al. being naive and/or lame when they don't allow games to be art. This great little reddit thread "Saddest moment in a game?" shows that, like the NYT summarizes rather well, it's more a generational misunderstanding of what 'art' should mean rather than one rooted in any actual thinking. See how many you've encountered yourself!

At the very least, I see how deficient I am for not having played any Metal Gear Solid beyond the first (which was excellent). Also glad Majora's Mask is getting a fair bit of love in that thread; it's the masterpiece nobody's played, like Skies of Arcadia.

June 25, 2010

Terrible, Wonderful Music Videos

Here's a bunch. The first is probably the most cringe-worthy. This is why the terrorists hate us.



Here's another:



Murrrrrrr

Now for some non-crappy ones.

Embedding disabled by request, so you'll have to follow the link for the first one. It's a group called Skindred my former housemate showed me, a group that combines reggae and metal, of all things. I think it works pretty well here.

This one is awful, but I love it:



"El Sonidito" means "the little sound." To quote my brother, who showed it to me: "You need to play this at your next party. I've never seen a cooler bunch of moderately looking guys. Also, I've never seen anyone look so cool playing one note on the cheap keyboard."

The funny/sad thing is, this took off and got way popular. They made a "real" music video with actors, a plot, unexplainable Hot Dancing Women... and it sucks. I like the other one so much better.

On this blog's theme of remixes, we've got a great, Bootsie-sounding one of Mr. Plow:



When I first heard this, I was underwhelmed. They were on the Colbert Report (as their record is released by Comedy Central, and what better way to promo them?). Still, after a bunch of listens, it's pretty hip:



Finally, we'll round out this tour with a little Pitbull, lest we get too classy:



It's more there as a contrast; I sometimes forget what machismo pop culture is like. Also, it's damn catchy.

(I wanted to make a PL parody of this. "Miranda she gets LAZY, Haskell it is LAZY, the Bash Shell it is LAZY, Clean language it is LAZY,..., Now thunk it let's get LAZY, thunk it let's get LAZY...").

UPDATE (6/26/2010): This one, in the 'awful' category, comes from Saurya, though I find it much less an offender than the first two:



I would also place this in less-than-wildly-lame, only moderately-lame-but-lovably-comical! It's dubiously dubbed Crabcore since they stand like crabs to look hardcore. Gotta love the sweet Techno Dance Hall interlude near the end.

June 23, 2010

Type Systems, From 1000 feet high

I posted a link to one of my favorite articles ever on Facebook that is now gone. Here's the reddit link; the article was about what you should know before you debate type systems, since most people have fuzzy notions of what type systems are, what they do, and their properties. He said it best, but now that it's gone, I'll go over a few points that I remember the author addressed.

(edit: Reddit provides a Wayback Machine link, so you can read the original!).

First off, what is a type system, really? This is a bit hard to answer, but the a simple way to describe it is as a mechanism to prevent your code from executing nonsense by investigating what operations you are performing to what data. An example of this would be if you had "2 + potatoes" in your code: a type system would see that you cannot add a number and the symbol "potatoes" (nonsense!) and prevent you from doing so.

Note that we've already encountered a subtle distinction that is the source of confusion: when does this happen? There are two major forms of type systems: static types and dynamic types. A static type system will investigate your code before you run it, reporting any errors it sees, whereas a dynamic type system will tell you of errors during runtime.

When most people talk about a type system, they almost always mean static types. This does not mean that dynamically typed languages don't have type systems. Far from it. Compare this 'untyped' Ruby code:


my_array = [1,2,3]
0.upto(100){ |i| puts (my_array[i] + 1).to_s }

with the 'typed' C code:

unsigned array[3];
array[0] = 1;
array[1] = 2;
array[2] = 3;
unsigned i;
for(i = 0; i < 100; ++i) {
    printf("%d\n", array[i] + "Stack Pointer!");
}

The output of the Ruby:

2
3
4
types.rb:2:in `block in
': undefined method `+' for nil:NilClass (NoMethodError)
    from types.rb:2:in `upto'
    from types.rb:2:in `
'

And the C:

3915
3916
3917
3917
1606417842
... (continues for 100 lines) ...

We see that Ruby stops and C will plow right through! (To be fair, the C compiler will warn you of the type mismatch). Ruby's type error shows that, while dynamic, Ruby does have a type system, and shouldn't be called untyped. And while C has something people call a type system, it doesn't really function as one might expect.

Which brings up the next point: What should a type system do?. There are two major properties that a type system should strive to provide (incidentally, C/C++/Java don't provide these):

  • Progress: A well-typed expression can be evaluated further, unless the computation is finished. In short, if the expression is well typed, the rules of evaluation and type system guarantee "there's something we can do with it" (this excludes exceptions: "1/0" is well-typed, but we can't detect this beforehand without solving the Halting Problem).

  • Preservation: Evaluation of a well-typed expression leads to another well-typed expression.


Type systems that provide these properties mean that if your program passes the type checker, it can't "go wrong.", by that we mean, "the computer will always know what do to (progress), and never lead you into a false corner (preservation)." This doesn't mean your program will be bug-free, just that any bugs are logical bugs, or unhandled exceptions (basically, your own fault).

Languages with great type systems (SML, OCaml, Haskell) have proven these properties about their type systems, and it makes programming in those languages a joy.

My favorite part of the article, however, was the Fallacies section. Things people believe which just aren't true. I covered the one of the biggest ones already with the example ("dynamic typing means untyped!"), but here are two others that really get my goat:

  • Typed code is longer, more verbose. This, again, is untrue. Most people saying this are referring more to type annotations, which is text you write in your program to tell the compiler what the type of everything is. You'll find these in C, C++, Java, and C#; I once had a really ugly line of Java that looked something like:


    HashMap<String, ArrayList<Integer>> scoreMap = new HashMap<String, ArrayList<Integer>>();


    (I've ranted about Java's verbosity before). You won't find lines like that (or at least they're not mandatory) in SML, Haskell, or Scala. Using technology from 70's, we can infer types from the context of the code. So those ranting about statically typed code being verbose should really rant against type annotations, which are distinct.

  • Strong typing vs. Weak typing. THESE TERMS MEAN NOTHING. At least, nobody's agreed upon what they should mean. Even Wikipedia agrees with me. So stop saying it, and say what you mean; it's like saying a food is 'flavorful.'


It's a pity the original link is gone, he said a lot more than I did, and a lot more clearly. Still, type systems are fun, and go a lot deeper than this. If you're looking for a good introduction to programming with types, The Little MLer is hard to beat. For the theory, people seem to love Pierce. I'm not too far into it, but it looks promising.

Playing Video Games Again

A brief summary of my video gaming:

  • When I was 3, I started on NES games. From there on in, I played almost exclusively console games until I was about 14.

  • A friend of mine from high school introduced me to Warcraft III, so I played that a bit. I was terrible. I more or less stopped playing games, except for Warcraft III, and Super Smash Bros. Melee. I flirted with the DC area of the high-level Melee scene, and got murdered.

  • In college, I played Warcraft my first year. Found the competitive gaming scene, and a great site for competitive Warcraft.

  • I didn't play for the four years following.


I mostly mention this because starting a few weeks ago, I started playing again. This has partially to do with my madly brilliant neuroscience friend (now, girlfriend ^_^), and let me say, I'm glad to be back.

You see, video games are the closest thing I have to sports, at least the role sports are usually cast in, in both participation and viewing. I'll contrast with my dad: he can go to bars and talk about why X trade was good for the Nationals, or to discuss Jordan's very brief career with the Wizards.

I don't know enough about my teams to do this, but I can talk about why I prefer watching ReMinD over Moon, or why I still play solo DH against most races. And while my dad likes to DVR games and watch them before bed, I love loading up a good replay to end the day.

But also as a player, video games give me something to work on and master. This video is a bit long, and is good apart from this specific point, but describes what I'm trying to very well:

Namely, that we enjoy doing things and getting good at them, for giggles. Like practicing your foul shots, there's a great feeling that comes from getting better at games.

Luckily for me, I feel these things about programming too. But video games have a softer learning curve (there's SO MUCH to computer science and programming) and aren't going to be my job.

With all that said, for those who don't know and love the pro scene already, here are a few videos that may illuminate. The first two are favorites: (context: in Korea, Starcraft games are televised, with many professional players. Luckily for us, the announcers learned from Latin American Soccer Announcers rather than the dry English-speaking ones. Here's an epic casting of Plague, a Zerg spell):

A similar description of the death of a Reaver (count how often they say it):

Some Pro Smash (Melee), for those who've never seen it:

Finally, a great epic game between my favorite Terran and Zerg players in the Starcraft 2 beta that has been shoutcasted. Skip around if you don't watch the whole thing (which you won't ^_^. The game starts at 3:05):

June 13, 2010

Quickie on Atheism

I love Andrew Sullivan's blog. It offers excellent analysis containing many sides of the news; what separates it from other blog is the conversations he mediates with the readers.

One topic of great discussion is divinity: Sullivan himself is a Catholic (an exemplary one, I would say, of the compassion and humility the faith claims to provide) and he frequently discusses religion and the many conversations around it.

He recently posted a letter from an atheist reader that echoes how I feel about (what is being branded as) the New Atheists: yes they are loud, and their tone does little to persuade believers in an honest debate.

But their contributions to the atheist community isn't persuasion of believers, it's more a declaration of the right to exist, and a place in the discussion. Like the earliest gay pride parades, they are there to say "We're here, our lack of belief is legitimate, and we won't be bullied out. We won't settle for less respect. Get used to us."

It was their voices, their arguments, and their courage that allowed me to come out to the world as an atheist.

An excellent, excellent debate between Andrew Sullivan and Sam Harris, a prolific atheist, is over at Belief Net.

June 9, 2010

Assorted interestings

I've always loved the Google "Did you mean?" (favorite is recursion), but this one came up recently while helping my sister with some definitions:



---

There are some words that people invoke to give false credence to ideas. The most obvious case is God; if you mention him/her, you can convince people of pretty much anything.

In non-God cases, it's normally an abstract term we use as shorthand but don't have a solid, working definition for. I've written about Art being a stupid word, and today I read an interesting blog post adds 'neuroplasticity' to the list.

June 8, 2010

Keep Up that Racket!

PLT Scheme, formerly my favorite Scheme implementation and mentioned before in my writings, has been re-branded as Racket. I'm very excited about this: Racket is a language of unbelievable potential, and hopefully it's re-branding will make people aware of this.

(while it is just a name change and new website, I doubt Clojure would have gotten it's momentum if it were just called "JVM Lisp").

One of the wonderful things about Racket is its mailing list, and a cute discussion there generated a major treasure. Namely, someone brought up that computer programming isn't really that related to computer science, to which someone else mentioned that you could very much be a successful programmer even if you haven't studied the science.

This is sadly true (many people making their livings don't know what they're doing, many examples at The Daily WTF), but then a user named Joe Marshall simply wins:


> It's quite possible to be a productive and successful programmer without having a
> solid understanding of computer science.

That's the problem. Maybe it shouldn't be the case. Variations on this
statement are alarming:

"It's quite possible to be a productive and successful physician without having a solid understanding of medicine."

"It's quite possible to be a productive and successful airplane engineer without having a solid understanding of aerodynamics."

'Rocket Scientist' : 'Newtonian physics'
'Brain Surgeon' : 'neurology'

I attribute this more to the fact that we're living in the cave-writing stages of software: we've had computers for fewer than 70 years, personal computers for maybe 30, and connectivity for less. I'm sure after the invention of the steam engine, all sorts of idiots were designing inefficient, dangerous factories, and we've now got the software equivalent.

I'm lucky in that I got to study the science, and am genuinely interested in it to keep learning. But what a wonderful day when people who develop software who know at least what I do will come standard.

June 1, 2010

Really, Apple?

Apple just showed themselves to be a bit lame about security, at least on Mac OS X two years ago. Here's the story:

My sister had a MacBook to use for college when she was supposed to enroll in 2008. A few weeks later she got sick with her encephalitis, and had her remarkable recovery story.

Unfortunately, she didn't remember the root password she used for her laptop, and it wasn't any of her 'standards' (we'd tried them all). So while she could still use the computer via the auto-login function, she couldn't install or update software (since this requires an authorized user to enter their password). Regarding use of passwords, this was a stranger's computer.

This presented mild problems over the last few years, but this came to a fore last night: with a very outdated iTunes, she couldn't sync her iPhone on her laptop while a summer student at Columbia's School for Continuing Education. This is on top of using and outdated and insecure OS, being unable to install new software...

So I googled "recover lost root password Mac OS X," and lo and behold, the instructions I found worked!

This should not happen. I shouldn't be able to set or reset root passwords with physical access to a stranger's computer. Below is the fix, in both technical terms and non-technical metaphor terms, for the curious:

Technical: Holding Command-S on startup allows you to run in single-user mode, as root. So a simple mount -uw / followed by passwd root allowed me to set the password of user 'root' to letmein. Then I reboot, and log into Mac OS X normally with admin privileges as user root.

Non-technical Metaphor: Suppose you're building is guarded by a lazy doorman. He has a list with all the tenants in the building, but when you walk up to him, you notice he just looks at the placards on the mailboxes. I effectively scribbled 'root' on a postcard, taped it to a mailbox, and walked up to the guard the next day and said "My name is root!" Seeing it written on a mailbox, he let me in and I changed the locks on a stranger's apartment.

While I haven't felt that elated in years (and saved my sister's computer from having its reset to disk factory settings, our best alternative), this is a major lame sauce from Apple.