Neither my gf nor I could resist attending the 33rd annual Sisters of Perpetual Indulgence Easter Celebration in Mission-Dolores Park, San Francisco today. Apparently the event has been going on since 1979 with the mission of celebrating diversity and eliminating needless guilt. Obviously the message resonated with a lot of folks because the place was ridiculously packed - we're talking thousands and thousands of people:
They claim that the event has gotten bigger and bigger with every passing year. There's not much space left in the park for it to get much bigger than this!
Luckily we were right up in front of the stage where we could see the Sister's antics up close.
Here is one of the event's three hosts. Obviously this is a er... very liberal gathering (though if you hadn't figured that out from the costumes, the male exotic dancers that showed up later were a dead give away).
The pièce de résistance for the entire celebration, after the various cabaret musical acts and the aformentioned 'dancers', was the 'Hunky Jesus' competition wherein a large number of differently themed 'Jesi' were paraded on stage for the crowd to see. I took a video of the first ~7 mins of the event so that you can get a feel for what it was like:
In the end, 'Funky Jesus' ended up winning on account of having converted his cross into a slide electric guitar, which was pretty awesome:
Only in San Francisco, eh? That's why I think I already love this town. As always I've created a Picasa Web Album with many more photos of the event and other Jesi. Check it out if you're interested!
Realizing that I never watch television anymore, I cancelled my Netflix subscription this week and took advantage of a free 30 day trial of Amazon's Audible.com. Audible is basically a modern version of 'Columbia House'1, where you pay $14.99/month to download 1 audiobook/month and receive 'discount' pricing (at least as compared to retailers) on additional books. As metioned in my last post, I find myself unable to meet my listening needs (jogging, benchwork, commuting, etc.) with podcasts lately so audiobooks seemed like a good idea.
The book I chose for my free trial is the unabridged version of Walter Isaacson's Steve Jobs (2011; Simon & Schuster), the massive bestseller that came out a few weeks after Jobs' death late last year. I knew that Jobs himself had asked Isaacson to write his biography and that the work wasn't intended as a cash-in from his death; however, I was still quite loathe to read it when it came out. Like any celebrity death, the media's evangelizing of Jobs was so over the top that I felt as though I had to wait until the furor had died down. The audiobook is divided into 3 long (8 hour) parts, and since I can't make notes and mark pages, I've decided to blog about each part individually.
You may have heard about the psychopath theory of corporate CEOs. It's basically pop-psychology that suggests that the qualities typically associated with psychopathy (e.g., lack of empathy, narcissism, solipsism, etc.) are beneficial qualities in the cutthroat world of big business. The aspect of this theory that I find disturbing, is that some people see psychopathic behavior among corporate executives as justifiable when stacked against their accomplishments2.
Steve Jobs is a particularly interesting individual on this count because the first part of the book paints him out to be a man who a) had some very clever, very forward thinking ideas at a young age and b) was a completely intolerable, unforgivable asshole. Specifically relevant to the pop-psychology theory outlined above, Jobs' 'psychopathy' was apparently so bad that it fundamentally harmed his company. While we can marvel at how impressive early Apple computers were, not all of its ideas and intuitions were successful.
Jobs' vision from the beginning and up to his death was that technology should be made available to the masses and that in order to do so, ease of use and elegance should trump some functionality. 70s tech was intimidating to non-specialists, and companies of the era were catering to the hardcore. If nothing else, Jobs realized rather early that presentation and 'messaging' mattered. For instance, he would routinely ask his designers to implement features that, while not necessary, were 'cool' (such as the hovering, rounded windows on the original Mac OS). More often than not Jobs realized that journalists and customers would spend more time talking about the "insanely awesome" superficial features of a product more so than the detailed specs that obsessed tech geeks. Jobs brought the idea of selling a concept or a lifestyle to electronics - a field dominated by tech heads.
The now famous '1984' ad run by Apple to announce the Macintosh computer was a smash success, and was strongly championed by Jobs against the wishes of the other execs. It 'went viral' long before there was a Youtube and drummed up massive interest in the brand.
The flipside to all of these great ideas is that Jobs had such a strongly abrasive personality that he frequently turned off investors, insulted clients and collaborators, and drove his own staff insane. He publicly insulted fellow employees and underplayed Apple products designed by 'rival' teams. It's sometimes admirable to say that someone is uncompromising in their vision, but Jobs failed to see that his desire to have control over every aspect of his system would make it more expensive and less 'open' than other brands. There were many competing, incompatible brands of PCs in 1984, and ultimately the open standards pioneered by IBM drove down costs and netted more customers even though it was clear for many years that Apple's products were more 'impressive'. Furthermore, Jobs went against the wishes of Apple's execs and forbade them from licensing the Mac OS software to other hardware manufacturers, a decision that would hurt down the road when Microsoft proved that the money was in the software.
In the early days of Apple, Jobs' artistic and forward thinking design achievement were very impressive, but so too were his managerial flaws. Business acumen was not his strength, and he probably had too much control over operations in the company's early days. Surprisingly, there's almost nothing positive said about the man as a person at all in the first third of the book, which makes it a bit of a boorish listen. The people who surround Jobs are so much more interesting as individuals than the man himself - if this is the real picture of Steve Jobs, I'm surprised that he bothered to seek out a biographer.
1I spoke too soon, apparently Columbia House is still in operation in the US.
2This could be an entire blog post on its own, but I've become acutely aware of how frequently people derive 'ought' from 'is' or alternatively, assume that others do so. Observation of a phenomenon is not equivalent to condoning it (or vice-versa).
"We go right for the top shelf with our words now. We don't think about how we talk; we just say the right-to-the-fucking... 'Dude it was amazing!' It was 'amazing'? Really, you were 'amazed'? You were 'amazed' by a basket of chicken wings? Really?"
Is it just me or does it seem as though we've become incapable of discussing topics with any kind of verbal caution or reasoned analysis? Alright, perhaps it's a bit ironic that I'm beginning a post about hyperbole with such an incendiary statement, but when it comes to certain types of discussions, I'm beginning to feel that it's justified.
It seems as though there are way too many articles about perfectly normal situations written as though said situation was taken to some insane theoretical extreme. If a popular company has bad quarter, all of the headlines start with 'is it the beginning of the end for [INSERT CORPORATION HERE]?'. Similarly, upon introduction of a new product, all headlines will read, 'Will product X kill [INSERT DOMINANT COMPANY]'s product Y?'. A political candidate makes a faux-pas, and everyone immediately asks whether their campaign is 'DOOMED', and so on.
What kills me is the lack of empiricism underlying these statements. We have abundant experience telling us that if something seems to good to be true, it probably is ('true' being defined here in terms of what will generate the best news copy). Similarly, things are almost never as bad as they first seem. I understand that you have to sell magazines or clicks, and as Drew Curtis of fark.compointed out in his book, there's a lot of incentive to manipulate headlines. But this sort of crazy exaggeration extends far beyond the headlines into the body of the work itself.
I think that one big part of it is the rise of news aggregator blogs, like Gawker's garbage, TMZ, or Yahoo (sorry I won't provide links to this stuff). I've already complained before about how these places have fallen prey to eschewing journalistic ethics in order to solicit 'clicks', and they're certainly guilty of making mountains out of mole hills in terms of headlines. But I don't think that the entire blame can be laid at their feet - it's a broader cultural phenomenon. I think we've all been somewhat seduced by media that uses hyperbole and often comical extremism to make points, but enough is enough.
As I've mentioned over and over again, I listen to a lot of podcasts. Well, I should say that I have listened to a lot of podcasts because I'm in a continuous process of culling my regular rotating list of shows. A lot of this is simply because most podcasts are not very good1. But I've also become very sensitive to the overuse of hyperbole in current event shows. Every single thing cannot possibly be one of the following:
a) The best thing ever. b) The worst thing ever. c) Revolutionary. (or is that 'Resolutionary'?) d) The death of modern [INSERT CONCEPT HERE]2. e) A harbinger of the apocalypse. f) [INSERT WORD HERE]gate.
You know what? I think that we are basically living in the midst of [INSERT WORD HERE]gate, wherein every random stupid thing that happens is being directly equated to a national scandal that shook the foundations of people's confidence in democratically elected government. Remember the problems with the iPhone 4 antenna - Antennagate. Or how about 'Nipplegate'? I need to point out that the above link to the list of 'gate' controversies is nowhere near exhaustive in terms of how often I've seen/heard this used on blogs and podcasts.
As Louis CK asks in the bit quoted above, where do we go from here? We're already describing banality with linguistic extremes, so how do we describe events that are legitimate outliers? I don't think that this is the end of the world or that it's all downhill from here (though that's what the subjects of this rant would probably say themselves). Rather it's simply that I'm being pushed towards a more professional style of communication not provided by every random source, and that I'm willing to pay for it.
A little reasoned analysis would be appreciated, even if it's notoriously difficult to argue the middle ground.
1The best podcasts are those that learn from and apply the techniques that make good radio. There has to be good chemistry among the hosts/participants, good quality to the recording and production, and interesting topics. Some people have the 'knack' while others do not, unfortunately. It's very much an Anna Karenina problem in that an otherwise great podcast can be ruined for me if people are continuously being dropped by Skype, or one host obviously has no idea what they're talking about, or if people on the 'cast are obviously distracted and doing other things while recording. It's tough to judge whether a cast will be decent before trying it out, because sometimes the most amateur (in the literal sense) efforts are amazing, while big professional productions fall utterly flat and vice-versa.
A.S. I realize that this is an incorrect use of 'right'.
During my Ph.D. a few postdocs lamented that they felt as though they didn't have time to sit down and 'learn how to do things properly'. For instance, when you're analyzing data, there are often many ways to accomplish straightforward tasks such as removing redundant entries from a list or concatenating two large tables. Similarly, if you've ever had the opportunity to learn a bit of PERL scripting, you quickly find out that there are often many, many different ways to get the desired results. However, saying that there are many different ways to do something doesn't mean that all ways are equally as efficient. I had a pretty shocking realization of this a couple of years ago when changing a few lines of a script altered its run time from overnight to ~5 mins (I learned about hashes).
Changing the time it took to generate some data from 8-10 hours to 5 mins was a massive gain in efficiency. Instead of taking several days to get results, I could process my whole dataset in a morning. In fact, even if it took me a whole day of searching, reading, and trial-and-error to learn how to boost my efficiency, it would probably still have been a net benefit.
Yet, most of the time, I find myself in the same mindset as the postdocs referenced at the beginning of this post - it's easier to just trudge through these data (or this protocol) using the techniques that I know rather than sit down and 'waste time' to look up a more efficient method1. Despite realizing that learning how to do something correctly now will reap future efficiency rewards I nevertheless offset future gains due to present circumstances (always pressing myself to 'get more done').
I suppose that it's a small comfort that this phenomenon has been quite well-studied in both psychology and economics under the general framework of intertemporal choice. In general, people have a tendency to perform what is called 'delay discounting', or to discount (often sharply) the value of long term rewards once they pass some arbitrary time threshold (e.g., would you choose between receiving $100 today or $110 in a month?).
Such 'bias for the present' (see this press release for an example), is logical under many circumstances as the future is uncertain. It doesn't really make sense in my case because I've seen the work put into learning some new aspect of PERL, or R, or Linux (or any number of things) pay off time and time again.
I've resolved to be more judicious in my allocation of time towards improving my skills in areas where such improvement would benefit my work. One way to offset the present 'cost' of such self-instruction is to invest more of what would otherwise be 'free time' into it - this may be a good way to offset the 'guilt' of not feeling like I'm 'wasting time learning'.
I am in a field where learning is perpetual, so don't get the wrong idea. What's important to keep in mind is that it often pays off to learn how to do something properly (read efficiently) from the get-go.
1In my experience it's often obvious that there's a better way to do whatever it is I'm doing. When you're developing a new protocol or method, you're typically well aware that you're breaking new ground. Otherwise, there's nothing new under the sun.
Everyone who's involved in academia probably has some angst about seminars. At the very least, it's always difficult to figure out which talks to attend when you feel like you already don't have enough time to take care of the work that you have to do. I've seen quite a few seminars at this point, and as the years have gone by, I think I've become more conflicted about them than ever.
I think that there are obviously many 'purposes' behind seminars. For instance, an invited speaker from another institution can tell you about interesting work that, if not directly relevant to your own projects, could at the very least be 'inspirational'. Regardless of whether you're happy with attending seminars for interest's sake (I've done it many times myself and would encourage it whenever possible), I think that there is a way that many of these talks could be made far more likely to be helpful. I also think that this is where my conflict regarding seminars lies:
I don't understand academic seminars that are primarily focused on 'results' rather than methods. Of course we're all excited about telling people about interesting new results, but on top of being fundamental to the practice of science, I think it's often more interesting to learn how those results were obtained.
Some people are probably reading this thinking that I'm crazy: of course people tell you how they obtained their results! While in some fields I think that this is true, I've been surprised by how often in large, complex dataset genomics, people gloss over the important details of analyses. In fact, often people will give the most cursory statement about how a figure was produced, ignoring complicating factors such as which subset of a larger dataset is actually being shown.
I'm most shocked by presentations of a genre that I've affectionately come to label the 'dump truck' method of presentation. This is the situation where someone plasters each slide with an extremely complex, poorly-labelled figure and basically asks you to take their word for the fact that it's revealing some amazingly novel aspect of biology. The few times that I've seen extremely egregious versions of this form of talk, the only thing that ran through my head was that the speaker really wanted to convince us that they were doing an amazing amount of work... somehow1.
Perhaps I'm naïve or simply missing something fundamental about how seminars 'work', but I get far more out of talks that are as much about how something was determined rather as they are about what was determined in the first place. This is particularly true in the case of 'informal' seminar series, such as meetings among labs interested in the same topic. In many results-focused seminars, I'd rather that the speaker had presented half of the 'neato' things they'd done, if it meant getting a better picture of how such things were done.