Words of Wisdom

"Evolutionary biology is not a story-telling exercise, and the goal of population genetics is not to be inspiring, but to be explanatory."

-Michael Lynch. 2007. Proc. Natl. Acad. Sci. USA. 104:8597-8604.

Social Media
Currently Reading

 

 

 

Cycling

mi (km) travelled: 4,969 (7,950).

mi (km) since last repair: 333 (532)

-----

Busted spoke (rear wheel) (4,636 mi)
Snapped left pedal and replaced both (4,057 mi)
Routine replacement of break pads (3,272 mi)
Routine replacement of both tires/tubes (3,265 mi)
Busted spoke (rear wheel): (2,200 mi)
Flat tire when hit by car (front): (1,990 mi)
Flat tire (front): (937 mi)
Flat tire (rear): (183 mi)

Blog RSS Feed
Powered by Squarespace

Entries in Technology (12)

Tuesday
Mar272012

Audiobook Club: Steve Jobs - Part 1...

Realizing that I never watch television anymore, I cancelled my Netflix subscription this week and took advantage of a free 30 day trial of Amazon's Audible.com. Audible is basically a modern version of 'Columbia House'1, where you pay $14.99/month to download 1 audiobook/month and receive 'discount' pricing (at least as compared to retailers) on additional books. As metioned in my last post, I find myself unable to meet my listening needs (jogging, benchwork, commuting, etc.) with podcasts lately so audiobooks seemed like a good idea.

The book I chose for my free trial is the unabridged version of Walter Isaacson's Steve Jobs (2011; Simon & Schuster), the massive bestseller that came out a few weeks after Jobs' death late last year. I knew that Jobs himself had asked Isaacson to write his biography and that the work wasn't intended as a cash-in from his death; however, I was still quite loathe to read it when it came out. Like any celebrity death, the media's evangelizing of Jobs was so over the top that I felt as though I had to wait until the furor had died down. The audiobook is divided into 3 long (8 hour) parts, and since I can't make notes and mark pages, I've decided to blog about each part individually.

You may have heard about the psychopath theory of corporate CEOs. It's basically pop-psychology that suggests that the qualities typically associated with psychopathy (e.g., lack of empathy, narcissism, solipsism, etc.) are beneficial qualities in the cutthroat world of big business. The aspect of this theory that I find disturbing, is that some people see psychopathic behavior among corporate executives as justifiable when stacked against their accomplishments2.

Steve Jobs is a particularly interesting individual on this count because the first part of the book paints him out to be a man who a) had some very clever, very forward thinking ideas at a young age and b) was a completely intolerable, unforgivable asshole. Specifically relevant to the pop-psychology theory outlined above, Jobs' 'psychopathy' was apparently so bad that it fundamentally harmed his company. While we can marvel at how impressive early Apple computers were, not all of its ideas and intuitions were successful.

Jobs' vision from the beginning and up to his death was that technology should be made available to the masses and that in order to do so, ease of use and elegance should trump some functionality. 70s tech was intimidating to non-specialists, and companies of the era were catering to the hardcore. If nothing else, Jobs realized rather early that presentation and 'messaging' mattered. For instance, he would routinely ask his designers to implement features that, while not necessary, were 'cool' (such as the hovering, rounded windows on the original Mac OS). More often than not Jobs realized that journalists and customers would spend more time talking about the "insanely awesome" superficial features of a product more so than the detailed specs that obsessed tech geeks. Jobs brought the idea of selling a concept or a lifestyle to electronics - a field dominated by tech heads.

 

The now famous '1984' ad run by Apple to announce the Macintosh computer was a smash success, and was strongly championed by Jobs against the wishes of the other execs. It 'went viral' long before there was a Youtube and drummed up massive interest in the brand.  

 

The flipside to all of these great ideas is that Jobs had such a strongly abrasive personality that he frequently turned off investors, insulted clients and collaborators, and drove his own staff insane. He publicly insulted fellow employees and underplayed Apple products designed by 'rival' teams. It's sometimes admirable to say that someone is uncompromising in their vision, but Jobs failed to see that his desire to have control over every aspect of his system would make it more expensive and less 'open' than other brands. There were many competing, incompatible brands of PCs in 1984, and ultimately the open standards pioneered by IBM drove down costs and netted more customers even though it was clear for many years that Apple's products were more 'impressive'. Furthermore, Jobs went against the wishes of Apple's execs and forbade them from licensing the Mac OS software to other hardware manufacturers, a decision that would hurt down the road when Microsoft proved that the money was in the software.

In the early days of Apple, Jobs' artistic and forward thinking design achievement were very impressive, but so too were his managerial flaws. Business acumen was not his strength, and he probably had too much control over operations in the company's early days. Surprisingly, there's almost nothing positive said about the man as a person at all in the first third of the book, which makes it a bit of a boorish listen. The people who surround Jobs are so much more interesting as individuals than the man himself - if this is the real picture of Steve Jobs, I'm surprised that he bothered to seek out a biographer.

 

1I spoke too soon, apparently Columbia House is still in operation in the US. 

2This could be an entire blog post on its own, but I've become acutely aware of how frequently people derive 'ought' from 'is' or alternatively, assume that others do so. Observation of a phenomenon is not equivalent to condoning it (or vice-versa).

Monday
Mar262012

Rant: Words from the 'Top Shelf' or [INSERT WORD HERE]gate...


"We go right for the top shelf with our words now. We don't think about how we talk; we just say the right-to-the-fucking... 'Dude it was amazing!' It was 'amazing'? Really, you were 'amazed'? You were 'amazed' by a basket of chicken wings? Really?"

-Louis CK, Hilarious. (listen to it here; Image here).

 

Is it just me or does it seem as though we've become incapable of discussing topics with any kind of verbal caution or reasoned analysis? Alright, perhaps it's a bit ironic that I'm beginning a post about hyperbole with such an incendiary statement, but when it comes to certain types of discussions, I'm beginning to feel that it's justified.

It seems as though there are way too many articles about perfectly normal situations written as though said situation was taken to some insane theoretical extreme. If a popular company has bad quarter, all of the headlines start with 'is it the beginning of the end for [INSERT CORPORATION HERE]?'. Similarly, upon introduction of a new product, all headlines will read, 'Will product X kill [INSERT DOMINANT COMPANY]'s product Y?'. A political candidate makes a faux-pas, and everyone immediately asks whether their campaign is 'DOOMED', and so on.

What kills me is the lack of empiricism underlying these statements. We have abundant experience telling us that if something seems to good to be true, it probably is ('true' being defined here in terms of what will generate the best news copy). Similarly, things are almost never as bad as they first seem. I understand that you have to sell magazines or clicks, and as Drew Curtis of fark.com pointed out in his book, there's a lot of incentive to manipulate headlines. But this sort of crazy exaggeration extends far beyond the headlines into the body of the work itself.

I think that one big part of it is the rise of news aggregator blogs, like Gawker's garbage, TMZ, or Yahoo (sorry I won't provide links to this stuff). I've already complained before about how these places have fallen prey to eschewing journalistic ethics in order to solicit 'clicks', and they're certainly guilty of making mountains out of mole hills in terms of headlines. But I don't think that the entire blame can be laid at their feet - it's a broader cultural phenomenon. I think we've all been somewhat seduced by media that uses hyperbole and often comical extremism to make points, but enough is enough.

As I've mentioned over and over again, I listen to a lot of podcasts. Well, I should say that I have listened to a lot of podcasts because I'm in a continuous process of culling my regular rotating list of shows. A lot of this is simply because most podcasts are not very good1. But I've also become very sensitive to the overuse of hyperbole in current event shows. Every single thing cannot possibly be one of the following:

a) The best thing ever.
b) The worst thing ever.
c) Revolutionary. (or is that 'Resolutionary'?) 
d) The death of modern [INSERT CONCEPT HERE]2.
e) A harbinger of the apocalypse.
f) [INSERT WORD HERE]gate.

You know what? I think that we are basically living in the midst of [INSERT WORD HERE]gate, wherein every random stupid thing that happens is being directly equated to a national scandal that shook the foundations of people's confidence in democratically elected government. Remember the problems with the iPhone 4 antenna - Antennagate. Or how about 'Nipplegate'? I need to point out that the above link to the list of 'gate' controversies is nowhere near exhaustive in terms of how often I've seen/heard this used on blogs and podcasts. 

As Louis CK asks in the bit quoted above, where do we go from here? We're already describing banality with linguistic extremes, so how do we describe events that are legitimate outliers? I don't think that this is the end of the world or that it's all downhill from here (though that's what the subjects of this rant would probably say themselves). Rather it's simply that I'm being pushed towards a more professional style of communication not provided by every random source, and that I'm willing to pay for it.

A little reasoned analysis would be appreciated, even if it's notoriously difficult to argue the middle ground.

 

1The best podcasts are those that learn from and apply the techniques that make good radio. There has to be good chemistry among the hosts/participants, good quality to the recording and production, and interesting topics. Some people have the 'knack' while others do not, unfortunately. It's very much an Anna Karenina problem in that an otherwise great podcast can be ruined for me if people are continuously being dropped by Skype, or one host obviously has no idea what they're talking about, or if people on the 'cast are obviously distracted and doing other things while recording. It's tough to judge whether a cast will be decent before trying it out, because sometimes the most amateur (in the literal sense) efforts are amazing, while big professional productions fall utterly flat and vice-versa.

2As an example of 'd', may I present exhibit A: Katy Perry unfriends Russell Brand on Twitter. Hearalded as the death of journalism.

Sunday
Mar112012

People Are Crazy...

This is a follow-up to my post from a few days ago about hardware and software. When I wrote that post, I brought up a contrast between Apple and Google's design philosophies. This just happened to be convenient example, as I could snap screenshots of the two apps for the purpose of illustration. Unfortunately, I realize after reading the post that the point I was trying to make with the comparison was muddled (that's what happens when you try to hammer out a post as quickly as possible), but I've also realized that it was a bad comparison in general. This is because people are crazy.

First, some background: I've been a pretty committed podcast consumer for ~5 years now: I listen to them while jogging, at the gym, doing repetitive benchwork, commuting, etc. I like to have a mix of 'news' casts to stay informed (e.g., NPR) as well as educational casts to learn interesting factoids (e.g., Econtalk, Stuff You Should Know, Caustic Soda). Recently, I've also started listening to a few different 'technology' podcasts focusing on news and reviews of the most recent products in computers, gadgets, and media. I was reccomended a few new casts, specifically Tech News Today and Build and Analyze, so I listened to a couple episodes of each (I'd reccomend the former and eschew the latter, by the way).

I've noticed a common thread among all of the tech podcasts that, when you step back and think about it, is very weird: Almost every time that certain companies or products are brought up in any context, discussion is either preceded or followed by extensive apologies to the listeners for having to bring up said company coupled to defenses against 'bias' and 'selling out'. I'm serious: think of how mind-blowing it is that when a show's hosts discuss a new Android phone, pointing out its relative merits and flaws, they literally have to spend a minute or more apologizing for their opinions and defending them as being based on 'facts' and not because they 'love Apple'. Alternatively, discussion of the new iPad announcement is preceded by an apology to all of the people who 'hate to hear about Apple' as well as the suggestion that they 'scrub forward' a few minutes into the show to get back to the rest of the tech news.

 

Take 5 seconds to think about what this image means.

Seriously, people are crazy. Admittedly, why they get 'upset' about the products that they don't own is obvious: Investing money into a brand is a reflection of your personal opinion and thus a statement of your 'values' (or whatever). Criticism of the brand that you chose (or, more ludicrously, praise of this brand's competitors) becomes internalized as direct criticism of your own choices and/or values. This is insane. Are we so thin-skinned and criticism-averse that we get upset when someone suggests that what we bought isn't perfect? Is it reasonable to throw tantrums when people simply point out facts1?

Now, I'm not completely unsympathetic to these umm, passions. I can understand the behavior in kids, for example - I certainly fell into the trap myself. This is because purchasing decisions are arguably more permanent among kids who only receive big-ticket items on rare occasions. If I decided to ask for a particular brand of PC for my birthday, only to find out a little later that a much better competitor was released, it's likely that I'm stuck with my decision for some time. Every time some tech podcast host points out how much better the new model is, it may cause me child-self to get 'defensive'.

Therefore, it strikes me that these podcast hosts are essentially wasting time and effort trying to appease 12 year olds2. The alternative possibility, that grown adults are 'losing their s$%t' over comparisons of the number of pixels between two random cellular phones, is too terrifying to contemplate.

 

 

1I'm dead serious. You should listen to letters written to tech blogs in response to their listing of sales figures. Clearly, pointing out that iPads outsell Android tablets by some margin is 'bias'.

2There's another problematic issue here that could be an entire post on its own and can be thought of as the 'vocal minority problem' in tech and entertainment. The people who write in to complain about 'bias against Android/Apple' are a very specific crowd. They're certainly not representative of the market at large, and it's not even clear that they're even representative of the total listeners of the podcasts that they're writing in to complain about. Therefore, podcasts that cater their discussion to the wants of this crowd become strangely 'out of touch' with what's happening with the market at large. It's fine to have a podcast that discusses HP products exclusively (if that's your thing) but no one would expect to get insight into braoder market trends from such a limited focus. The people who only ever want to hear good things said about their chosen products should find appropriately skewed shows.

 

Wednesday
Mar072012

Soft- vs. Hard(ware)...

I've now been a Mac user for 3 years. Before buying in to the Apple camp, I used Linux for computational work and Windows at home. My experiences with Apple products had typically been negative (why can't I properly maximize my windows?!?!?!?), and I had a philosophical opposition to their 'closed' ecosystem1. Then one day, based partially on the recommendations of co-workers, but also on somewhat of a whim, I bought a MacBook and became an Apple fan.

I still have a PC for playing the occasional, umm... videogame, but all of my work and most of my 'productivity' applications (such as writing this blog post) are done on my Mac. The reason is simple: I really enjoy using Apple's software. What all of my Mac-hating, Windows-using friends have always referred to as 'dumbed-down' interfaces, I now see as intuitive. There's a certain elegance and consistency of design among software on the Mac, such that you can immediately figure out how to do things on an application that you've never used before. As an example, take these screen grabs from two relatively similar pieces of software:

 

The first program is Apple's iPhoto.

 

The second is Google's Picasa 3.

Just looking at these two screen grabs, it's pretty clear to me that iPhoto is more sparse: There's less clutter on the screen, fewer information boxes, and fewer buttons sliders and toggles. Picasa has more functionality, but (and all of this is just my opinion, of course) that functionality is realized by continuously bolting more features (i.e., buttons and sliders) on top of an already clunky interface. iPhoto is also arguably more 'aesthetically pleasing', but again that's (common) personal opinion. 

There are arguments for both views - Shiny interfaces are often undesirable when functionality is at a premium, after all. But I think that this specific example illustrates a larger problem: why is there so often a disconnect between the quality of the hardware we use as compared to the software running it? Or, for that matter, why do companies that produce very functional software (Picasa) often put little stock into developing their user exprience and aesthetics?

Another example: I got an HP printer-scanner combo thing last year with my new laptop and while it scans great images, the software that came with it is ABYSMAL. There are freeware programs that are so much better. You could repeat this example ad nauseum with various products that have come out over the years: digital cameras, cell phones, MP3 players, etc. Why does their in-box software suck so badly2?

Part of Apple's recent success is undoubtedly due to their concerted effort to improve both the quality of their hardware and their software. This is particularly evident on iOS devices, where the ecosystem encourages uniformity of design such that even very different apps 'feel' the same.

So I guess the real question comes down to that philosophical opposition discussed at the beginning of this post: 'freedom' seems intuitively better in some moral sense, but what if the 'walled garden' approach actually produces better and more stable systems? I'd rather have both options available, but now that computing is mainstream rather than the domain of basement-dwelling nerds, is expecting everyone to 'figure every new program out' reasonable? Can Apple be thought of as simply setting 'standards', something that can enhance a market in situations where too much competition leads to consumer confusion? 

 

 

1It's debatable how closed the ecosystem of software has been on the Mac/Macbook side in the past, but it's become more closed with the inclusion of the App store on desktops. The new version of the OS coming out this year may close it off a bit more. On the mobile side (iPhone/iPad), the ecosystem is pretty much as closed as can be: You can't install anything not authorized by Apple without hacking the device. 

2We can always say that these are often hardware companies and that software is a secondary consideration, I suppose. But someone, somewhere must've looked at some of these programs and asked 'are you serious?' The software for my old Sony mini-disc player felt like it had been made in a high-school programing course. 

Wednesday
Nov232011

The Future is Now...

As mentioned in a previous post, I've been listening to some 'tech' podcasts recently. It on these podcasts that I first heard about 'Siri', the cloud-based voice recognition software that Apple acquired a few years ago and made available in its current incarnation on the iPhone 4S (Apple's official perhaps somewhat idealized advertisement video can be found here). Having been very disappointed with previous voice recognition software1, I didn't really pay much attention to it.

Then I ws biking home from work, listening to some music through my bluetooth headset, and thought I'd give the system a try. My phone sits in a pocket on my biking jacket, and activating Siri requires only holding down the iPhone's 'home' button for a few seconds. I tried a bunch of different ways of saying things, and while I got some odd results at first, I began to figure out things that work consistently.

For example, I can say 'shuffle playlist jogging' and it'll start playing that particular playlist. 'Next song' skips ahead, 'pause/resume music' do just that, etc. You can actually say things like 'Play album Thank You' or 'Play podcast Giant bomb' and it'll actually do it. Furthermore, if I get a notification tone while biking I can say 'Notifications' and it'll read off any new emails, txt messages, tweets, etc. I've received. Maybe I'm easily impressed, but this is pretty Star Trek to me2. Oh, and is there anything better than a lab timer that you can talk to? ('Set timer, 45 minutes').

The one area where Siri isn't so great, at least so far, is in taking dictation. That's not exactly fair - it's quite accurate if you stick to common, well-defined words. However, as a scientist who uses a lot of non-standard verbiage, Siri's not particularly useful; it does incorporate some good ideas though. For one, it highlights all of the words that it detects as ambiguous - this includes both misunderstood words and homonyms. You can then click on them and select from a list of best guesses or edit them yourself. This is not particularly useful as a hands-free feature, though. I can see some utility in being able to fire off quick acknowledgement replies to emails or text messages, but I doubt that many screenplays will be written by dictation.

It's not clear from the Apple website, nor the dreaded Wikipedia how Siri works, but according to the Tested.com podcast, the system actually offloads data-processing to Apple's 'cloud' servers where the heavy crunching is done. The results are then sent back as text to your phone (in practice this takes a matter of seconds)3. If this is the case, it's an interesting glimpse into the future of processing, where your PC will be naught more than a box connected to the internet and the heavy lifting will all be done on servers.

Stuff like this makes me think that current/future generations take technological progress for granted. In our parent's first 30 years of life, they went from what: AM to FM radio? Teletype to fax machine? Mimeograph to photocopier? Frustrating wheels to power steering? I'm exaggerating a bit, but it's difficult to understate that we've gone from the Sears Wishbook of my youth to buying stuff on Amazon.com over 3G Wireless internet on our cell phones, or looking up stuff in a 12 volume encyclopedia to searching libraries of text on Google.

I challenge you to go back and read sci-fi books from the 70s. In much of the imagination of those authors, we're already living hundreds of years into the future4.

 

1Did anyone ever try Dragon Naturally Speaking back in the day? This was a piece of sofware that would allegedly allow you to dictate Microsoft Word documents. I tried demos of a few versions and remember being stunned by how poorly the software worked - It's not exactly useful if I have to go back and correct the dictation by hand every 5 or 6 words.

2Incidentally, the pocket I use for my phone happens to be on the upper-left side of my chest. Thus activating Siri basically involves me 'tapping' a virtual communicator badge thingie...

3This explanation seems a bit weird as it means that you require an active data connection to tell the phone to play music. It may, however, only do such cloud processing for internet related searches and not for on device lookups. This cloud thing also may explain why the system doesn't work on previous iPhones, even when 'hacked in' - Apple can tell the phone's model when the servers are accessed and thus it can reject the data.

4For instance, in the classic sci-fi book The Mote in God's Eye (1974), the authors, Larry Niven and Jerry Pournelle, predict that by the year 3,000 we'd have invented faster-than-light travel. Yet we'd still lack wireless networking as well as contemporary laptop computers. The sci-fi novels that impress me are the rare ones that really 'nail' it. I'd still rank Neuromancer (1984) among the finest in this tradition, as well as the odd, yet interesting, Snow Crash (1992). This being said, I'm not the most well-versed in classic sci-fi. I'm always looking for recommendations though!