Last year I bit….

I got a Mac.

The iPod wasn’t the gateway drug for me. I was buying a new computer and wanted a high end laptop to run big nasty software I need for my research on the go. A friend suggested I get a 17″ MacBook Pro. I’d been rather hesitant about Apple for a while due to things they’d been doing back in the Steve Jobs interregnum, but my friend—who knows and cares a lot more about such things than I do—was persuasive, so I figured I’d spent grant money as he said. Costing it out it wasn’t a bad deal. Apples may be expensive, but feature for feature they are competitive on price. The difference is that Apple simply doesn’t sell the low end (under $1000), but I wasn’t looking for anything like that.

I’ve long disliked Windows and I’m sure I’m not alone. Does anyone really like Windows? There are some nice things about it, but its countless irritation factors rapidly overwhelm what good feelings one might have had. However, I am stuck needing Windows because there is a lot of software I need that exists only on Windows, kind of an inverse of what kept the Mac platform alive during the 1990s, when multimedia people needed to run things like Video Toaster and the only really good platform for it was Mac. In my case this is scientific and statistical analysis software. Numerical integration, nonlinear optimization, 3D graphics, big data files, etc., all really like a powerful machine, for exactly the same reasons multimedia machines do: Floating point calculation and big data files. Unlike multimedia, most of these programs are written for Windows. Now that the market is trending towards Apple having a much larger share than “pathetic,” particularly at the medium to high end where the miserable failure of Vista has left a gap, the software vendors are starting to trend back too, but it will be a while before I get to run everything I need.

I’m not a classic stereotypical Mac user. Profession aside (hardly diagnostic, believe me), I’m not a Whole Foods shopping, latte-sipping hipster. I listen to music that—while often off the beaten path—is generally twenty or thirty years old. I dislike nouveaux cuisine, have middlebrow taste in movies and TV (favorites: police procedurals, detective shows, historical dramas and nonfiction) and reading (mostly nonfiction or historical novels). In short, I’m pretty skeptical of things bobo. I am, sadly, spiteful enough to be able to understand anti-Obama votes that come from the same basic motive (as opposed to genuine motives, whatever those are), a defiant desire to crank some good old fashioned headbanger rock rather than hear the pathetic wailings of the new wretched indie rocker that none of your friends have heard of quite yet, or a desire to avoid Apple products because of the jackoff Apple-is-my-life advocates on the intarweb.

So what is that I like about the new Apples? The ideal OS to me is very much unlike the Mac-as-lifestyle marketing: In a nutshell, the less I have to acknowledge its existence the happier I am with it. OS/X comes as close as I’ve found in two decades of heavy computer use in which I spent a lot of time on DOS, Windows 3.X, OS/2, Windows NT/2000/XP, and Unix of various flavors, as well as Mac back in the old days, which was obnoxious. Linux isn’t really an option for me—I have to do too much sysadmining, which means I have to know stuff about the OS, ergo be aware of its existence; it also doesn’t easily run the apps I need. For me, Linux is only free if I don’t value my time.

OS/X is not perfect: It has a few annoying quirks and I don’t like Mac keyboard layouts, but otherwise it meets my ideal because, 99% of the time I do absolutely nothing with it but run the apps I want. I may be fighting with them (this means you, Office 2008), but that’s not Apple’s fault. Mostly I don’t think about the OS at all, with the occasional exception when Apple Software Update wants me to type in my admin password or I need to change some setting or another, a task which is similarly refreshingly easy. Unlike Windows Update, Apple Software Update is very much a piece of the rest. It does its thing—after asking permission—and goes away. It’s not an “adventure” and it doesn’t leave its crap on the hard drive like a bunch of sloppy workmen who abandon their take-out wrappers and track mud on your carpet after fixing the bathroom. The Intel Macs were a brilliant idea and are what pushed me over the edge. People like me who have a fair number of Windows-only applications to run can do that with minimal fuss with Parallels or VMWare—and, since Windows in that case is just another program, when virtual Windows blue screens, it just gets killed like any other hung application. Sweet.

The transformation from Disneyland (OS/9) to libertarian paternalism (OS/X) is an amazing shift of philosophy. XP was bad enough but Vista from what I hear has become downright Disneyland totalitarian. That was a bullet that Apple dodged ten years ago when the original design for OS/X, which sounds downright Vista-ish, died under its own weight and Steve Jobs returning as CEO brought OPENSTEP in as a replacement.

A few random points to conclude:

The Apple Stores look like absolute chaos inside, but I will give them this: They are efficient. They may hire body-pierced twentysomethings, but don’t seem to put up with much BS from them.

Oh, in case you’ve not seen it, here’s an updated version of the classic “if your OS was an airline.”

WordPress divider

ObFascismTag: With OS/X I am living in New Hampshire rather than Mussolini’s Italy. 😛


In case you missed my last article detailing my passionate hatred for the latest bit of consumer stupidity, known as hybrid automobiles, I’m back with a sequel piece. Here I’ll be employing the mighty power known as algebra to explain why buying a hybrid automobile makes no economic sense, except in what would be best classified as a nightmare scenario. To illustrate this, I want to compare the Toyota Prius with (in my opinion at least) one of the best inexpensive cars on the market, the Toyota Corolla. The cars are of a similar size and equal seating capacity (5). So what makes the humble Corolla more than a match for the mighty Prius from an economic perspective? The answer is simple: cost.

Consider the following vital stats about the two automobiles:

Car City MPG Highway MPG MSRP Range
2008 Prius 48 45 $21,100 — $23,370
2008 Corolla 28 37 $14,405 — $16,415

According to Uncle Sam, it’s plausible to assume that the average driver puts about 15k miles on a car every year. Likewise, according to the DOT, the average American keeps their car about 4.5 years. That’s not a whole lot of time to recover the cost of the vastly more expensive Prius. But with the price of gas these days, it has to be a good deal, right? Wrong.

Assuming that the cars depreciate at an equal rate (or you just crash them into a tree and get nothing from your insurance company) and that inflation (now pushing 4% per year) drops to zero, here’s where the Prius becomes cheaper as determined by the price of gas (here we only consider the lowest end model of each car):

When the Prius Price of Gasoline (per gallon)
Costs Less $2.00 $3.26 $5.00 $8.00 $10.00
Years 22.8 14.1 9.1 5.7 4.6

So for a Prius to be more economically sensible for the “average” American, gas has to cost $10.00 a gallon. And this is assuming 0% inflation. The numbers get worse when you factor in a 3% inflation rate. Assume that the gas price listed is the price today and that the cost of gas increases inline with the 3% inflation rate (Ben Bernanke and I are both being hopeful). Then the crossover point looks like:

When the Prius Price of Gasoline (per gallon)
Costs Less $2.00 $3.26 $5.00 $8.00 $10.00
Years 36 18 11 6 5

So unless we assume that gas prices are going to head up significantly faster than the inflation rate, it’ll still take $10.00 per gallon gasoline to make the mightly Prius cost-competitive with the humble Corolla. Perhaps that’s something to think about the next time you head to visit the Toyota dealer…

A bit of background: I work in the midst of a slough of professional artists who, like most artists, cover their work areas in artwork of varying quality and propriety. The walls of the office are saturated with artwork ranging from pencil sketches to internationally renown masterpieces. Like anyone around a wide variety of anything, a few of the pieces I find extremely irritating and patently inappropriate. However, being a reasonable person, I go about my day and get my work done. Little things like that don’t ruin my equilibrium, because, being an adult, I’ve learned that not everything goes my way and I save my effort for the important fights.

That being said there is one place in the building where there is a creative ongoing comic strip that is written and drawn entirely by software developers, not artists. Obviously, the quality of artwork pales in comparison with the best stuff made by the pros in the building, but it’s hardly the worst thing decorating a wall (that honor typically belongs to newspaper comics). I’m a big believer that good artwork doesn’t have to be complicated, especially not good comics.

After being up in its location for several years, and the latest episode being posted for over a year now (it’s not the world’s most prolific comic team — they’re busy writing software to support the artists, after all), someone complained about the handgun in the picture, and now it’s all been taken down. The one exposure in the building that the artistic ability of the software staff have next to the hundreds of thousands of elements from the art group, and this one is ruining one of those pitiful whiner’s day enough to get it canceled.

Now if someone has some serious gun trauma in their background, I can understand that they might not like reminders of the violence, but the current primary project of the company involves elements including a helicopter gunship and missile-firing motorcycles. Missle. Firing. Motorcycles. Good thing the pencil sketch with a handgun in it got removed. Someone was almost in danger there… might give someone ideas…

Today it trickled down to me that the official reason given for removing it was that the quality of the artwork was too low. That “you can’t make good art if you look at bad art”. Seriously. That’s the reason they gave. Now, being a logical sort of guy, I’m puzzled how people who believe that can expect to ever create the world’s most amazing artwork in their field — which is their stated goal. A motto like that means that you can’t ever be the best — what artwork on the wall would inspire you to create something that the world has never seen? Wouldn’t any existing artwork only serve to “bring you down”?

Continuing along that illogical train of thought, the new insistence is that the space should be filled with artwork of previous company projects. Now, that’s even worse if you’re so dependent on that magical space for inspiration to new world-beating heights. You’ll only be looking at stuff you’ve seen before, not anything that makes you think of anything new. And at the end of the day, all you’ve served to do is to squash a whimsical bit of entertainment from folks who are typically constricted in their tasks.

And another bit of fun dies in the name of political correctness.

It’s the sense of entitlement with which it was done that really get to me, though. If I were the vindictive type, and since some amount of control over what now appears in that space falls to me, I might be tempted to take advantage of that situation, given that it evidently affects the artists’ performance so critically…

N1   Angryman Challenge Problem

On the 8th of August 1900, David Hilbert presented a set of 10 problems at a conference at the Sorbonne in Paris. While anyone can throw out ten unsolved problems, Hilbert’s problems influenced much of 20th century mathematics, leading to entire new fields of mathematics — group theory and Godel’s theorem to name two. The interesting aspect is that one man (Hilbert) was so in tune with mathematics that his problem set drove much of the mathematical work of the century. [There were 23 problems but only ten were presented at Sorbonne.]

Later in 2000, the Clay Mathematics Institute, initiated the Millennium Prize, which unlike Hilbert’s problems came with a $1MM prize for the solution. Interestingly, the Millennium Prize problems still contain the Riemann Hypothesis, one of Hilbert’s original questions.

As a member of the Angry Man technorati, I am continually impressed by the readership of our humble blog, as well as the stream of email from my fellow Angry Men; and while I would not characterize myself as being in the same class as David Hilbert, I have been involved with technology for much of my life. As such I feel inclined to throw out a few problems of my own. So all you slashdottirs and sons, I present the AMB Challenge: initial problem.

Search engines incorporate various algorithms to index and identify web content. Much of this indexing is handled through ‘robots’ and ‘crawlers’ which follow links. Google, Ask Jeeves, and Dogpile are really pretty good once you train yourself to ask the right questions (and ignore the first three sponsored answers). But do you ever feel something is missing?

The first challenge problem is to design a system which can be integrated into search engines to identify and return links to images. A browser plug-in would allow the user to upload for immediate analysis an image or frame of a movie to be used as a search key. The search engine would return content which contained that specific image whether in a static image file (.TIFF, .JPG, .PNG, .GIF, etc.), or incorporated into a movie (.MPG, .MOV, etc.). This would presumably be accomplished by indexing the web for images. As crawlers identify known image formats, an analysis would be performed and a compact representation of the image content would be stored with the link. The submitted image would similarly be reduced to a representation and used as the search key, returning the links, and perhaps thumbnails, associated with the close matches. Users could specify match thresholds in preferences.

Consider a few applications. One might want to load an image of your girlfriend and return all links to any on-line content with her visage, such as MySpace, FaceBook, group photos posted by organizations, etc. Or as a fellow Angryman quipped “It would revolutionize the porn industry. You could search for your particular preference: a blond and two frogs.”

More seriously, trademark and service mark protections are dependent for their validity upon aggressive defense by the owner. McDonalds Corporation hires law firms to search the Internet to identify misuse of the “golden arches” — much like the misuse parodied by McDowell in Eddie Murphy’s “Coming to America” where McDowell’s restaurant is adorned with a set of somewhat similar arches.

Consider all of that imagery collected in the streets of London. If the process could be used on line to index web content, it could certainly be used to index stored video content. The national security implications are staggering.

Companies already invested in video web content (YouTube?) would have a vested interest in developing this technology. Competing companies (in Redmond, say), would have a powerful incentive to come up with a technology that would prevent certain search companies from attaining a 100% share of the search engine (and ad revenue) market.

I mention these few applications in passing only in that whoever develops this technology (solves the challenge problem) will probably not have to worry about rising gasoline prices. While we at 12 Angry Men would not be able to match the Millennium Prize amount, it would not be unreasonable (hint, hint) for Microsoft or Google to cough up, say, $10MM for the winner. The NSA and intellectual property lawyers are said to have money also. We will throw in a free beer at the Man Lunch providing the winner is local.

The current state of the art seems to be based on tags which are assigned to images by the provider or poster. That the automated search of images is yet an immature field is evidenced by Google’s attempt to entice users into participating in tagging. Polar Rose is another attempt at making a user friendly browser image tagger. To be truly useful, the image must be analyzed on basis of content, not associated metadata. As a start to the problem, the following thoughts are forwarded:

  • The compact representation of the image must be generated in small polynomial time
  • Compact is very small compared to size of original image
  • Representational form may be equivalent to solving eigenvalue problems of high dimension
  • Formats need to be expanded to a common form for analysis
  • Search comparison will likely be multidimensional

I looked at this from the point of view that many images were self-similar and subject to a fractal compression technique similar to Michael Barnsley’s Iterated Function Systems (IFS). In a particular iteration experiment, we were able to create a particularly accurate coastline of Australia using five line segments and a set of iteration coefficients — extremely compact compared to the point set describing the continent. Unfortunately, the rendering could be done in linear time but the cost of deriving the coefficients, the heart of the problem under consideration here, was high. Wavelet approximation also has some potential. Added to this is an observation that it might be more efficient to classify images first before trying to generate a compact representation. Kris Woodbeck is reported to have a process similar to the way humans process images but a quick read suggests it’s more in the line of a classifier rather than reducing the image representation to a searchable context.

Any process is used both at the server side when requested by the browser at submission of the search key image, and in the crawler image indexing process. So far the human eye-brain is the only process that comes close to doing this.

Good luck.

Occasionally on The 12 Angry Men, we will post rants from invited guests. In lieu of our normally scheduled segment, today we feature an invited rant, from an Angry Guest Woman. You may remember our current guest from her previous appearence when she ranted about poor service, and tipping. – The Staff of The 12 Angry Men

My company, like many others, decided a few years ago to outsource all IT-related work in an effort to “save money” and have more “effective” business practices by limiting the people who worked on our IT systems to “specialists.” Of course, in reality, we ended up with the opposite situation.

Sure, there were the comical incidents associated with initial setup. Like the time I ordered my first Linux box through the new IT contract. It arrived, carried by a teen-aged-looking guy with slicked-back hair, wearing chains, presumably required to keep his pants covering the bottom half of his boxer shorts, whose cologne I started to smell about 10 minutes before he entered my building. He had a set of Linux CD’s in his hands and absolutely no clue how to use them. I ended up giving him a lesson on how to install Linux. (He had never done this before whereas I had trouble remembering how many Linux boxes I had installed.) He insisted on driving the entire time because he was the “specialist” and I was not. Incidentally, upon completion of my setup, several key settings needed to be fixed. Yet I was not allowed to have the root password or su power on my box so I had to keep calling the teenager and his associates to do things like set the correct date on my box. Each time he had to call me to ask how to do this; or just give me the password and then change it again when I was done. Apparently setting the date and time is not intuitive to some IT professionals.

Since then, I have taken the company’s system administration certification exam, applied to administer my own box, and have had relatively few problems, except having to re-negotiate my status every time someone new sees I’m defying the system. But, my boxes have consistently worked, no thanks to our IT contractors.

Well — until the *only* thing of mine over which IT has control, my email, stopped working yesterday. I kept getting weird server errors whenever my email program attempted to connect to the server to send/receive messages. After we went through the normal process of me calling; getting someone who has never heard of email but promises to have someone else call me back; and 5 different people calling me back with different reasons why it didn’t work AFTER insisting that clearly their server errors must be caused by the fact that I’m running Linux and my telling them they’re full of it because server errors occur *on* the server, we have the problem solved. Despite the fact that I was told that IT did not know they were going to start expiring passwords, apparently my email password had expired. But they couldn’t tell this had happened and they couldn’t notify me of the status of my email account because… get this… (this is my favorite IT excuse EVER) they didn’t have my email address!!!!! I should win an Oscar for making it through two phone calls this morning without bursting into laughter while two different men explained to me in very serious voices that my email address was not in their system (the system of the people who CREATED and ASSIGNED my email address and who RUN the email servers!) and that they needed to enter it. The first guy called to inform me of this epiphany. The second one called to check that they had entered my email address correctly. I presume both of them found my phone number in our company directory. (Incidentally, my copy of the company directory also lists email addresses.)

My sides hurt now.

When I stopped laughing, I was still unable to change my password because the web interface, which is the only way to do this from Linux, was broken. In response, IT has just released a statement saying that all of their problems are being caused by people running non-standard desktops and has issued a ruling that everyone must now use the same standard Windows desktop, with a few exceptions for Mac. I have been ordered to give up all of my boxes and replace them with one Windows box, which will have exactly the same installation as every other box at the company, including the machines running specialized equipment in the research labs and the box they give to our secretary. Did I mention that my job is to do research? I write experimental software for a living. On a machine with no compilers (because why does the secretary across the hall need a compiler?), this should be very interesting. Then there’s the issue that a lot of the software is written for operating systems that are not Windows… I complained to the decision-making head of IT about this change and he didn’t see a problem. Why am I not surprised? Probably because the person I spoke to didn’t know what a compiler was.

I’m off to fight again for the right to have a computer I can use to do work on. Please, if you’re a manager out there, think long and hard before outsourcing your IT department to another company. Each year or three we change IT contract companies, but they’re all the same: they charge you too much; pay their employees so little that none of them stay to complete the “training” process; and waste your employees time while contributing to your IT problems instead of solving them. Then they fill out their own “customer reviews” instead of sending them to the employees, like they’re supposed to, so they insure they will keep the contract. I’ll hold out as long as I can in an effort to be able to effectively do my job. Each time our entire building is taken down by a computer virus and my Linux box is one of the few machines left standing, I’ll take the time to smile and feel vindicated.

– Angry East Coast Guest Woman

Once upon a time, news trickled out into newspapers or magazines. Then radio brought news bulletins out on a twice or three-times daily schedule. Television merged the fast pace of radio with the graphic content of photographs but didn’t really accelerate things further. Over many years we doubled or tripled our daily dose, but that was about it.

Until cable. With the advent of CNN and Headline News, and all their successors we now had news on an hourly basis. Naturally the Internet would only take that further, with news now literally “on demand.”

So it was only a matter of time until some clever news agency merged various technologies to give us this: a fully embedded, Google map-based, interactive display of currently known hash houses in Florida:

Can a full merge of all this with Google Earth be far behind? Will we soon have “breaking news” layers for Google Earth allowing us to zoom in as events unfold? Will Google eventually stream live satellite coverage to allow us to watch police chases and shootouts in real time?

Is there even any downside? (Well apart from the unfortunate inevitability that some poor sap will have his house displayed for national scorn due to a mistyped address…)

Pretty soon will this scenario be not clever fantasy but simply the way it is?

If so, is that good or bad?


I’d like to kick off a new semi-regular feature here at the Angry Men, a celebration of Americans of all different stripes and backgrounds who have all, in their way, made America and the world a better place. They will be politicians, generals, entrepreneurs, scientists, and inventors; famous, and obscure; figures of history and thoroughly modern folks. But together they will remind us of the diversity and unity of the United States, of our greatest principles and of the great promise of America: you are free to pursue your dreams as best you can.

Without further ado, let’s raise a rousing chorus of Happy Birthday to our inaugural Great American: Walter Elias Disney:

WordPress divider

The first few decades of Walt Disney’s life reads like an almost stereotypical American success story: born the son of an immigrant, growing up across the Midwest in big cities and small towns, sneaking off to World War I as an ambulance driver with the Red Cross, hustling to get started in his chosen career, getting breaks from his brother and returning the favor, and making and losing businesses and fortunes. All by the age of 33.

But in 1934, Disney did something destined to change American entertainment forever, and catapult him to new heights: he produced a full-length animated film featuring both realistic human characters and fantastic cartoon characters. This doesn’t sound like much these days, but back then it was “Disney’s Folly” because it had never been done, and conventional wisdom said it couldn’t be done. Disney bet the farm that conventional wisdom was wrong, and his competitors bet that he’d lose that farm.

Of course, as we know, Disney was right, and Snow White and the Seven Dwarfs was wildly successful, playing to standing ovations and winning an Oscar (well, actually one large and seven small Oscars, in fact). More than a personal triumph, it ushered in the golden age of American animation, and set the stage for the staggering industry of animated features around the world. It also launched Walt Disney Studios in Burbank and bankrolled a skilled studio of master animators. Disney would go on to produce a whole cavalcade of classic animated films: Pinocchio, Fantasia, Dumbo, Bambi, The Adventures of Ichabod and Mr. Toad (which brought the Legend of Sleepy Hollow and the Wind in the Willows to many for the first time), Alice in Wonderland, Peter Pan, Cinderella, and many, many more. Many did not make much money, some were quite successful, but all have endured the test of time surprisingly well and stand as a tribute to Walt Disney’s vision that rich, complex stories could be told through animation.

After the Second World War, Disney brought his vision for a child’s fantasy amusement park to life in Disneyland, setting it on a huge lot and surrounding it by one of his favorite things in the world: a train. Throughout the 1950s Disney Studios worked on Disneyland and released major live-action and animated features. Disney also turned his eyes towards the stars and worked with NASA (and Werner von Braun) to promote space travel through films.

The 1960s saw Disney at the peak of his success, with Mary Poppins sweeping box offices and Disney debuting his vision of the future at the 1964 Worlds Fair. Not content with a one-time display of that vision, he laid the plans for an expanded and enhanced Disneyland known in development as “Disney World” and sited on 27,000 acres in Florida. Although plans included an expanded amusement park (to be known as the “Magic Kingdom”), resorts, and hotels, the centerpiece was to be Disney’s vision of the perfect future community, the Experimental Prototype Community of Tomorrow (EPCOT). In Disney’s expansive vision, EPCOT was to be a working future city, whose residents would focus on innovative science and advanced technology.

Sadly, Walt Disney would never live to see the fulfillment of this vision, as he died from lung cancer in 1966, just two years after beginning the new project. His brother Roy came out of retirement to manage the project (and company) and open the first stage of the new park, now formally called “Walt Disney World Resort” in October 1971. By December of that year, Roy too was dead.

EPCOT as envisioned by Disney never came to be, though the modern Epcot park does provide a showcase for future technologies, and embodies the spirit of international cooperation in its World Showcase. And Disney’s Celebration community, built by Disney Imagineering as a model planned community, comes closer to the original goal of EPCOT (though in a suburban rather than urban mode).

Of course, as we remember the man and his legacy we should not overlook the darker side. Walt Disney never trusted organized labor, and his prejudice led him to make unsubstantiated allegations during the McCarthy hearings of the 1950s. He spied on union activity for the FBI for years and may well have illegally intimidated union activists. He was, as many visionaries are, a notoriously difficult man to work with. In short, he was a man, with a full share of faults and limitations.

But he was also a visionary in the best American mode, with an optimistic and enthusiastic take on society, technology, science, and the future. He built places devoted to bringing joy to children and inspiring them to dream deeply. He gave the world the vast legacy of his dreams in film and concrete and has inspired millions around the world with a vision of pluralism, tolerance, kindness, optimism and joy. For all of these reasons, whatever his human faults and foibles, Walter Elias Disney is, indeed, one of the Best of Us.

UPDATE: Welcome! After you read this, feel free to have a look around. You’ve probably already seen this and this, but check out this fine piece about the One Laptop per Child program, this one about that nutcase Chavez, and, of course, this classic challenge.