January 28, 2013

and eat your quinoa . . .

I recommend to my family that we eat as much locally produced foodstuffs as possible. I prefer to shop at farmers' markets (but actually do so only rarely . . . should have been one of new year's resolutions - none of which I actually made. I do buy local when possible from the outlets where I shop and will continue to do so; however, we also eat a variety of foods that we can only obtain elsewhere and while we should be good citizens in shopping elsewhere, we should also not be led astray by parlor room fainting spells . . .
"The appetite of countries such as ours for this grain has pushed up prices to such an extent that poorer people in Peru and Bolivia, for whom it was once a nourishing staple food, can no longer afford to eat it," writes journalist Joanna Blythman.

This was one of several stories published in the last few years by the likes of NPR, the Associated Press, and the New York Times that draw attention to the negative aspects of the boom in world demand for quinoa. Some, like the Guardian, went to the extreme of guilt-tripping readers against buying it.

But the idea that worldwide demand for quinoa is causing undue harm where it's produced is an oversimplification at best. At worst, discouraging demand for quinoa could end up hurting producers rather than helping them.

Most of the world's quinoa is grown on the altiplano, a vast, cold, windswept, and barren 14,000-foot Andean plateau spanning parts of Peru and Bolivia. Quinoa is one of the few things that grow there, and its high price means more economic opportunities for the farmers in one of the poorest parts of South America.

January 27, 2013

evaluating Chavez' record in Venezuela . . .

One of my pecuniary vices is my subscription to the New Yorker . . . I do enjoy the covers, reviews, cartoons and humor, but it is often the articles titled "Letter From . . . " that I enjoy . . . of if not a "letter from somewhere or other" then just the articles talking about a certain place or certain culture, usually with ample background and most usually any political or cultural bias of the author is not that apparent . . . the current edition has a piece about Venezuela and Chavez "the Slumlord" . . . I found its author's biases very apparent . . . and am glad this morning to have stumbled across a review of the article by Jim Naureckas on Fairness and Accuracy in Reporting that puts the article in perspective and besides, it agrees with my own assessment to a very large extent . . .
Of course, the idea that the Chavez-hating architect represents the majority opinion in Venezuela more than the Chavista community leader is dubious. As Anderson admits toward the end of the article, Chavez has won "one election after another." But that just makes Venezuelans "the victims of their affection for a charismatic man, whom they allowed to become the central character on the Venezuelan stage, at the expense of everything else."

. . .

Anderson's acknowledgment of this could hardly be more grudging: "The poorest Venezuelans are marginally better off these days," he writes. It seems like for the New Yorker, rising standards of living for the poor don't matter much when weighed against the fact that rich people lost some property they weren't using.

learning (all over again) how to cook beans . . .

Don't be afraid of a little salt when prepping your beans . . .
Most of us have been told at some point in our culinary careers that salting beans will cause them to toughen. It's incredible that this little bit of culinary mis-wisdom still lingers, for it couldn't be further from the truth. A simple side-by-side test can prove to you conclusively that salting beans (both the water used to soak them in and the water used to cook them) actually tenderizes the skins.

It's got to do with magnesium and calcium, two ions found in the bean skins that help keep the structure of the beans' skin intact. When you soak the beans in salt water, sodium ions end up replacing some of the magnesium and calcium, effectively softening the skins. Your beans come out creamier, better seasoned, and have a much smaller likelihood of exploding while cooking.

January 26, 2013

Software, Republicans, and more

I continue to be frustrated by the Blogger interface in relation to commenting. I've had trouble with it ever since I learned about peripatetic patter, and even more maddeningly, have been able, on rare occasions, to get a comment posted. I can post, but Blogger just doesn't like my comments. About 95% of the time, even after signing in on my Blogger account, after composing my comment, clicking on either Publish or Preview sends the comment to the bitbucket and me back to Go. My brother, who comments here without any reported problems, uses the same system software and the same Web browser as I do. Who the hell knows ...

Whatever. To start with, concerning Republican attempts to rig the voting game in their favor: It just goes to show how far some people will go to hold on to whatever power they think they may have accumulated. The thing that keeps surprising me, though, is just how brazen they have become in furthering the interests of the people who own this country. They don't even try to hide or even camouflage what they're doing any more. I guess they realize that the dumbass American public will gladly buy any old load of shit that comes down the pike.

I also want to expand a little on Bill's post about crapware. To be sure, it's really annoying and insulting to find that some application you've installed on your computer has also installed other software that you didn't know was being installed. The crapware that was at issue at least has the virtue, though, of - at some point - revealing its presence.

Far more insidious and threatening is the software that is either a part of the known-installed program or is installed separately by it that never makes itself known to the user. Many (or maybe even most - I don't really know, because I'm not a user of a great deal of software) programs, both commercial and shareware, now have a "feature" that "calls home": i.e., it will, without either your consent or your knowledge, contact the developer via the Internet, often to insure that the software is properly registered, but potentially for any reason the developer sees fit. The vast majority of owners/users of the software have no clue that this is happening.

Web users - everybody who reads this blog, or any blog, or nearly any kind of Web site you could name - are especially susceptible to surreptitious bullshit going on behind their backs. I clicked on the link Bill provided to the original article about crapware on Slate.com - and the Slate server, in turn, connected to no less than 17 (that's seventeen) other third-party Web sites, each of which tried to leave it's own cookie on my computer (I say "tried" to leave a cookie, because I have my Web browser set to refuse cookies from all third-party sites). I know about this because I have an essential piece of software that I keep running at all times called Little Snitch. The only function of Little Snitch is to inform me of any outside connection any program installed on my computer is trying to make, and to offer me the choice to allow it to do so or not. Essentially it is a firewall - but it is an outbound firewall, as opposed to an inbound firewall, which is what nearly everybody, including me, uses. Now, Little Snitch is Mac-only software, but FWIW, I understand there are some equivalents for Windows (google "Little Snitch Windows"). I strongly urge all you Winders people to get one and install it, post haste. If nothing else, it's a real eye-opener as to what's really going on behind the scenes.

And then, of course, are the notorious Flash cookies (aka LSOs), which are installed any time you view a source of streaming content - movie, sound clip, whatever - based on Macromedia's Flash, which includes most all popular media. These, too, are installed without either your knowledge or consent, but unlike normal Web cookies, which stop tracking you once you quit your browser, Flash cookies remain active all the time, even when no programs at all (other than your operating system) are running, tracking - well, who knows what they track?

Now, I understand in principle the usefulness, and even necessity, of cookies for Web sites - within strict limits of legitimate use. But cookies can be, and often are, only marginally useful, borderline, or downright malicious, and as for me and my house, tolerance of cookies is to be tightly controlled. I also refuse to allow any outgoing connection by any of my other programs (except my email client). There might well be acceptable reasons for such a connection, but since the developers have chosen to hide those connections from me in the first place, I err on the safe side by assuming there might also be unacceptable reasons and just don't let it happen at all.

It's just sickening to be aware of how few facets of life remain which have not been sullied by the relentless pursuit of profit via every conceivable avenue. At times I yearn for the release of death from this otherwise inescapable nightmare ...

January 25, 2013

winner take nothing . . .

This is happening now . . .
The US electoral college system is based on winner take all delegate allocation in all but two states. If you get just one more vote than the other candidate you get all the electoral votes. One way to change the system is go to proportional allocation. That would still give some advantage to the overall winner. But not much. The key to the Republican plan is to do this but only in Democratic leaning swing states — not in any of the states where Republicans win. That means you take away all the advantage Dems win by winning states like Ohio, Pennsylvania, Michigan and so forth.
But the Republican plan goes a step further.
Rather than going by the overall vote in a state, they’d allocate by congressional district. And this is where it gets real good, or bad, depending on your point of view. Democrats are now increasingly concentrated in urban areas and Republicans did an extremely successful round of gerrymandering in 2010, enough to enable them to hold on to a substantial House majority even though they got fewer votes in House races than Democrats.
In other words, the new plan is to make the electoral college as wired for Republicans as the House currently is. But only in Dem leaning states. In Republican states just keep it winner take all. So Dems get no electoral votes at all.
Another way of looking at this is that the new system makes the votes of whites count for much more than non-whites — which is a helpful thing if you’re overwhelmingly dependent on white votes in a country that is increasingly non-white.
This all sounds pretty crazy. But it gets even crazier when you see the actual numbers. Here’s a very illustrative example. They’re already pushing a bill to do this in the Virginia legislature. Remember, Barack Obama won Virginia and got 13 electoral votes. But as Benjy Sarlin reported today in a series of posts, if the plan now being worked on would have been in place last November, Mitt Romney would have lost the state but still got 9 electoral votes to Obama’s 4. Think of that, two-thirds of the electoral votes for losing the state. If the Virginia plan had been in place across the country, as Republicans are now planning to do, Mitt Romney would have been elected president even though he lost by more than 5 million votes.
Remember, plans to do this are already underway in Michigan, Pennsylvania, Ohio and other states in the Midwest.
Follow-up . . . But this is also happening now . . .
What’s going to stop these changes to the electoral system is not law, but politics. Republicans have a lot to lose by going down this road, which is why Florida’s legislative leaders have already balked at it. It’s also why you don’t see Republican legislatures simply reallocating Electoral College votes to themselves.

First, it is wrong to assume, as the right’s National Review and the left’s Think Progress have, that Mitt Romney would have won had this rule been in place. As Donald Rumsfeld might say, you go into a campaign with the Electoral College rules you have, not the rules you wish them to be. The Obama and Romney campaigns would have campaigned very differently in these states if they were under a district system, targeting not voters across the state, but voters in the key districts needed to win the election. Yes, Republican gerrymandering of districts would have given the GOP some advantage, but it is far from clear it would have been enough to defeat the Obama campaign machine.

Think about it: The last thing Republican legislators want is national Democratic campaigns scrounging for every vote in conservative-leaning districts. Fewer Republicans will win legislative and Congressional seats because Republican districts will become more competitive by design. Why would Republican legislators vote for a plan that will make it harder for them to keep their jobs?

January 24, 2013

I want a ticket . . .

for this movie . . .

crapware is taking over our world. . .

"Crapware is the annoying software that worms into your computer without your knowledge. You can get it when you buy your PC—software companies pay PC makers to install the stuff on new machines—or when you download some ostensibly useful program from the Web."
Crapware is the annoying software that worms into your computer without your knowledge. You can get it when you buy your PC—software companies pay PC makers to install the stuff on new machines—or when you download some ostensibly useful program from the Web. You might download Adobe’s Flash player, say, and only later discover that the installer also larded up your computer with a dubious “PC health check” program that tries to scare you into paying to “repair” your machine.
But ever since the summer of 2008, when Apple launched its App Store, the death of crapware has seemed imminent. The App Store promised to kill crapware by centralizing software distribution. Because it’s the only way to get apps on your phone, and because Apple prohibits crapware and reviews all the apps that get submitted to the store, you’ll never get unwanted programs when you install an app. There are lots of problems with this model—the App Store gives Apple too much control over the software market, letting it stifle competition and enforce prudishness. But one of the reasons the App Store has proved so popular is that it lets people try new software without having to worry that it will hurt their machines. That’s one reason why Android, Windows, Kindle, and the BlackBerry have all adopted similar centralized app stores. Many of these stores have more liberal review policies than Apple’s, but they all prohibit crapware. It seemed likely, then, that this scourge would soon be gone—if we all got our apps from app stores, and if someone was checking those apps to make sure they weren’t bundled with unwanted software, crapware would soon crap out.
But that’s not happening. Crapware has proved remarkably resilient, and now I fear it will stick around for years to come. That’s because device makers, cellular carriers, and some of the most prominent investors in Silicon Valley are keeping it alive. It’s also because Google and Microsoft, the only companies in a position to stop it, haven’t fought crapware with the passion it deserves. (Macs can get crapware through bundled downloads, too, but Apple doesn’t allow it to be preinstalled, and Apple’s centralized Mac App Store—which is becoming the favored way to distribute Mac programs—prohibits it.) And that gets to the main reason crapware lives on: There’s a lot of money in it. Indeed, the rise of app stores has perversely made crapware even more valuable than in the past. App stores are clogged with thousands of programs, so it’s harder than ever for software companies to get you to voluntarily download their stupid games, weather monitoring programs, and unnecessary security programs. That’s why they’re willing to pay a lot to get their stuff on your device without your permission—and that’s why crapware may never, ever die.

January 15, 2013

Just read an astute piece on the NPR Web site concerning the phony indignation displayed by Lance Armstrong in his many denials over the years of his doping, and it speaks to a number of things that have been on my mind lately. I'd like to take up a little space and a little time to ruminate on those things now.

It's almost 50 years now since the assassination of John Kennedy, and even though the actual anniversary is still nearly a year off, articles and commentaries on those events are starting to appear. E.g., Robert Kennedy Jr. was recently quoted as saying that neither he nor his father (who, of course, was also assassinated) put much credence in the Warren Commission report, and that he personally did not believe that Lee Harvey Oswald was the only gunman. I was kinda skeptical of the official conclusions at the time and for a number of years afterward, but I went along because I was too young and green to know any better. After my military service, though, and ever since, I have been a confirmed doubter. Anyone who has seen the Zapruder film, read the accounts of the ER physicians at Parkland Hospital, and seen the autopsy photos cannot possibly doubt that John Kennedy was killed instantly by a single shot from the front. There were several shots, certainly, and Oswald may have fired one or more of them, but Lee Harvey Oswald did not kill John F. Kennedy from the 6th floor of the Texas School Book Depository. Yet the official story still stands, assented to and defended by the mainstream, despite its obvious untruth.

Unlike the Kennedy assassination story, I never for one minute believed the official story of the 9/11 events. Incontrovertibly, two airliners did fly into each of the Twin Towers and did much damage - but those collisions did not cause the ensuing and near-immediate collapse of either tower. Such a conclusion directly contradicts all the engineering and architectural evidence that can be amassed. More than all that, though, for me the strongest evidence against the official story is this: I cannot imagine that even the dumbest-ass gaggle of radical ragheads could not have known that the magnitude of the US response to such an attack would have been overwhelmingly against their own best interests. And if such attacks were indeed their agenda, why have there not been further attacks in the ensuing 12 years, despite how ridiculously easily they can be perpetrated, even in the face of the best that Homeland Security can offer?

Finally, I note the recent mini-controversy over the treatment of CIA torture in the film Zero Dark Thirty. It seems that a few commentators are concerned that the US public may get the impression that such torture is somehow justifiable in the "War On Terror." Well, here's some propositions for you to consider: What if Zero Dark Thirty, and the books by Seals purportedly involved in the manhunt for Osama bin Laden, are all accounts of events that never happened? How is it that Saddam Hussein, a man with vastly more resources at his disposal than Osama bin Laden, could not evade even the US Army on the ground in Iraq for more than a few years, while it took the CIA and all the Special Ops forces the US could muster about 15 years to find bin Laden? How come bin Laden's supposed corpse was buried at sea within 24 hours of his supposed demise, without so much as even photos, never mind an autopsy or positive identification? What if Osama bin Laden has been CIA all along, and is now living incognito somewhere in Saudi Arabia, or Tuvalu, or Kansas, or any of a myriad of other out-of-the-way locales in the world?

Call me a tin-foil crackpot if you will, but I must tell you that I owe a great deal of my skepticism and paranoia to my 4-year stint in the Army Security Agency. Now, any ASA vet will tell you that we were not, essentially, a part of the military, at least after basic training. We became anonymous little cogs in the vast US intelligence gathering apparatus, and were only nominally in the Army. None of us knew much at all about that apparatus, being subject to the "need to know" doctrine, to be sure. Yet even that little sliver of exposure was a window to the whole of the enterprise, and anyone with a smidgen of curiosity and a little initiative could discover much more. The point is that all of this is stuff that the average citizen has no concept of; most would even deny that it exists at all. And, especially since the end of World War II, it has become the real government of the United States, accurately called a shadow government, running behind the charade of the government we see on TV and read about in the print media.

What it's all about is honesty - or more accurately, the lack thereof - in both public and private life. Now, if you believe that you can make it on your own, that you don't need other people to help you through this life, then deception and dissimulation are nothing more than tools to help you along your way. But if you understand that even the strongest, most well-armed, most cunning, most ambitious person cannot make it without other people, you must conclude that dishonesty, in any form, destroys utterly the basis for living as a human among other humans. I speak from experience: For many long years, I was a diligent practicer of deception, and it did help me - in some respects - to get what I thought I wanted. But in recent years I discovered just how tangled and strangling was the web I had woven, and I've since been trying very hard to hack my way out of it. It's been painful, and I very much regret the damage I've done to myself and to the relationships I've had with others. But I can tell you that it's also enormously freeing to drop all that bullshit and just shoot straight from the hip. (I must tell you that, though I'm no longer a believer in any religion, the faith I was raised in, Roman Catholicism, had [has?] a wonderful institution, one of the Seven Sacraments, called Confession. I always had a great deal of trouble with it, but I've come to understand just how useful it is. The only change I would make is to make it a public confession, so that one no longer has to maintain the deception in public.)

One more thing, and I'll shut up for a while: I think it's one of our highest priorities as a polity to start stripping away the layers of secrecy in our government. At this point, since I see no "enemies" out there bent on conquering this country, I think I'd go so far as to recommend no secrecy at all. And I think a good place to start is with the Kennedy assassination(s), 9/11, and the whereabouts of Osama bin Laden. I know, that's asking a lot, but since we're dreaming anyway, what the hell ...

January 08, 2013

thinking about a trip somewhere . . .

climate change . . . and ability to think . . .

Every time there is a flurry of snow somewhere one or another of my neighbors or office colleagues (buttoning up their longjohns . . . ) tell me that it proves climate warming is bogus . . . do none of the schools continue to teach critical thinking . . . ?

The National Oceanic and Atmospheric Administration National Climatic Data Center tells us that 2012 was the warmest and second most extreme year on record for the contiguous U.S. The State of the Climate:
2012 marked the warmest year on record for the contiguous United States with the year consisting of a record warm spring, second warmest summer, fourth warmest winter and a warmer-than-average autumn. The average temperature for 2012 was 55.3°F, 3.2°F above the 20th century average, and 1.0°F above 1998, the previous warmest year.

SEC! SEC! SEC!

One - or several, actually - of the sports pundits said that the real National Championship game is not the BCS game, but the SEC Championship game. Anybody care to argue otherwise?

January 06, 2013

blessings of atheism . . .

Susan Jacoby in The New York Times SundayReview shares thoughts on being forthright about atheism and compassion . . .
This widespread misapprehension that atheists believe in nothing positive is one of the main reasons secularly inclined Americans — roughly 20 percent of the population — do not wield public influence commensurate with their numbers. One major problem is the dearth of secular community institutions. But the most powerful force holding us back is our own reluctance to speak, particularly at moments of high national drama and emotion, with the combination of reason and passion needed to erase the image of the atheist as a bloodless intellectual robot.

The secular community is fearful of seeming to proselytize. When giving talks on college campuses, I used to avoid personal discussions of my atheism. But over the years, I have changed my mind because such diffidence contributes to the false image of the atheist as someone whose convictions are removed from ordinary experience. It is vital to show that there are indeed atheists in foxholes, and wherever else human beings suffer and die.

Now when students ask how I came to believe what I believe, I tell them that I trace my atheism to my first encounter, at age 7, with the scourge of polio. In 1952, a 9-year-old friend was stricken by the disease and clinging to life in an iron lung. After visiting him in the hospital, I asked my mother, “Why would God do that to a little boy?” She sighed in a way that telegraphed her lack of conviction and said: “I don’t know. The priest would say God must have his reasons, but I don’t know what they could be.”

Virtual Lazarus

As I told Bill in a recent email, my absence for the last couple of months probably had him believing I'd finally bought the farm. But my death - and with this post - my resurrection has taken place only in this weird virtual world we all increasingly inhabit. In what we call the "real world" (by many seemingly disdainfully) I'm still plugging along with age and disease my constant companions. (My brother says I'm too ornery to die any time soon.) Naw, what actually happened was that, along about Thanksgiving, my beloved 9-year-old Mirrored Drive Door Mac, which was the vehicle that permitted my existence in this virtual world, finally began succumbing to the constant heat it generated and refused to reliably do my bidding any longer.

You must understand that I've been a committed, exclusive Macintosh user since 1990. At the time, I was a DOS-head, just like nearly all PC users, struggling along with the command line. But I took a job as a church secretary, and the pastor had just installed a little 2-machine Mac Classic network for the church - one for him and one for the office, each with its little 9-inch screen, 2 MB RAM, and 40 MB hard drive - so I had little choice but to learn how to use a Mac. Realize that Windows 3.1, the first even remotely usable version of a Graphical User Interface for the PC (as opposed to the Mac, which has always had a GUI, and from whom Bill Gates  shamelessly copied the idea), had just then been released and had not yet taken the PC world by storm. I'd never used it myself, so a GUI was a whole new beast for me. It took about a month to really learn the operating system, but I've never looked back from then until now.

You most likely already know that Apple computers have always been more expensive, often much more so, than PCs. I've owned a number of Macs over the years, but, being a poor person, I've always bought them used - even my latest, the MDD. For a long time, that higher cost was more or less justifiable, because the hardware was indeed better than PC hardware. But somewhere in the late '90s, along about the time that Apple was licensing its operating system, that began to change. (In fact, that's why, when Steve Jobs came back to Apple, he pulled the plug on OS licensing: People like Power Computing, and others, were eating Apple's hardware lunch.) And that's especially been the case since Apple switched from PowerPC CPUs to Intel CPUs. Since that happened, there has arisen a dedicated bunch of hackers who've found ways to run the OSX operating system on off-the-shelf PC hardware, which is cheaper (for equivalent capabilities) than Apple hardware by a factor of 3.

Now I'd known for about a year that I was going to have to replace my MDD with something newer. I considered my options: I could buy a stripped-down, used Mac Pro, the original Intel model released in 2006, for about $700, or I could invest another $100 and buy all-new, latest-generation PC hardware - case, power supply, and all - which would have about 3 times the capability as the Mac Pro. For me, it was no contest. Only problem: $800 is more than I get a month from Social Security, so the only way I could do it was like Johnny Cash did, "one piece at a time." I was about halfway there when the MDD went south on me, and it sent me into a depression, looking at 3-4 months of living without a functional computer. But a dear friend, seeing the funk I was in, kindly and generously offered to front me the money to complete my project, which I immediately and shamelessly accepted.

Be advised, those of you who might be considering something similar (yeah, I know, there might be one or two), it's not like inserting the Install Disc and be up and running in half an hour. There's a lot of hoops you gotta jump through, and in precisely the right order, to get it to work. I'm not a hard-core geek, though I've done some hacking and I can get around hardware and software pretty well (though not so much the software any more). Still, it took me about 10 days of installing and reinstalling to finally get it right. Extreme patience and attention to detail are what's absolutely required. Anyhow, my new Hackintosh has been running stably and reliably for over a week now, so I think (knock on wood) that I'm over the hump now.

A final note: For me, this whole thing has been about the operating system, not so much the hardware. I've used, out of necessity and on other people's computers, Windows 95, 98, and XP Pro, and I'm utterly convinced that even earlier versions of the Mac OS, never mind OSX, are way ahead of anything Microsoft has on offer. I've never understood, despite much pondering, why Apple has never sold a version of its operating system that would run on stock PC hardware. Especially today, when Apple hardware that runs on the MacOS (not iOS, which runs on all its mobile devices like the iPhone and iPad) represents only 3% of its total revenue. They're trying to protect that? What the hell for? If they sold the MacOS for, like, $119, in about 6 months Microsoft would be history, a minor player, and Apple would own the OS market in the same way Windows does now. And they could have done it at any time in the last 25 years. One of the stupidest decisions in the world of profit-making that I've ever seen.

The upshot is that I'll always be a Mac operating system user, come what may. Winders, as I call it, is about 15 years behind, and will never catch up. But I've totally given up on Apple hardware, with its absurd pricing, and, frankly, inferior technical capabilities. Even at that, I'm only running the last version of OSX 10.6, Snow Leopard, which is now two generations behind 10.7, Lion, and 10.8, Mountain Lion, though I could install either. Neither offer me anything, and in fact, break most of my application software, which I have no intention of replacing for the sake of being "up to date." Apple Inc. (notably not Apple Computers Inc. any more), with their iDoodad product line, can rot inside their walled garden.

Sorry to ramble on and on so long with this post, but it's been my world for the last 2 months, and I needed to get it off my chest as a sort of catharsis or epitaph or whatever. I've got a mountain of backlogged work in front of me, which is going to consume most of my time, but I'm going to try to put in a screed here every now and then, just to keep my hand in. Thanks for your indulgence.

January 02, 2013

the old ball coach . . .

South Carolina wins Outback Bowl.

view of the fiscal cliff bill . . .

Paul Krugman's take on the fiscal cliff bill appears mostly a warning that diligent progressives need to hold the President to his vow to not negotiate the debt ceiling . . . how will that work our? Some of us are optimistic . . .
So why the bad taste in progressives’ mouths? It has less to do with where Obama ended up than with how he got there. He kept drawing lines in the sand, then erasing them and retreating to a new position. And his evident desire to have a deal before hitting the essentially innocuous fiscal cliff bodes very badly for the confrontation looming in a few weeks over the debt ceiling.

If Obama stands his ground in that confrontation, this deal won’t look bad in retrospect. If he doesn’t, yesterday will be seen as the day he began throwing away his presidency and the hopes of everyone who supported him.

January 01, 2013

Bach and better hair . . . Happy New Year!

Natalie Angier suggests in her piece The Life of Pi, and Other Infinities in The New York Times that "your doppelgängers may be out there and many variants, too, some with much better hair who can play Bach like Glenn Gould." I'm not sure if the "better hair" refers to you or to Glenn Gould . . .