I was reading something this morning that referred to someone drinking dozens of cups of coffee. After getting over the vicarious shock of the caffeination, my thoughts immediately turned to Ella Mae Morse.
Morse wasn’t an early Starbucks exec. In fact, to my knowledge, she never worked in food service. She did have a hit with the tune “40 Cups of Coffee”, though, along with the “Cow Cow Boogie”, “The Blacksmith Blues,” and many others in the 1940s and 1950s.
Her 1942 performance of “Cow Cow Boogie,” in fact, was Capitol Records’ first-ever gold single. The next decade was one of stardom for Morse, peaking with “The Blacksmith Blues” in 1952, which sold over a million copies.
Ella Mae Morse’s style (Wikipedia places it into the Jump Blues genre) was hip, in the swing way (think WWII and immediately post-war), with smooth big-band backing, rhythmic interest, and catchy melodic hooks.
About fifteen years ago, a box set of all of her recordings (including some studio work tapes that were fascinating to hear) were released, and, remarkably, they largely hold up just as well today as 70 years ago. What’s most noticeable when listening to the whole canon is its versatility. Indeed, she hit the top of the charts in pop and R & B, which during the late 1940s and early 1950s, was rare. She even performed in a handful of movies during the period. But it was that versatility, I think, that prevented her from becoming the long-resonant star that her performing ability warranted.
Ella Mae’s work intersected with lots of other amazing professionals of the time, too. Her 1952 recording of “Oakie Boogie” was one of Nelson Riddle’s first arrangements ever. Being at ground zero of early Capitol records meant that she crossed paths with many amazing artists at the early stages of their career (including, among others, Sinatra).
Morse had difficult relationships personally, with two marriages, six kids, and lots of grandchildren that she long felt guilty about raising poorly. But man, the creative output she churned out in only a decade or so is amazing. It’s enough to get your toes tapping, even in the Facebook and YouTube era. In fact, some of her tunes are over at YouTube — I heartily encourage a listen.
This weekend, my wife and I celebrated the holiday weekend as we do every year — seeing movies. If you’ve read the blog a bit, you know that it’s Oscar season, so the movie-watching is in full gear.
In a confluence of the calendar and the schedule of viewing, we ended up seeing Blue Valentine this weekend. It’s a good thing that we’re already married, because this is not a date movie.
The movie is a slow-motion portrait of the dissolution of a marriage, basically. Flipping back and forth between two time periods, we see the protagonist (?) couple during their meeting and wooing, then in the present tense, with tense being the operative word.
In some ways, it’s a picture of what happens if people allow one or two positive attributes dominate their personality — they become liabilities. Ryan Gosling’s character, for instance, is seen in the earlier period as a light soul, someone who eschews care and is a fixer, helping others solve their problems (including Michelle Williams’ character). In the present period, though, that care of those immediately around him becomes consuming. The result, ironically, is a lack of caring and responsibility.
Williams is nominated for her performance, and it is a strong one. Her character is a go-getter in the past, with big plans that get derailed. Whether it would get nominated in a year with stronger lead female performances is less clear, however — I thought Gosling’s work was just as strong, but he didn’t nab a nomination in part because the field there was tougher.
The movie initially garnered attention because of its NC-17 rating, which the filmmakers and studio appealed. The ratings board reconsidered and gave it an R instead, without the film being changed in any way. Upon viewing, it really doesn’t feel like an NC-17; rather, it falls in the middle of the R range, with some sex and language. Perhaps the NC-17 was because the sex in the movie is really, in some ways, anti-sex. It’s not passionate, and in fact, marks another milestone on the path toward relationship dissolution.
Indeed, the feel upon leaving the movie was blue, with a pall of sadness that lingered for a while. This was due in part to the performances, to the writing, and to the look of the movie.
So it’s a good movie, even if it’s not an upper. Worth a rental. Just maybe not on Valentine’s Day.
The elements were all there. It was virtually brimming with possibility. But Country Strong just couldn’t hit the right notes.
I like Gwyneth Paltrow. Not personally, mind you — we’ve never spoken. But her past work is largely very solid, and with a significant range. A career that goes from Sylvia and The Royal Tenenbaums to Iron Man and Austin Powers is impressive.
And doing a project that requires her not only to sing, but to do it early, often, and well, is a risk. And she does well, vocally. But there’s a passion missing. Some of that may be character-driven (her character is a Grammy-award-winning country artist whose heart isn’t really in it anymore), but there are points in the story where the passion is supposed to run high, and it barely registers above a mild pulse elevation.
Before seeing his last film performance (in The Blind Side), I though the idea of Tim McGraw as an actor was scoff-worthy. But in that performance, he held his own capably, although there wasn’t too much to do, given Sandra Bullock’s filling of every frame she inhabited. So in this movie, I had higher expectations, looking forward to a more nuanced portrayal and character evolution.
But sadly, it wasn’t really there. I’m not convinced it was McGraw’s fault, though. His character wasn’t given that much to do, emotionally. Detached-stoicism-cum-faded-passion is a theme of the movie, apparently, and he did that consistently and well. In this case, maybe it was enough.
Supporting cast was good — if Garrett Hedlund needed another breakout performance to add to his reel in a year where he also was key in Tron:Legacy, this is it. It’s quite a flip-side from the frenetic, agitated son in Tron to be the more tranquil singer-songwriter here. Leighton Meester also turned in an adequate performance.
But the fundamental weakness of the writing just couldn’t be overcome by any of the actors. Some sequences didn’t ring true at all, while others just were, in Randy Jackson parlance, “pitchy.”
This could have been a story as dramatic and engaging as Walk the Line or Ray — triumph, tragedy, redemption with a 20th-century soundtrack. Unfortunately, it ended up more like a TV-movie made for the CMT cable network.
Nature or nurture? It seems like an eternal question, although as science progresses, we may get more and more of a definitive answer. There are several of my attributes that I can easily ascribe to nurture (both good and bad — sorry, mom!), and certainly a whole bunch that are based on the caprice of genetics.
One of the most profound attributes that I ascribe to nurture, though, is an internal locus of control. And it has shaped me more than most other characteristics.
Let’s start with dropping a little Wikipedia knowledge:
Individuals with a high internal locus of control believe that events result primarily from their own behavior and actions. Those with a low internal locus of control believe that powerful others, fate, or chance primarily determine events.
That’s a neat summary, but it really glosses over the implications.
I have some fairly-distant relatives who, for instance, when something good happens to them, ascribe it to luck. “Ah, that’s good luck!” is a common phrase. Or, perhaps more commonly, they shake their fist at the universe when something bad happens.
My perspective tends to be quite different. It’s not that I believe I control every aspect of my life (indeed, several recent events in my life of the past few years have highlighted that I don’t!), but rather that my behaviors and actions are a primary cause of what happens to me next.
The result when I was a kid was that I was more observant of my actions and their consequences. I acted out less than most kids (sometimes to the ridicule of my wife, who is convinced I was the Most Boring Child Ever). I focused on effort and achievement. I nurtured key relationships, especially with influential adults. And I was happier.
As an adult, I find that the last of those is the most important side effect: I’m generally a happy, optimistic person. Because fundamentally, I think that if (when) things go wrong or events converge to cause havoc, the results of my actions will make a difference. So if I act well, the outcome will certainly improve. And to the degree that I can control it or influence it, it surely can’t be as bad as it seems.
So thanks, parents. Nice going on that one.
Over the weekend, my Oscar research continued, as my wife and I took in four nominated movies in as many days. After a few more weeks of this, it’s entirely possible that my body will take on the form of a movie theater seat. (Some may say that’s already happened, but I digress.)
In any event, some thoughts about the four movies we saw:
This is a prototypical Mike Leigh movie. Deeply-drawn characters whose evolution comes through a period of ruminative time. A slice of time marked not for its Big Events (there are no car chases, bombs, or great inventions here), but rather for its smaller ones (a new girlfriend, buying a car, etc). And very strong performances.
Let’s focus on that latter part for a minute. In particular, the women of this film are very convincing. They inhabit their characters in what appears to be an effortless way. From the flighty Mary (played by Leslie Manville) to the grounded Gerri (played by Ruth Sheen), it’s impressive. The male characters have much less to do, although Jim Broadbent does it admirably.
So what’s not to like? Length, mostly. The reflective pace can sometimes stall a bit into a drag, and as a result, there could have been some trimming along the way. In depicting vignettes of a year in this couple’s marriage (and their friends), there were a few points where I thought it might be real-time…
All in all, though, worth a rental. For a couple of hours, it will slow things down and maybe even give you a chance to think about the seasons in your own life.
I’m convinced that, in some little-known dialect of Spanish, “Biutiful” is the word for “long.” On paper, it’s two and a half hours. In the theater, it feels more like three.
First, a positive: Javier Bardem’s performance is very strong. He’s in the majority of the frames of the movie, and his evolution over the time (did I mention it’s a LOOONG time?) is nuanced and appropriately despairing. You can watch him slowly come to terms with his options disappearing one by one.
But that could just as easily have been portrayed in two hours. Or maybe even 1:50. Do I hear 1:45? In fact, with the brevity that comes with a shorter time, the emotional effects may have been amplified. It may have been slightly less of a tour de force by Bardem, but likely much more moving.
Probably not worth a rental, but if it’s on TV and you’ve got the time…
OK, I’ll admit it. I really had no interest in seeing this. I remember the hype when it first came out, and at the time, it sounded like it might be interesting. But over the ensuing months, it began to sound more ponderous. Plus, I’m not a huge fan of Leonardo DiCaprio. Well, to be fair, I’m mostly not a fan of some of his choices of roles. (Gangs of New York? Really?)
But this ended up being better than I feared it would be. Sort of like Memento, the hype about it being difficult to follow proved to be just that–hype. The performances were good for a popcorn movie, and the visual effects really were incredible. I think I was actually more impressed with Joseph Gordon-Levitt than DiCaprio; although the latter had more emotional range to display, Gordon-Levitt struck a good tone of jocular focus.
For what it was, an effects-driven popcorn movie, it was pretty good, actually. There was a little thought required, which elevates it within the genre for me, and although there were a few places where the editing could have been stepped up, the technical prowess shown on the screen is remarkable.
Worth a rental, especially if you have a tricked-out home theater in which you can really see the effects.
Finally, the popcorn-iest (and pop-corniest) of the bunch: Tron. When the first one came out a couple of decades or more ago, it was amazing to me. Even after just watching the first fifteen minutes, I knew it was completely forgettable as a Great Film, but I also knew it was showing technical effects that blew my mind.
This time around, both reactions were mediated a bit. The movie? Well, if the original Tron was a C or C-, this was a B-. It was better. And the effects? Well, if the original was an A+, this was an A.
Let’s start with the movie part. Jeff Bridges did a fine job, trying his best to not make the dialogue come across as cheesily as it was written. And, oh, man, was it cheesy. Perhaps The Dude abides, but Kitsis and Horowitz (and the other credited writers) give one lactose intolerance. Garrett Hedlund did an adequate job, too, carrying the primary protagonist role with zeal. Neither actors had much emotional nuance, but that’s less their fault than that of the vehicle itself. (Perhaps the most scenery-chewing came from Michael Sheen, who definitely made the most of his character!)
The effects, though, are really the stars of the show. And they were darn good. It’s clear that they were intended to depict an electronic reality as convincingly as an actual one, and they do that well. It’s more a tribute to the state of the art today that there aren’t the breathtaking moments of the original (visually) than it is a failing of the effects themselves. Perhaps the ability to enhance the realism of the world helps to make it less overtly wondrous. It’s not pervasively showy like Avatar (or Inception), but just… pervasive.
Worth seeing in HD on TV, and maybe a rental for those who are nostalgic for the original or those who love Jeff Bridges, regardless of the context. If it comes out in 3D on DVD and you’ve got a 3D set at home, it’s a good one for the collection.
It’s that time again, Super Bowl weekend. I have to confess that much of the allure of the event itself eludes me.
It does seem to be an excuse to eat and drink to major excess, which can, from time to time, be things I can get behind. But the tone of it seems excessive, too. But on a more fundamental level, I guess I’m just not into football much. I get the strategic nature of it (at least at a high level; the details are a little fuzzy in some cases), and I like that. But the tactical nature of the celebration of Big Hits just seems one step shy of pugilism, which is another “sport” that I just don’t get. Thus my characterization of it as a Stupor Bowl, a time in which observers eat and drink themselves into a stupor and in which participants beat on each other until they, too, are in a stupor (either now or in 15 years, when they’re repeated concussions result in functional brain injury).
But. I tend to follow the advertising industry, so in that sense, it really is the peak of the season, with the possible exception of the announcement of the CLIOs (the Oscars of advertising) or the AdAge awards. At some points, in fact, I will fast-forward through the game and just go from commercial break to commercial break (which is exactly the opposite of how I watch most TV). So there’s some allure there.
Plus, I think it’s interesting that this is still one of the few times in our media-fragmented world in which a Very Large number of people all tune in to the same thing at the same time. It’s one of very few throwbacks to the last-episode-of-M*A*S*H kinds of TV ratings. (In fact, last year’s Super Bowl got such high ratings that it finally unseated that M*A*S*H episode as the most-viewed program of all-time — a record that M*A*S*H held for nearly three decades!)
Why that is, though, is a cultural phenomenon that just stumps me.
Last week, the U.S. Supreme Court handed down a decision that only got a modicum of notice, and mostly for its result. In Chase Bank v. McCoy, the Court said that regulations prior to the financial reform law didn’t prohibit credit card companies from jacking up your interest rate without warning. Of course, that’s moot now, because the law has changed and they can’t do it anymore. So it’s kind of a non-event, right?
Well, not so fast. In this case, it’s not the outcome that was helpful to the future, but rather the reasoning. You see, the Federal Reserve Board did the regulations that addressed the ability for credit card issuers to change terms (like raising rates), and thought it was pretty clear.
But courts didn’t get the memo. They started interpreting the Fed’s rules in ways that the Fed didn’t think were right. And the Fed said so. But the real question is: who should the courts rely on? Their own interpretation of the regulations, or what the Fed said they meant?
There are reasonable arguments for both sides. After all, it is the province of the judiciary to saw what the law IS. But the Federal Reserve Board actually made the rules, and you’d think they’d have a good idea of what they meant.
Well, the Supreme Court said that in this case (and, one assumes, more generally), increased deference goes to the agency (i.e., the Fed). While agencies can’t rely on the fact that they’ll be consulted and trusted in every case of interpretation, the likelihood is ever growing.
Of course, this raises an issue of the relative powers of the branches. As you increase the ability of the executive branch agencies to be the deciders, you’re reducing the independence of the judicial branch (theoretically). And if the judicial branch is less politically motivated than the executive (since the executive stands for election on a regular basis and federal judges don’t), subjecting regulations to the whim of the elected official may not always be a good idea.
It’s a balancing act, of course, and the “right answer” seems to oscillate over time. But the McCoy case should be noticed for more than an issue of credit card fees — it’s another data point on how much power our executive agencies have.
With the tumult going on in places like Egypt and Tunisia and Jordan right now, it caused me to look at how the United States’ power is manifesting itself throughout these demonstrations and governmental changes. And I must say, I’m pretty upbeat about it.
First, from all appearances, we don’t look like we’re doing anything hugely stupid. In some of these countries, that’s almost a first. What I mean is, we’re not flying our own hand-picked leader into the country and setting him (it’s always a him, it seems) up to lead. Which, inevitably, means that person will become dictatorial and either get deposed later, cause the country’s populous to hate us more, or — in a bonus move — both.
The fact that we’re being fairly hands-off overtly (while certainly doing what we can on a diplomatic front) is heartening. But how are we shaping the events on a more subtle basis?
An analyst for al-Arabiya was interviewed this morning on the Today show about Egypt, and despite spouting some opinions that I don’t quite agree with, he made an interesting observation. He noted that Egypt will do whatever it can to avoid a “Tianenmen Square moment” in large part because the media coverage of it would damage its reputation with the US, Egypt’s supplier of weapons (and money, in some cases).
That kind of soft power through commerce (indeed, through the export of hard power) hasn’t flagged, despite the hand-wringing over the country’s economic station in the world. In fact, it seems like one thing that our lets-spend-like-its-WWIII military budgets have done is made us the worldwide go-to weapons shop. So, assuming our weapons sales negotiations aren’t completely divorced from our other foreign policies, countries will do what they need to do to keep that flow coming.
That’s not entirely an inspirational outcome, since it relies on us arming other nations to the teeth — including those who would be even MORE despotic if they could get away with it. But it does speak to the notion that we still have some “super” in our “superpower” tank. And that’s not too shabby.
First, let’s get one thing out of the way: this is not a feel-good movie. Set in a rural American area during the winter, it looks bleak. When you factor in the poverty and violence of the environment, it just gets depressing. A jaunty, mirthful romp this is not.
What it is, though, is resonant. This evocativeness begins from the first frames, thanks initially to Michael McDonough’s cinematography. It’s not the artistic sheets of white of Fargo, but that’s not what’s called for here. To show the backbreaking work of living in this environment, McDonough layers the physical beauty of the white on the trees with the shanties and scars of those who inhabit them. At one point, he even managed to make an otherwise bucolic lake seem positively ugly and sinister.
From there, the next hat-tip goes to the writing and directing. The pitch of the dialogue seems right on, although not having spent much time in this setting, it’s hard to tell. What is clearer from the writing — and it’s echoed in the directing — is the pace of the movie, from the dialogue to the shots themselves, is languid. Which fits the content perfectly.
The bulk of the awards for the movie, though, have gone to the actors. Jennifer Lawrence, who is reminiscent of a young Renee Zellweger, is remarkable, carrying the emotional center of the film from beginning to end. It is her social status — her presence, or lack, of options in her life — that is most evocative from beginning to end. She is stolid and vulnerable at the same time, and that’s a difficult combination.
John Hawkes has gotten attention as Teardrop, the gruff but sympathetic uncle who ultimately risks a fair amount to help our protagonist. His portrayal was strong, indeed. Whether it ranks in the top-five supporting actor roles of the year I’m less certain.
Was it a perfect movie? No. The movie runs only 100 minutes (1:40), so I’m not sure where I’d do it, but it seems there are some places where cuts could be made or the tempo altered a bit. I realize I congratulated the movie for its languid pace, but there were moments where the languid started to veer toward the torpid.
Winter’s Bone is a small movie. It’s the prototypical depressing indie film, of which there is lots of competition (see Frozen River, for instance, for an interesting comparative). Immediately after watching it, I was ready to consider the parts (acting, writing, etc) and then file it away. But what altered my opinion was its staying power, as the next day, I found myself again considering the plight of those in the poverty of the movie.
It’s that resonance that gives it a whole beyond the sum of its parts to me.
Okay, this is not what I was intending to write about today. In fact, I had a whole entry prepared about how small changes to one’s environment can make big changes in behavior, using the example of my Starbucks experience this morning, which was notable.
But not anymore. Now I’m mad.
I work in an environment with cats. Three of them. They’re not my cats; they belong to the business’ owner, and working around them is a condition of employment. So here I sit.
It’s not that they’re not cute (well, most of them, anyway). And we’ll set aside the fact that they shed all over everything and knock things over such that they’ve crippled electronic equipment and loaded up the laser printer with enough hair to look like a new breed of LabraDoodle. And that they periodically vomit on the floor of my office.
No, I’m agitated now because just as I was finishing up my previous post, one of the cats jumped onto my desk and walked over and hit the multi-button mouse in some way as to erase the whole of my writings. And because it’s a web form and doesn’t auto-save with the regularity of a full application, it’s gone.
Does anyone know if you can sous vide a feline?
(No, no, PETA people. I’m kidding. About that last part anyway.) I’d grumble aloud, but the cats might take it for purring and shed some more on my carpet.