Bride-(who-wants)-to-be-(thinner), Jessica Schnaider, who spent $1,500 for eight days on a feeding tube to make sure her wedding photos — if not her new body — would look good forever. Credit: Barbara Fernandez for The New York Times
Months of winter forcing you to chow down on ‘comfort food’ making you feel like Violet Beauregarde? You’re in luck: the New York Times was kind enough to shine a spotlight on the latest diet trend sweeping the nation (in this case, a nation made up almost solely of prospective brides looking to shed a couple-twenty pounds): nasogastric tubes, which provide a highly controlled daily dose of calories after being inserted through the nostril, down the esophagus, and into the stomach.
Needless to say, this type of weight loss regimen isn’t for everyone — and in some experts’ opinions, shouldn’t be for anyone. As Time reports:
Dr. David Heber, director of the UCLA Risk Factor Obesity Program, says complications can also include aspiration, infection of the lung, kidney failure and erosion of tissues in the nose and throat. “People are taking an unnecessary medical risk by putting in a [feeding] tube,” he says. “To do it for no reason seems to me overly risky. Without medical supervision, if the protein and electrolyte levels are not monitored, it’s not safe.”
Of course, humans have experimented with destructive ways to maintain their figures since people realized they had figures. The ancient Romans would vomit between courses to make room for the next round. William the Conquerer adopted an all-alcohol diet in 1087 after becoming too fat to ride his horse. Unfortunately, he died that same year after supposedly falling off said horse, which gives a whole new meaning to the phrase “crash diet.” Even Lord Byron had some less-than-appealing practices to keep the poet pounds down:
Existing on biscuits and soda water or potatoes drenched in vinegar, he wore woolly layers to sweat off the pounds and measured himself obsessively. Then he binged on huge meals, finishing off with a necessarily large dose of magnesia.
Of course, if they’d existed in his day, I’m sure Byron would have been one of the first to purchase one of the many variations of the Jiggle-a-Tron 5000 that became popular in the early half of last century, especially considering that its main attraction seemed to be how little physical effort its users actually had to exert in order to reap its supposed benefits.
That said, it’s hypocritical to pretend that many of today’s high-tech solutions are any better. Take Japan’s 4 in 1 Pressotherapy Slimming Machine, which promises (in its own delightfully poetic way) that
With perfect combination with magicconversion curve & lymph conduction, fat elimination, breach of fat, magnetic skin tension, integrated with weight loss, body beautification, massage, and exercise, four-in-one body beautification instrument transmits 32 different types of myriametric wave signals to strengthen normal electrochemical process of human nerve endings and generate 32 different moving modes of fat in the body of patients so that the fat in different people will be completed decomposed.
Couldn’t have said it better myself. Especially with a feeding tube up my nose.
Oh technology, is there any way you can’t make us look stupid? First you tricked us into believing that rolling over a relatively smooth surface was more efficient than upright bipedalism (wheels! amirite???). Then, a few years later, crazy folks lost their monopoly on talking to themselves in public when some Smurf-obsessed nerd invented the bluetooth and it began to proliferate among business-types and young people — two truly dismal demographics which, to this day, can be seen having extensive, passionate conversations about the stock market and/or their latest STDs with interested dust motes and subway posters.
Now, entering into this startlingly fractured technoscape for the first time is none other than that erstwhile manufacturer of Happy Days-themed car fresheners and body-shaping undergarments for older women, The Google.
Google Inc is getting into the eyewear business with a pair of thin wraparound shades that puts the company’s Web services in your face.
The experimental “augmented reality” glasses – from the same team that is developing self-driven cars – can snap photos, initiate videochats and display directions at the sound of a user’s voice.
The prototype digital glasses, unveiled on the company’s Google+ social network on Wednesday, are still being tweaked and tested, and are not available in stores yet.
Here’s a (presumably After Effects’d) videographic demonstrating what Google Goggles could one day do for you:
I’d be lying if I didn’t admit that this actually seems pretty cool — assuming, of course, you don’t fall through an open manhole while you’re wearing them. And of course, in this day of instant high-production-value witticism, one tech-savvy wag who I wish was me has already created a commendably tongue-in-cheek riposte:
On second thought, maybe I’ll save my money for Google Contacts.
Let me preempt this post (ontological question: is something really preemptive if it’s the first sentence in an essay, or is it merely introductory? Oh well…) by assuring you that I’m not writing it from my front porch while lording over an ever-growing collection of kites, soccer balls, and frisbees as I yell at the neighborhood kids to get the hell off my lawn.
That said, here’s another assurance: your two-year-old needs an iPad about as much as you need a diaper. (Which is to say, sure, it might be a treat once in a while, but let’s not go all Lisa Nowak here, okay?)
Three years ago, when he was just 2 years old, Max Fuller got his first iPhone. His father, Craig Fuller, the CEO of a banking technology company, said it’s been an “enormous tool” for teaching Max the basics about colors, shapes and letters, and most recently the names of all of the dinosaurs and how they lived.
Okay, yeah, sure — education, innovation, keep up with the times, Trevor the Troglodyte. Obviously, you’ve missed the trend train and are attempting to analyze the current state of affairs from the engine fumes-engulfed platform of your pump-action handcar:
According to data gathered from September to December 2011 by global strategic marketing agencyKids Industries, 20% of children ages 3 to 8 own their own iPod touch, while 24% of U.S. children in this age group own their own iPad and 8% own their own iPhone. For teens, the numbers are considerably higher. An April 2011 survey conducted by financial adviser firm Piper Jaffray found that 80% of U.S. teenagers owned a type of mp3 player, with the iPod by far the most common, 17% owned an iPhone (38% expected to buy one in the ensuing six months), and 29% owned or had access at home to a tablet device (and 22% said they expected to buy an iPad in the ensuing six months).
Which is all well and good for Apple investors, but perhaps not so keen for early (as in, pre-pre-pre-teen) adopters:
According to many experts, so much screen time can have permanent effects on the brain. The American Academy of Pediatrics discourages any media use by children younger than 2. Dr. David Hill, a member of American Association of Pediatrics’ Council on Communications and the Media and the author of the forthcoming book “Dad to Dad: Parenting Like a Pro,” agrees and recommends that any child over the age of 2 limit screen time to two hours a day.
“Evidence suggests that viewing the sorts of rapid fire images present in videos or video games can lead to future problems in children’s ability to concentrate,” he says, adding that some research suggests a strong link between media exposure and ADHD. He says problems are likely to surface when the device is used as a substitute for communication between parent and child.
Jane M. Healy, an educational psychologist who specializes in the effect of computer technology on growing brains and the author of “Different Learners: Identifying, Preventing and Treating Your Child’s Learning Problems,” says technology offers no benefits to young children.
“All indications are that instead of increasing their intelligence, it’s going to dull it down,” she says. What’s most important for a young child’s brain development is participating in conversation, a skill that children preoccupied with an iPad, cellphone or computer fail to practice, she says. “It’s language that will later help them become physicists, scientists and imaginative computer programmers.”
Again, this isn’t a screed against a harried parent handing their screaming toddler their touchscreen-enabled smartphone to quiet him down at the mall or in a restaurant; it’s a screed against anyone who would use such technology to outright replace time that they would have otherwise spent interacting with thetreasured fruit (Apple, in most cases) of their loins. Of course, that’s only half the issue, because while it’s one thing to let your kid use your fancy-ass future gizmo once in a while, it’s another thing entirely to give him one of his own — and not because you might spoil him (though there is that), but because you might literally and permanently reconfigure his brain chemistry for the worse.
Don’t get me wrong: it’s not like I wouldn’t have killed for the latest interactive miniaturized gadget as soon as I was old enough to start requesting Disney movies by name, but the fact that I didn’t have ready access to pre-canned digital entertainment meant that I spent most of my youth careening through the unlimited confines of that wonderously weighty buzzword, IMAGINATION.
If I’d owned an iPad, do you think I would have spent the majority of my free time running around outdoors or reading piles of books animated in proprietary HD (head-defined) ImagiVision? Shit no! I’d have been hunkered down on the couch with a bag of Nacho Cheese Doritos one one side and a bottle of Cran-Raspberry juice on the other, alternating my time between the latest YouTube sensation and marathon battles spent launching disgruntled fowl at ravenous porkers into the wee hours of the morning.
So yes, this is an “everything in moderation” rant, but I think it’s an important one. Because while it’s absolutely true that, with
an increasingly technology-focused society and economy…exposure to technology, no matter how early, will only help children develop into the tech-savvy adults the country needs[,]
it’s also true that hundreds of people die of exposure each year. (Yeah, I went there.) So, Mr. Fuller, next time you want to teach your kid about colors and letters, why not try Dr. Seuss? And if he wants to learn about dinosaurs, I bet he’d love the ones in a museum even more than the ones on the tiny screen in his hand. Because there’s always going to be time for him to get his Retina Display on, but once those vital synapses and cerebral crennelations begin to solidify, there’s literally no going back. Then it won’t matter how many apples a day you feed him.
I started thinking about this when we ran out of apples this morning and I had to raid our hurricane stash for a container of Earth’s Best KIDZ Organic Apple Sauce (yeah yeah…): why is it that almost every fruit under the sun has its own juice, but only apples have their own sauce — sauce, at least, that you can reliably buy in stores? (Maybe you can get cherry sauce at some hippy dippy speciality store, but I sure as hell have never seen it.)
Important side note: making sauce from this particular Apple is illegal in most states.
Is it simply because all fruits are inherently juiceable due to their high water content, but only the grainy texture and consistency of apples make them conducive to “saucing” — that bizarre meta-state residing awkwardly between liquid and solid nutrition? Or is it that, while apple sauce has achieved universal popularity over the years (perhaps due to the simple ubiquity of the apple itself in both culture — Eden, Newton, certain bottom-hugging jeans — and cooking), other fruits remain ideologically associated with the types of purees typically reserved for baby food, such that any non-apple “sauce” is inextricably linked to a subconcious infantilization, thus making it commercially unviable to the average adult?
It’s also five years behind the eight ball (and not just because the eight ball is the black one).
Forthwith, an early (2006) short film by Emmy award-winning director, Jeremy Levine, featuring the delightfully racist dutch folk hero, Zwarte Piet, and his clamoring clan of misfit negros.
From the director:
The Dutch celebrate Saint Nicholas Day in much the same way that Americans celebrate Christmas only our elves are replaced by their Black Petes. Every December, white Dutch citizens paint their faces black, cover their heads in curly wigs, and carry on a tradition that has long passed its admissibility in The Netherlands multi-ethnic society. Inspired by David Sedaris Six to Eight Black Men, this film provides a first-hand look at one of the most shocking and offensive traditions still in practice today.
For added value/context, check out some of the comments the video has received, as well as Levine’s characteristically thoughtful responses.
I have a photo blog that I set up last year. I receive a steady stream of email notifications asking me to moderate comments on my posts.
I only post my images, and no commentary other than a title for the image, or collection of photos.
Out of hundreds of comments, only two have been from actual people.
My photo website lives in WestWorld, where my work is viewed more by algorithms than by eyeballs.
In this collaborative piece, I work with cyber demons sent from Russia and Nigeria. As I find it difficult to choose descriptions for my photographs, I have accepted help in the form of robo comments from spambots.
I present, “That is Things I Wanted!”
Help, I’ve been informed and I can’t bemcoe ignorant.
Massachusetts, as everybody knows, was founded by a bunch of religious folks eager to escape British persecution for their disparate beliefs so that they, in turn, could be free to persecute a different bunch of religious folks for their disparate beliefs. (Maybe David Brooks should move there!) This worked pretty well (give or take recorded History) until just last week or so when, once again, the 21st Century was forced to make space at the kids table for the 17th. As the Boston Globe points out, they really don’t have much in common any longer:
With the year’s biggest shopping blitz just 10 days away, major retailers across the country have already released their doorbuster deals and crowd-control plans for the Friday after Thanksgiving. But many merchants are scrambling to figure out one crucial detail: what time to open in Massachusetts.
Some chains that had promoted midnight sales are amending their early-bird hours to comply with the state’s 17th-century blue laws. The rules prohibit retail employees from working until the clock strikes 12 a.m. after Thanksgiving – leaving no time for staff to prepare for midnight openings.
The laws, penned by the Puritans during the 1600s, were created to prevent Colonists from straying from church to drink or conduct business on the Sabbath. They include many regulations that are rarely enforced, such as a ban on dancing on Sundays.
Okay, so the dancing law is still useful (who the hell wants to stay out at the local discoteque past midnight?), but why the hell are so many other hoary headscratchers still on the books? Lawmakers know that Massachusetts legalized gay marriage, right? I mean, I don’t know any Puritans personally so I can’t say for certain, but I’m pretty sure that decision would bunched a few petticoats. But fuck retailers, right? (Note the singular “t” in the first word of that sentence, please.)
Hurst and officials at the Massachusetts Department of Labor Standards said they have spoken with merchants over the past several days to clarify the rules.
“We’ve gotten a lot of calls from many, many retailers on what the law is,’’ said Patricia DeAngelis, general counsel for the department.
“This agency has not issued a statewide permit to allow retail stores to be open or permit work on Thanksgiving Day or Christmas Day. The spirit of the law and intent is to give people a day off, and that is why this state has exercised that authority in the way it has.’’
Fortunately, as a consumer, you actually shouldn’t care all that much about when stores are going to open, because the truth of the matter is, you’ll find equal or better deals online this Black Friday — not in line. Plus, you’ll be less prone to being upsold on other crap you don’t need and significantly less likely to be trampled by stampeding mobs. (Unless you kick back with your laptop in front of the bathroom door after everyone has begun to digest Aunt Gerry’s famous apple rhubarb pie, in which case, you have only yourself to blame.)
The stories have been everywhere these last few weeks. Thesetwo happened to be featured today in Google News and MSN, respectively, but they all say essentially the same thing, which is that Alabama has royally1 fucked itself with its new toughest-in-the-nation immigration law — definitely in the short-term and, assuming nothing is revised or repealed, perhaps for years to come.
The tl;dr of not only the articles above but the entire issue in general is that Americans don’t want to work their asses off (literally) doing back-breaking labor (again, literally) for minimum wage — “minimum,” in this case, being meant ironically, since illegals could only aspire to that threshold. And that doesn’t mean that Americans are too fat, soft, or lazy to hold these jobs; it means that, like you or I (aka, people with a reliable internet connection), the vast majority of unemployed Alabamians still have not faced true desperation. But more than that, it demonstrates how long-in-coming cultural shifts cannot be overcome by a single flourish of the executive pen. Jobs performed primarily by Americans through the first half of the last century have, perhaps irreversibly, become the domain of immigrant laborers, with Americans no longer willing to subject themselves to such strenous working conditions day after day for miniscule wages and non-existent benefits.
As is pointed out in the Bloomberg piece:
The notion of jobs in fields and food plants as “immigrant work” is relatively new. As late as the 1940s, most farm labor in Alabama and elsewhere was done by Americans. During World War II the U.S. signed an agreement with Mexico to import temporary workers to ease labor shortages. Four and a half million Mexican guest workers crossed the border. At first most went to farms and orchards in California; by the program’s completion in 1964 they were working in almost every state. Many braceros—the term translates to “strong-arm,” as in someone who works with his arms—were granted green cards, became permanent residents, and continued to work in agriculture. Native-born Americans never returned to the fields. “Agricultural labor is basically 100 percent an immigrant job category,” says Princeton University sociologist Doug Massey, who studies population migration. “Once an occupational category becomes dominated by immigrants, it becomes very difficult to erase the stigma.”
Massey says Americans didn’t turn away from the work merely because it was hard or because of the pay but because they had come to think of it as beneath them. “It doesn’t have anything to do with the job itself,” he says. In other countries, citizens refuse to take jobs that Americans compete for. In Europe, Massey says, “auto manufacturing is an immigrant job category. Whereas in the States, it’s a native category.”
But history is never as compelling as humanity, so let’s put a face to the story, shall we?
On a sunny October afternoon, Juan Castro leans over the back of a pickup truck parked in the middle of a field at Ellen Jenkins’s farm in northern Alabama. He sorts tomatoes rapidly into buckets by color and ripeness. Behind him his crew—his father, his cousin, and some friends—move expertly through the rows of plants that stretch out for acres in all directions, barely looking up as they pull the last tomatoes of the season off the tangled vines and place them in baskets. Since heading into the fields at 7 a.m., they haven’t stopped for more than the few seconds it takes to swig some water. They’ll work until 6 p.m., earning $2 for each 25-pound basket they fill. The men figure they’ll take home around $60 apiece.
Parse that ‘graf again: that’s an 11-hour work day doing the type of labor you and I would consider cruel and unsusual if forced to continue after the first hour, which these men undertake day after day for less than $5.50 an hour. I make that much on a longer-than-average bathroom break, and believe me, the only thing I’m rolling in is debt. Even the excessively fit bow down to the immigrant work ethic:
In the weeks since the immigration law took hold, several hundred Americans have answered farmers’ ads for tomato pickers. A field over from where Juan Castro and his friends muse about the sorry state of the U.S. workforce, 34-year-old Jesse Durr stands among the vines. An aspiring rapper from inner-city Birmingham, he wears big jeans and a do-rag to shield his head from the sun. He had lost his job prepping food at Applebee’s, and after spending a few months looking for work a friend told him about a Facebook posting for farm labor.
The money isn’t good—$2 per basket, plus $600 to clear the three acres when the vines were picked clean—but he figures it’s better than sitting around. Plus, the transportation is free, provided by Jerry Spencer, who runs a community-supported agriculture program in Birmingham. That helps, because the farm is an hour north of Birmingham and the gas money adds up.
Durr thinks of himself as fit—he’s all chiseled muscle—but he is surprised at how hard the work is. “Not everyone is used to this. I ain’t used to it,” he says while taking a break in front of his truck. “But I’m getting used to it.”
Yet after three weeks in the fields, he is frustrated. His crew of seven has dropped down to two. “A lot of people look at this as slave work. I say, you do what you have to do,” Durr says. “My mission is to finish these acres. As long as I’m here, I’m striving for something.” In a neighboring field, Cedric Rayford is working a row. The 28-year-old came up with two friends from Gadsden, Ala., after hearing on the radio that farmers were hiring. The work is halfway complete when one member of their crew decides to quit. Rayford and crewmate Marvin Turner try to persuade their friend to stay and finish the job. Otherwise, no one will get paid. Turner even offers $20 out of his own pocket as a sweetener to no effect. “When a man’s mind is made up, there’s about nothing you can do,” he says.
Unsurprisingly, Alabama’s current governor, Robert Bentley2 (R), is doubling down on the tough love rationale:
“If they [employers] are using illegal workers right now, will it hurt them? Possibly,” Bentley says. “Especially this first year or maybe the second year. But eventually, it will not hurt them, because we will get back to doing things the right way.”
Ah yes, the “right” way — aka, hiring legal residents at a fair wage commensurate with the difficulty of the job and the competitiveness of the market. Sounds great, right? Sure, if you can magically cut an equal amount of costs in other areas to avoid having to raise your prices. Cue cucumber farmer, Jerry Danford:
As we park and walk toward the fields, Danford talks about how many workers he needs to harvest all the cucumbers. Danford supplies a lot of the major pickle brand names you’d recognize. All those acres represent $20 million in retail pickle sales.
Since Danford doesn’t think a pool of labor, apart from immigrant workers, exists, he says he won’t be able to plant so much produce anymore.
But what if he paid a higher hourly wage? The going rate now is $10 an hour.
“The [pickle] company wouldn’t buy it from you then,” he says. They’d turn to suppliers in other states where labor is cheaper — states that allow undocumented immigrants to continue working under the radar.
Across Alabama we heard the same thing, from watermelon growers in the south to tomato farmers up north.
Whether cukes or catfish, the fundamental problem remains the same:
Skinning, gutting, and cutting up catfish is not easy or pleasant work. No one knows this better than Randy Rhodes, president of Harvest Select, which has a processing plant in impoverished Uniontown, Ala. For years, Rhodes has had trouble finding Americans willing to grab a knife and stand 10 or more hours a day in a cold, wet room for minimum wage and skimpy benefits.
Rhodes says he understands why Americans aren’t jumping at the chance to slice up catfish for minimum wage. He just doesn’t know what he can do about it. “I’m sorry, but I can’t pay those kids $13 an hour,” he says. Although the Uniontown plant, which processes about 850,000 pounds of fish a week, is the largest in Alabama and sells to big supermarket chains including Food Lion, Harris Teeter, and Sam’s Club (WMT), Rhodes says overseas competitors, which pay employees even lower wages, are squeezing the industry.
So what’s the solution? Jesus, who do I look like: Stephen Colbert?
Here’s an idea though. Why not return to our green card-granting ways, providing a path to citizenship for formerly illegal aliens willing to put in the blood, sweat, and tears required to help get the nicely packaged food we all take for granted from the fields and rivers to our grocery store shelves? Since we know they’re willing to do the work, we don’t have to pay them anything more than we were before (that’s called Capitalism!), and as long as they keep working, we’ll keep not-deporting them. Then we can turn an open eye — rather than our accustomed-to blind one — to more pressing matters: for example, teenagers using vodka-soaked tampons to get drunk! (Sneak preview: the ladies use ‘em how you’d expect, but the fellas…well, they had to get creative!)
___________________ 1. Fun fact: This is the only context in which you will ever see the word “royal” or its derivatives used to describe Alabama! 2. Are you kidding me? “Bentley”???
A ballot measure going before voters in Mississippi on Nov. 8 would define the term “person” in the State Constitution to include fertilized human eggs and grant to fertilized eggs the legal rights and protections that apply to people.
I see red(state) people.
The rest of the piece goes into detail about why this is a terrible, ridiculous idea –convenient, since I don’t have any desire to delve into the more highfalutin ethical and legal ramifications anyway. However, I worry about some of the lesser absurdities that such a ruling might lead to.
For example, if life truly begins at conception, then age must also logically begin at conception, which means that someone’s day of birth (what some people call a “birth day”) is no longer a valid indicator of how old he or she is. Furthermore, since life is life no matter how far along in it you already are, presumably any such ruling would have to apply retroactively. Which means that, if Missippians really believe in the new law, they’ll have to start giving 17-and-three-month-year-olds (give or take) the right to vote and 14-and-whatever-year-olds the ability to earn their learner’s permit and on and on and on. Because age is merely a reflection of how long someone has been a person, right? And if personhood is applied at conception to a gloopy dribble of non-sentient cells, then that’s when the clock starts ticking. To argue otherwise would be to deprive your citizens of their sperm-given humanity.
(And yeah, I know that, as a nation, we’ve already complicitly — if tacitly — agreed to compromise on the birthday convention for purposes of convenience, but we’re talking about a single state here, so if this law passes, they better be prepared to put their penis where their vagina money where their mouth is.)
[Editor's note: I'm not sure if the parodic title of this post makes any sense, but I'm a little punch drunk this week. My thinking was: it's almost Halloween, "Flying Purple People Eater" is one of the few well-known Halloween-type songs, babies don't fly but they cry, they're a little purplish looking in the womb, this story is about the legal definition of "people," and classifying them as such at conception is cheating. So we cool?]