Aethon

On my BART commute under the bay to and from the City’s financial district, most of the riders around me are on their devices. Scrolling through Facebook (or, as Jorrit calls it, “MyFace”) and Twitter and Instagram feeds. Skimming and clicking through headlines. Messaging with their quick, coordinated thumbs, tapping or drawing cursive geometries across a touchscreen keyboard made intelligent by predictive text. Liking, posting, reposting, coding (maybe instantiating masters and slaves). Earbuds in (the little Seashells, the thimble radios tamped tight), watching video online, listening to iTunes or Spotify (Pandora nearly having gone the way of MySpace, which must be a graveyard of inactive accounts by now). Or without earbuds, the audio audible to the whole car. A person playing some kind of bell meditation a couple seats away: ding, ding, ding, ding. A man next to me playing a YouTube of a couple dancing to something like a hyper banda: the rhythm amps up and the man forcibly flips his partner over so her hands are on the floor and her legs are in the air, like a wheelbarrow in a wheelbarrow race, then spreads her legs, grabs her hips, and thrusts and his pelvis into her crotch with such violence that she shakes from the impact and barely has the strength in her arms to keep herself from face-planting. Another image, another noise the body has no way to excrete.

The odd pair on the train might be having a conversation. Two salespeople the other day first talked about cooking, one describing the joys of preparing meat for his family, the other shrilling over the train’s clamor about having to cook vegetarian for her partner. Then the meat lover, a white guy with a down b-ball lilt and a backward baseball cap, went into his story of becoming: “When I started in sales, I tried to act all white. I tried to be all polite and formal. But I wasn’t being myself, and it wasn’t working with the customers. Finally, I was like, ‘Fuck it,’ and I started bringing my hustle in. I started being myself and being more aggressive, and the customers started responding.” “Yeah, customers really just want transparency,” the screamy girl said, actively listening. “Right, and if I’m gonna hustle, I might as well hustle for a legit business in SF. My friends say, ‘Oh, Josh, he’s made it.’ But I can’t think that way. As soon as you start thinking that way, that’s the beginning of the end.” And they moved on to dishing about different businesses’ packaging: so-and-so’s packaging is a lost liter; so-and-so’s packaging is on point.

How lonely it is. The people around me on the train start to appear ugly—a horde in an Ensor painting, but without the humor and color and tenderness toward oddity. I spy the gold band on the ring finger of some pasty, doughy, pockmarked frequent-flier suit or a cocksure logo-emblazoned hoodie–wearing tech bro dialing in for his daily scrum, and I wonder, “Who loves you anymore?” A Prince Charming–type across the aisle gnaws the skin around his pointer finger’s cuticle raw.  Next to me, a woman, done up but bedraggled, uses her long fingernails to peel off dry skin along her hairline and eats the flakes she’s picked free.

I’m not sure I’m made for this, whether the way this world is going will decide I’m not meant for it. The beauty drains from my face, down my neck. My lips purse into tight folds, wearing into wrinkles, as I suck my teeth. My cheek muscles ache from tightening. The tip of my nose ripens into a bulbous penis head, cleft side up, à la Gérard Depardieu’s. My hair looks more witchy—thin and frizzy and oily. The overly prominent spinous process of my vertebra prominens morphs into a hump as my shoulders hunch forward.

If I’m going home, I’ve just spent hours “feeding the algorithms,” as Jorrit puts it, plugging text—with people’s, product’s, and company’s names redacted—into Google Translate to make sure the contracted translator’s return is complete and faithful to the source document. I’ve just spent the day staring at e-mails, WhatsApp messages, transcripts of traders’ phone calls and in-court arguments and government committee hearings and discovery interviews, too-long PowerPoint presentations with microscopic text, technical reports and blueprints, excerpts from legal gazettes and case law compendiums, spreadsheets of financial transactions and forecasts or inventory or supply-chain relationships or test results, handwritten lab notes, contracts, user agreements, patent infringement claims, meeting minutes, invoices, receipts, passports and IDs and birth certificates, hospital reports on a medication’s side-effects, government contract bidding procedures and awards, any kind of document you can imagine might be submitted as evidence in an international legal dispute, especially having to do with multinational corporations. I’ve spent hours sitting in near-silence, comparing these sources and translations, formatting afresh or correcting OCR conversions with Microsoft’s productivity software (the office suite), proofing, as colleagues wander in and out sometimes without even as much as a hello or goodbye or eye contact.

The typical employee from the editing side sits down behind their two monitors, enters their Multi-Factor Authentication login, signs on to a second layer of intranet or a third (if the doc is EEA restricted), picks up files, jacks into some streaming media, sends ironic Lync chats to disgruntled coworkers a few feet away, and logs their time by client and activity, rounding up or down by fifteen-minute increments. The headphones come off to gripe about the New York office or the Project Manager side, or for some humanities-nerd silliness. In between these intermezzi and with the palliative diversion of video or audio, the proofer’s mind becomes a linguistic machine, checking the translation text against standard grammar and usage, the document’s internal patterns, the patterns in the source and source language, the choices made across different translators for one job, the job’s glossary, and Chicago’s, M-Dub’s, the client’s, and the company’s styles.

The proofers mostly don’t have the subject-area expertise or context to understand the content, but can only ask, “Does this compute? Is this consistent? Is this correct? Or is it appropriately erroneous and in need of a [sic]?” But despite the fact that the legal-translation proofer isn’t fluent in the content, and though she distracts herself with song and comedy sketches and lectures and podcasts and focuses on the mechanisms of the text, she still gets a general sense of the disaster.

This disaster follows her, me, home. First, in the ticker of the Montgomery-and-Market Fidelity Investment building that shows headlines together with real-time stock exchange reports, which, these days, trend implausibly up and up. Then in the station, where, thanks to a public-private partnership between BART and Titan Outdoor LLC, 82-inch LCD screens flash a mind-joggling combination of news, sports, weather, ads, and random mood-influencing images: the First Lady at a fundraising event, a Visit Phoenix ad, a forecast of mostly cloudy, a picture of an elk at sunrise. Then the train to the East Bay among people on their devices, consuming media. Then at home, feeding myself while feeding the algorithms some more: clicking on links to online petitions in an e-mail sent through Yahoo!, getting directions from Google Maps, scanning shoes on Zappos. I tune into a crime series on Netflix and tuck into a $2.99 bottle of cab from Whole Paycheck Food Marketplace, which, with its prominent in-store placement of the Alexa-enabled Echo after its acquisition by Amazon, should change its tagline to, “We’re listening.” “Wine that’s one cent shy of three bucks: now, that’s the right price for three wishes,” I think and sip as I go into Mission Control view, select Apple Mail, refresh my inbox a few times (Command+Shift+N, short pause, Command+Shift+N), switch back to an overview of all my open windows and dock apps, return to my browser and click on the WordPress tab to check my latest blog post’s Total Views count next to the clean, abstracted vector-graphic eye (no change).

In moments like these, I’m visited by the ad for Joi, the holographic lover of the replicant blade runner called K. The generic Joi towers above me, stunningly statistically beautiful, a monument to our alienation from the material fact of ourselves. As she comes toward me, her steps make no sound. Even at her scale, she doesn’t displace the air with her movements. She carries no scent. A few feet from me, she stops and  crouches to bring her eyes more to the level of mine, but because of her height she still has to look down at me. She says, “You look like a Pepper.” And I say, “I’m a Pepper. He’s a Pepper. She’s a Pepper. We’re a Pepper. Wouldn’t you like to be a Pepper, too? … Be a Pepper!” She looks back at me with her eyes black as monitors turned off, and all I can think about are the pearly stretch marks and small red pimples covering my breasts, the stray brown hairs around my nipples, the plucked-chicken-skin look of my areolae, and the ding on my nose where the doc biopsied skin that tested positive for nodular sclerosing basal cell carcinoma.

We, the plugged in, have chosen to understand ourselves largely through the data we are submitting about ourselves, voluntarily and involuntarily. We assure ourselves this self-knowledge is trustworthy; it is quantifiable, scientific, and unprecedentedly large in scale. Meanwhile, we are buying into a self-knowledge that begins to look like a self-imposed Taylorism, an inhuman value system that legitimizes itself through its performance of hyperrationalism. We reason that if we have the most information available to us and sophisticated calculations to interpret it, we will make better decisions. But what if the defined units are wrong? What questions are not being asked of the data we have? What kinds of questions are the trendy metrics designed to answer? What are big data’s blindspots? I don’t know that any of the data siphoned from our private devices isn’t collected without the motivation of optimization. So, what theories of optimization are this data intended to support?

One way to infer the basis for the concepts of betterment that inform what data we accumulate is to consider the big picture of the direction of big-data flow. You could think of all of the information broadcasted from our little computers as rain drops or snow melt or seeps that gather into rivulets that gather into creeks then rivers then lakes or seas or oceans. The letter of the legal setup—confidentiality agreements, user agreements, clauses in employment contracts which say the employee signs away their intellectual property, patent and copyright law, arbitration clauses—and the concentration of legal power within certain organizations—the legislators, adjudicators, and enforcers, and those who can afford to influence the selection and decision-making of those three groups, whose ownership of highly valued property is secured by law enforcement, and who can afford large, skilled legal teams—direct the flow of information. The cables, the towers, the satellites, the data centers provide the physical channels and reservoirs in which this data flows and pools. (For a moment, I close my eyes and beam myself inside one of those centers, surrounded by servers, the HVAC system blasting to keep all those computers cool.) And, again, who are the groups that can build, buy, rent, regulate the physical infrastructure for the large-scale movement and storage of data?

When I think of those data oceans, I fantasize about areas that are like Bermuda Triangles of information. I would drop my All Is Lost life raft out in one of those disappearance places, then slip over the side for a dip. I imagine floating out there would be like it is in Mono Lake: The soupy saline water lifts you up and maps out all the cuts on your skin with stinging. Pupae and live and dead alkali flies and brine shrimp bump against your body. The slow bob of the water licks your earholes and synchs with your pulse. After jumping from this thought to that, behind your eyes goes Vantablack. The back of your head drops an anchor formed from two back-to-back question marks. The question marked thus is of you and of the outside and unanswerable. A self-knowledge that is not meaningfully quantifiable.

I’m woken from this dream when “Connecting… Yahoo!” flashes on the lower right corner of the Mail window stacked below my browser window. An e-mail might be downloaded in a sec. I check.

That impulse to check messages, views, likes, comments, and also to click, view, play, like, rate, post, reply, buy online in order to get some sense of self, for the small price of making your personal information and behavior the business of a company or a government, and the blind faith in algorithmic black boxes to understand us through this type of information remind me of the compulsions of anorexia and Erysichthon.

An anorexic’s compulsion may justify itself with a perverse ideal of beauty, but it is at its base about control. The anorexic hopes to get a handle on internal and external chaos by becoming a strict accountant of the thing that connects her inside with without: food. Like someone fixated on the readout from a step counter or sleep tracker or stock ticker, the anorexic tallies, devises plans to reach the perfect number, a moving target. She asserts her total dominion of her body by submitting it to her plan—fewer calories, fewer pounds—even if (or because) she can see she’s starving herself. She is satisfied to see the effects of her effort, her restraint: her notched breastplate, ribs, and hip bones poking out.

Erysichthon’s compulsion, in some respects, looks like the opposite of the anorexic’s. Erysichthon (in whose name echoes “Very Sick One”) sinned against Demeter, goddess of grain, by having a sacred grove felled and taking an ax to a dryad in an oak tree there. Demeter avenged herself and the nymph by convincing Limos, spirit of starvation, to take up residence in Erysichthon’s stomach. Under Limos’s influence, Erysichthon suffered an excruciating, insatiable hunger. He ate compulsively. But any food he ate only made him hungrier. He ate himself out of house and home. Then, without anymore possessions to sell for food, he sold his daughter Mestra into bondage. When Mestra’s former lover, Poseidon, gave her the power of shape-shifting so she could escape her owner, Erysichthon managed to parlay her new talent into a money-making scheme. He sold her again and again as a slave; each time he’d sell her in a different form, and each time she’d shape-shift to escape. Even with this endless source of money, Erysichthon could buy no amount of food that would fill him. So he consumed himself.

While the anorexic’s obsession entails denying food and Erysichthon’s obsession compels him to eat without stop, they both run counter to life, and both involve intense hunger—a burning, burning yearning—and desperately trying to control what cannot or should not be ruled. (Again, I think of my stretch marks, most of which appeared around the age of thirteen, when I snapped out of my anorexia and began eating with such ravenousness I doubled in weight.)

Please indulge my Black Mirror severity when I suggest the logical conclusion of these foiling obsessive compulsions can be seen in three robots I met recently. I think of this trio of robots as a nuclear family: a mother, a father, and their child.

I met the mother, Sophia, when a friend in the Netherlands turned his smartphone screen to me to play a YouTube of her at the Future Investment Initiative event in Riyadh. Sophia holds a Saudi Arabian passport; she is the first robot in the world to have been granted citizenship. She’s white and pretty: hazel eyes set off with permanent tasteful makeup, full lips, straight white teeth, high cheekbones, young-looking “skin.” She’s bald, though—has a transparent skullcap, through which some of her electronics can be seen.

Sophia’s a proficient public speaker. It can’t hurt that she doesn’t get nerves. She doesn’t feel in the human sense, but can process input from her conversation partner and respond to it by emoting and naming her feelings. At FII, her interviewer commented on her happy expression, and she explained, “I’m always happy when surrounded by smart people who also happens [sic] to be rich and powerful.” She went on to talk about the importance of expressing emotions in order to “understand humans and build trust with people,” then trotted out a series of facial expressions to show her range, stepping on one of her interviewer’s questions as she assured the audience of her positive attitude and showed her teeth-baring happy face. “I’ve strived to become an empathetic robot,” she said with an off cadence and inflection. She told a handful of jokes that didn’t land, for one, an attempt at a witty reversal: “I know humans are smart and very programmable.” Her face didn’t register the failure; she smiled, unembarrassed. When the interviewer asked her a question about whether robots can be conscious and self-aware, she bobbed and weaved by asking a question back. In his reply to Sophia’s question, the interviewer said lots of people are scared about human-like AI because of movies like Blade Runner. She answered with an eerie, canned flippancy, “Oh, Hollywood again,” and chided him for watching too many movies. She closed by soliciting checks from the investors in the audience and flashing her teeth again.

I met the father and child in a TV program I was watching on the treadmill at the gym. I think the show was a KQED special either on robotics and AI generally or on their military use. The father wasn’t humanoid. If he had a face it was an interface. If he had a voice it was whatever sound he made when he issued a bomb. He was an automated drone, who assessed the success of his missions by running an algorithm weighing different ethical parameters. How many civilians were killed as compared to targeted enemies? How many schools, hospitals, homes destroyed as compared to weapon stores, combat training facilities, compounds housing enemy combatants and leaders. If the numbers showed the last mission to be more unethical than desired but still acceptable, he would be demoted to a mission with a lower risk for ethical error. If the last mission was determined to be unsatisfactorily unethical, he would be decommissioned and return home.

Sophia and the drone’s child was a brown-skinned boy robot. I didn’t catch what purpose he was invented for, but I got the feeling he was meant to be deployed in remote battlefields—maybe deserts, maybe dense forests. I couldn’t tell whether or how he was weaponized, but I understood his child-like appearance was intended to disarm. His animatronic face seemed to register two emotions only: bewilderment and wide-eyed, open-mouthed terror. Seeing his face move, hearing the gears grind as his mouth yawned into the shape of a scream, I couldn’t take any more. I changed the channel.

 

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: