Search This Blog

Friday, May 9, 2014

Inson Wood on Steve Jobs' Tweaker vs Jacques Derrida's Critic - Philosophy, Design and Genius.

The journey is destination - Inson Wood approaches design from a philosophical stance, often quoting French Literary Critic Jacques Derrida, while imbuing his designs with the flexibility of Steve Jobs iPhone "an app for every client and every lifestyle. A true designer should be both a bar tender and a psychologist - adapt to any mind and be a good listener." 
Maximalism with Art by Jeff Koons - Upper East Side Townhouse by Inson Dubois Wood.  The beauty of design is that with a relatively empty space - any style can be imposed - it is the overall experience that is the take away.  Rock star chic or Louis 16 with a mod twist. 
Inson Wood tells us that his reverence for Philosopher and Literary Critic, Jacques Derrida, did as much for the design community as, Steve Jobs, did for the tech world. Both died prematurely from Pancreatic Cancer, but also left their indelible scar on the psyche of the planet in a myriad of ways.


A powder room by Inson Dubois Wood with minimalist features and only man made materials such as a white carbon fiber ceiling, Glass floors, Titanium Walls. 






According to Inson Wood, there is no such thing as a period room, even in the times of Louis 16, there would be a gift presented by a foreign diplomat that was coveted and incorporated into the overall scheme - it was considered fashionable to have things from distant lands such as India, China, or Africa.  
Inson Wood likes to think of the furniture he designs as streamlined as possible as if worn by a thousand years of use -paired down to its most elemental parts. 

A collaboration with Hermes where Inson Wood mimicked the ceramics in a fantastic vaulted ceiling with a similar pattern and color scheme...Steve Jobs was all about detail - endless attention to detail. 

Jude Law Residence - designed by Inson Dubois Wood LLC - combines the beauty of a 1890 church and hand scraped floors and beams with 1960 mid century modern desk by Wabbes and a Isamu Nouguchi sculpture. Derrida's influence which holds that elements at the periphery can be highly valuable and coveted such as this Morrocan high pile shag rug situated in a Church which were normally laid out on a mud floor in a tented space.

A family retreat in a remote part of Northern Thailand designed by Inson Dubois Wood. The simplicity of Zen was one of Jobs favorite mantras.

NOVEMBER 14, 2011

Jobs’s sensibility was more editorial than inventive. “I’ll know it when I see it,” he said.
Jobs’s sensibility was more editorial than inventive. “I’ll know it when I see it,” he said.


Not long after Steve Jobs got married, in 1991, he moved with his wife to a nineteen-thirties, Cotswolds-style house in old Palo Alto. Jobs always found it difficult to furnish the places where he lived. His previous house had only a mattress, a table, and chairs. He needed things to be perfect, and it took time to figure out what perfect was. This time, he had a wife and family in tow, but it made little difference. “We spoke about furniture in theory for eight years,” his wife, Laurene Powell, tells Walter Isaacson, in “Steve Jobs,” Isaacson’s enthralling new biography of the Apple founder. “We spent a lot of time asking ourselves, ‘What is the purpose of a sofa?’ ”
It was the choice of a washing machine, however, that proved most vexing. European washing machines, Jobs discovered, used less detergent and less water than their American counterparts, and were easier on the clothes. But they took twice as long to complete a washing cycle. What should the family do? As Jobs explained, “We spent some time in our family talking about what’s the trade-off we want to make. We ended up talking a lot about design, but also about the values of our family. Did we care most about getting our wash done in an hour versus an hour and a half? Or did we care most about our clothes feeling really soft and lasting longer? Did we care about using a quarter of the water? We spent about two weeks talking about this every night at the dinner table.”
Steve Jobs, Isaacson’s biography makes clear, was a complicated and exhausting man. “There are parts of his life and personality that are extremely messy, and that’s the truth,” Powell tells Isaacson. “You shouldn’t whitewash it.” Isaacson, to his credit, does not. He talks to everyone in Jobs’s career, meticulously recording conversations and encounters dating back twenty and thirty years. Jobs, we learn, was a bully. “He had the uncanny capacity to know exactly what your weak point is, know what will make you feel small, to make you cringe,” a friend of his tells Isaacson. Jobs gets his girlfriend pregnant, and then denies that the child is his. He parks in handicapped spaces. He screams at subordinates. He cries like a small child when he does not get his way. He gets stopped for driving a hundred miles an hour, honks angrily at the officer for taking too long to write up the ticket, and then resumes his journey at a hundred miles an hour. He sits in a restaurant and sends his food back three times. He arrives at his hotel suite in New York for press interviews and decides, at 10 p.m., that the piano needs to be repositioned, the strawberries are inadequate, and the flowers are all wrong: he wanted calla lilies. (When his public-relations assistant returns, at midnight, with the right flowers, he tells her that her suit is “disgusting.”) “Machines and robots were painted and repainted as he compulsively revised his color scheme,” Isaacson writes, of the factory Jobs built, after founding NeXT, in the late nineteen-eighties. “The walls were museum white, as they had been at the Macintosh factory, and there were $20,000 black leather chairs and a custom-made staircase. . . . He insisted that the machinery on the 165-foot assembly line be configured to move the circuit boards from right to left as they got built, so that the process would look better to visitors who watched from the viewing gallery.”
Isaacson begins with Jobs’s humble origins in Silicon Valley, the early triumph at Apple, and the humiliating ouster from the firm he created. He then charts the even greater triumphs at Pixar and at a resurgent Apple, when Jobs returns, in the late nineteen-nineties, and our natural expectation is that Jobs will emerge wiser and gentler from his tumultuous journey. He never does. In the hospital at the end of his life, he runs through sixty-seven nurses before he finds three he likes. “At one point, the pulmonologist tried to put a mask over his face when he was deeply sedated,” Isaacson writes:

Jobs ripped it off and mumbled that he hated the design and refused to wear it. Though barely able to speak, he ordered them to bring five different options for the mask and he would pick a design he liked. . . . He also hated the oxygen monitor they put on his finger. He told them it was ugly and too complex.
One of the great puzzles of the industrial revolution is why it began in England. Why not France, or Germany? Many reasons have been offered. Britain had plentiful supplies of coal, for instance. It had a good patent system in place. It had relatively high labor costs, which encouraged the search for labor-saving innovations. In an article published earlier this year, however, the economists Ralf Meisenzahl and Joel Mokyr focus on a different explanation: the role of Britain’s human-capital advantage—in particular, on a group they call “tweakers.” They believe that Britain dominated the industrial revolution because it had a far larger population of skilled engineers and artisans than its competitors: resourceful and creative men who took the signature inventions of the industrial age and tweaked them—refined and perfected them, and made them work.
In 1779, Samuel Crompton, a retiring genius from Lancashire, invented the spinning mule, which made possible the mechanization of cotton manufacture. Yet England’s real advantage was that it had Henry Stones, of Horwich, who added metal rollers to the mule; and James Hargreaves, of Tottington, who figured out how to smooth the acceleration and deceleration of the spinning wheel; and William Kelly, of Glasgow, who worked out how to add water power to the draw stroke; and John Kennedy, of Manchester, who adapted the wheel to turn out fine counts; and, finally, Richard Roberts, also of Manchester, a master of precision machine tooling—and the tweaker’s tweaker. He created the “automatic” spinning mule: an exacting, high-speed, reliable rethinking of Crompton’s original creation. Such men, the economists argue, provided the “micro inventions necessary to make macro inventions highly productive and remunerative.”
Was Steve Jobs a Samuel Crompton or was he a Richard Roberts? In the eulogies that followed Jobs’s death, last month, he was repeatedly referred to as a large-scale visionary and inventor. But Isaacson’s biography suggests that he was much more of a tweaker. He borrowed the characteristic features of the Macintosh—the mouse and the icons on the screen—from the engineers at Xerox parc, after his famous visit there, in 1979. The first portable digital music players came out in 1996. Apple introduced the iPod, in 2001, because Jobs looked at the existing music players on the market and concluded that they “truly sucked.” Smart phones started coming out in the nineteen-nineties. Jobs introduced the iPhone in 2007, more than a decade later, because, Isaacson writes, “he had noticed something odd about the cell phones on the market: They all stank, just like portable music players used to.” The idea for the iPad came from an engineer at Microsoft, who was married to a friend of the Jobs family, and who invited Jobs to his fiftieth-birthday party. As Jobs tells Isaacson:

This guy badgered me about how Microsoft was going to completely change the world with this tablet PC software and eliminate all notebook computers, and Apple ought to license his Microsoft software. But he was doing the device all wrong. It had a stylus. As soon as you have a stylus, you’re dead. This dinner was like the tenth time he talked to me about it, and I was so sick of it that I came home and said, “Fuck this, let’s show him what a tablet can really be.”
Even within Apple, Jobs was known for taking credit for others’ ideas. Jonathan Ive, the designer behind the iMac, the iPod, and the iPhone, tells Isaacson, “He will go through a process of looking at my ideas and say, ‘That’s no good. That’s not very good. I like that one.’ And later I will be sitting in the audience and he will be talking about it as if it was his idea.”
Jobs’s sensibility was editorial, not inventive. His gift lay in taking what was in front of him—the tablet with stylus—and ruthlessly refining it. After looking at the first commercials for the iPad, he tracked down the copywriter, James Vincent, and told him, “Your commercials suck.”

“Well, what do you want?” Vincent shot back. “You’ve not been able to tell me what you want.”
“I don’t know,” Jobs said. “You have to bring me something new. Nothing you’ve shown me is even close.”
Vincent argued back and suddenly Jobs went ballistic. “He just started screaming at me,” Vincent recalled. Vincent could be volatile himself, and the volleys escalated.
When Vincent shouted, “You’ve got to tell me what you want,” Jobs shot back, “You’ve got to show me some stuff, and I’ll know it when I see it.”
I’ll know it when I see it. That was Jobs’s credo, and until he saw it his perfectionism kept him on edge. He looked at the title bars—the headers that run across the top of windows and documents—that his team of software developers had designed for the original Macintosh and decided he didn’t like them. He forced the developers to do another version, and then another, about twenty iterations in all, insisting on one tiny tweak after another, and when the developers protested that they had better things to do he shouted, “Can you imagine looking at that every day? It’s not just a little thing. It’s something we have to do right.”
The famous Apple “Think Different” campaign came from Jobs’s advertising team at TBWA\Chiat\Day. But it was Jobs who agonized over the slogan until it was right:

They debated the grammatical issue: If “different” was supposed to modify the verb “think,” it should be an adverb, as in “think differently.” But Jobs insisted that he wanted “different” to be used as a noun, as in “think victory” or “think beauty.” Also, it echoed colloquial use, as in “think big.” Jobs later explained, “We discussed whether it was correct before we ran it. It’s grammatical, if you think about what we’re trying to say. It’s not thinkthe same, it’s think different. Think a little different, think a lot different, think different. ‘Think differently’ wouldn’t hit the meaning for me.”
The point of Meisenzahl and Mokyr’s argument is that this sort of tweaking is essential to progress. James Watt invented the modern steam engine, doubling the efficiency of the engines that had come before. But when the tweakers took over the efficiency of the steam engine swiftly quadrupled. Samuel Crompton was responsible for what Meisenzahl and Mokyr call “arguably the most productive invention” of the industrial revolution. But the key moment, in the history of the mule, came a few years later, when there was a strike of cotton workers. The mill owners were looking for a way to replace the workers with unskilled labor, and needed an automatic mule, which did not need to be controlled by the spinner. Who solved the problem? Not Crompton, an unambitious man who regretted only that public interest would not leave him to his seclusion, so that he might “earn undisturbed the fruits of his ingenuity and perseverance.” It was the tweaker’s tweaker, Richard Roberts, who saved the day, producing a prototype, in 1825, and then an even better solution in 1830. Before long, the number of spindles on a typical mule jumped from four hundred to a thousand. The visionary starts with a clean sheet of paper, and re-imagines the world. The tweaker inherits things as they are, and has to push and pull them toward some more nearly perfect solution. That is not a lesser task.
Jobs’s friend Larry Ellison, the founder of Oracle, had a private jet, and he designed its interior with a great deal of care. One day, Jobs decided that he wanted a private jet, too. He studied what Ellison had done. Then he set about to reproduce his friend’s design in its entirety—the same jet, the same reconfiguration, the same doors between the cabins. Actually, not in its entirety. Ellison’s jet “had a door between cabins with an open button and a close button,” Isaacson writes. “Jobs insisted that his have a single button that toggled. He didn’t like the polished stainless steel of the buttons, so he had them replaced with brushed metal ones.” Having hired Ellison’s designer, “pretty soon he was driving her crazy.” Of course he was. The great accomplishment of Jobs’s life is how effectively he put his idiosyncrasies—his petulance, his narcissism, and his rudeness—in the service of perfection. “I look at his airplane and mine,” Ellison says, “and everything he changed was better.”
The angriest Isaacson ever saw Steve Jobs was when the wave of Android phones appeared, running the operating system developed by Google. Jobs saw the Android handsets, with their touchscreens and their icons, as a copy of the iPhone. He decided to sue. As he tells Isaacson:

Our lawsuit is saying, “Google, you fucking ripped off the iPhone, wholesale ripped us off.” Grand theft. I will spend my last dying breath if I need to, and I will spend every penny of Apple’s $40 billion in the bank, to right this wrong. I’m going to destroy Android, because it’s a stolen product. I’m willing to go to thermonuclear war on this. They are scared to death, because they know they are guilty. Outside of Search, Google’s products—Android, Google Docs—are shit.
In the nineteen-eighties, Jobs reacted the same way when Microsoft came out with Windows. It used the same graphical user interface—icons and mouse—as the Macintosh. Jobs was outraged and summoned Gates from Seattle to Apple’s Silicon Valley headquarters. “They met in Jobs’s conference room, where Gates found himself surrounded by ten Apple employees who were eager to watch their boss assail him,” Isaacson writes. “Jobs didn’t disappoint his troops. ‘You’re ripping us off!’ he shouted. ‘I trusted you, and now you’re stealing from us!’ ”
Gates looked back at Jobs calmly. Everyone knew where the windows and the icons came from. “Well, Steve,” Gates responded. “I think there’s more than one way of looking at it. I think it’s more like we both had this rich neighbor named Xerox and I broke into his house to steal the TV set and found out that you had already stolen it.”
Jobs was someone who took other people’s ideas and changed them. But he did not like it when the same thing was done to him. In his mind, what he did was special. Jobs persuaded the head of Pepsi-Cola, John Sculley, to join Apple as C.E.O., in 1983, by asking him, “Do you want to spend the rest of your life selling sugared water, or do you want a chance to change the world?” When Jobs approached Isaacson to write his biography, Isaacson first thought (“half jokingly”) that Jobs had noticed that his two previous books were on Benjamin Franklin and Albert Einstein, and that he “saw himself as the natural successor in that sequence.” The architecture of Apple software was always closed. Jobs did not want the iPhone and the iPod and the iPad to be opened up and fiddled with, because in his eyes they were perfect. The greatest tweaker of his generation did not care to be tweaked.
Perhaps this is why Bill Gates—of all Jobs’s contemporaries—gave him fits. Gates resisted the romance of perfectionism. Time and again, Isaacson repeatedly asks Jobs about Gates and Jobs cannot resist the gratuitous dig. “Bill is basically unimaginative,” Jobs tells Isaacson, “and has never invented anything, which I think is why he’s more comfortable now in philanthropy than technology. He just shamelessly ripped off other people’s ideas.”
After close to six hundred pages, the reader will recognize this as vintage Jobs: equal parts insightful, vicious, and delusional. It’s true that Gates is now more interested in trying to eradicate malaria than in overseeing the next iteration of Word. But this is not evidence of a lack of imagination. Philanthropy on the scale that Gates practices it represents imagination at its grandest. In contrast, Jobs’s vision, brilliant and perfect as it was, was narrow. He was a tweaker to the last, endlessly refining the same territory he had claimed as a young man.
As his life wound down, and cancer claimed his body, his great passion was designing Apple’s new, three-million-square-foot headquarters, in Cupertino. Jobs threw himself into the details. “Over and over he would come up with new concepts, sometimes entirely new shapes, and make them restart and provide more alternatives,” Isaacson writes. He was obsessed with glass, expanding on what he learned from the big panes in the Apple retail stores. “There would not be a straight piece of glass in the building,” Isaacson writes. “All would be curved and seamlessly joined. . . . The planned center courtyard was eight hundred feet across (more than three typical city blocks, or almost the length of three football fields), and he showed it to me with overlays indicating how it could surround St. Peter’s Square in Rome.” The architects wanted the windows to open. Jobs said no. He “had never liked the idea of people being able to open things. ‘That would just allow people to screw things up.’ ” 
Jaques Derrida: Decomposing Philosophy

Jacques Derrida is one of the most well known twentieth century philosophers. He was also one of the most prolific. Distancing himself from the various philosophical movements and traditions that preceded him on the French intellectual scene (phenomenology, existentialism, and structuralism), he developed a strategy called “deconstruction” in the mid 1960s. Although not purely negative, deconstruction is primarily concerned with something tantamount to a critique of the Western philosophical tradition. Deconstruction is generally presented via an analysis of specific texts. It seeks to expose, and then to subvert, the various binary oppositions that undergird our dominant ways of thinking—presence/absence, speech/writing, and so forth.
Deconstruction has at least two aspects: literary and philosophical. The literary aspect concerns the textual interpretation, where invention is essential to finding hidden alternative meanings in the text. The philosophical aspect concerns the main target of deconstruction: the “metaphysics of presence,” or simply metaphysics. Starting from an Heideggerian point of view, Derrida argues that metaphysics affects the whole of philosophy from Plato onwards. Metaphysics creates dualistic oppositions and installs a hierarchy that unfortunately privileges one term of each dichotomy (presence before absence, speech before writing, and so on).
The deconstructive strategy is to unmask these too-sedimented ways of thinking, and it operates on them especially through two steps—reversing dichotomies and attempting to corrupt the dichotomies themselves. The strategy also aims to show that there are undecidables, that is, something that cannot conform to either side of a dichotomy or opposition. Undecidability returns in later period of Derrida’s reflection, when it is applied to reveal paradoxes involved in notions such as gift giving or hospitality, whose conditions of possibility are at the same time their conditions of impossibility. Because of this, it is undecidable whether authentic giving or hospitality are either possible or impossible.
In this period, the founder of deconstruction turns his attention to ethical themes. In particular, the theme of responsibility to the other (for example, God or a beloved person) leads Derrida to leave the idea that responsibility is associated with a behavior publicly and rationally justifiable by general principles. Reflecting upon tales of Jewish tradition, he highlights the absolute singularity of responsibility to the other.
Deconstruction has had an enormous influence in psychology, literary theory, cultural studies, linguistics, feminism, sociology and anthropology. Poised in the interstices between philosophy and non-philosophy (or philosophy and literature), it is not difficult to see why this is the case. What follows in this article, however, is an attempt to bring out the philosophical significance of Derrida’s thought.

In 1930, Derrida was born into a Jewish family in Algiers. He was also born into an environment of some discrimination. In fact, he either withdrew from, or was forced out of at least two schools during his childhood simply on account of being Jewish. He was expelled from one school because there was a 7% limit on the Jewish population, and he later withdrew from another school on account of the anti-semitism. While Derrida would resist any reductive understanding of his work based upon his biographical life, it could be argued that these kind of experiences played a large role in his insistence upon the importance of the marginal, and the other, in his later thought.
Derrida was twice refused a position in the prestigious Ecole Normale Superieure (where Sartre, Simone de Beauvoir and the majority of French intellectuals and academics began their careers), but he was eventually accepted to the institution at the age of 19. He hence moved from Algiers to France, and soon after he also began to play a major role in the leftist journal Tel Quel. Derrida’s initial work in philosophy was largely phenomenological, and his early training as a philosopher was done largely through the lens of Husserl. Other important inspirations on his early thought include NietzscheHeidegger, Saussure, Levinas and Freud. Derrida acknowledges his indebtedness to all of these thinkers in the development of his approach to texts, which has come to be known as ‘deconstruction’.
It was in 1967 that Derrida really arrived as a philosopher of world importance. He published three momentous texts (Of GrammatologyWriting and Difference, and Speech and Phenomena). All of these works have been influential for different reasons, but it is Of Grammatology that remains his most famous work. In Of Grammatology, Derrida reveals and then undermines the speech-writing opposition that he argues has been such an influential factor in Western thought. His preoccupation with language in this text is typical of much of his early work, and since the publication of these and other major texts (including DisseminationGlasThe PostcardSpectres of MarxThe Gift of Death, and Politics of Friendship), deconstruction has gradually moved from occupying a major role in continental Europe, to also becoming a significant player in the Anglo-American philosophical context. This is particularly so in the areas of literary criticism, and cultural studies, where deconstruction’s method of textual analysis has inspired theorists like Paul de Man. He has also had lecturing positions at various universities, the world over. Derrida died in 2004.
Deconstruction has frequently been the subject of some controversy. When Derrida was awarded an honorary doctorate at Cambridge in 1992, there were howls of protest from many ‘analytic’ philosophers. Since then, Derrida has also had many dialogues with philosophers like John Searle (see Limited Inc.), in which deconstruction has been roundly criticised, although perhaps unfairly at times. However, what is clear from the antipathy of such thinkers is that deconstruction challenges traditional philosophy in several important ways, and the remainder of this article will highlight why this is so.