Matthew T Grant

Icon

Tall Guy. Glasses.

The Irony of Authenticity and the Authenticity of Irony

authenticity and social mediaSeems like nowadays, authentic is the thing to be.

Mitch Joel calls authenticity, “the cost of admission” in the Web 2.0 world, though he warns: “Being authentic isn’t always good. Let me correct that, being authentic is always good, but the output of being authentic [ie, revealing your flaws, shortcomings, and “warts” – Matt] is sometimes pretty ugly.”

HubSpot TV called the “marketing takeaway” of a notorious scandal involving a company paying for positive online reviews: “Be authentic. If not, you will get caught.”

When CC Chapman was among the Twitterati recently profiled by the Boston Globe, one of his Facebook friends asked, “Ever wondered why you have such a following?” He responded, “I wonder it all the time actually. I asked once and the general theme in the answers was my honest approach between life, family and work when it came to sharing things.” To which another friend replied, “Exactly right CC. You don’t try to be someone you’re not. It’s that authenticity that attracts people.”

Among the first to identify this flight to authenticity were James H. Gilmore & B. Joseph Pine II, who wrote Authenticity: What Consumers Really Want (2007). What notably separates them from contemporary partisans of authenticity is that their take is tinged with irony, an irony most evident in their promise to define “how companies can render their offerings as “really real.”’

This irony is refreshing because invocations of authenticity regularly fail to acknowledge or appreciate what is inherently contradictory about the concept. Said failure begins with the mistaken equation of authenticity and honesty (see above). Honesty may be a characteristic of an individual, but it is not a characteristic of authenticity. For example, an authentically honest person is being “authentic” when she is being honest, but an authentically devious person is being just as authentic when he is lying.

Similarly, we don’t call a painting an “authentic Rembrandt” because it is honest; we call it authentic because it was really painted by Rembrandt, unlike the forgery which only looks like it was painted by him. In other words, we call it authentic because it is what it seems to be. Herein lies the essential contradiction of authenticity: Authenticity isn’t about being real; authenticity is about really being what you seem to be.

The centrality of “seeming” to authenticity becomes even more clear when we call a person “authentic.” Such a designation usually means, “the way this person acts transparently or guilelessly reflects who they really are.” Because our sense of their authenticity depends on an assessment a person’s behavior, we should pay special attention to the fact that authenticity is performed; as paradoxical as it may sound, authenticity is an “act,” in the theatrical sense. (Which is why I always say, “Be yourself. It’s the perfect disguise.”)

The bigger problem though, is that our notion of authenticity assumes we really know who someone is and likewise the imperative to “be authentic” assumes we know who we really are.

Our identity, “who we really are,” is always contingent, provisional, and changing. It is an amalgam of who we want to be, who we mean to be, who we’re supposed to be, who we have to be, and who we are in spite of ourselves. Moreover, no matter how much we’d like to think so, we are not the authority on who we really are since it includes much that cannot be known by us. Indeed, and again paradoxically, we can’t know anything about ourselves without assuming the perspective of another, that is by identifying with someone else and precisely NOT being ourselves.

Just as one must consult an expert to determine the authenticity of a treasured heirloom – it can’t speak for itself – we can’t call ourselves “authentic;” that is for others to decide. At best, and this is the irony, we can always only strive to “seem” authentic. True authenticity calls for acknowledging that “who you are” is an open question and, moreover, a collaborative work in progress.

In the end, we must distance ourselves from our claims or pretensions to authenticity. We must call it into question and even suggest, especially to ourselves, that it may just be a ego-driven pose. (Hey, it just may be!) This distancing, implicitly critical and potentially mocking (or at least deprecating), is the classic stance of irony. And though the dodginess of irony (“did he mean that or didn’t he?”) seems to put it at a distinct remove from authenticity (“this is exactly what I think”), it actually mirrors the open-ended, unresolved, and ever-changing “dodginess” of reality itself.

Which is to say that irony, as a posture, an attitude, and as an approach, is more authentic (in the sense of “really being the way reality seems to be”) than honesty, sincerity, openness, or any of the other qualities that pass for such. The tragedy (or irony) is, however, that it will always seems less than authentic due to the all-too-human suspicion of ambiguity, indeterminacy, uncertainty, and, lest we forget, the wily intelligence native to irony and the ironist.

Image Courtesy of Mary Hockenbery.

More Thoughts on Design Thinking

Pull a thread on the Web and it unravels the universe. Having accidentally stumbled across the concept of “design thinking,” I found that there was a whole, thriving discourse on the subject. Who knew? I wrote a brief series of posts on my discoveries. This was the third and was originally published on March 14, 2007.

design thinking and the evolution of creative work according to David ArmanoI’m a latecomer and a slow learner.

My thoughts on design thinking began as a reaction to something written by Dan Saffer of Adaptive Path. Little did I know as I was penning my post entitled, “Thinking about ‘Design Thinking,'” that that self-same Dan Saffer had written a post with the exact same title almost exactly two years ago! That article includes a helpful stab at defining the characteristics of design thinking, “if there is such a thing,” as he wrote way back then.

One characteristic is “Ideation and Prototyping” – “The way we find … solutions is through brainstorming and then, importantly, building models to test the solutions out.” Actually making things to see if they work or solve the problem at hand is key to designing anything – hence his lament as he sees design schools move to an overly conceptual notion of design thinking, one that neglects craft and making and, ultimately, produces designers that can’t.

Oddly enough, I found Saffer’s earlier post in a rather roundabout fashion. The first event in this twisted chain came in the form of an email from David Armano, whom I had name-checked in my previous post. He pointed me to a post on his blog concerning the evolution of creativity in a decidedly inter-disciplinary and multi-dimensional direction. As an example of someone who embodies this emergent creativity, Armano referred to the site of one Zachary Jean Paradis, who graduated from the Institute of Design at the Illinois Institute of Technology.

What did I find on Mr. Paradis’ blog? You guessed it, a long, thoughtful essay on none other than “Design Thinking.” In fact, it was via this essay that I “discovered” Mr. Saffer’s earlier thoughts and my own intellectual tardiness.

Before I leave the topic of design thinking and return once again to more familiar ground, like Second Life, I will mention what I found most illuminating about Paradis’ perspective. First, he conceives of design thinking as an approach to “developing new offerings” which should not, to Mr. Saffer’s point, be equated with “professional design as it is taught.”

Secondly, because this approach is “purposeful,” he sees it as inherently integrative. He writes, “When developing some new offering with a team, members share the common goal of producing something contextually relevant.” The complexity of product/offering development, and the fact that the process must result in something that works in the world and meets definable needs of end-users/consumers, imposes the dual need for multiple disciplinary perspectives and their successful integration.

Finally, and as he says, “most importantly,” design thinking provides guidelines for collaborative work rather than prescribing a specific process for executing it. This kind of collaboration requires individuals who possess “a certain breadth and depth of knowledge of complementary disciplines,” precisely the new kind of “Creative” David Armano describes on his blog. Paradis ends his essay by insisting that, “… organizations must begin to recognize that moderately deep breadth is as important if not more so than deep specialization in addressing complex problems.”

To bring things more or less full circle, I think it bears stating that only by doing work on a series of increasingly complex and diverse projects, and not through schooling of any sort, can one acquire this “moderately deep breadth.”

Image Courtesy of dbostrom.

Design Thinking and the Serendipitous Web

This was the second of a brief series of posts that I wrote on the subject of design thinking. It was originally published on March 9, 2007.

I had never really thought about “design thinking” until I read the blog post at Adaptive Path that led me to write my last post. The funny thing is that as I started to research the concept, I noticed that earlier that same day I had bookmarked, obviously without much thought, a blog called Design Thinking Digest, which is maintained by Chris Bernard, Microsoft User Experience Evangelist and which I was introduced to via this post on David Armano’s blog.

As if it weren’t strange enough that the mighty and mysterious Web would bombard my subconscious with secret messages about “design thinking” so as to get me to write about it, today Bernard is blogging about the design approach of BMW’s Chris Bangle and, guess what? Mr. Bernard is very taken with the fact that when designing cars, Bangle focuses on “the doing.” He writes, “His teams get outside to look at the car, they craft and sculpt designs with their hands. They are constantly on the lookout for new ways that they can make things, they spend as much time thinking about not the actual creation but the TOOLS they use to create with too.”

That is, a critical component of true “design thinking” as practiced by a successful designer like Bangle and admired by an evangelizing software designer like Bernard is “doing” – getting your hands dirty, working with tools, making things. But that was, like, exactly the point I was “making” in my initial post on “design thinking”!!!

Is the Web reading my mind?

More frighteningly, is the Web writing my mind?

Thinking about ‘Design Thinking’

An article by Dan Saffer at Adaptive Path got me thinking about design thinking, which led to a series of posts on the subject. This post was first published on March 7, 2007.

design thinking and Adaptive PathI subscribe to the feed from Adaptive Path’s blog because, as they say here in Boston, the people who work there are “wicked smaht.” As a result, and thanks to the magic of RSS feedings, I spotted this impassioned plea from one of the Adaptive Pathers, Dan Saffer, for design schools to start teaching design again.

Saffer’s main complaint is that design schools have moved towards a curriculum centered around “design thinking” and away from a well-rounded, practical education focused on “thinking and making and doing.” In his view, the real work of design consists in the process of moving from concept to realization; stopping at the idea stage means you’ve only done the easy part. He writes, “Some notes on a whiteboard and a pretty concept movie or storyboard pales in comparison to the messy world of prototyping, development, and manufacturing,” and then puts a finer point on it by adding, “It’s harder to execute an idea than to have one…”

Having encountered this lament in one form or another many times – “No one understands good typography anymore;” “People try to design when they can’t even draw,” “They think the computer’s going to do it all for them,” etc. – that aspect of his argument wasn’t new. Rather, what drew my attention was the phrase “design thinking” and his characterization of it as “just thinking.”

Since I was pretty sure that it meant more than that, I did a little research and found a Business Week article from last October called, “The Talent Hunt,” which describes Mozilla turning to the folks at Stanford’s Hasso Plattner Institute of Design (aka, the “D-School”) in search of a strategy for expanding the adoption of Firefox. In light of Saffer’s comments, I was struck by the following sentences: “Business school students would have developed a single new product to sell. The D-schoolers aimed at creating a prototype with possible features that might appeal to consumers.” Likewise, in a lecture at MIT entitled “Innovation Through Design Thinking,” IDEO’s Tim Brown talks about the process they follow often involving “a hundred prototypes created quickly, both to test the design and to create stakeholders in the process.”

As I understand it, the “thought leaders” behind “design thinking” (you can find a good overview of them and their thoughts here on Luke Wroblewski’s site) advocate the application of design methods to problems of business strategy precisely because it places a heavy emphasis on prototyping and real-world pragmatics. If Saffer is correct that “design thinking” as taught in design schools is primarily about thinking, and not about making things and seeing if they work, then I would say the real problem is that they are not actually teaching “design thinking.”

But then again, I never attended design school. If you have, do you think that Saffer’s criticism rings true?

Image Courtesy of dsevilla.

Information Design and Visual Complexity

Investigations into the use of ActionScript 3.0 and dynamically generated representations of data sets led me to write this post on visual complexity, first published on December 1, 2006.

3090102907_c3b7c67a13_mMy research on Amaznode (see this post) reminded me that there are a lot of folks out there working on innovative and practical ways to display complex sets of data and networks of information.

While I thought I was so special for stumbling across Amaznode at Adobe Labs, I soon discovered that someone had actually referenced it back in September in a comment on this post from David Armano’s blog. In the aforementioned post, Armano praises the Visual Thesaurus, which depicts relationships between words in the same way Amaznode depicts relationships between products on Amazon (except the Visual Thesaurus actually allows for much deeper exploration of the related words it displays).

The Visual Thesaurus is just one example of information design that shows up on the site Visual Complexity, the stated intention of which is “to be a unified resource space for anyone interested in the visualization of complex networks.” This site is endlessly fascinating both due to the ingenious (and sometimes oddly beautiful) ways that people have devised to portray complex, densely interrelated systems, as well as due to the range of data, be it business-related (for example, what patterns might we discern by examining 10 million receipts from a large DIY store?) or just strange, that they have chosen to model.

As the serendipity of blogging and intellectual interest would have it, a colleague of mine brought The Baby Name Wizard’s Name Voyager to my attention yesterday. The NameVoyager shows the waxing and waning fortunes of baby names from the 1880s to the present. (It allows you to see, for example, that the name “Chester” peaked in popularity around 1910.) The creator of the NameVoyager is one Martin Wattenberg who has come up with a number of methods for graphing complex processes such as the editing history of Wikipedia pages, among other things. In fact, he has been so inventive that several samples of his work show up on the Visual Complexity website, which I didn’t even know existed two days ago!

Image Courtesy of michael.heiss.

Product Placement in the Real World

Another blast from the past, originally published on Aquent’s Talent Blog, December 11, 2006. Summary: Advertisers must consider “all the world’s a stage” and manufacture ubiquitous product placement.

2534254541_06b30f2c59_m“As a result of the growing popularity of consumer-generated pictures, videos and e-mail messages on Internet sites like YouTube and Myspace, advertisers are getting consumers to essentially do their jobs for them,” according to this New York Times article which focuses on the emergence of Times Square as “a publishing platform.”

In brief, thanks to the ubiquity of digital cameras and the rise of user-generated and social networking sites, marketers are finding that “experiential marketing” (aka, “publicity stunts“), such as Charmin’s fancy public restrooms, are growing long legs on the Web. These restrooms alone, “[u]sed by thousands in Times Square [were] viewed by 7,400 Web users on one site alone.”

While this raises a lot of interesting questions about the meaning of “product placement” and whether or not advertisers should start courting, and compensating, particularly popular or prolific private citizens for featuring their products on Flickr and YouTube, I was particularly struck by the formulation “getting consumers to essentially do their jobs for them.” Now it is certainly the case that YouTubers and Flickr-ers are, wittingly or un-, doing things that benefit advertisers and the brands they promote. But so is anyone wearing a t-shirt with a visible logo.

It is not the job of advertisers to wander around the city in sandwich boards; it is their job, however, to come up with novel ways of getting brand-specific messages out to the world. If they create a spectacle noteworthy enough to generate spontaneous buzz promoted by random individuals, then they have done exactly what they are supposed to do. In fact, by now, I’d be astonished if the folks who conceived of and executed these events weren’t planning on a significant “web” effect. In a sense, if no one had posted this stuff to the Web, then you could rightly accuse advertisers of shirking.

Or do I, and not the paper of record, misunderstand what advertisers are supposed to do?

Image Courtesy of funadium.

Credentials, Connections, and Authority

For several years, I wrote an “Ask the Expert” career-advice column for the American Marketing Association. This post grew out of that. It was originally published on Aquent’s Talent Blog February 27, 2009. – Matt

253061533_f35ce098dd_m.jpgI was working my latest Ask the Expert column for the AMA and was surprised by the number of questions I got concerning certification specifically (“Can you recommend a marketable web certification?” “How do I become a Professional Certified Marketer?”) or credentials more broadly (“Is there a possibility a company won’t hire me (even if I have a Masters Degree) just because they do not think the University I went too meets the “top-of-the-line” criteria?”).

I understand that people are looking for something to give them an edge in a highly competitive market and that they may have time to devote to education and personal development, but the value of certification per se seems dubious to me, particularly in the interactive/marketing space.

While there are some certifications that I’m told are meaningful – Google Adwords Certification and Project Management Professional Certification being two examples – my basic assumption is that they are at best a useful addition to a record of proven experience as a practitioner in a particular discipline.

My thoughts on this subject were paraphrased by Dave Atkins on Twitter yesterday when he wrote, “Connections are more indicative of authority than credentials.” Some pointed out that connections may be more indicative of personality than authority, and I can see that, but the more important point to me was that authority does not come from credentials. Authority reflects a respected position within a community which is generally earned by demonstrated ability and measured by influence.

In other words, if you want to be a more attractive candidate for a marketing or interactive position, focus on establishing authority by earning the respect and recognition of your peers. Your authority and experience make your credentials meaningful, not the other way around.

Image Courtesy of jurvetson.

Just a Moment

3044226914_b639b96df9_mWent to see a jazz trio called “Fly” last night: Mark Turner (saxophone), Larry Grenadier (bass), Jeff Ballard ) drums. Their performance reminded me how much I love improvised music played by intuitive and gifted people who know how to spontaneously combine harmonic complexity and dynamic subtlety with a searching and startling lyricism.

Just as we’re taught that a line contains an infinite series of points; music, for it’s part, shows us the infinite divisibility of time. The limits of this division are set, on the one hand,  by the frequency of tonal or rhythmic variation attainable by the musicians and, on the other, by the patience, attentiveness, and perceptual acuity of the audience.

Events apparently never exhaust the between of instants, which always allows for ever more vanishingly brief happenings. By contrast, a moment is not a measure of time, but a state of consciousness. Music, like the music I heard that night, ebbs and crashes around this moment of awareness causing us to ask not how soon is now, but how long?

Image Courtesy of overdrive_cz.

Is 4-D the New 3-D? Thinking about Photosynth

One thing that irks me about the 3-D world is that it’s hard to find things in it. I’ve often been looking for my keys or a book or a CD and wished that I could just open up a search box, type in the object of my fruitless and frustrating search, and instantly locate the darn thing. The fact that 3-D spaces can be difficult to search visually is one thing that stands in the way of the the 3-D desktop metaphor, IMHO.

Then I remembered Photosynth, a software that allows you to make 3-D models of places from 2-D images which, thanks to the magic of tagging, come replete with a conveniently searchable 4th dimension (raising the question: Is information, and not time, the 4th dimension?).

I first wrote about Photosynth on Aquent’s Talent Blog in 2007. Here’s the original post:

Visual Information, Design, and the Future

photosynthjp.jpgA friend of mine passed this link along to me. It is a video of a software demo at the TED Conference back in March. The speaker is Blaise Aguera y Arcas who was demoing two software packages – Seadragon, which is used to browse large amounts of visual data, and Photosynth, which organizes pictures into navigable, 3-D spaces.

This stuff really has to be seen to be believed. It represents the future of how we will interact with visual data and also highlights that we are already creating virtual models of the world we live in by uploading content to websites like Flickr. There is also a cool example of an explorable, high resolution advertisement for Honda. Imagine if a picture in a magazine contained the richness of data you could find on an entire website. Mind-boggling.

Microsoft acquired Seadragon back in February. Aguera y Arcas makes a funny comment about that when people start clapping at the amazing things he’s showing them. Have you ever attended a software demo where people burst into spontaneous applause?

Image Courtesy of Live Labs.

The Consolations of Conspiracy Theory

2876550480_fb353da796_mEver since I realized that there was an “official story,” on the one hand, and a very complicated, to some degree unknowable, and to some degree intentionally obscured, reality on the other, I’ve been interested in conspiracy theories. From Holy Blood, Holy Grail to Loose Change, from occult Nazism to the reign of the reptoids, I’ve consistently been amused, amazed, and disturbed by the fantastic proliferation of alternative world histories and astonishing speculation about who’s really running things.

I’ve always tended to approach these theories with a Muldaurian “I want to believe” attitude, but have also always been disappointed when I dug down into the details. While it may be true that I’ve never met a conspiracy theory I didn’t like, it’s also true that I’ve never met a conspiracy theory that wasn’t riddled with holes, hallucinations, and brain-rending leaps of (il)logic. Reading this stuff has frequently been edifying and even, in a strange way, inspiring, but it has never been convincing.

Although the truth is undoubtedly out there, conspiracy theories are not about the truth. Their primary purpose is to forge a semblance of order from the relentless rush and incomprehensible sweep of events on both the human and cosmic scale. Scientific discovery has unveiled a universe of overwhelming temporal and spatial vastness, the mass media continually inundate us with an unassimilable torrent of devestating reports from the bottomless well of human suffering, and the traditional (i.e., religious or mythical) filters no longer have the power to channel our experience into comforting or even remotely manageable frames of reference.

Still, the thought that our lives are “meaningless commas in the sentence of time,” or that we are blindly stumbling, ‘neath a protective veil of self-deception, through a labyrinthine vortex of genetically driven ego-trips and Nietzschean/Orwellian power-games devoid of exit or purpose, is for most of us unbearable. So we clutch at the straws offered by the conspiracy theorists (not to mention the good old news) because they tell us that, even if it is the Greys or the Illuminati or the CIA or the World Bank or the Council on Foreign Relations or the Trilateral Commission, or whatever, at least SOMEONE is in charge and everything is at least going according to SOME plan, as nefarious, diabolical, or alien as that plan may be.

The question is: Why are we consoled by the thought of SOMETHING, even SOMETHING MALEVOLENT, behind EVERYTHING?

Why, on the contrary, is the notion that all we are and experience arises from and inevitably returns to a primordial, entropic chaos – in other words, to nothingness – so difficult, even impossible, to accept let alone embrace?

Or is it?

Image Courtesy of Midnight-digital.