The tendency to deconstruct business functions into ever finer units of specialization – what I call business reductionism – threatens the kind of coordinated action required to execute on a good business strategy and must be resisted. The marketing in particular seems to have become susceptible to this sort of reductionism, brought on by the introduction of new marketing technologies , and hope for reconstitution rests on getting back to first principles.
Archive for the ‘Social Technology’ Category
Posted in Digital, Innovation, Marketing, Social Technology, tagged ad tech, branding, digital, digital marketing, marketing, social marketing, social media, technology on June 13, 2014| Leave a Comment »
In the aftermath of the financial crisis of late 2008 and the resulting global recession, defenders of the US financial system maintained that it was a source of US competitive advantage. Our capital markets facilitate the exchange of money and risk and thereby not only help maximize productivity but also attract businesses to our economy.
With the exception of some of the more exotic financial structures, I tend to agree. Just as the shift from barter to physical currency enabled new economic prosperity, more sophisticated financial instruments and capital markets (properly regulated) benefit individuals, businesses and entire economies.
The same could be true of social capital. Today people already exchange favors, debts of gratitude and obligations of reciprocity on a daily basis. Indeed, reciprocity seems to be deeply ingrained in the social nature of humanity. Dunbar’s number could be explained as the result of the cognitive limits on mental accounts in social groups.
What if companies could make all these invisible exchanges more visible? How might collaboration and innovation benefit? What would a system of social currency look like? How would it effect rewards and incentives?
Such a system is both possible and relatively easy to implement using a combination of gamification concepts and social technologies. Here is a hack that I have been mulling over recently.
Every year allocate to employees a set number of social currency point – say 200 per employee. They can choose to hold onto the points or they can award points to colleagues as a thanks for helping out – say, 1 point for a discrete favor or for providing some much needed insight as a subject matter expert, 5 points for consistently being a team player who goes above and beyond, 10 points for saving the day and making the difference on an important project. (Publishing very basic guidelines will help.)
Require some information on why the points are being awarded, and make that information public. This will enforce some discipline and avoid frivolous exchanges. Making the information public also reinforces the inherent value of the social currency.
For 1 point, the information required can just be an option in a drop down; for 5 points require an additional one or two sentences – for 10 or more, maybe a short blurb. Include the date. As you collect information and track awards, you build a data set on social interactions that can be analyzed later.
At the end of the year, everyone is entered into a raffle for prizes with the number of entries per person being somehow proportional to the points that the entrants have accrued by the year’s end. There can be multiple winners, and smaller prizes are probably better than really large ones. The goal is to make the whole process fun and slightly augment the value of the currency without distorting the normal social incentives so much that employees start gaming the system just to win.
At the end of each year you can see who are the experts and who are the team players. You can map out interactions and social networks and begin managing people with a new set of metrics.
We are only just beginning to understand the potential gains from harnessing social data. The Enron corpus has proven to be an invaluable trove of data for analysis. Imagine adding datasets from the exchange of social currency, one day maybe even gadgets like those that are already gaining popularity in the quantified self movement. All sorts of new, more rewarding and more productive organizational structures and management practices could be possible.
A conversation at work recently got me thinking a lot about the value of openness. The irony of that statement should become clear by the end of this post.
Since open innovation entered business parlance, one of the unsolved challenges has been intellectual property rights. In a more open world, who owns the rights to an idea? How can we more efficiently share rights and returns on our ideas?
Lawrence Lessig has written eloquently on the subject in books such as Free Culture: The Nature and Future of Creativity, and solutions such as Creative Commons have emerged to address the issue. Still, this problem of who has the rightful claim to an idea seems to persist.
Undoubtedly, in a capitalist system, great thinkers need to be assured of some return on their ideas to maintain the incentive for sharing their great thinking. On the other hand, “no man is an island.” The myth of the sole inventor is just that – a myth. The instances of simultaneous innovation are numerous and well documented, despite the predilection of popular consciousness to selectively remember only inventors such as Edison over Tesla.
The true origins of an idea are diffuse and imprecise. The “aha!” moment is as much as fallacy as the myth of the sole inventor. Insight comes from a pattern of multiple connections in more than one very real sense.
In neural activity, it is characterized by the gamma-waves commonly attributed to the formation of new neural connections in the brain. The brain holds all sorts of disparate ideas in memory, and the “aha!” moment of insight and innovation is the experience of finally connecting those ideas together in some novel way. (For a more accessible discussion of the neuroscience than the Beeman and Kounios article, I recommend a quick read of Imagine by John Lehrer.)
Steve Johnson was spot on when he said, “Chance favors the connected mind.” Of course, Johnson was also referring to social connections, which is the other very real sense in which insight comes from connections. Social interactions expose us to ideas, the same ideas that are held in the memory of our brain. There they sit, lying in wait for a pattern to emerge when one idea connects to another and still another – what Johnson terms, “the slow hunch.”
I think about innovation as a chemical reaction. A chemical reaction isn’t something you do; rather it’s more like something that happens. You can manipulate the conditions for a chemical reaction, by adding catalysts or energy in the form of heat or motion, but you cannot will it into being. A chemical reaction occurs when the molecules and elements collide, breaking old bonds and forming new ones. The more collisions are created, the faster the chemical reaction.
Innovation is the serendipitous collision of ideas. Those ideas originate from different places and at different times – intense brainstorming sessions at work, a good read on a long flight, a relaxing stroll on the beach. If one of those ideas came from a conversation with a coworker, does my employer have claim to my insight? (Now do you see the irony?)
If we want to innovate, it seems counter-intuitive that we would also want to reduce the number of collisions by talking only among ourselves in soft tones. The Internet has been such a fabulous engine of ingenuity because it is such a transparent and highly visible medium.
I suppose some compounds are more reactive than others and don’t need to be spurred on; some problems can be solved in the walled off gardens of R&D labs or stealth start-ups. I’m not convinced that’s the case with the really big game changers though.
AnnaLee Saxenian wrote a very interesting book on why Silicon Valley is in California today instead of along Route 128. Her argument that the openness of Silicon Valley (e.g. non-proprietary standards, decentralized organization, and cooperative exchange) was its advantage is very compelling.
Look at the example of the Homebrew Computer Club. Apple wasn’t invented in secret; the technology that Wozniak and Jobs used to revolutionize the computer industry was shared freely among like minded hobbyists. Indeed, if you read Walter Isaacson’s Steve Jobs it seems much of what contributed to the early success of Apple did not actually originate within Apple but PARC.
Apple out executed. That’s why being the most innovative doesn’t actually matter. It’s getting the business model right that wins. If you can’t do that, it doesn’t matter how long you hide your light under a rock. When you finally do start marketing to customers, another fast follower is going to eat your lunch.
Let me reiterate: I am not denying the risk that someone else could be more successful with your idea than you. I am questioning if that is a bad thing – exposing people that risk and letting the best person win. Wouldn’t that make for a more efficient market? It’s the only way you are going to get out of the building and be sure you are onto something. It’s the only way you are going to avoid group think and positive illusion about your own ideas.
Stop wasting time and energy trying to keep your ideas from other. Focus instead on unlocking the value from those ideas – before someone else does.
Just yesterday I had a very brief Twitter exchange with Grant McCracken that centered on an HBR article he had written. Grant was gracious to even acknowledge and reply to my tweet since I don’t really have the credentials to criticize a published author, respected academic and experienced consultant. (Please excuse the use of the more familiar first name; it represents no claim to true familiarity.) I suppose all is fair in love, war, and the pursuit of knowledge.
Today I felt compelled to develop my argument at bit further than
160 140 characters and hence this post. Keep in mind there may only be a distinction without a difference between Grant and I; I am coming at this issue as a music fan only, not with a more circumspect ethnographic/anthropological perspective or the years of experience as a music writer that Simon Reynolds has (the latter of which might actually be more of a burden in the context of a paradigm shift).
The question I think Grant and I are exploring is, “Is innovation in pop music really on the decline?” I feel the answer is no, as I suspect Grant does too, but my reasoning is different.
The Long Tail
Pop music is being displaced by indie music to such effect that my indie music snob friends have a name for it: mindie (mainstream + indie). The Big 4 are losing their grip on power. Some of the brightest stars in the recent music firmament have come from a comparably small label. Top 40 has become formulaic. Sure, club bangers will still sell because they still have their place, but now much more other music has its place as well. In terms of innovation and what music does for culture and society, the long tail is the new pop.
By now most people are familiar enough with the concept of the long tail first introduced by Chris Anderson that is has entered popular lexicon. While Anderson explored the economic implications, the long tail is a social phenomenon as well. Small, spatially or temporally separate communities are able to form around interests and connect in way that before were prohibitively costly or difficult. This is as true of furries as it is of eclectic tastes in music.
The means of music production and distribution have become accessible to the masses enabling much more experimentation. Not everything is going to be good but that’s how innovation works. You try something, fail, learn, try again, fail, try again . . . hence the Silicon Valley adage, “Hurry up and fail so that you can succeed sooner.” With more experimentation comes more variety and more choice for the music consumers, the fans.
When I was in high school and college, big time music fans and festival goers wore hemp necklaces and traded bootleg tapes in the tradition of The Dead, and punk rockers and rude boys copied song lyrics on the school photocopier. It was all too much effort for most people so radio ruled.
Now, discovering and sharing music is so much easier everyone is participating in ways they never did before. People aren’t dependent on terrestrial radio to get turned on to new music. There has been a boom in the festival scene, and Bonnaroo is booking acts far removed from Phish and Widespread Panic (although Widespread brought the festival to a beautiful close this year if I don’t say so myself). As Perry Farrell noted at the recent Spotify party, reaching 1 million people with your music is doable in ways it never was before.
I have to admit, I haven’t listened to terrestrial radio in a long time so I’m not sure what really does or does not get a lot of play, but I’m pretty sure Skrillex isn’t getting play time on your average pop radio station. Has his music been any less influential? Is it any less innovative? Are the crowds any smaller for it? He’s still on the cover of SPIN. Popular is being redefined, not by payola and the “powers that be” but by the fans who are connecting with the music and one another in totally new ways.
Take the juggalos as another example. Scary perhaps but no less an American subculture created from music. I can’t say I really understand it or if I think it is really a good or bad thing that it exists. It seems as though being a juggalo has given them a way to overcome the intense social isolation and resentment that inspired the tragedy at Columbine years ago, without having to act out the violent imagery of the scare-core rap they all enjoy. The point here is that trends are still finding expression in sound, style and sensibility, however alarming they may be.
Maybe Reynolds feels that there’s no value, no meaning, cultural significance or social purpose to these new forms of music; it’s all vapid at best or deranged at worst. I haven’t had a chance to read his book. If so, he’s just as out of touch with youth culture as MTV. I wasn’t around to hear it, but I would venture to speculate the older generation were as dismissive of Woodstock hippies as Fox News has been of the Occupy Wall Street movement and others were of PLUR and the rave scene. This is a paradigm shift none of us are likely to really understand except in retrospect.
This post wouldn’t be complete without also defining the terms we’ve been using: pop and innovation. (I suppose if I was a better debater I would have defined my terms upfront but copying and pasting it all to rearrange this post is so much effort.)
The Cool Kids Lunch Table
In some sense, this entire conversation makes no sense because we’re mixing concepts of pop and subculture. By definition, a subculture is something other than popular culture – pop – so it might be argued that innovation in pop music isn’t on the decline since there’s never been all that much innovation in pop music to begin with. The innovation happens on the edges of what’s popular, only later to be assumed into pop culture. Grant makes reference to punk culture but with the exceptions of Rancid and Green Day, was punk ever really pop?
That might be where the greatest divide between Grant and I lies. I deny that popular music is or ever really was, “great lab bench for our culture.” I think the dissemination of cultural innovation happens in a way more similar to Malcolm Gladwell’s description in The Tipping Point. If we accept this premise, then my argument above can be synthesized and made more simply: music and subcultures are still doing what they always have for society only now the effects are being amplified by social technologies.
We also must be careful not to confuse innovation and originality. Innovation is a new way of thinking that creates value. Innovation does not require originality however. It can simply be a new perspective on something old and familiar that makes it fresh, more relevant or simply more accessible. To quote Barbara Grizzuti Harrison, “There are no original ideas.” (The irony is I remembered first hearing a quote to that effect somewhere else, from someone much further back in history. I thought it was Descartes but with a quick internet search could find nothing attributable to him so went with what was more accessible. Great quotes can be like that.)
After the recent, tragic loss of probably the greatest innovator in my lifetime (and perhaps many lifetimes to come), I am reminded of the context in which the iPod was launched. It wasn’t the first MP3 player on the market, and it didn’t introduce a revolutionary new technology per se. Apple and Steve Jobs took what was already out there and combined in a novel way that was very compelling for consumers. They created a simple, elegant, and revolutionary whole product.
Isn’t that what a mash-up essentially is? Does anyone deny Greg Gillis‘s talent as a musician and innovator? Watch Morgan Spurlock’s profile of him if you are a doubter. Better yet, experience the electricity of one of his performances. That sort of visceral, emotional experience is what binds people together into subcultures. It offers people the kind of social identity they crave, especially young people still trying to define who they are.
To be sure, I think this assessment of innovation is fairly consistent with the argument that Grant makes. Where we might part ways is the notion that innovation in music has been crowded out by other forms of innovative media. There is enough cognitive surplus to go around. If anything, music is more important than ever. In an always on, always connect world, with so much competing for our attention (read my blog!), the 24 hour news cycle and fretting about a generation that has lost its the ability to focus, fostering an emotional connection is essential, and emotionality is at the core of great music.
Music will always play an important role in culture and society. It has an expressive quality that is almost mysterious. It inspires and imparts meaning. We aren’t seeing a decline in innovation in music. We are seeing a proliferation. It’s a very exciting time in the music industry, full of disruption and opportunity. I have no doubt that artists around the world are working on the anthems to which we will all march (or dance) together into the future.
Your time is limited, so don’t waste it living someone else’s life. Don’t be trapped by dogma – which is living with the results of other people’s thinking. Don’t let the noise of other’s opinions drown out your own inner voice. And most important, have the courage to follow your heart and intuition. They somehow already know what you truly want to become. Everything else is secondary. -Steve Jobs, 1955-2011
I am fascinated with questions of how we know what we know and why we think and act the way we do, topics frequently touched on in the investigations of cognitive psychology, evolutionary psychology and neuroscience. So I was very excited to read an article recently in McKinsey Quarterly that discussed the implications of cognitive biases on business (a recent article only to me).
I can recount a number of anecdotes from my own consulting experience where cognitive bias has interfered with team collaboration and good decision making. One of my favorites was a particularly contentious and political exchange that really centered on highly fallible recollections. I took great private satisfaction later in being able to produce a teleconference recording vindicating my side of the story, and I was admittedly tempted to send an anonymous copy of The Invisible Gorilla to one of the meeting participants; I resisted the temptation in favor of showing some semblance of social grace.
Much of the thinking I have advanced in the area of why acquisitions fail to deliver value (assuming good target selection) is based, in part, on the interference of cognitive bias. Put simply, integration teams are too risk averse, succumbing to the cognitive bias called loss aversion. No one losses their job for preserving the status quo, but status quo won’t position the integrated company to realize the sales growth and cost savings that justified the transaction price.
I don’t think enough business professionals acknowledge the impact of cognitive biases on their own decision making, to their great detriment. This is much of the premise of the works of Nassim Taleb as well as others. The fields of behavioral economics and behavioral finance try to incorporate learnings from psychology, although any breakthrough predictive models probably only exist outside academia as the closely held secrets of hedge funds. Those two areas of inquiry, however, seem more focused on understanding the effects of others decisions rather than learning to correct our own.
There is June HBR article with many valuable techniques for countering cognitive bias, in addition to just employing a more data driven approach to decision making (note: even statistic are subject to confirmation bias). We’ll never be able to eliminate it, and the real frustration is the inherent difficult in recognizing it in the moment. Nonetheless, we can cope, and those who do will be at an advantage to their competitors.
As a young, single professional, I spend a lot of time thinking about my career and dating. It should be no surprise, then, that I have come to see a lot of parallels between job hunting and dating, particularly online dating. Recruiters should take notice as well.
In online dating, you fill out a profile presenting a filtered version of yourself (your resume) filled with positive illusion and tailored to (hopefully) attract the attention of the kind of person or persons you would like to date. You then spend a lot of time browsing through other profiles (job postings) looking for someone that approximates your ideal mate (employer), all the while knowing on some level that (s)he has created a profile with the same positive illusion and ulterior motives as you.
When you are lucky enough to find a good match, you carefully craft a message (cover letter) that shows you’ve read the profile and have something in common that makes you a good match, while still trying to stand out somehow in the cacophony of other messages. If you succeed, you might exchange numbers, then text or talk on the phone (phone interview) before meeting up for a first date (real interview), usually in some abbreviated format like happy hour drinks. A few dates later, if all goes well, you (ostensibly) decide to commit to one another in an exclusive relationship and delete your dating profile (you’re hired!).
In both job hunting and online dating, you are trying to learn as much about someone with limited time and imperfect information before making a decision about how good a fit you are for one another. If it’s going to work, it has to be reciprocal. You look for shared values and similar expectations from the relationship. In dating and careers, there are the gold diggers and the people that want to connect on a deeper level.
There is a bit of cat and mouse game to both job hunting and online dating. You both want to highlight your best attributes and downplay your worst faults, but then you aren’t really painting an accurate picture of yourself. What’s she hiding? If he’s so great, why is he on this dating site? People say they hate games, but still we play them. Wouldn’t it really be more efficient just to announce our shortcomings so the other person could decide up front whether they want to hire/date you in spite of them? No one is perfect after all.
New jobs and new relationships always seem to start out with a lot of enthusiasm and optimism that dissipates over time. Paralleling the decision more people are making to put off marriage or opt for nontraditional family structures, the era of the organization man is giving way to the free agent and creative class.
I am surprised that no one has picked up on these commonalities and built a career site modeled more like a dating site. The problem with most career sites, for both the recruiters and the job hunters, is the sheer volume. There are too many resumes for a recruiter to effectively sort through, and it is too easy to get lost in the crowd when you submit a resume, witty cover letter notwithstanding.
What if you had a site that was free for anyone to use, recruiters and job seekers alike, but you could only access a finite number of job postings or applicant resumes per time period without paying and which ones you saw were decided by an algorithm? Maybe charge a nominal finders fee for matches, say a $5 charge to reveal an applicant’s contact information or $1 to submit your resume. For a paid account, you could access more postings/applications and actually toggle between the full population and a subset determined by the algorithm.
Build it on Facebook Connect and you can match candidates to openings on a much richer data set than just key word searches (watch out Branched Out!). Let users create questions with structured answers the same way OkCupid does today to help discover interesting correlations. Recruiters could attach the questions to their job postings and candidates can pose questions to companies to learn more about what it’s really like to work there.
If Monster wants to stay relevant in the face of competition from LinkedIn, they need to innovate. A good start might be poaching some of the talent away from over at Match.
- ad tech analytics big data bonuses branding business businessmodelinnovation business models Compensation data analysis Data Science design digital digital marketing entertainment executive compensation income income inequality machine learning marketing music music discovery online video predictive analytics prototyping salaries social marketing social media technology video wages youtube
Error: Twitter did not respond. Please wait a few minutes and refresh this page.
- March 2016
- February 2016
- January 2015
- June 2014
- May 2014
- April 2014
- January 2014
- November 2013
- October 2013
- May 2013
- October 2012
- September 2012
- August 2012
- July 2012
- June 2012
- May 2012
- April 2012
- February 2012
- December 2011
- October 2011
- September 2011
- August 2011
- July 2011
- June 2011
- April 2011
- March 2011
- February 2011
- January 2011
- December 2010
- September 2010
- August 2010
- July 2010
- May 2010
- April 2010
- March 2010
- January 2010
- December 2009