The trouble with being SMART

January 30th, 2016 — 1:41am

2016 is here. Worldwide, managers are setting SMART – Specific, Measureable, Achievable, Relevant, and Timely – performance goals. Even though the underlying theory – Management by Objectives (MBO)– has lost credence, this catchy mnemonic, which George Doran coined in 1981, has not. The time is upon us to retire SMART for all managers and executives from whom we need discretionary effort.

MBO lost credence because “The boss knows best” paternalism no longer works well. Global companies are transforming structurally to free emerging businesses from top-down strictures. Open source communities, coordinated by respect for expertise, not central authority, are creating technologies and products. Innovative workplaces are giving employees time off the clock and free resources, and benefitting from their unmeasured, untracked tinkering. Such environments thrive on distributed leadership and decentralized, uncounted action, and SMART goals can’t add to, and inevitably subtract from, them.

The problems with SMART run deeper and can damage even organizations that don’t need to unbundle business units or use open source approaches. The business environment has fundamentally changed. Companies no longer compete individually, but as members of networks: Apple couldn’t create the iPhone, or Airbus the A350 aircraft, without collaborating with others. Network members may be located half a world away, and inevitably have their own strategies, processes and cultures. So, complexity, uncertainty, and ambiguity abound, which allow problems and opportunities flash across these networks with blinding speed, meaningfully affecting performance. SMART goals implicitly assume staid environments that are far removed from these realities and can keep executives from responding appropriately.

Problems with SMART arise from virtually all elements of the acronym. ‘Specific’ goals, clear-cut and definite, are easy to articulate and act on. They enable quick assessments of individuals’ successes. However, when used extensively, they reduce discretionary activity and limit broader action. I once facilitated a meeting between two groups of senior executives, each from a well-known global company, whose businesses had merged. One group described how its corporate values drove performance evaluation and gave it freedom to act. The other retorted that its values were the five tasks set for each by his/her boss; each manager could, and did, decline to work on any initiative unless specifically tasked to do so. Guess which company had acquired the other? Guess which one’s stock price has usually outperformed the other’s?

‘Measurable’ goals have become an unshakeable article of faith, commonly justified by physicist Lord Kelvin’s dictum, “If you can not measure it, you can not improve it.” Such goals make it is easy to decide not just whether someone has performed, but how well. In so doing, they implicitly emphasize efficiency (doing something optimally, even if it is the wrong thing) over effectiveness (doing the right thing, which may be hard to discern). To make this point, I often ask senior executives to identify a single factor whose absence would destroy their businesses. They inevitably – and quickly – converge on ‘Trust.’ They are right – how much would you get done if you had to personally check every single word you were told? I then ask, “How do you measure trust?” They don’t – and can’t: this critical driver of business success is immeasurable. Instead of spouting Lord Kelvin out of context, executives should internalize the words attributed, perhaps aphoristically, to Albert Einstein: “Everything that can be counted does not necessarily count; everything that counts cannot necessarily be counted.”

Since ‘Achievable’ may produce inadequate outcomes, many companies set ‘Ambitious but achievable’ goals. Regardless, this criterion disregards our knowledge about human motivation. As Daniel Pink has brilliantly summarized in a YouTube video popular in business schools, carrot-and-stick approaches improve performance only when work is physical. Intellectual work benefits from allowing people to develop mastery in a field and giving them the autonomy to act therein. So, the quintessential carrot-and-stick nature of Achievable goals limits their relevance to managers. To drive supernormal performance, we should instead give people responsibility for accomplishment, and allow them to set their own targets consistent with organizational goals.

Goals are supposed to be ‘Relevant’ – not just to the organization, given its environment – but also to specific individuals and groups for whom they are set. The first criterion is undoubtedly reasonable, and on a prima facie basis, so is the second: why set a goal that isn’t applicable to, or deliverable by, the people in question? In reality, ‘relevance’ inevitably results in an enduring, widespread problem: organizational silos that hinder collaboration. For example, a sales team accountable for customer satisfaction is likely to have conflicts with a supply network team accountable for minimizing inventory. However, these silos would collaborate in their own interest if each was assessed (in part) on the other’s accountability – in effect measuring them for something that wasn’t relevant to their daily work.

How could ‘Timely’ not be legitimate? Very simply because it has become code for “as soon as possible.” We have made a virtue of speed to the exclusion of every other meaningful, and important organizational goal. Business textbooks assign critical importance to “first mover advantage” even though irrefutable examples of its falseness are readily available. When was the last time Apple launched a truly first-in-the world product? Was Google the first search company? Was Facebook the first social media offering in its niche? How are Chinese and Indian multinationals, Johnny-come-latelies to international markets, giving established Western firms a run for the money? When ‘timely’ equates to solely to speed, creativity, effectiveness and yes, even efficiency, suffer, sometimes irreparably.

What should executives do? They should reserve SMART goals solely for people who have limited discretion. For everyone else, they should begin goal setting with non-specific, qualitative, “can’t be done” diffuse and “time is one of several criterions” goals that give people autonomy and mastery. Indeed, they should urge them to propose goals for themselves. They should add SMART goals, only where they are truly unavoidable, and there too, with enough fudge-factors to ensure they don’t become limiting or constraining.

In effect, throughout the goal setting process, they should ask themselves: Am I paying attention to issues that truly matter? Am I truly leading an organization of people, or am I merely checking boxes to show that my job matters? Being SMART is easy, but that doesn’t make it right.

———

A shorter version was published by Forbes on January 12: http://www.forbes.com/sites/forbesleadershipforum/2016/01/12/it-may-be-time-to-get-rid-of-smart-management/#2715e4857a0b6e160d5e3bbc

Comment » | Business Tools, Company Performance, Corporate Culture, Leadership

The “Republic of India” in Europe

December 30th, 2015 — 9:20am

Amit’s note: After a four year absence, I am back writing. As a Professor of Leadership and Strategy at IMD, I will off and on produce articles for publication that IMD calls “Tomorrow’s Challenge.” Often, these articles are shortened for publication by the media outlets that publish them. I will, on a case-by-case basis, post either the original version or the published version. For example, a shorter version of the following post appeared in 10 media outlets (that I know of) from Finland to Hong Kong during October and November 2015. I will, of course also occasionally post opinions that don’t fit the criteria for Tomorrow’s Challenge. Let me know what you think.

—————-
The EU is wrestling with seemingly insoluble human and financial crises. Pundits routinely draw (unfavorable) parallels to the US to illustrate needed changes. A mechanism to move resources to areas where needed. A central bank with powers to set monetary policy and regulate all major financial institutions. Greater political integration. They note that with a debt of $72 billion that it cannot pay because of a decimated economy, Puerto Rico faces a crisis similar to Greece’s. Yet, financial markets have not panicked, assuming America’s fiscal, monetary, political and judicial mechanisms will enable a soft landing for this US Territory.

This technocratic prescription, though valid, doesn’t address the EU’s real problem: The EU is similar not to today’s US, but to the US of 1776 – 1789. After winning independence, the US states functioned as “these United States” only in name. Each focused only on its own needs and foreswore responsibility for the immense war debt. After 13 years of chaos, the Second Continental Congress adopted the US constitution and installed George Washington as the first President. The EU is in its “thirteen years” now, in need of its own reform. However, despite the present crises, the situation still isn’t bad enough to force fundamental change.

Progress before the situation gets “bad enough” will require learning from India. This will be challenging; no European I know thinks an emerging economy where corruption is rife can teach the EU much. They are wrong. The EU’s challenge isn’t creating a “US of Europe,” but a “Republic of India” in Europe:

• The EU must unify very diverse peoples. In 1947, India integrated over 600 independent and semi-independent kingdoms and the erstwhile British India, and over a few years, consolidated them into language-based states. (There are 29 today.)
• The EU has 24 official languages, and so does India (including English). Westerners are dismissive when I claim to speak two Indian languages. But the people who speak these languages – as different as English and Polish –live as far apart as London and Warsaw. Half of India can’t even recognize the other half’s alphabets. So, educated people use English to communicate.
• Like Europeans, Indians swear by the cultures and food of their states. Only a tiny minority eats regularly what Westerners call “Indian cuisine.”
• Europe is less religiously diverse than India. India has more Christians than all but five EU countries, and more Muslims than all but two countries worldwide. The Hindus are also diverse; for example, the rituals of the two areas whose languages I speak differ considerably.
• As in the EU, people in India (still) harbor false beliefs that people in some states are industrious while those in others are not.

The EU’s efforts at managing diversity have been a near complete failure. Its politicians haven’t made a clear and consistent case why diverse peoples should come together. The absence of an emotion-laden “I am loyal to the EU because …” rationale for unity has produced today’s “What’s in it for me?” ruptures along national and linguistic lines, and the alienation of European Muslims.

EU politicians don’t understand a basic truth we teach in Leadership and Change Management courses: when people rally around a shared vision, driving change becomes easier. Why does the EU exist? Surely not to prevent a German initiated World War III? That rationale became irrelevant decades ago. Politicians – like Jean-Claude Juncker – who ardently champion the EU, offer technocratic rationales, not ones that ordinary people can feel in their guts.

In contrast, India’s efforts at forging a common identity – the concept of “India” had not existed in the prior 4,500 years – have been a substantial success. Indian politicians got a lot wrong, but this they got right. They adopted a national anthem that lauded, by name, every part of the country. They adopted a flag with colors associated with the three major religions. They made political decisions that made no logical or economic sense, but helped manage diversity. They drummed the message of “Unity in Diversity” into every child’s head.

And despite India’s periodic, ugly, politics-driven religious killings, they championed religious diversity. Four of India’s 12 presidents were Muslims, as were 4 of India’s 42 Chief Justices, many senior bureaucrats and ministers, and the senior-most leaders of its armed forces. Forbes lists Indian Muslim billionaires, and India worships the many Muslims in its movies, cricket team, and the arts. EU leaders should ponder why so many British Muslims have joined ISIS while few (if any) Indian Muslims have, even though Britain’s Muslim population is 1.6% of India’s.

I first made my case for “The Republic of India in Europe” in 1989 at a dinner with the executives of a Flemish-Belgian multinational. Europeans must learn, I said, that sometimes a major investment must be made in a particular region not because it makes sense, but because “it’s their turn.” An executive whom I respected retorted, “I don’t care, as long as it isn’t in any French area!” The others laughed.

In the early 1990s, as a professor at INSEAD, the “European Institute of Business Administration,” I observed French companies recruiting only French students, German companies only Germans, British companies only British and so on. On a visit in 2005, I heard former colleagues ruefully note that the situation hadn’t improved very much. Hopefully, for Europe’s sake, it is a lot better now.

I had hoped that the EU policy that required teaching children two non-native languages would help Europeans appreciate their diversity. On the plus side, a 2012 European Commission report noted that 73% of young students were learning English, while German and French were common as second languages. However, the time individual countries devote to languages varies sharply. The UK lacks a specific time commitment and unsurprisingly, while living recently in a well-off London neighborhood, I only heard children speak English. Spain devotes only 5% time at primary levels and 10% in secondary levels. Again unsurprisingly, during my recent visit to five Spanish cities, I met very few young people who admitted to speaking English. A waiter who spoke good English bemoaned the quality of his daughter’s English education.

People can drive change themselves, but they must want to – and it takes much longer. In India of the 1970s, my fellow students and I ridiculed the efforts of a language institute, modeled on the Académie Française, that coined long-winded Hindi equivalents of simple English: “railway signal” became “lahu-puth-gameni-awat-jawat-soochak-danda.” Though today’s BJP government is trying to reintroduce similar silly ideas, DJs and program hosts on Indian TV and radio tend to speak smooth amalgamations of the local language with English (e.g., “Hinglish” combines Hindi and English). So, even illiterate people learn – and use! – English words, facilitating interactions. Unity in diversity, writ small.

Virtually every European leader is running away from the richness of European culture. Instead of unifying people, they are perversely pushing them apart. Wolfgang Schäuble mused that Greece should temporarily leave the Euro zone. David Cameron promised a referendum on EU membership unless the EU acceded to British demands. Greece is flirting with Russia. Viktor Orban wants the EU refugee/migrant policy to ensure that Europe remains Christian. This depressing list is unending. Disunity in diversity, writ large.

(It’s worth noting that today’s EU refugees are a fraction of the roughly 10 million Muslim refugees that India hosted during the 1971 bloodbath that birthed Bangladesh. That India, unlike the EU, was dirt-poor.)

The EU will stop lurching from crisis to crisis only if its leaders ensure it stands for something that makes people proud. They must set an extraordinary, but human, vision for the EU no European country can fulfill on its own. They must learn to give something up first, in order to get something in return. They have to champion policies and ideas that might have limited value for their own countrymen – and indeed, may be to their detriment in the short run – but are essential for the EU’s longer term success.

David Cameron, Angela Merkel, and Francoise Holland have not shown they are up to the challenge. What are they willing to give up? What policies will they promote that aren’t in their own countries’ best interests? However, I know in my gut that some other Europeans are. After all, ordinary Europeans created Médecins Sans Frontières, and instead of staying in the comfort of their rich homelands, at great risk to themselves, they regularly take light and hope to the darkest corners of the world. Surely others can see the value in ensuring the EU embraces European diversity?

Comment » | Politics

Grokking Jobs on Campus

September 1st, 2011 — 2:01pm

I’ve been Executive in Residence at Babson College since January. As Fall creeps up on New England (You’re beautiful, but can you please stay away for a little longer?) and students return, my thoughts are a continent away, at two other campuses: The California Institute of Technology and the Apple campus in Cupertino.

This summer, I learnt of a Caltech lore: When Apple visits Caltech to recruit undergraduates in computer science, it brings an open checkbook. Even unreasonable salary expectations don’t preclude the hiring of those whom it likes. Initially, the story seemed inconsequential.

Then, a few days ago, Steve Jobs resigned his position as Apple’s CEO. Apple’s iconic co-founder has reportedly lived a decidedly iconoclastic life, at least in comparison with those of the CEOs of most global companies. He dropped out of college, but living on friends’ sofas, continued to attend classes he liked. So exposed to calligraphy, he incorporated a range of fonts, not just Pica and Elite, on the original Macs. He then dropped out altogether went to an ashram in India, from where he returned a Buddhist. He embraced counter-culture and reportedly regards his doing so a critical formative experience. In short, as a young man, he was the complete antithesis of the people that Apple is seemingly hiring at Caltech.

I am not begrudging the Caltech seniors, particularly those who have worked diligently, their high-paying jobs! Nevertheless, the juxtaposition of these events raised in my mind a critical question for Apple and a more general one for businesses and academia. The roots of these questions lie in an amazing interview Jobs gave to Wired magazine in 1996, before he returned to Apple. In part, he said:

“Some people think design means how it looks. But of course, if you dig deeper, it’s really how it works. The design of the Mac wasn’t what it looked like, although that was part of it. Primarily, it was how it worked. To design something really well, you have to get it. You have to really grok what it’s all about. It takes a passionate commitment to really thoroughly understand something, chew it up, not just quickly swallow it. Most people don’t take the time to do that.

(Jobs probably used the word grok very deliberately; if you don’t grok it, read Robert Heinlein’s Stranger in a Strange Land.)

“Creativity is just connecting things. When you ask creative people how they did something, they feel a little guilty because they didn’t really do it, they just saw something. It seemed obvious to them after a while. That’s because they were able to connect experiences they’ve had and synthesize new things. And the reason they were able to do that was that they’ve had more experiences or they have thought more about their experiences than other people.

“Unfortunately, that’s too rare a commodity. A lot of people in our industry haven’t had very diverse experiences. So they don’t have enough dots to connect, and they end up with very linear solutions without a broad perspective on the problem. The broader one’s understanding of the human experience, the better design we will have.”

I wonder if those responsible for on-campus hiring at Apple have grokked this interview. People have long debated Apple’s ability to create lifestyle-altering experiences in a post-Jobs era. A die-hard Apple fan, I had no doubts it could – if it institutionalized Jobs’ perspective on design. (I call this making of the “private knowledge of an individual the public knowledge of many” organizational learning.) However, if Caltech’s lore is true (and broadly representative), Jobs’ insights haven’t become organizational. This won’t be a problem tomorrow, but will be when the individuals so hired rise to managerial positions. Will they prize staff who lack deep knowledge but who, by virtue of their life experiences and broad knowledge, can connect seemingly unconnectable dots?

More broadly, in a world that prizes “deep, micro-knowledge” more than “broad, macro-knowledge,” how do we produce great designers, managers, and indeed, leaders? How do we ensure people are, in Jobs’ words, “able to connect experiences they’ve had and synthesize new things. And the reason they were able to do that was that they’ve had more experiences or they have thought more about their experiences than other people.”

I am not arrogant enough to believe I have the answer, but will leave you with a proposal. For college students, I’d make a “semester abroad” a requirement, not an option. And an American going to Western Europe (or vice versa) wouldn’t count.

What do you think?

Comment » | Corporate Culture, Design, Education, Leadership

The Duel of the Physicists

April 11th, 2011 — 3:21pm

Peruse any forum on business and sooner or later, you will find a discussion on the value of metric-based performance appraisals. Almost inevitably, you will also find a reference, attributed or not, to Lord Kelvin’s famous dictum, “If you can not measure it, you can not improve it.” (Here’s one example: Metrics, metrics, metrics. “If you don’t measure it, it won’t happen.”) Finally, you’ll also probably note that the discussion’s initiator heavily favors measuring. In my example, the next lines read, “Do you believe that every department/function and employee should have measurable goals? Can you share your successes?” Quite possibly, because of this aphorism, Lord Kelvin is better known to managers than physicsts.

Now consider Albert Einstein. True, he wasn’t a “Lord”, but surely his various other achievements compensate for this deficiency? He said, “Everything that can be counted does not necessarily count; everything that counts cannot necessarily be counted.” As managers and students of management, if we must turn to physicists for insights, can we not cite him with equally often? We don’t and this is sad, for we need this particular insight very, very badly. Consider just two examples:

In a world in which most businesses rely on others for core products and services, as a manager you rely on work done by people you don’t know, who work for other companies with different goals, cultures, and risk tolerances. Here, you have two choices: You can base your hopes for a mutually beneficial relationship on tightly structured, measurable Service Level Agreements that your teams of lawyers can help enforce. Or, recognizing that if you have to rely on lawyers to enforce the relationship, you can’t possibly succeed at it, you could choose to invest in building a a level of mutual trust (which you could never measure) which would smooth the imperfections of your “good enough” SLA.



How about innovation? How many of the greatest products or services you use today (or those that were used in the past) were created in workplaces that operated under the philosophy, “If you don’t measure it, it won’t happen?” Some of the most innovative workplaces of the world have long given employees free “off the clock” time and free resources – and benefited from the results of the unmeasured, untracked tinkering they did.

One and even two generations ago, the doyen of quality, Edward Deming, tried to use statistics to convince managers that measuring the performance of individuals often made no sense. Most weren’t willing to consider this advice. Now they need to follow it more than they did then. (And no, Google’s recent efforts, to which I will devote a separate post soon, doesn’t obviate this opinion.)


Metrics and measures have a place in business, just not a central one. A manager who firmly ascribes to “If you don’t measure it, it won’t happen” will most likely make sure only one thing will definitely happen: he/she will be outperformed by those who understand that at the very least, not all statements about physics should be applied to management, and more correctly, management is most definitely not physics.

3 comments » | Business Tools, Leadership

The Joys and Perils of Dancing on a Knife’s Edge

February 25th, 2011 — 11:09am

The tumultuous crowds that brought down a dictator in Egypt had an unintended impact far from their homeland: they drowned out – rightfully! – the announcement of a strategic partnership between Nokia and Microsoft. I admire the Nokia I researched; yet I acknowledge it is currently in deep trouble. I have long disdained Microsoft for its product quality and its reliance on monopolistic power instead of innovation (sole exceptions: Xbox and Kinect; and yes, I admit that Office 2011 for the Mac is far better than iWorks!). So, what do I think of this alliance?

A key “prerequisite” question is: Do I still believe the ideas in The Spider’s Strategy? Absolutely! Toyota’s “unexpected acceleration” fiasco and its resultant recalls of millions of cars didn’t discredit Lean Enterprise. Why then, should Nokia’s recent challenges discredit Networked Organizations? Indeed, Nokia got into trouble because in the key area of product innovation, it stopped applying the ideas that powered its 17% compounded annual organic growth rate (in revenues and operating profits) from 1995 to 2006.

Nokia violated a subtle rule embedded in my third Design Principle, “Value and nurture organizational learning.” It used to learn rapidly by setting seemingly impossible targets that demanded the periodic reinvention its business model. Simultanneouly, to keep control, it insisted its managers follow a “no surprises” policy. This brilliant rule is the proverbial knife’s edge. Balance well, and you can pull off miracles. Tilt toward “big risk” and you can lose your shirt. Tilt toward “no suprises” and you will bring innovation to a screeching halt. As it grew, Nokia made the mistake many other large companies have: it tilted toward “no surprises.” So, unlike Apple, it didn’t build a network of complementary product makers to buttress its proprietary Symbian software. Unlike Google, it didn’t attract a different type of sustaining network by making Symbian open source – until it was too late.

The alliance with Microsoft was in the cards from the day Nokia’s Board appointed Stephen Elop CEO. Nokia’s press release spoke of a strategy to “build a new global mobile ecosystem” with Windows Phone software at its core; “capture volume and value growth to connect ‘the next billion’ to the Internet in developing growth markets;” and make “focused investments in next-generation disruptive technologies.”

The second element – a continued focus on markets like India and China – is an key, though the notoriously developed-world-focused financial analysts may not care. Apple has ignored these markets and Windows still has a true monopoly among operating systems. These facts, plus Nokia’s still dominant marketshare there, give the alliance a strong base on which it can build; Nokia can instantly create volume for the Windows Phone and a seemless integration with Wintel computers may give it an edge over low cost Chinese phone makers. At the very least, this element will buy the alliance time; at best, the “next billion” is a huge market. That’s where the first element is also critical.

To bring the alliance value, the goal of building a mobile ecosystem must truly assimilate the lesson of a recent The New York Times story about a start-up company that hoped to build a business around enabling group dates. The founders noted that the site’s users were mostly South or East Asians, but filed that fact away as “Interesting, but Unimportant.” Success came only when they reluctantly acknowledged that group dating wouldn’t fly in the US and shifted their focus to India. The world, as Thomas Freidman said, is flat. But that doesn’t mean people’s needs are the same everywhere. That’s why the word “global” in the language of this strategic element is troubling. Its use may seduce financial analysts, but unless an ecosystem to specific markets, it won’t amount to a hill of beans. At one time Nokia knew this lesson; it had had anthropologists in Indian villages whose work strengthened its market position there. Does it still remember that lesson and can it convince a monopoly to learn it too?

The third element is critical for the long term and most troubling: Will two companies who haven’t created any disruptive technology recently be able to do so in the near future? Nokia’s Chairman Jorma Ollila had championed the Networked Organization philosophy and as CEO, had managed its phenomenal growth. I could make a cogent case that he and the Board had no choice but to create the alliance with Microsoft. (Which would explain why they pursued Mr. Elop in the first place.) Now, he must ensure that Mr. Elop realizes that his most critical tasks are (1) putting into leadership positions those within Nokia who are still capable of dancing gracefully on a knife’s edge and (2) using his deep knowledge of Microsoft to convince Mr. Ballmer to do the same. Then, and only then, will the alliance succeed. If so, I may one day become once again an enthusiastic customer of both companies.

Comment » | Business Environment, Company Performance, Corporate Culture, Leadership

‘Will no one rid me of this troublesome priest?’

January 12th, 2011 — 2:48pm

On January 10th, I was driving to a business school to lead a symposium on leadership in a networked world. On the radio, I heard the debate about whether the vitriol common in American politics today triggered the carnage in Tucson, Arizona on January 8th. (A man had attacked a centrist US Congresswoman; she is recovering from a serious bullet wound to her head, while six others are dead and twelve more are wounded.)

Some people – typically those on the political left – decried the language used by those on the right. Their Exhibit A was a map Sarah Palin, their bête-noir, put up before the recent US mid-term elections: it had a marksman’s crosshairs drawn on 20 congressional districts (including Ms. Giffords’) held by the Democrats. Others – typically those on the political right – accused the left of politicizing a tragedy. The killer didn’t belong to any right wing group, and quite possibly was psychotic. Words and images like Ms. Palin’s map were merely rhetorical political devices, not incitements to violence; linking these to a psychotic’s actions was wrong.

I considered focusing the symposium on the link between words and action. Ultimately, I chose not to; this issue was key to leadership, but not necessarily to the concept of a networked world. What would I have said if I had made a different decision? Without a doubt, I would have begun with the words in the title to this post.

Henry II, King of England and a part of today’s France, supposedly uttered them from a sickbed. (Other records suggest that he said, “What miserable drones and traitors have I nourished and brought up in my household, who let their lord be treated with such shameful contempt by a low-born cleric?”) The priest in question was Thomas Beckett, his one-time closest friend and confidant, who as Archbishop of Canterbury, had successfully blocked a key law Henry championed. Four of Henry’s knights acted on his words. In one version of what happened next, they went to Canterbury to kill Beckett and succeeded. In another version, they went to arrest Beckett, but backed off and went to bed when he resisted. The next day, they again tried to drag Beckett out of the cathedral. Somehow Beckett got hit on his head. This accident triggered bloodshed: the knights then drew their swords and slew him.

Henry might simply have been delirious – or merely frustrated – when he spoke. In doing so, whether he intended it or not, he set in motion Beckett’s assassination. The four knights were not mentally ill; they acted deliberately to please their lord. They might not have intended to commit murder, but even in the passion-of-the-moment version of the events, by-standers became “collateral damage.” In either version, I doubt they would have acted against “God’s personal representative in England” had they not felt that their lord was implicitly urging them to do so.

Far from issues of life and death, the essential lesson of this story for any business manager is simple: Words of those in positions of authority always have consequences, even if they aren’t immediately palpable. This lesson is valid for positive words as well, and most annoyingly, for words – positive and negative – that aren’t spoken when they could have been.

Why? Because most people try to fit into their chosen group. Because they value praise from their superiors. Because they try to find meaning for the humdrum of their daily work. Because they routinely look to their superiors’ words for cues about what they should do. Because they analyze whom a new boss speaks to first; if he/she talks to them, they conclude they have been anointed, but if he/she talks to someone they consider incompetent, they conclude the boss has “been captured by the wrong people.”  Because they read much into whom their boss has lunch with, ignoring the fact that the lunchtime companion may merely be an old friend. Much of this scrutiny is way over the top, but there is no escaping from the fact that it happens every day in every organization, including informal ones.

So, if you aspire to positions of authority or leadership, teach yourself to be very careful about the words you use. Conversely, if you don’t accept – or don’t want to live by – this lesson, don’t seek positions of authority or leadership. You certainly shouldn’t be given such a position, for you will have the potential to do enormous damage.

Finally, it is worth noting that both the right and the left in today’s debate are wrong. The right is disclaiming a link for which there is tons of evidence. The left is applying the link way beyond what is reasonable: The issue is not vitriol per se, but its source. Identical words spoken by two people will have divergent impact, if one is an average citizen and the other, someone with a substantial following. The words the latter uses in difficult and/or emotionally charged situations can give us insight into whether he/she has the capacity for greatness or whether he/she merely is a power hungry mortal. Unfortunately, instead of using such situations as guides, we convince ourselves that, everyone, without restriction, who is on “our side” is capable of greatness, while everyone, without restriction, who is on “their side” is a power hungry mortal.

My best wishes for this New Year. May it rise beyond the horrific sights from Tucson.

Comment » | Leadership, Politics

“Oh, You Spoke the Culture and She Spoke the Language”

October 6th, 2010 — 11:07am

The quintessential Indian term, “License Raj” pejoratively describes environments where governmental approval – administered by imperious bureaucrats – is essential for day-to-day activities of people and businesses. I used the term at a recent meeting I attended to describe my sojourn (1992—1994) in France. (Incidentally, none of this is a commentary on today’s France and India.)

French state run utilities regularly sent notices that said, “We will come on X day at Y time to read your meter. If you are not at home, we will levy a 50 Franc charge. If you need to change this appointment, you must also pay a 50 Franc charge.” My wife (India-born, but US-bred) spoke French well, but found such demands incomprehensible. I didn’t speak French, but having grown up in India, readily understood them. Ultimately, she found France more challenging than I did.

While several people laughed at my story, one observed, “Oh, you spoke the culture and she spoke the language!” The depth of this casual statement, spoken in jest, stayed with me.

Many scholars and adults believe that learning someone else’s language allows us to understand how he/she thinks. So, premier universities had – and still have – language requirements for undergraduates. Years ago, the European Union, against significant opposition, passed laws requiring children to learn languages that where not their own national languages. When the best of our companies open offices in countries far from home, they populate these offices with people who have more than a nodding acquaintance with the host country’s language.

Indeed, when the Disney company opened EuroDisney outside Paris (while I was living there), it hired bilingual – often fluently so – people. Yet, EuroDisney was an unmitigated disaster at its opening; I took my toddler there and swore I would never return. The Park turned around only when Disney acquiesced to a European – French – management takeover.

So, the knowledge of a language, however proficient, may not necessarily translate into an understanding of the associated culture. Conversely, the understanding of a culture, may not necessarily require fluency in the language. In the language of mathematicians, neither is necessary nor sufficient.

As we create ever more complex – networked – organizations that span the globe, we must understand this issue in greater depth than we do. How do you think businesses should deal with this culture-language issue?

Comment » | Business Environment

Better on a Camel?

June 9th, 2010 — 3:54pm

It has been exactly 99 days since I last posted. Hadn’t meant all this time to pass, but life intervened. So, I’m going to welcome myself back by first looking back 40 years.

If you are old enough – or have an deep interest in commercial flying – you might know that long ago, British Airways used to be British Overseas Airways Corporation. During those days of genteel competition, airlines’ acronyms often became amusing nicknames. BOAC was “Better on a Camel;” industry insiders used this moniker affectionately. Decades later, however, one would truly be better off on a camel than on British Airways.

Why? Three words: People, people, people. BA and its employees are constantly at war. Their mutual acrimony routinely spills over into public and affects passengers. Both sides seem to loath the customers who keep them employed.

In the late 1990s, BA put up signs at Heathrow, threatening to prosecute passengers who were discourteous to its employees. It neglected to tell its employees that they too needed to be polite. And with that omission, they unleashed trouble. At a check-in counter once, I expressed mild irritation that I was not given the seat I had reserved. Red Queen style, the agent literally turned crimson with fury. How dare I complain, he asked? If I didn’t like the seat he was giving me, I didn’t have to fly.

Fast forward a few years to an ever lengthening Business Class check-in line. One of the two agents designated to attend to it was enjoying a long, uproariously funny phone conversation. A passenger left our line and requested him politely to terminate what was clearly a non-urgent call. The agent followed the man back to the line and as the rest of us stood around stunned, began screaming, “Who are you to tell me what work I must do?” His rant lasted a couple of minutes and then, he went back to his call.

Fifteen minutes later, he was still on his call and the line was becoming ever longer. Another passenger screwed up his courage and asked a passing agent to summon a manager. This one also became Red Queen incarnate, “You’re telling me to do something? Who are you to tell me what I should do?” He hadn’t heard the “please” the rest of us did and felt it was completely appropriate to abuse a premier passenger.

I’m not making these up! More recently, a business class counter at Brussels was open, but the BA agent was missing. I chatted with a couple of other waiting passengers. Each of us had multiple such horror stories. One called me “lucky” since I only had to fly to London, while he was stuck with BA till Sydney.

This is one sad, sad airline whose service is worse than even the deficient service (by Asian standards) available in the US. As I am writing this, BA cabin crew are finally on the strike that judges had forbid twice before. Once was last December, but by the the time the judicial edict came down, they had hurt thousands of vacationers during the Christmas holiday period. Another time was last April. I was on a round the world business trip that began in Europe and I actively avoided all BA long-haul flights even though they were theoretically the most convenient. Unable to avoid a short Madrid-to-London flight, I waited with bated breath for signs of trouble. Fortunately, I wasn’t affected.

Some readers might blame such behavior on the presence of unions. Maybe so, but they are, at worst, only partly at fault. To me, the clear onus for such disregard for customers must be placed on management. BA management, it seems, has long believed that “service” means more amenities. BA has generally been among the leaders in introducing new technology – like flat bed seats in business class. But in the far more difficult area of creating a more positive corporate culture, in well over a decade, its management has failed – miserably, in my opinion. Nor has their approach to management created much value for their shareholders. Which raises the question: Why do they still have their jobs? (I know, Richard Branson’s been asking this for a long time.)

Economists point to the virtues of free markets; if enough people felt like me, they say, we could take our business elsewhere and punish BA. In a world of networks, however, that is not true; BA is a key member of the One World alliance and as long as I choose to fly One World, I will have to put up with BA, at least occasionally.

Sometimes, good things have very bad consequences.

Comment » | Company Performance, Corporate Culture

“Don’t be evil” meets “Do no harm”

March 1st, 2010 — 3:33pm

Last week, an Italian court gave three Google executives six months’ suspended sentences. Their case dealt with a video uploaded on YouTube in Italy, giving the court (and the prosecutors) jurisdiction. The video, which showed a group of teenage boys bullying another with autism, quickly became an Internet sensation. A couple of weeks later, Google received a complaint and removed the video within three hours. By then over half a million people had viewed it. The Google executives were deemed guilty of violating privacy laws.

In the real world, most issues worth reflecting on – like this one – have no simple answers. This one asks us to weigh the relative benefits of privacy and free speech. I don’t know all the possible arguments people made about this case, but I’ll address a few that I heard repeatedly.

The first ignores the specific details of the case and suggests it was a cynical ploy by the billionaire Italian Prime Minister Silvio Berlusconi to clip the power of the Internet since it was threatening his vast “old media” empire. I don’t know much about the Italian judicial system, but if one has a reasonable understanding of realpolitik and of Mr. Berlusconi’s repeated cavalier disregard of a variety of laws, this view is hard to dismiss as a ridiculous conspiracy theory. If true, the court’s decision could have a very negative impact around the world.

It isn’t unheard of for ruthless executives to take unethical, albeit legal, positions to further their ends. The big deal here is that this decision was handed down in a Western democracy on an issue with very high stakes. Undoubtedly, many ruthless people are currently assessing how they could win similar rulings in their bailliwicks. The Ahmadinejads and Mugabes of the world are preparing arguments along the lines of “But this is acceptable in the West.” So, the decision has made the world much more fraught with risk for decent people.

The second viewpoint has attracted most commentators. In essence, it compares the Internet to traditional communications – like telephones and the post office. Telephone companies aren’t subject to criminal charges when their equipment and services are used to plan crimes, no matter how nefarious. So, why should companies like Google?

I am not a lawyer, but for me, this argument doesn’t have legs. Progress in laws generally always lags progress in technology. In The Spider’s Strategy, I argued that our legal systems haven’t caught up with the fact that sense-and-respond capabilities are erasing the traditional boundaries of companies and taking us into uncharted territories. So, inadequate laws shouldn’t be a defense here.

Besides, Google’s defense was that it took down the video within 3 hours of being informed about it. The real question is: should it have acted proactively? After all, when I go to my local post office, I am routinely, proactively asked to confirm that the letter or package I am shipping has nothing dangerous in it. Legally, the post office doesn’t have to ask me (at least not that I know of!), but it is commonsensical for them to do so, if for no other reason than to protect its own people. I am sure that if I give them cause for concern, someone will take some proactive action and at least screen my package. Indeed, increasingly, the post office is trying to screen all packages.

But Google responds that every second, twenty hours of video are loaded on its systems around the world. They just don’t have the ability to screen everything. This argument also seems specious. Google doesn’t have to screen everything. However, can’t – doesn’t – it have filters to screen on an exceptional basis? If a tag or a comment says “school yard bully” couldn’t that particular video be checked out? Let’s assume that this filter would itself get swamped by volumes. How about using an additional decision point? “If a video hits 100,000 views or if a video is shooting up the popularity index very rapidly, check its appropriateness.” Saying “We want the right to search every book in the world and make money out of giving people access to these” seems incompatible with “We can’t possibly be expected to scan every video – or even a fraction of the videos – currently on our system.”

The third viewpoint focuses on biases rooted in the divergent histories of people around the world. Americans favor the freedom of information over all else because its national birth was in part driven by the oppression of a government using information inappropriately. That’s why it is the First Amendement to the US Constitution (part of the Bill of Rights which enshrines the first ten amendments); the Consitution was adopted on September 17, 1787 and the Bill of Rights was adopted on December 15, 1791. In contrast, there is no explicit “right to privacy” in the US Constitution or its amendments; this right was imputed to exist (on the basis of several of the other Bill of Rights amendments) as late as 1965 by a much disputed ruling of the US Supreme Court.

In contrast, Europe has suffered severely as a result of a lack of a fundamental right to privacy. Throughout history, dictators and totalitarian regimes have terrorized their people by collecting huge amounts of secret information and using these to justify punishments, torture and killings. And so, it is no surprise that Article 8 of the European Convention on Human Rights says, “Everyone has the right to respect for his private and family life, his home and his correspondence.”

Supporters of the “information first” logic point out that today’s totalitarian states block access to information, particularly that acquired through the Internet. So, Google rightfully stood by its corporate motto and “did no evil:” It shouldn’t have – and didn’t – act preemptively to block the video, but took action when it was appropriate. Supporters of the “privacy first” logic, (which, incidentally, the Italian court adopted) argue that Google had a fiduciary responsibility to protect the autistic child’s right to privacy. Above all, Google should have “done no harm.”

In the years to come, we will face the two facets of this third viewpoint over and again. Sense-and-respond capabilities will not only benefit businesses and society, but will also raise this issue in ways that we can’t even imagine. (For example, listen to “Different Strokes.” This “On the Media” program from National Public Radio discusses technology that tracks where someone goes on the Internet on the basis of his/her typing pattern.)

My own bias is towards privacy; I think it will increasingly become hard to live as an individual unless privacy safeguards are strengthened. And the day when this becomes a real issue for everyone is not far off; it will happen, as I’d indicated to a pharmaceuticals industry audience in May 2002, because of genetic-profile based medicine. Even “open information” stalwarts in the US will have to think about whether they want companies and governments to have unfettered access to their own specific genetic structures. That is why I did not howl in protest when I read about the Italian court’s decision – but as I indicated in my discussion of the first viewpoint, I am not one hundred percent convinced that it was the right decision.

1 comment » | Business Environment, Corporate Culture, Online Business Models, Politics

The Michael Crichton Strain

January 29th, 2010 — 11:01am

Michael Crichton was the author who ensured that English speaking children know – and can perfectly pronounce – the names of at least ten dinosaurs. I read the first of his 26 novels, The Andromeda Strain, in 1976 and several others – including the ones about dinosaurs, Jurassic Park and The Lost World – in subsequent years. He also created the extremely popular TV show ER; I didn’t see even a single episode of the show. He passed away in November 2008.

I liked reading his books because many, if not all, of them dealt with the complexities of a world I knew well: the intersection of advanced technology and business. However, I am definitely not a “Crichton groupie;” I stopped reading him in the early 1990s, because I felt that his 1992 book, Rising Sun, had racist undertones. This decision means that Mr. Crichton may well have held positions about which I know absolutely nothing.

Mr. Crichton’s writings introduced me to an extraordinarily powerful idea: humans are creating ever more complex technological systems without truly understanding their implications. They think they can completely control these, but the reality is they can’t. For example, consider the following extract from a speech on environmentalism, as it is reported on “Michael Crichton, The Official Site”: “Most people assume linearity in environmental processes, but the world is largely non-linear: it’s a complex system. An important feature of complex systems is that we don’t know how they work. We don’t understand them except in a general way; we simply interact with them. Whenever we think we understand them, we learn we don’t. Sometimes spectacularly.”

I couldn’t help but be reminded of this idea when the news about Toyota’s ever-expanding recall came into the public spotlight. How could a company so admired and emulated falter so badly? One explanation is that the Company’s relentless pursuit of growth over the last decade caused it to take its eye off quality. Toyota’s new CEO, Akio Toyoda, shares this view; when he got the job in October 2009, he apologized profusely in public for the quality problems that Toyota had experienced. As time would tell, those were nothing compared to what’s happening right now. (I will return to this explanation in a future post.)

A second possible explanation drove me to introduce Michael Crichton here: we are building cars so complex that we really don’t understand how they function and why they do what they do. So far, no one knows what ails the Toyotas. Is it a mechanical problem with the accelerator pedal made by the US company CTS? These pedals are being replaced not just on Toyota but also on other cars. But even Toyota doesn’t think this is the key explanation. Mechanical problems are generally easy to diagnose because we can actually see what’s wrong. The “improper floormats” explanation is also, at best, a secondary one. Right now, the focus seems to be on the electronics that control acceleration – and possibly, even the embedded software. Yet no one has yet figured out what this problem is. So, unless the real story has not been made publicly available (which is always possible), this explanation is still speculation; perhaps informed speculation, but speculation nevertheless.

Many years ago, I had started writing – and then abandoned – a book on manufacturing. In that effort I had assailed the belief that some software companies popularized in the 1990s: “Get it 80% right and ship.” Customers will tell you what is wrong – and you can fix it then. An incredibly simplistic belief in the power of being first to market drove this view. I hope it gets buried soon, for Apple is only the latest company to show that first mover advantages are highly overrated.) Couple this view with Mr. Crichton’s lesson and the dangers of following it become immediately obvious.

In 2006/2007, I was writing The Spider’s Strategy. I pointed out that the holy grail of modern product development – “make it modular” – had major limits. Companies like modularity because it gives (1) the flexibility to use the same parts in different places and (2) the ability to outsource design and manufacturing work in discrete chunks. I cited examples of product failures that had afflicted some of the best known brands in the world, including Toyota and argued that the weakness of this thinking lay in the electronics and software. This limitation made it essential for companies to collaborate closely with their design and manufacturing partners.

Toyota understood this fact better than most other companies. This is why it focused on building strong partnerships with its suppliers. Those partnerships had helped it make the jump from a Lean company to a networked company. It is truly sad that along the way somehow its management unlearnt this critically important lesson.

Comment » | Company Performance, Corporate Culture, Leadership

Back to top