1,995
28
Essay, 19 pages (5000 words)

Reflection – learning to live with complexity

SPOTLIGHT ON MANAGING COMPLEX ORGANIZATIONS Spotlight PHOTOGRAPHY: HARLAN ERSKINE 7/29/11 2: 41: 59 PM Gokce Sargut is an assistant professor at Governors State University, in University Park, Illinois. His research focuses on strategy and structural change in creative industries. Rita Gunther McGrath is a professor at Columbia Business School. She researches strategy and innovation in volatile environments. 68 Harvard Business Review September 2011 1271 Sep11 Sargut Layout; 24. indd 68 HBR. ORG ARTWORK Jen Stark, How to Become a Millionaire in 100 Days 2007, 1 million pieces of hand-cut paper, size varies (average: 3′ x 3′)

Learning To Live with Complexity How to make sense of the unpredictable and the unde? nable in today’s hyperconnected business world by Gokce Sargut and Rita Gunther McGrath September 2011 Harvard Business Review 69 1271 Sep11 Sargut Layout; 24. indd 69 7/29/11 2: 42: 17 PM SPOTLIGHT ON MANAGING COMPLEX ORGANIZATIONS M 70 Harvard Business Review September 2011 1271 Sep11 Sargut Layout; 24. indd 70 anaging a business today is fundamentally different than it was just 30 years ago. The most profound difference, we’ve come to believe, is the level of complexity people have to cope with.

Complex systems have always existed, of course—and business life has always featured the unpredictable, the surprising, and the unexpected. But complexity has gone from something found mainly in large systems, such as cities, to something that affects almost everything we touch: the products we design, the jobs we do every day, and the organizations we oversee. Most of this increase has resulted from the information technology revolution of the past few decades. Systems that used to be separate are now interconnected and interdependent, which means that they are, by definition, more complex.

Complex organizations are far more difficult to manage than merely complicated ones. It’s harder to predict what will happen, because complex systems interact in unexpected ways. It’s harder to make sense of things, because the degree of complexity may lie beyond our cognitive limits. And it’s harder to place bets, because the past behavior of a complex system may not predict its future behavior. In a complex system the outlier is often more significant than the average. Making matters worse, our analytic tools haven’t kept up.

Collectively we know a good deal about how to navigate complexity—but that knowledge hasn’t permeated the thinking of most of today’s executives or the business schools that teach tomorrow’s managers. How can we bring that knowledge to the fore? Let’s take a close look at what complexity is, the problems it raises, and how those problems can be addressed. Complicated Versus Complex It’s easy to confuse the merely complicated with the genuinely complex. Managers need to know the difference: If you manage a complex organization as if it were just a complicated one, you’ll make serious, expensive mistakes.

Let’s back up and start with simple systems. These contain few interactions and are extremely predictable. Think of switching a light on and off: The same action produces the same result every time. Complicated systems have many moving parts, but they operate in patterned ways. The electrical grid that powers the light is complicated: There are many possible interactions within it, but they usually follow a pattern. It’s possible to make accurate predictions about how a complicated system will behave. For instance, flying a commercial airplane involves complicated but predictable steps, and as a result it’s astonishingly safe.

Implementing a Six Sigma process can be complicated, but the inputs, practices, and outputs are relatively easy to predict. Complex systems, by contrast, are imbued with features that may operate in patterned ways but whose interactions are continually changing. Three properties determine the complexity of an environment. The first, multiplicity, refers to the number of potentially interacting elements. The second, interdependence, relates to how connected those elements are. The third, diversity, has to do with the degree of their heterogeneity. The greater the multiplicity, interdependence, and diversity, the greater the complexity.

An organic growth program, for example, is highly complex—it contains a large number of interactive, interdependent, diverse elements. Practically speaking, the main difference between complicated and complex systems is that with the former, one can usually predict outcomes by knowing the starting conditions. In a complex system, the same starting conditions can produce different outcomes, depending on the interactions of the elements in the system. Air traffic control, a complex system, constantly changes in reaction to weather, aircraft downtime, and so on.

The system is predictable not because it produces the same results from the same starting conditions but because it has been designed to continuously adjust as its components change in relation to one another. It’s possible to understand both simple and complicated systems by identifying and modeling the re- 7/29/11 2: 42: 24 PM LEARNING TO LIVE WITH COMPLEXITY? HBR. ORG Idea in Brief In just a short time, most businesses have gone from complicated to complex: They contain numerous diverse, interdependent parts. This makes managers’ jobs much more di? cult. They can’t predict what will happen when various parts of the business interact; the same starting conditions may yield di? erent results. • Seemingly simple actions produce unintended consequences. • Human beings’ cognitive limits mean that no manager can understand all aspects of the business—but many refuse to acknowledge those limits. • Rare events can be more signi? cant than average ones— and may occur more often than we think. Managers can navigate these di? culties by making fundamental changes to how they approach key tasks: • Forecasting • Mitigating risks • Making tradeo? • Ensuring diversity of thought lationships between the parts; the relationships can be reduced to clear, predictable interactions. It’s not possible to understand complex systems in this way, because all the elements are interacting continuously and unpredictably. The Problems of Complexity We’ve observed two problems commonly faced by managers of complex systems: unintended consequences and difficulty making sense of a situation. Unintended consequences. In a complex environment, even small decisions can have surprising effects.

Researchers have identified three situations in which this is likely to happen. The first is when events interact without anyone meaning them to. Nintendo’s Wii provides a recent example. Its innovative motion-sensing feature was designed to significantly expand the gaming market. To appeal to novice gamers and keep the price down, the company made the rest of the console relatively simple. It believed that its core audience would appreciate the new technology and forgive the less-sophisticated console. Nin- tendo succeeded in its immediate goal of pulling in new customers.

But traditional, hard-core gamers saw the motion-sensing technology as a gimmick and perceived the system as unserious. Over time, third-party developers increasingly released titles for Xbox 360 and PlayStation 3 but not for the Wii— partly because of the console’s limitations but also because they, too, had come to view the Wii as a “ casual” gaming machine. This long-term consequence of the company’s decisions would have been hard to foresee. A positive unintended consequence occurred when Ford CEO Alan Mulally agreed to join his fellow U. S. automotive CEOs in testifying to Congress in support f an industry bailout—even though Ford was the only carmaker not requesting TARP money. (He did this in part because the industry’s supply chains were so intertwined that a GM or Chrysler shutdown would have hurt Ford, too. ) Press reports of his action were quite favorable, and public perception of Ford’s quality and desirability rose dramatically. How Complexity Disrupts Business Ecosystems Many companies that once functioned within simple, self-contained markets now face competition from unexpected players. Consider, for example, the payments business.

Card issuers such as Visa, MasterCard, and American Express make money from two sources: annual cardholder fees and payments from vendors who accept the card. New players, including mobile-telephone operators and technology giants such as Google, are now racing into the payments market. Because these companies don’t need to make money from payments—their business models are supported by advertising—the collateral damage could be considerable. As business ecosystems become more interconnected and thus more complex, this kind of disruption becomes more common and causes more harm.

About the Spotlight Artist Each month we illustrate our Spotlight package with a series of works from an accomplished artist. We hope that the lively and cerebral creations of these photographers, painters, and installation artists will infuse our pages with additional energy and intelligence and amplify what are often complex and abstract concepts. This month we showcase Jen Stark, an American artist who creates intriguingly elaborate designs using simple paper-cutting techniques. Her works focus on replication and in? ity, drawing on patterns found in nature. They track the question our authors explore in this package: How can we identify basic values in hugely complex entities? View more of the artist’s work at jenstark. com. PHOTOGRAPHY: STEPHAN GOTTLICHER September 2011 Harvard Business Review 71 1271 Sep11 Sargut Layout; 24. indd 71 7/29/11 2: 42: 29 PM SPOTLIGHT ON MANAGING COMPLEX ORGANIZATIONS The second situation concerns unintended consequences that are based on an aggregate of individual elements, not a single occurrence.

The 2008 financial meltdown, for example, can be traced to numerous distinct but interconnected events: the relaxation of banking regulations, the invention of instruments that allowed lenders to shift risk off their balance sheets, monetary policies that kept interest rates low, the evaporation of reasonable credit standards and conventional down-payment requirements, ignorance on the part of borrowers, and so on. As we have now painfully learned, many observers could see some of these elements, but almost no one saw them all or anticipated the consequences of a drop in housing prices on the entire economic system.

A third situation is when policies and procedures remain in place long after the reason for their creation becomes obsolete. By then the logic underlying the procedures has often been forgotten. Employees at a major New York financial institution, for example, had to key in a code to enter the restrooms because of concerns about uninvited people gaining access. After 9/11 the firm instituted security screening at the building’s entrance, making the rest room key codes unnecessary—but it took years to get rid of them! In the meantime, life was more difficult for employees, clients, suppliers, and other visitors, for no reason at all.

Making sense of a situation. It is very difficult, if not impossible, for an individual decision maker to see an entire complex system. This is essentially a vantage point problem: It’s hard to observe and comprehend a highly diverse array of relationships from any one location. Many have argued that Citigroup’s near collapse, in 2008, stemmed from an organizational design that locked people into silos; employees with information about the consequences of the bank’s involvement in subprime lending were not connected to those making strategic decisions.

It didn’t help, of course, that the CEO at the time, Chuck Prince, conspicuously chose to ignore any warning signs of excessive leverage, as a now-famous remark to the Financial Times in 2007 demonstrates. “ As long as the music is playing, you’ve got to get up and dance,” Prince said, adding, “ We’re still dancing. ” 72 Harvard Business Review September 2011 We are hampered by cognitive limits: Most executives think they can take in more information than research suggests they actually can. We are further hampered by cognitive limits to our understanding of the effects of other people’s actions and our own.

Most executives believe they can take in and make sense of more information than research suggests they actually can. As a result, they often act prematurely, making major decisions without fully comprehending the likely consequences for the system. Durk Jager, the former head of Procter & Gamble, was pilloried for implementing sweeping organizational changes that mangled essential informal ties; in effect, he failed to grasp critical interdependencies in the firm. He lasted just 17 months in the top job. His successor, A.

G. Lafley, did very little to change formal structures, focusing instead on realigning incentives and rebuilding informal connections. In June 2000, when Lafley took over, P&G’s market capitalization was $69. 8 billion. By 2007 it was $231. 9 billion. In addition, we now know that focusing on one thing can prevent us from seeing others. A recent study documented substantial “ inattentional blindness”: Subjects who had been instructed to concentrate on a task failed even to notice dramatic events going on around them.

Rare events pose particular problems for those trying to make sense of complex systems, because they don’t repeat themselves often enough for us to learn how they will affect the system. Recall that air traffic control is generally a manageable system because it continuously adapts to changes. That adaptability is possible only because the system’s designers (sense makers) observed patterns that emerged over time and found the root causes of failures by conducting excruciatingly thorough postmortem reviews.

When the system was confronted with a rare event—the 2010 eruptions of Iceland’s Eyjafjallajokull volcano, which created a dust cloud whose size and properties had never been encountered in aviation history—it could not cope and had to be shut down, at enormous expense. Similar systemic shutdowns followed Hurricane Katrina, in New Orleans, and the earthquake and tsunami in Japan. Collectively, these problems mean that complex systems pose challenges in at least three areas of managerial activity: forecasting the future, mitigat- 1271 Sep11 Sargut Layout; 24. ndd 72 7/29/11 2: 42: 34 PM LEARNING TO LIVE WITH COMPLEXITY? HBR. ORG A Counterintuitive Approach to Hiring In The Di? erence (Princeton University Press, 2007), Scott E. Page, a social scientist and complex systems expert at the University of Michigan, examines a number of topics relating to diversity. One concerns strategies for hiring people who will maximize the cognitive variety within a company. Consider the test results below, which represent the responses of three people being considered for two open positions; each X indicates a correct answer.

The candidates chosen will join a research team in which diverse thinking is of the utmost importance. Which two would you hire? Je? has the highest number of correct answers (7); Rose and Spencer APPLICATION TEST RESULTS SPENCER JEFF ROSE X X X X Q1 Q2 Q3 X Q4 X ing risks, and making tradeoffs. Let’s explore some remedies for each. Improved Forecasting Methods Managers faced with complex systems can take several steps to increase their predictive abilities. They should: Drop certain forecasting tools. Embedded in many analytic tools are two assumptions that don’t hold for complex systems.

The first is that observations of phenomena are truly independent; this is often not the case in complex systems, with their highly interconnected parts. (Think of the wellknown “ butterfly effect,” when something small that happens early in a chain of events causes disproportionate consequences by the end. ) The second is that it’s possible to extrapolate averages or medians to entire populations. Take a controversial case in medicine—the U. S. Food and Drug Administration’s deliberations (ongoing as of this writing) over whether to withdraw approval for the use of the drug Avastin in treating breast cancer.

The issue has caused an uproar among the estimated 17, 000 U. S. women who take the medication. Follow-up clinical trials revealed some potentially serious side effects and failed to show that the drug helps the statistically average patient. However, many doctors and patients have suggested that it prolongs life and improves quality of life in certain patients and completely cures a few. Cancer treatment is a complex system, but the agency is applying the logic of a complicated one. In business, the problem shows up when companies try to predict customer behavior on the basis of average responses.

On average, people loved New Coke, but the product ultimately flopped. It shows up when they fail to consider that outliers are often more interesting than the average case. And it shows up when they fail to account for the future importance of early events. Boston Scientific paid a huge amount for the cardiovascular device manufacturer Guidant, despite revelations during the bidding process of quality problems and cover-ups. Had it understood that those revelations signaled deeper problems going back many years, it could have avoided overpaying for a company it then had to pour vast resources into fixing.

Boston Scientific’s stock has yet to recover. And in complex systems, events far from the median may be more common than we think. Tools that assume outliers to be rare can obscure the wide varia- have 6 and 5, respectively. Assuming that everything else is constant, most of us would conclude that Je? should de? nitely be hired. You’d probably also hire Rose. Page argues that this might not be the best decision, though. Notice that every question Rose answered correctly was also answered correctly by Je? ; her knowledge is likely to duplicate his.

Moreover, although Spencer got the fewest correct answers, he gave the correct answer to every question that Je? got wrong; he’s apt to bring something di? erent to the table. The lesson: If your organization needs people with diverse points of view, your HR strategy should try to complement the Je? s with the Spencers. Q5 X X X X X X Q6 Q7 Q8 X Q9 X X X X X Q10 ADAPTED FROM THE DIFFERENCE tions contained in complex systems. In the U. S. stock market, the 10 biggest one-day moves accounted for half the market returns over the past 50 years.

Only a handful of analysts entertained the possibility of so many significant spikes when they constructed their predictive models. Simulate the behavior of a system. Instead of extrapolating from irrelevant medians, look for modeling that will give you insight into the system and the ways in which its various elements interact. Examples include the customer-relationshipmanagement models used by telecommunications companies to anticipate a person’s vulnerability to defection, and the data-mining tools used to predict consumer responses to various types of advertising.

Further, make sure that your forecasting models incorporate low-probability but high-impact extremes. The complexity researchers Pierpaolo Andriani and Bill McKelvey observed that 16, 000 minor earthquakes occur in California every year, but a really big one happens only once every 150 or 200 years. The average earthquake, then, is not very dangerous. It would be foolhardy, though, to base building codes on the average quake when what matters most is the big one. So, too, in business: What matters most September 2011 Harvard Business Review 73 1271 Sep11 Sargut Layout; 24. indd 73 /29/11 2: 42: 40 PM SPOTLIGHT ON MANAGING COMPLEX ORGANIZATIONS may be the extreme but rare possibility, not the most likely one. Use three types of predictive information. If it’s impossible to predict the future in a complex system with a high degree of accuracy, and if organizations must nonetheless place bets with the future in mind, what’s the wisest course for leaders who need to put some stakes in the ground? How can they find a happy medium between excessive and convoluted scenarios about what might happen and linear predictions that are over-reliant on past knowledge?

We advise managers to be explicit about what they think will be applicable from past experience and what might be different this time around. One way to do this is to divide your data among three buckets: • Lagging: data about what has already happened. Most financial metrics and key performance indicators fall into this bucket. • Current: data about where you stand right now. Your pipeline of opportunities might be in this bucket. • Leading: data about where things could go and how the system might respond to a range of possibilities. If the bulk of your information is in the lagging bucket, that’s a warning sign.

Basing decisions mainly on lagging indicators is essentially betting that the future will be like the past. At least some of your information should be in the leading bucket. This information will be fuzzy and subjective by definition: The future hasn’t happened yet. But without it, you’re apt to be blindsided by change. For an example of how the leading bucket prompted action to avert a possible system failure, recall the Y2K dilemma—the concern that computers would go haywire at the turn of the century because many used a two-digit year format.

Early programmers expected that the software they created would be completely overhauled long before the millennium rolled over, but many critical legacy systems using the two-digit format remained (a fact we would place in the lagging bucket). The catastrophic scenarios in the leading bucket were so vivid and plausible that enormous efforts were made to bring complex computer systems into compliance before the year 2000 arrived (the plans to this end would 74 Harvard Business Review September 2011 be placed in the current bucket).

When the time came, only a handful of problems surfaced, most of them minor. Note that while the bucket tool simplifies reality, it doesn’t assume away complexity, unlike traditional forecasting tools. Better Risk Mitigation Minimizing risk is crucial for anyone in charge of a complex system, and traditional approaches aren’t good enough. Managers must learn to: Limit or even eliminate the need for accurate predictions. In an unpredictable world, sometimes the best investments are those that minimize the importance of predictions. Take product design.

In a conventional system, manufacturers must guess which configuration of features customers will purchase, and at what price. They run a high risk of being wrong, especially when the product is complex. It’s possible to eliminate this guesswork by designing a system that puts users in charge of the decisions, allowing them to create the outputs they want. Lulu, for example, has upended the traditional publishing model by giving writers control over key elements of the process. In the conventional model, publishers pay authors an advance and print books without knowing how many copies will sell.

In the Lulu model, authors upload content to the company’s website and name their price. The books (or other outputs) are printed only after customers visit the site and decide to buy them. The authors receive 80% of the revenue—more per copy than is typical—and Lulu avoids the risk of printing books that end up on the remainder table or in warehouses, or being destroyed. By structuring the decision process so that books are produced and funds change hands only when a buyer is ready to pay, Lulu has more or less eliminated the danger of getting it wrong.

Boeing’s wildly successful 777 aircraft series exemplifies this principle at a much higher level of product complexity. The company engaged eight major airlines to help with the development process, producing iterative models whose design evolved according to these customers’ input. It used advanced visualization techniques such as 3-D modeling to In an unpredictable world, sometimes the best investments are those that minimize the importance of predictions. 1271 Sep11 Sargut Layout; 24. indd 74 7/29/11 2: 42: 45 PM LEARNING TO LIVE WITH COMPLEXITY?

HBR. ORG reduce unexpected interactions between airplane systems and capture feedback as early as possible. Use decoupling and redundancy. Sometimes elements of a complex system can be separated from one another to decrease the systemic consequences if something goes wrong. Decoupling yields two benefits: It shields parts of the organization from the risks of an unexpected event, and it preserves parts that may be needed to mount a response. Contrast the Windows operating system with Software as a Service (SaaS) applications.

With Windows, the operating system and your data are tightly entwined; when you upgrade to a new version of the system, all your information is erased, meaning that you need to back it up and reload it to your computer. With SaaS, uniform interfaces tell the computer where your data are. You can upgrade away, and the data won’t be touched. And because the software and the data are uncoupled, the risk that both will be harmed simultaneously is significantly reduced. Elements can also be designed to substitute for one another in case part of the system goes down.

Intentional redundancy makes it more likely that the system can continue to operate to at least some degree even when portions of it are challenged. Decoupling and redundancy involve added expense, but the investment can be worthwhile. Of course, there are limits to the decoupling and redundancy you can contain (and afford) within a single organization. You may need to call on external resources to expand the adaptive responses your organization can muster. The consultancy Accenture, for example, has an extensive network of partners to whom it can quickly turn if a client has an unanticipated need that Accenture cannot address.

It also uses partnerships (including an arrangement with one of us, Rita) to conduct research that might not be part of its mainstream business but could yield early warnings of interest to its clients. Draw on storytelling and counterfactuals. Another aspect of mitigating risk is making sure that people view unlikely but potentially catastrophic future events as real. Sharing anecdotes about near misses and rehearsing responses to a hypothesized negative event can help focus attention on a possibly significant future occurrence. Posing counterfactuals—asking “ What if? —is a terrific but surprisingly underutilized way of coming up with scenarios that are unlikely to be surfaced by traditional techniques. In business, “ soft” approaches like these are valued less than the supposedly more rigorous activity of number crunching. We instinctively associate stories and counterfactuals with literature and fantasy and look to data for science, reason, and truth. But when traditional methods repeatedly fail to make sense of the rare and unexpected (precisely the things that most interest us), it’s time to reconsider. Stories can give us great insights into complex systems, partly ecause the storyteller’s reflections are not restricted by the available data. Triangulate. As powerful as storytelling is, it comes with a disadvantage. The sky’s the limit as far as our imagination goes—and therein lies the problem. There are no boundaries around where we should look or when we should stop looking. That’s where triangulation comes into play. Triangulation means attacking a problem from various angles—using different methodologies, making different assumptions, collecting different data, or looking at the same data in different ways.

One of the best ways to understand a complex system is to do precisely that. For example, comparing snapshots of various elements taken at a given point in time (an activity social scientists call crosssectional analysis) yields a different understanding than looking at how a single element evolves over time. Or you can do both, studying how numerous elements evolve over time; in fact, this is the bread and butter of much sophisticated econometric and financial analysis.

Despite its obvious advantages, triangulation had limited application until very recently, but the tools it requires have gotten better and easier to use. Combining “ soft” but flexible storytelling techniques with “ hard” but rigid quantitative analyses can be an extremely powerful way to make sense of complex systems. The former help us explore unlikely but important possibilities and unintended consequences, while the latter give us concrete insights into the relationships of the system’s visible components.

Managers confronted with complexity should avail themselves of both. Smart Tradeo? Decisions In a complicated environment, it’s relatively easy to make good tradeoffs: Simply figure out the optimal combination of elements and invest in those. It’s similar to an engineering problem. In complex environments, however, making good tradeoffs is more difficult. Two strategies can help. Take a real-options approach. This means making relatively small investments that give you September 2011 Harvard Business Review 75 1271 Sep11 Sargut Layout; 24. indd 75 /29/11 2: 42: 50 PM SPOTLIGHT ON MANAGING COMPLEX ORGANIZATIONS HBR. ORG the right, but not the obligation, to make further investments later on. The goal is to limit your downside while maximizing the value you can capture on the upside. Gradually building a portfolio of small investments keeps the stakes low until you’re able to reduce the most significant uncertainties you face. A real-options strategy helps you manage failure by containing costs, not by eliminating risks (an approach Duke University’s Sim Sitkin and others have called “ intelligent failure”).

The idea isn’t to avoid making mistakes but to make them cheaply and early, learning from them and increasing your resilience as you go. Ensure diversity of thought. What kinds of HR tradeoffs might you make if you realized you were dealing with a complex system rather than a merely complicated one? Complicated systems are like machines; above all, you need to minimize friction. Complex systems are organic; you need to make sure your organization contains enough diverse thinkers to deal with the changes and variations that will inevitably occur.

Who in your company regularly talks to people you might not interact with yourself, comes up with things that are a little off the beaten track, and is attuned to underlying risks and trends that your other managers might overlook? In a complex system, finding the right people for the job means seeking out these sorts of thinkers (see the sidebar “ A Counterintuitive Approach to Hiring” for an unusual but effective strategy). WE HAVE made tremendous progress in our ability to operate complicated systems, even large ones; we’ve done this by studying breakdowns and adjusting accordingly.

We have made less progress in our ability to operate complex systems, which defy conventional modeling and challenge traditional management practices. Leaders need to use better tools for anticipating how these systems will behave—tools that can help us understand the constant interactions of numerous elements and the impact of rare but extreme events. By taking steps to mitigate risks, making measured tradeoffs that keep early failures small, and gathering diverse thinkers who can deal creatively with ariation, we can approach decision making in our complex organizations with more confidence and increase our chances of success. HBR Reprint R1109C 76 Harvard Business Review September 2011 1271 Sep11 Sargut Layout; 24. indd 76 7/29/11 2: 42: 55 PM CARTOON: GREGORY KOGAN Harvard Business Review Notice of Use Restrictions, May 2009 Harvard Business Review and Harvard Business Publishing Newsletter content on EBSCOhost is licensed for the private individual use of authorized EBSCOhost users. It is not intended for use as assigned course material in academic institutions nor as corporate learning or training materials in businesses.

Academic licensees may not use this content in electronic reserves, electronic course packs, persistent linking from syllabi or by any other means of incorporating the content into course resources. Business licensees may not host this content on learning management systems or use persistent linking or other means to incorporate the content into learning management systems. Harvard Business Publishing will be pleased to grant permission to make this content available through such means. For rates and permission, contact [email protected] org.

Thank's for Your Vote!
Reflection – learning to live with complexity. Page 1
Reflection – learning to live with complexity. Page 2
Reflection – learning to live with complexity. Page 3
Reflection – learning to live with complexity. Page 4
Reflection – learning to live with complexity. Page 5
Reflection – learning to live with complexity. Page 6
Reflection – learning to live with complexity. Page 7
Reflection – learning to live with complexity. Page 8
Reflection – learning to live with complexity. Page 9

This work, titled "Reflection – learning to live with complexity" was written and willingly shared by a fellow student. This sample can be utilized as a research and reference resource to aid in the writing of your own work. Any use of the work that does not include an appropriate citation is banned.

If you are the owner of this work and don’t want it to be published on AssignBuster, request its removal.

Request Removal
Cite this Essay

References

AssignBuster. (2022) 'Reflection – learning to live with complexity'. 5 August.

Reference

AssignBuster. (2022, August 5). Reflection – learning to live with complexity. Retrieved from https://assignbuster.com/reflection-learning-to-live-with-complexity/

References

AssignBuster. 2022. "Reflection – learning to live with complexity." August 5, 2022. https://assignbuster.com/reflection-learning-to-live-with-complexity/.

1. AssignBuster. "Reflection – learning to live with complexity." August 5, 2022. https://assignbuster.com/reflection-learning-to-live-with-complexity/.


Bibliography


AssignBuster. "Reflection – learning to live with complexity." August 5, 2022. https://assignbuster.com/reflection-learning-to-live-with-complexity/.

Work Cited

"Reflection – learning to live with complexity." AssignBuster, 5 Aug. 2022, assignbuster.com/reflection-learning-to-live-with-complexity/.

Get in Touch

Please, let us know if you have any ideas on improving Reflection – learning to live with complexity, or our service. We will be happy to hear what you think: [email protected]