Представители финансового истеблишмента США стали все чаще заявлять о том, что замена людей на рабочих местах программами и роботами в дальнейшем приведет к увеличению безработицы.
Представители финансового истеблишмента США стали все чаще заявлять о том, что замена людей на рабочих местах программами и роботами в дальнейшем приведет к увеличению безработицы.
Оптимизация использования источников энергии поможет сэкономить от $900 млрд до $1,6 трлн во всем мире к 2035 г. согласно новому докладу McKinsey Global Institute. Подробнее читайте на нашем сайте www.oilru.com
Оптимизация использования источников энергии поможет сэкономить от $900 млрд до $1,6 трлн во всем мире к 2035 г. согласно новому докладу McKinsey Global Institute.
Впервые за десятилетнюю историю конкурса для женщин-предпринимателей Cartier Women’s Initiative Awards в число финалисток вошла участница из России. Ею стала победительница прошлогодней Школы молодого миллиардера Forbes. Всего выбрано 18 финалисток из 1900 заявок.
McKinsey has a new study out on an important topic — the question of whether corporations systematically take too short a view and do not invest enough for the long term. If they do, as many CEOs believe, this is a serious indictment of current corporate governance arrangements and has important policy implications. To take one close to my heart, if short-termism causes underinvestment, it will be a cause of secular stagnation. I am not sure what to believe in this area. On the one hand, there are many anecdotes suggesting that pressures to manage earnings hold back investment. And the short-termism view is very widely believed. On the other hand, some of what is done in the name of managing for the long term may be unmonitored waste. The observation that many “unicorn” companies with no profits — and sometimes no revenues or even fully developed products — get valued so highly makes me skeptical of the idea that the capital market is systematically myopic. It is also the case that the companies generating the highest immediate cash flows, which should be overvalued on the myopia theory, historically have had the highest stock market returns, implying undervaluation rather than overvaluation. I was therefore excited to see that McKinsey had a new empirical study that provides evidence in favor of the view that corporations should take longer views. It has a reasonable methodology. It divides the sample between companies that take a long-term view and those that do not, and then compares their performance. It finds that companies that take a long-term view perform better on many metrics, such as employment growth and shareholder return. Its findings deserve much discussion, debate, and attempts at replication. At this point, though, I would give a Scottish verdict of “not proven” to their case. They may be right, but I do not think they have provided evidence that would convince anyone other than a prior believer. Consider an analogy. It is doubtless the case that golfers with long swings, like Phil Mickelson, hit the ball farther and more accurately than golfers with short swings, like myself. An index of swing length would be highly correlated with almost any measure of golf performance. Does this mean I should lengthen my swing? I doubt it. Those with more flexibility and coordination are able to take and control longer swings and to play golf better. If I were to try to swing like Phil, I would mishit the ball and maybe break my back. Some companies have great ideas, great management teams, and compelling strategies. They invest heavily, seek to grow revenue, ignore the management of earnings, and do limited stock buybacks. These are the criteria McKinsey uses to measure long-termism. Other companies lack vision and have mediocre management. They invest less, cut costs more, manage earnings, and buy back their stock. McKinsey deems them short term–focused. No surprise, the long-term companies outperform the short-term companies. But this may be due to their vision and execution capacity, not their long-term focus. Mediocre companies seeking to imitate them will be like me trying to imitate Phil — painful failures. I do not see any basis in the McKinsey results for saying that companies should extend their horizons. McKinsey tries to address this issue by doing comparisons within industries. But everything we know suggests that there are substantial differences in company quality within industries, as well as across industries. Again, it may be that the long-termism hypothesis is right, and there may be ways of teasing causality out of the very interesting data set that McKinsey has created. But at this point, I think the issue is still unresolved.
The United States has thousands of workforce development and training programs, run by the public, social, and private sectors. Some are excellent; others, not so much. The problem is that we don’t know which are which. That lack of knowledge is costly. According to the Georgetown University Center on Education and the Workforce, spending on programs in the U.S. for those not going to four-year colleges — everything from federal and state jobs initiatives to on-the-job training, certifications, community college, and employer training — is at least $300 billion a year. But according to the World Bank, only 30% of youth employment programs are successful, with many of those offering only marginal benefit. And most programs have no positive effect at all. Yet workplace training is more necessary than ever, as technology and globalization continue to change the types of jobs that are available. In a dynamic economy workers are expected to adapt, to change not just jobs but sometimes careers, to pick up new skills when necessary. That requires successful training programs, which means we need to know which ones work. Most existing training programs do try to assess their effectiveness. Many measure cost per student. Some measure job placement rates. A minority track on-the-job retention. These metrics are useful but miss the big picture, in part because they mistake a program’s cost for its value. Think about it. If a program has a low cost per student but fails to actually help people forge a solid career, then the fact that the failure is cheap does not make it any less of a failure. Conversely, some programs may promise high rates of job retention, but at such a high cost per student that the program proves impractical or impossible to scale. Or, if the jobs themselves are low paying and don’t offer students a viable career path, they may not be worth it regardless of the high retention rates. Cost per student is good to know, but it doesn’t mean much if students don’t succeed in the workplace. Job placement matters, but a high placement rate is meaningless if the participant leaves after a week or if the job itself is temporary or doesn’t pay well. Conducting an accurate cost-benefit analysis requires a holistic approach, one that incorporates costs and job placement and also accounts for how participants are doing after they leave the program. We need to adopt something similar to a “total cost of ownership” (TCO) analysis. Now common in industry, TCO considers both direct and indirect costs over time. Applying a form of TCO to workforce programs makes sense because, instead of concentrating on inputs (in the form of spending), this approach emphasizes outcomes (in the form of long-term results). We have come to this realization the hard way — through experience. For the last two years we have been implementing Generation, a youth employment program that is part of the McKinsey Social Initiative. So far, Generation has served nearly 10,000 young people in five countries: India, Kenya, Mexico, Spain, and the United States. As we sought to measure Generation’s results, we began to understand the limitations of current practice. We developed a new metric — cost per employed day (CPED) over the first six months — that we believe better defines how well employment programs work. CPED combines elements of existing measures into a powerful, readily understandable one. It measures the social and economic benefits of employment programs with much greater precision. Here’s an example. Program X serves 1,000 students at a cost of $1,000 each, or $1 million total. Five hundred individuals are placed into work (a 50% “job placement” rate), and they stay employed for an average of 60 days in the first six months. That adds up to 30,000 days on the job, at a cost of $33 per employed day. Program Y, on the other hand, has an up-front cost of $2,000 per student, but a placement rate of 80%, and graduates stay on the job for an average of 120 days. That comes out to 96,000 working days, or $21 per employed day. Clearly, Program Y, which at first blush looks twice as expensive as Program X, provides far more value in terms of helping participants find and keep gainful employment. At Generation, the CPED figure varies depending on the market, ranging from about $5 in India to $26 in the United States. Debating the utility of specific metrics might seem like a minor thing. But adopting more-accurate measures of success increases accountability. And accountability drives results. For example, once Generation managers realized the power of CPED, they used it to make operational improvements. On the basis of what we learned from CPED, we began to work more closely with employers to track retention rates and we increased our emphasis on mentoring in the first days on the job. Generation is also developing tools to improve data collection and management. While the data needed to make comparisons with other job training programs does not yet exist, our sense is that using CPED would reveal tens of billions of dollars in inefficient spending, in the form of programs with subpar CPED performance. Perhaps the biggest challenge to widespread use of CPED is that workforce development programs are fragmented, with thousands of providers and almost as many ways of doing things. That makes getting basic information next to impossible. And because reporting requirements vary from place to place, practitioners spend an inordinate amount of time fulfilling compliance obligations that may be pointless. CPED, by contrast, provides a simple and effective way to measure performance. For it to be adopted more widely, or even to become standard, all programs would need to collect data on cost per student, job placement, and retention. In addition, to enable everyone to learn what works, there should be a centralized database in which this information can be gathered and then easily accessed. Funders could help by adopting CPED and mandating that programs collect the necessary data. Despite the promise shown by CPED, we have significant work ahead to improve this new metric and make it the standard across training programs. Today, for instance, many programs would struggle to measure CPED at the three-month mark, let alone at the six-month mark. Our hope is that once we, Generation, and other programs take this next step, we can extend the timeline for CPED, and perhaps even incorporate wages — both of which would make CPED a richer, even more accurate metric. While CPED can continue to be improved, it’s a big step in the right direction and can help us better measure the effectiveness of worker training programs. “What gets measured gets managed” has become a cliché. Like many clichés, this one earned its status because there is a large element of truth to it. In a world in which 73 million young people are unemployed and over 200 million more struggle in unstable or dead-end jobs, it is surely possible to do much better. Data and metrics are part of the solution.
SOUTH BEND, Ind. ― Earlier this month, South Bend Mayor Pete Buttigieg, a dark horse candidate to chair the Democratic National Committee, settled into the cramped studios of Radio Sabor Latino, a local Spanish-language news and music station. Armed with Google Translate and a cheerful attitude about his español defectuoso, Buttigieg set about describing South Bend’s new municipal identification program, created to provide people without official government IDs access to local facilities like schools and libraries. The anxiety permeating the interview was palpable. “La función de la policía es la seguridad de nuestros residentes,” Buttigieg said, looking over at the DJ for an assist as he explained that the role of the police is to ensure residents’ security. “Vamos a … vamos a ... We’re going to care for each other.” The Trump administration’s harsh anti-immigrant rhetoric has rattled communities with sizable undocumented populations like South Bend, eroding an already tenuous relationship with members of law enforcement, who are often viewed as a conduit to deportation. This is why the city has a local nonprofit manage the new ID system, one that is not subject to the same transparency obligations as city government. This may be somewhat surprising since, thanks to cultural touchstones like Notre Dame’s Fighting Irish and “Rudy,” the South Bend of our imagination is a hardscrabble Irish-Catholic town, disproportionately populated by the type of white, working-class voters who flocked to President Donald Trump’s candidacy last November. In reality, one-quarter of the city’s population is African-American, one-tenth is Hispanic and a sizable university presence bolsters its white-collar workforce ― so the population bears closer resemblance to the country as a whole than some kind of Caucasian working-class hamlet. Yesteryear does loom large in South Bend, which endured a decades-long economic decline when the Studebaker automobile company, which was headquartered in the city, shuttered in the mid-1960s. The city lost roughly one- quarter of its population between 1960 and 2010, and the signs of that economic contraction are evident everywhere, whether in in the guise of shuttered storefronts, abandoned lots or dilapidated Victorian mansions that used to house the beneficiaries of a long-gone prosperity. If you’ve never visited a place like South Bend, you’ve probably read about one in the thousands of pieces demystifying the so-called “economic anxiety” of Trump voters during the 2016 presidential campaign. Buttigieg notes that South Bend’s population has begun to grow again, and his able stewardship of the city and popularity in this politically purple area ― he was re-elected in 2015 with over 80 percent of the vote ― are the cornerstones of his DNC campaign. “There are still a lot of empty teeth here,” he conceded, alluding to the abandoned and bulldozed properties that dot the city, but he nevertheless sees a success story. Indeed, Buttigieg’s career has been one well-manicured success story itself, as if conceived in a round of Democratic Mad Libs. Peter Paul Montgomery Buttigieg, a mere 35 (young-ish age) is a graduate of Harvard (prestigious university) and Oxford (prestigious university) where he was a Rhodes scholar (academic accolade). Despite hailing from Indiana (flyover red state), he came out as gay (orientation) in a 2015 op-ed (public action). Mayor Buttigieg has proudly served in the Navy Reserves (military branch), earning a Joint Service Commendation Medal (military commendation) while serving in Afghanistan (theater of war). He is a true millennial, managing his own Twitter (social media service) account; he even met his boyfriend (partner noun) on Hinge (dating app)! He has learned the importance of data-driven governance from his time at McKinsey & Company (tremendously boring place to work) and has dealt extensively with Silicon Valley. You can hear him wax wistfully about his hometown’s shuttered Studebaker plant (local totem of past economic glory) and its burgeoning data industry (local modernization initiative) by watching his TED Talk (TED Talk). After the Radio Sabor Latino interview, Buttigieg took a turn playing enthusiastic tour guide, navigating an aide’s Hyundai through some of South Bend’s previously robust industrial areas. He’s spent much of his time as mayor engaged in a kind of NIMBY whack-a-mole, tearing down abandoned industrial facilities, repurposing other ones and enticing tech companies to build data centers in his city (South Bend sits near a major fiber optic cable artery). As such, Buttigieg possess a singular ability to be excited about empty plots of land. “There were acres and acres of old Studebaker factories,” Buttigieg recalled about the city’s old skyline. Now, many of the facilities ― the ones that haven’t been bulldozed ― serve as data centers, owing in part to South Bend’s cold weather and relatively cheap energy prices. “I actually don’t remember my first ribbon-cutting, there’ve been so many,” he said, before mentioning with a note of pride the increasing number of Notre Dame students who are staying in the area after graduation. In many ways, Buttigieg’s ascent mirrors that of another rising star in the party, New Jersey’s Democratic Sen. Cory Booker, who came to national prominence as the charismatic mayor of another down-on-its luck city, Newark. Like Booker, Buttigieig has forged close ties with Silicon Valley and other nodes of coastal power and has received praise for his job performance. It’s not uncommon for Buttigieg to reference former Harvard classmates or interactions with tech moguls in conversation. In June 2016, The New York Times’ Frank Bruni asked whether Butteigig would be America’s “First Gay President.” In 2014, The Washington Post labeled Buttigieg “The most interesting mayor you’ve never heard of.” If Buttigieg doesn’t already know a spot in Davos that has great kalberwurst, he probably will soon. To his credit, Buttigieg indulges in neither Booker’s inspirational poster rhetoric nor his unbridled adulation for the Charlie Rose set. South Bend’s mayor possesses a far more laid back personality than his methodological rise might indicate, maintaining an easy rapport with his staff and an ability to speak policy without devolving into talking points. And like Booker, Buttigieg has an almost cartoonishly friendly appearance: Teeth fixed in a slight grin and framed by a boyish face, he could easily pass as a children’s daytime television host. As a literal public face of the party, Democrats could do worse. Buttigieg admits that becoming the DNC chair is an uphill climb. Conventional wisdom dictates that when party officials vote at their winter meeting later this month, they are most likely going to choose Minnesota Rep. Keith Ellison or President Barack Obama’s labor secretary, Thomas Perez. But Buttigieg argues that he could play Democratic peacemaker, uniting a party split largely between the Obama-Clinton establishment wing, which has coalesced around Perez, and the insurgent wing, populated by supporters of Sen. Bernie Sanders (I-Vt.) and coalescing around Ellison. If one of those two win, “half the party is going to feel like they lost,” said one Buttigieg aide. Buttigieg’s proposed approach to leading the party does not differ terribly from that of his opponents: He believes in cultivating a 50-state strategy at the grassroots level, focusing the party’s myriad coalitions and keeping up relentless pressure on the Trump administration. There are specific proposals, too, such as shifting the DNC’s regional staff out of D.C. and into the states, but his overarching agenda isn’t terribly unique. On Trump, Buttigieg’s approach isn’t terribly unorthodox, either. Senate Democrats, he said, need to take a tough line on opposing Trump’s nominees. “We’ve never been a party to obstruct for obstruction sake,” Buttigieg said, “but I think we have to be fierce in how we respond to this stuff.” A loss might not necessarily be the worst thing for Buttigieg, however. Party chair jobs are inherently partisan positions that can derail a politician’s personal ambitions ― just ask former chairwoman Debbie Wasserman Schultz, who remains persona non grata in many circles for her tenure atop the party apparatus during the DNC hack. And while Virginia Gov. Terry McAuliffe and Sen. Tim Kaine have found political success despite chairing the DNC, Virginia is far less red than Indiana, home to a Democratic base centered near that most politicized of places, Washington, D.C. If Buttigieg’s bid fails, he will still get the benefit of an increased profile without the politically damaging effects. Indeed, it’s hard not to hear Buttigieg’s rhetoric about his can-do mayoralty and think this is all a practice for future campaigns. When DNC officials approached Buttigieg during the 2016 cycle to ask if he’d serve as an LGBTQ surrogate, he offered no opposition, but said he would prefer to discuss defense matters, a far more politically safe issue. In the meantime, however, he is still running for DNC chair and could actually win. Why risk that? “This isn’t Virginia, obviously,” Buttigieg agreed, “but I don’t think you should be in elected office just to have it.” Sign up for the HuffPost Must Reads newsletter. Each Sunday, we will bring you the best original reporting, long form writing and breaking news from The Huffington Post and around the web, plus behind-the-scenes looks at how it’s all made. Click here to sign up! Huffington Post reporter Eliot Nelson’s book, The Beltway Bible: A Totally Serious A-Z Guide to Our No-Good, Corrupt, Incompetent, Terrible, Depressing and Sometimes Hilarious Government, is out now. -- This feed and its contents are the property of The Huffington Post, and use is subject to our terms. It may be used for personal consumption, but may not be distributed on a website.
Too many respond to increased competition by lowering prices. The problem is that lowering the price often comes with unintended consequences. It can lower the image of your product and send the wrong message to customers that recently paid more. What is a company to do? The answer is set the price at the right level by finding out what buyers are willing to pay. What buyers are willing to pay Pricing novices tend to be "inside-out" thinkers. They assume that buyers always want to pay less because they, themselves (inside their own heads), want to pay less. What the data shows is that this is true if the product is validated. If you know and trust the company and the product you are buying, of course, you would like to buy it for less. What if the product is not validated? You never heard of the company, or you have never heard of the product. Time and again, a lower price for an unknown product from an unknown company usually means one thing to the buyer's brain - the product is probably not very good. What gives you control over the price Economists tell you that the market determines the price, and that an economic price is where supply equals demand. Rather than fight with economists, good Marketers know that positioning (product branding) gives you control over the price. The more uniqueness and desirability you put into your positioning strategy, the more control you have over the price you can charge for your product. A few examples should help to illustrate how positioning gives you this control. Record-breaking baseball. A baseball sells for a few dollars. The baseball that Barry Bonds hit in the stands to break Hank Aaron's home run record sold for $752,467.20. It was unique, and desired by the man that bought it. Apple's smartphone profits. Even if the iPhone has only a 12.1% share of the smartphone market, Apple commanded 91% of the smartphone profits in the 3rd Quarter of 2016. If you want and iPhone, only one company makes it - Apple. If you want an Android phone, there are a number of choices. Entertainment venue food and beverages. When you go to see a popular professional sports team in a stadium or a movie or concert in a theater, you might notice that the price of candy, popcorn, hot dogs and other items is considerably higher than if you buy them outside the venue. Why? Even if the items themselves are not unique, they have uniqueness because the distribution channel you are in has no competition nearby. If you want to eat or drink something, you have to buy it from the venue at the prices they charge. Once you understand that positioning gives you control over the price, you are ready to develop pricing strategies to help your sales and profits. Pricing is more complex than you think. Pricing is one of the fundamental building blocks of marketing. Many believe price is easy to understand when, in fact, it is one of the most difficult. McKinsey points out in an article entitled Shedding the Commodity Mind-set, too many companies leave large amounts of money on the table. Pricing experts know that setting the right price is difficult because, in addition to the physical factors of cost and profit, price is subject to psychological factors. The best companies can do to have control over these psychological factors is to do a great job of branding. And to get the branding right, companies have to know how to develop the right underlying corporate image and positioning strategies. In short, creating a brand image of the product that is impossible, or extremely difficult, to copy is the key to having control over your pricing strategies. If you are able to do that, you will be able to employ the most powerful and effective of all pricing strategies - What The Market Will Bear (WTMWB). What the Market Will Bear In markets where there is little or no competition, companies can employ a pricing strategy that optimizes profits. It is often called a What The Market Will Bear (WTMWB) price. This strategy sets the price based on the maximum price the market will pay for the product. On the one hand, the company wants to realize the highest profits possible in the shortest amount of time to help recoup high start-up costs, such R&D (research and development), production, and marketing costs. On the other, it may not want its profits to be so attractive as to entice cutthroat competition to enter the market within the time window it needs to build market share and establish a leadership position. This strategy typically works because those likely to buy a new product - the Innovators and Early Adopters - are not particularly price sensitive. If there is considerable uniqueness and desirability built into the product brand, your company can employ a WTMWB strategy. If not, you might consider other effective pricing strategies. Gross Profit Margin Target In almost all cases, pricing strategies should begin with a Gross Profit Margin Target (GPMT) strategy. Companies typically know the gross profit margin they need to pay back their expenses and generate positive net income and cash flow. Once your company knows the cost of sales (cost of goods and services sold) of a particular product and the Gross Profit Margin Target it wants, it can easily employ a GPMT strategy. Gross Profit Margin is defined by the formula (P-C)/P, where P=Price and C=Cost of Sales. Anybody can put this formula into a spreadsheet program, and as costs change, recalculate the price that will produce the targeted Gross Profit Margin. Most companies know the GPMT they want. If you don't, there are some common guidelines you can follow. Manufacturers typically aim for a GPMT of 50% Distributors (Wholesalers) usually need a GPM of 10 to 15% Dealers (Retailers) require a GPM of 30 to 50% (the higher percentage is for retailers that have to train people (customers and employees) to use the product and the lower margin is for retailers that are selling a product that does not require after-sale support. The price, or marked-up cost, to achieve these target GPMs is as follows (P=Price and C=Cost of Sales): Manufacturers: P=2C so the formula is (2C-C)/2C = 0.5, yielding a GPM of 50% Distributors: P=1.18C so the formula is (1.18C-C)/1.18, which will give them a 15% GPM Dealers: P=1.5C so the formula is (1.5C-C)/1.5C, for a 33% GPM. When I develop pricing strategies for a client that is a manufacturer, I always start with a GPMT pricing strategy that is twice their cost, or 2C, since that is an easy calculation that will give them their GPMT of 50%. Most Significant Digit Pricing For products that will be sold to consumers, most companies employ a Most Significant Digit (MSD) pricing strategy. Why? Studies and experience show that sales will be significantly higher if a product is priced at say $29.95 or $29.99 instead of $30. Most humans focus on the most significant digit - the "2" in this case. To them $29.95 or $29.99 seems a lot less than $30 even though it is only 1 to 5¢ less. Even expensive homes in Beverly Hills might sell for $7,995,000 rather than $8 million. There are exceptions. In upscale restaurants, it is usually a mistake to price an entrée at $31.95. Instead it will be priced at $32-. For some reason, people do not think the food is as good if MSD pricing is used in a high-end restaurant. Combining all three If a product is positioned as unique, smart marketing companies will typically use all three of these strategies in combination. For example, Apple has priced its iPad Air and iPad Pro starting at $399 and $599 respectively. Apple is using a MSD strategy in addition to a WTMWB strategy because the iPhone has uniqueness built-in since Apple controls the platform. It also aims for a GPMT, which is not officially published, but which is in the 30 to 50% GPM range of well-positioned products in competitive markets. When Johnson & Johnson launched a margarine developed in Finland that lowers cholesterol, it priced a tub of this margarine at between $5.79 and $5.99. At the same time, a tub of regular margarine sold for 99¢. Based on this pricing, which used MSD and WTMWB strategies, many speculated that J&J priced the product at 8C, which gave it a GPMT of roughly 87.5%. Pricing your products When you are pricing your products, what gives you control over the price is the uniqueness and desirability built into your positioning, or branding, strategy. If you have created a product image that is impossible, or very difficult, to copy, you can employ a WTMWB price that will give you a good GPM that enables you to achieve your desired GPMT. And, if you sell your product in a consumer market, it would be a good idea to also employ an MSD pricing strategy. For example, if you are a manufacturer that is targeting a GPM of 50% and your cost of sales is $15, you might consider selling the product for $29.99 - a penny less than the price of 2C, which would give you a 50% gross profit margin. Best of luck. -- This feed and its contents are the property of The Huffington Post, and use is subject to our terms. It may be used for personal consumption, but may not be distributed on a website.
Back in 1950, close to 30% of the global population lived in cities. As Visual Capitalist's Jeff Desjardins notes, that has shifted dramatically, and by 2050, a whopping 70% of people will live in urban areas – some of which will be megacities housing tens of millions of people. This trend of urbanization has been a boon to global growth and the economy. In fact, it is estimated today by McKinsey that the 600 top urban centers contribute a whopping 60% to the world’s total GDP. Courtesy of: Visual Capitalist SEVEN TYPES OF GLOBAL CITIES With so many people moving to urban metropolitan areas, the complexion of cities and their economies change each day. The Brookings Institute has a new way of classifying these megacities, using various economic indicators. According to their analysis, here’s what differentiates the seven types of global cities: Important note: This isn’t intended to be a “ranking” of cities. However, on the infographic, cities are sorted by GDP per capita within each typology, and given a number based on where they stand in terms of this metric. This is just intended to show how wealthy the average citizen is per city, and is not a broader indicator relating to the success or overall ranking of a city. 1. Global Giants These six cities are the world’s leading economic and financial centers. They are hubs for financial markets and are characterized by large populations and a high concentration of wealth and talent. Examples: New York City, Tokyo, London 2. Asian Anchors The six Asian Anchor cities are not as wealthy as the Global Giants, however they leverage attributes such as infrastructure connectivity and talented workforces to attract the most Foreign Direct Investment (FDI) out of any other metro grouping. Examples: Hong Kong, Seoul, Singapore 3. Emerging Gateways These 28 cities are large business and transportation hubs for major national and regional markets in Africa, Asia, Latin America, and the Middle East. While they have grown to reach middle-income status, they fall behind other global cities on many key competitiveness factors such as GDP and FDI. Examples: Mumbai, Cape Town, Mexico City, Hangzhou 4. Factory China There are 22 second and third-tier Chinese cities reliant on export manufacturing to power economic growth and international engagement. Although Factory China displays a GDP growth rate that is well above average, it fails to reach average levels of innovation, talent, and connectivity. Examples: Shenyang, Changchun, Chengdu 5. Knowledge Capitals These are 19 mid-sized cities in the U.S. and Europe that are considered centers of innovation, with elite research universities producing talented workforces. Examples: San Francisco, Boston, Zurich 6. American Middleweights These 16 mid-sized U.S. metro areas are relatively wealthy and house strong universities, as well as other anchor institutions. Examples: Orlando, Sacramento, Phoenix 7. International Middleweights These 26 cities span across several continents, internationally connected by human and investment capital flow. Like their American middleweight counterparts, growth has slowed for these cities since the 2008 recession. Examples: Vancouver, Melbourne, Brussels, Tel Aviv
Submitted by David Galland via The Passing Parade One of the more interesting mental exercises related to predicting the future involves trying to fathom the impact the rise of robots will have on humanity. We can be quite sure that in the proverbial blink, robots will be doing all the war fighting. After that, what’s the point? But does that then lead to the sort of robotic apocalypse so well envisioned in Terminator? I also suspect it’s only a matter of time before the idea of sex bots goes from being an “eew” sort of thing to a household appliance. Well, at least in some households. After all, we already live in a world where every possible iteration of sexual proclivity is not just accepted but celebrated. So, who’s to deny the unmated a good snogging from the Yabadabdo Sexbot 2000? In fact, in a recent survey, 1 in 4 adults aged 18 to 34 said they would “date” a robot. But what will the impact of bionic sex partners be on society—or birth rates, for that matter? It’s all but impossible to see through the fog to the answers. We already have robo news reporters (you didn’t actually think humans write the crap passed off for news these days, did you?) Of course, as the news writing programs become more and more sophisticated, might the algorithms be tweaked to influence the masses to buy an advertiser’s product or, more onerously, to create a desired political outcome? You know, kind of how Google tried to get Hillary elected? In terms of managing money, we already have robo traders and robo advisors. But what happens when these technologies become self-learning? Will the competing programs become so adept at exploiting kinks in the armor of Mr. Market that they will effectively nullify each other? It’s also abundantly clear that self-driving cars will become the norm within the next decade. As someone who hates driving, that is a development I eagerly await. But imagine the sweeping changes self-driving cars will have on insurance, road building, car manufacturers, trucking, energy usage, the urban landscape, the taxi industry, government and regulations (will we still need driver’s licenses?), senior mobility, etc. It’s staggering to contemplate, and it’s just over the horizon. I could continue, but as I am preparing for a trip to Tafí de Valle in the neighboring province of Tucumán here in Argentina tomorrow morning, I’ll shuffle toward the featured article of this week’s musings—a look at the impact of automation on the structure of the workforce by friend and associate Stephen McBride. This is a particularly interesting topic on many levels. What percentage of the workforce is at risk of being replaced by automation? Where will the displaced find new jobs? What job skills will remain largely immune to automation? How will the US government, which is funded to the tune of 92% by income-related taxes, replace the lost revenue… a robot tax? It’s a big topic, too big for a single Parade, but we must start somewhere. And with that, I turn the podium over to Stephen. How the Coming Wave of Job Automation Will Affect You By Stephen McBride The 227,000 jobs added to the payroll in January marked the 76th straight month of expansion. The headline number is impressive. But if you dig a little deeper, you’ll find these jobs “aren’t what they used to be.” Since 2000, the creation of full-time positions has slowed significantly. The private sector used to add full-time jobs at 2–3% per annum. In 2000, that number fell below 2%. Since 2008, it has been below 1%. The majority of positions created since 2010 have been temporary. Around 20–50% of employees at the likes of Google and Walmart now fall into this category. With the explosion of contract workers, “workforce solution” firms now generate an estimated $1 trillion in revenue every year. The declining quality of jobs has caused many to stop looking for work. The labor force participation rate is near the lowest level since 1978. Hordes of Baby Boomers retiring skews the data somewhat, but the rate for workers in their prime isn’t pretty either. Almost 12% of men aged 25–64 aren’t in the workforce—a near five-fold increase in 60 years. So what has caused this shift? Automation Annihilation Steven Berkenfeld, a managing director in the investment banking division at Barclays, summed up the thought process of companies hiring today: “Can I automate it? If not, can I outsource it? If not, can I give it to an independent contractor?” Hiring an employee is the last resort. Over the past four decades, millions of jobs have been lost to automation. The manufacturing sector is a prime example. While productivity has increased, employment has fallen. We can see this trend when comparing companies across time. The most valuable US firm in 1964 was AT&T. Then, it was worth $267 billion (in 2016 dollars) and employed 758,611 people. Today, Google is worth $370 billion and has only 55,000 employees. Many workers have already been replaced by machines, but the number is only set to rise. A 2013 study from the University of Oxford concluded that 47% of jobs in the US will likely be automated over the next two decades. And a 2015 report by McKinsey found that the majority of tasks performed in sectors like manufacturing and food service can be automated with currently demonstrable technology. Technological advancement has created more jobs than it has destroyed in the past. However, the big problem is the lag time it takes to forge those new careers. Given the high cost of living in the US today, even a small lag could be financially devastating. Let’s take a look at the implications of job displacement going forward… The Missing Middle Due to an inability to secure a full-time job, McKinsey estimates 20–30% of workers now partake in contingency work to supplement their income. Work in the “gig economy” can be fun, but it doesn’t provide a stable, reliable wage. Sure, one can survive on it, but it’s hard to get mortgage approval or support a family with it. One of the reasons the US became an economic behemoth was its large middle class. With the loss of traditional careers, this trend is now in reverse. Over time, employment will likely become polarized as “Middle America” is hollowed out. Lower-quality careers ultimately mean lower pay… and when incomes drop, people have less to spend. Given that consumption now accounts for 70% of economic activity, this is a matter of great concern. As the Fed has stated: Recoveries don’t die of old age. It’s usually falling demand that leads to their death. Many Americans are unable to find full-time employment, but they are spending more trying to attain it. Outstanding student loans now total a whopping $1.4 trillion. This isn’t a problem if individuals have the ability to pay. But with 45% of recent college graduates underemployed and 10% over 90 days late on payments, it’s a big problem. In 2013, the Department of Labor predicted 65% of school children will be employed in jobs that don’t yet exist. Therefore, many of the skills they are learning today will likely be obsolete in the near future. And it’s not only job seekers who are affected. With dependency ratios collapsing, who will fund the pensions of the retiring Boomers? Displacement does not only have economic consequences, it also has profound social consequences. A Gallup study found that having a job was the number one social value. Unemployment is linked to increased drug use and depression. It’s also positively correlated with crime. While automation will have a major impact on the future of employment, the outlook is not all bad. Machines may be rendering many skills useless, but creativity is where humans still have an edge. McKinsey listed “managing others” and “applying expertise” as the least susceptible to automation. Likewise, Deloitte identified cognitive skills as the most important to have going forward. Machines may be advancing, but the future is likely to be one of collaboration, not competition. There will be serious challenges in the near term as many jobs are displaced by technology. But in the end, who would bet against the “ascent of man”?
For many executives, the concept of organization design is an oxymoron. They are so consumed by working in the organization that they lack the patience to work on the organization. They don’t do the intricate, complex work of configuring their organization to execute strategy. Instead, they shift boxes on an organization chart, bolt on more resources that were lobbied for by a zealous executive, or cut costs across the board. They focus on communicating messages more inclusively or reassigning stronger leaders to troubled departments. These are surface-level, counterfeit solutions, and they do more harm than good. And yet when it comes to reorganization, they’re the norm. According to one McKinsey study, the success rate for organizational redesign efforts is less than 25%. It’s much more common for reorg efforts run out of steam before completion or fail to yield improvements once they’ve been implemented. These numbers reflect a fundamentally flawed approach to thinking about systems. Organization design is not a static, one-time event. It is an ongoing management discipline; as a living, breathing organism, your organization must be continually refined and improved. Intentional design can improve the health of the organization, position your team for success, and make life better for everyone. Such design should motivated by a desire to: Realize the benefits of scale, bringing together people who perform similar work. Improve decision making by ensuring information can move easily across the organization. Empower people by shaping behavior and motivating them to perform and contribute as the organization requires. In my experience, the organizations that succeed at organization design tend to do five things: Organize around competitive advantage. Organizations must answer critical questions of identity: What sets us apart? What are our markets? Who is our customer? It may sound obvious, but it’s astounding how often these questions go unasked. Without critical self-reflection, organizations build silos, bureaucracies, and cultures that impede rather than enable performance. If your competitive advantage is responsiveness or speed, the organization must be built for that. If it’s quality and service, that’s a different configuration. A narrowly defined set of critical choices is the foundation of good organization design. Create boundaries between competitive and necessary work. Competitive work — work that directly drives, or supports, the ability to compete — must be organized for effectiveness and mastery. This is the work you have to be better at than anyone else. On the other hand, necessary work — tasks that you have to do on par with anyone else, or in compliance with regulatory requirements — should be organized for maximum efficiency. Problems happen when competitive and necessary work get too close, and the urgency of the everyday undermines the strategic work of remaining competitively focused. To prevent this from happening, create boundaries around geographies, functions, customer segments, service or business lines, or a combination of them (a matrix). Again, those choices must be driven by strategic requirements noted above. Solid boundaries foster smooth coordination inside those groups. They also deepen expertise and enhance execution of a defined set of activities. The challenge with any set of boundaries is that it creates the need for coordination between groups. Without planning for how work will be coordinated and integrated, the grouping decisions become meaningless. Focus on the seams. The vast majority of an organization’s competitive muscle will reside across units, more so than within them. Great service sits at the intersection of sales, customer service, and supply chain. Product innovation sits at the intersection of R&D, marketing, and business intelligence. Where these seams come together, work must be tightly linked to ensure coordination is not encumbered by the boundaries between groups. Repeatable core processes, technology and information sharing platforms, and cross-functional teams are all design options that help create seamless linkages. Hierarchy, as an example, is simply a way to link vertically integrated tasks. Sadly, it’s usually the only thing that changes when people just change the “org chart.” Creating roles that cross organizational boundaries to coordinate with other parts of the organization is also a way to link work. Building effective linkages is one of the strongest ways to ensure organizational changes succeed. Distribute decision rights. How decision rights are distributed through the organization can promote desirable behaviors and avoid negative ones. A good decision architecture helps clarify everyone’s expectations about what they are accountable for. And most reorg efforts never touch it. It is foundational to how an organization works: It’s the set of authority structures, roles, and processes by which critical aspects of the organization are managed. More than just meeting cadences, it includes how strategy is set and prioritized, how resources are allocated, and how performance is measured. It includes the planning and building of P&Ls and budgets, managing the portfolios of products, clients, and talent, and the long-term financial and strategy processes that plan for results. While overhauling all of this is a huge task, it’s vital for making sure any reorg actually sticks. The ability to execute clear decisions is the central activity of an organization design, the activity from which the others all flow. It gives a predictable cadence to a business so that all of the interconnecting gears are working in coordinated fashion and the strategy is being executed and monitored appropriately. Fail to reform this, and your organization may simply retrench to silos and border wars, and fail to achieve common goals. Design clear, meaningful roles. It’s common in organizations for people to respond to the question, “So what do you do here?” with something like, “Well, there’s what my job description says, and then there’s what I do every day.” Jobs, like organizations, must be carefully crafted not around people’s preferences or idiosyncrasies, but around needed work and outcomes. The “mitosis” factor of organization growth usually has jobs “divide” the way cells do as humans form. Such mitosis is one of the worst ways to scale an organization. It creates both costly redundancies and soul-sapping jobs. Some people’s jobs may become boring and narrow, while other employees may find themselves juggling dozens of unrelated tasks. In other cases, organizations may bend the necessary work of a role to fit the employee within that role, diluting what needs to be done and settling for what can get done. Roles should be designed as widely and largely as possible so that people are continually challenged and fulfilled. Stretching people’s skills sustains a feeling of personal growth and satisfaction. It also enables great organizational breadth, something vital for when people are ready for expanded leadership responsibilities. With all of these approaches, remember to build an organization that you can actually implement. When you go to assign your talent base to your new design, if you are left with too many people “to be determined,” you have built a design that exceeds what you can implement. Designing organizations for “super humans” never goes well. It has to be a design that can stand up in the real world. That’s not an excuse to compromise, take an easy way out, or accept the mediocre talent you have to work with. There has to be a balance. To get further, design for your ideal state, and then adjust accordingly. Designing your optimal organization takes hard work, sacrifice, and significant trade-offs. They must be balanced against the realities and constraints of real life. But thoughtful design work pays great dividends and helps avoid the painful statistic of a failed reorg.
A survey finds that nearly a third of people say they have been less productive since the election.
A survey finds that nearly a third of people say they have been less productive since the election.
По мере того, как развивается эра искусственного интеллекта, его способность изменить будущее получает противоречивые отклики. Сооснователь и председатель совета директоров Microsoft (NASDAQ: MSFT) Билл Гейтс называет искусственный интеллект «священным Граалем» мира компьютеров. Генеральный директор Tesla Илон Маск, в свою очередь, несколько апокалиптически смотрит на эту сферу жизни и связывает развитие искусственного разума с «пробуждением демона».
Submitted by Michael Snyder via The Economic Collapse blog, What is going to happen to society when robots are able to do just about everything better, faster and cheaper than human workers can? We live at a time when technology is increasing at an exponential pace. Incredible advancements in robotics, computer science and artificial intelligence are certainly making our lives more comfortable, but they are also bringing fundamental changes to the workplace. For employers, there are a lot of advantages to replacing human workers with robots. Robots don’t surf around on Facebook when they are supposed to be working. Robots don’t need Obamacare, lunch breaks or vacation days. Robots never steal from the company and they never complain. Up until fairly recently, human workers could generally perform many tasks more cheaply than robots could, but now that is rapidly changing. For example, a coffee shop has just opened up in San Francisco that is manned by a robot instead of a human… Tired of your barista misspelling your name on your morning cup of joe? Perhaps a robot could do better. On Monday, Cafe X opened its very first robotic cafe in San Francisco’s Metreon shopping center. Promising “precision crafted specialty coffee in seconds, the way the roaster intended,” Cafe X thinks that anything a human can do, its machines can do better. Specifically, one very special machine. Nicknamed Gordon, after a Cafe X employee, this robot mans, or robots, two standard professional coffee machines in order to serve up espressos and lattes. In the San Francisco location, customers can grab a cup of coffee with beans from AKA Coffee, Verve Coffee Roasters, or Peet’s. While the coffee itself may not make Cafe X stand out from the competition, the startup hopes that the robot’s efficiency will. If that coffee shop demonstrates that it can be much more profitable than a coffee shop with human employees, it is just a matter of time before human baristas start to be phased out all over the nation. A similar thing is happening in many supermarkets. Personally, I hate the “self-checkout lines”, but you are starting to see them everywhere these days. And according to the Sun, Amazon is playing around with a concept that would employ hardly any human workers at all… In the case of Amazon’s automated retail prototype, a half-dozen workers could staff an average location. A manager’s duties would include signing up customers for the “Amazon Fresh” grocery service. Another worker would restock shelves, and still another two would be stationed at “drive-thru” windows for customers picking up their groceries, fast-food style. The last pair would work upstairs, helping the robots bag groceries to be sent down to customers on “dumbwaiter”-like conveyors, a source said. With the bare-bones payroll, the boost to profits could be huge. Indeed, the prototype being discussed calls for operating profit margins north of 20 percent. That compares with an industry average of just 1.7 percent, according to the Food Marketing Institute. During the recent presidential campaign, much was made of the fact that we have shipped millions of good paying jobs overseas over the past several decades. We can certainly try to make some laws that would keep American workers from losing jobs to foreign workers, but pretty soon workers all over the world are going to be losing millions of jobs to technology, and it is going to be just about impossible to make laws to prevent that from happening. Just check out what is happening in China. Many big firms had moved manufacturing to China because labor was much cheaper over there, but now a lot of those cheap Chinese workers are being replaced by robots… Apple’s iPhone manufacturer, Foxconn, in fact, has already begun automating certain work that was previously done by hand. A Chinese government official told a Hong Kong newspaper in May that Foxconn had replaced 60,000 workers with robots at one factory there. And the company is receiving incentives north of Shanghai in the eastern-central Jiangsu Province to accelerate investments in robotics to replace human labor, according to Chinese state media organization Xinhua. Sadly, this is just the beginning. According to one study, 49 percent of all activities currently performed by human workers could already “be turned over to some sort of machine or robot”… About 49% of worker activities can be turned over to some sort of machine or robot, increasingly helped along by artificial-intelligence software, according to consultancy McKinsey. About 58% of CEOs plan to cut jobs over the next five years because of robotics, while 16% say they plan to hire more people because of robotics, according to a PricewaterhouseCoopers survey. And Carl Frey of Oxford University has determined that some professions have more than a 90 percent chance of becoming automated in the coming years… The revelations that dependable office jobs such as insurance workers and real estate agents have a more than 97% chance of becoming computerised could now spark fears among the middle class workforce. ‘While low-skilled jobs are most exposed to automation over the forthcoming decades, a substantial number of middle-income jobs are equally at risk.’ Frey told The Times. Other jobs that feature high on the ‘risk list’ are credit analysts who have a 97% chance of losing their jobs to robots, postal service workers at 95% and lab technicians who have an 89% chance of seeing their role become automated. So what in the world are we going to do with billions of human workers around the globe that are no longer needed when technology takes virtually all of our jobs? Some have suggested that the idea of “work” will become a thing of the past, and that society will evolve into a socialist utopia where everything we need is provided for by the government. In fact, the concept of a “universal basic income” is already being promoted in Europe and elsewhere. But others see a dystopian future where the gap between the “haves” and the “have nots” grows greater than ever before. Humanity has always been plagued by poverty and greed, and everyone agrees that the gap between the very wealthy and the rest of us has been growing very rapidly in recent years. Where there is nearly universal agreement is on the fact that big changes are coming. Workers are going to be displaced by technology at an accelerating rate in the years ahead, and this will present a tremendous challenge for us all.
The world's largest 123 cities generate an astonishing $36 trillion in GDP per year. This infographic breaks these global cities down into seven typologies. The post The Megacity Economy: How Seven Types of Global Cities Stack Up appeared first on Visual Capitalist.
Companies deliver superior results when executives manage for long-term value creation and resist pressure from analysts and investors to focus excessively on meeting Wall Street’s quarterly earnings expectations. This has long seemed intuitively true to us. We’ve seen companies such as Unilever, AT&T, and Amazon succeed by sticking resolutely to a long-term view. And yet we have not had the comprehensive data needed to quantify the payoff from managing for the long term — until now. New research, led by a team from McKinsey Global Institute in cooperation with FCLT Global, found that companies that operate with a true long-term mindset have consistently outperformed their industry peers since 2001 across almost every financial measure that matters. The differences were dramatic. Among the firms we identified as focused on the long term, average revenue and earnings growth were 47% and 36% higher, respectively, by 2014, and market capitalization grew faster as well. The returns to society and the overall economy were equally impressive. By our measures, companies that were managed for the long term added nearly 12,000 more jobs on average than their peers from 2001 to 2015. We calculate that U.S. GDP over the past decade might well have grown by an additional $1 trillion if the whole economy had performed at the level our long-term stalwarts delivered — and generated more than five million additional jobs over this period. Who are these overachievers and how did we identify them? We’ll dive into those answers shortly. But first, it’s worth pausing to consider why finding conclusive data that establishes the rewards from long-term management has been so hard — and just how tangled the debate over this issue has been as a result. In recent years we have learned a lot about the causes of short-termism and its intensifying power. We know from FCLT surveys, for example, that 61% of executives and directors say that they would cut discretionary spending to avoid risking an earnings miss, and a further 47% would delay starting a new project in such a situation, even if doing so led to a potential sacrifice in value. We also know that most executives feel the balance between short-term accountability and long-term success has fallen out of whack; 65% say the short-term pressure they face has increased in the past five years. We can all see what appear to be the results of excessive short-termism in the form of record levels of stock buybacks in the U.S. and historic lows in new capital investment. But while measuring the increase in short-term pressures and identifying perverse incentives is fairly straightforward, assessing the ultimate impact of corporate short-termism on company performance and macroeconomic growth is highly complex. After all, “short-termism” does not correspond to any single quantifiable metric. It is a confluence of so many complex factors it can be nearly impossible to pin down. As a result, despite persistent calls for more long-term behavior from us and from CEOs who share our views, such as Larry Fink of BlackRock and Mark Wiseman, the former head of the Canada Pension Plan Investment Board, a genuine debate has continued to rage among economists and analysts over whether short-termism really destroys value. Academic studies have linked the possible effects of short-termism to lower investment rates among publicly traded firms and decreased returns over a multiyear time horizon. Ambitious work has even attempted to quantify economic growth foregone due to cuts in R&D expenditure driven by short-termism, putting it in the range of about 0.1% per year. Other researchers, however, remain skeptical. How, they ask, could corporate profits in the U.S. remain so high for so long if short-termism were such a drag on performance? And isn’t the focus on quarterly results a natural outgrowth of the rigorous corporate governance that keeps executives accountable? What We Actually Measured — and the Limits of Our Knowledge To help provide a better factual base for this debate, MGI, working with McKinsey colleagues from our Strategy & Corporate Finance practice as well as the team at FCLT Global, began last fall to devise a way to systemically measure short-termism and long-termism at the company level. It started with developing a proprietary Corporate Horizon Index. The data for this index was drawn from 615 nonfinance companies that had reported continuous results from 2001 to 2015 and whose market capitalization in that period had exceeded $5 billion in at least one year. (We wanted to focus on companies large enough to feel the potential short-term pressures exerted by shareholders, boards, activists, and others.) Collectively, our sample accounts for about 60%–65% of total U.S. public market capitalization over this period. To further ensure valid results and to avoid bias in our sample, we evaluated all companies in our index only relative to their industry peers over several years, and conducted other tests and controls to ensure statistical robustness. One final caveat: While we firmly believe our index enables us to classify companies as “long-term” in an unbiased manner, our findings are descriptive only. We aren’t saying that a long-term orientation causes better performance, nor have we controlled for every factor that could impact the relationship between those two. All we can say is that companies with a long-term orientation tend to perform better than similar but short-term-focused firms. Even so, the correlation we uncovered between behaviors that typify a longer-term approach and superior historical performance deliver a message that’s hard to ignore. To construct our Corporate Horizon Index, we identified five financial indicators, selected because they matched up with five hypotheses we had developed about the ways in which long- and short-term companies might differ. These indicators and hypotheses were: Investment: The ratio of capex to depreciation. We assume long-term companies will invest more and more-consistently than other companies. Earnings quality: Accruals as a share of revenue. Our belief is that the earnings of long-term companies will rely less on accounting decisions and more on underlying cash flow than other companies. Margin growth: Difference between earnings growth and revenue growth. We assume that long-term companies are less likely to grow their margins unsustainably in order to hit near-term targets. Earnings-per-share (EPS) growth: Difference between EPS growth and true earnings growth. We hypothesize that long-term companies are less likely to overindex on analyst metrics like EPS and less likely to consistently take actions (such as share repurchases) that boost EPS. Quarterly targeting: Incidence of beating or missing EPS targets by less than two cents. We assume long-term companies are more likely to miss earnings targets by small amounts (when they easily could have taken action to hit them) and less likely to hit earnings targets by small amounts (where doing so would divert resources from other business needs). After running the numbers on these indicators, two broad groups emerged among those 615 large and midcap U.S. publicly listed companies: a “long-term” group of 164 companies (about 27% of the sample), which were either long-term relative to their industry peers over the entire sample or clearly became more long-term between the first half of the sample period and the second half, and a baseline group of the 451 remaining companies (about 73% of the sample). The performance gap that subsequently opened between these two groups of companies offers the most compelling evidence to date of the relative cost of short-termism — and the real payoff that arises from managing for the long term. Trillions of Dollars of Value Creation at Stake To recap, from 2001 to 2014, the long-term companies identified by our Corporate Horizons Index increased their revenue by 47% more than others in their industry groups and their earnings by 36% more, on average. Their revenue growth was less volatile over this period, with a standard deviation of growth of 5.6%, versus 7.6% for all other companies. Our long-term firms also appeared more willing to maintain their strategies during times of economic stress. During the 2008–2009 global financial crisis, they not only saw smaller declines in revenue and earnings but also continued to increase investments in research and development while others cut back. From 2007 to 2014, their R&D spending grew at an annualized rate of 8.5%, greater than the 3.7% rate for other companies. Another way to measure the value creation of long-term companies is to look through the lens of what is known as “economic profit.” Economic profit represents a company’s profit after subtracting a charge for the capital that the firm has invested (working capital, fixed assets, goodwill). The capital charge equals the amount of invested capital times the opportunity cost of capital — that is, the return that shareholders expect to earn from investing in companies with similar risk. Consider, for example, Company A, which earns $100 of after-tax operating profit, has an 8% cost of capital and $800 of invested capital. In this case its capital charge is $800 times 8%, or $64. Subtracting the capital charge from profits gives $36 of economic profit. A company is creating value when its economic profit is positive, and destroying value if its economic profit is negative. With this metric, the gap between long-term companies and the rest is even bigger. From 2001 to 2014 those managing for the long term cumulatively increased their economic profit by 63% more than the other companies. By 2014 their annual economic profit was 81% larger than their peers, a tribute to superior capital allocation that led to fundamental value creation. No path goes straight up, of course, and the long-term companies in our sample still faced plenty of character-testing times. During the last financial crisis, for example, they saw their share prices take greater hits than their short-term counterparts. Afterward, however, the long-term firms significantly outperformed, adding an average of $7 billion more to their companies’ market capitalization from 2009 and 2014 than their short-term peers did. While we can’t directly measure the cost of short-termism, our analysis gives an indication of just how large the value of what’s being left on the table might be. As noted earlier, if all public U.S. companies had created jobs at the scale of the long-term-focused organizations in our sample, the country would have generated at least five million more jobs from 2001 and 2015 — and an additional $1 trillion in GDP growth (equivalent to an average of 0.8 percentage points of GDP growth per year). Projecting forward, if nothing changes to close the gap between the long-term group and the others, then the U.S. economy could be giving up another $3 trillion in foregone GDP and job growth by 2025. Clearly, addressing persistent short-termism should be an urgent issue not just for investors and boards but also for policy makers. Where Do We Go from Here? Our research is just a first step toward understanding the scope and magnitude of corporate short-termism. For instance, our initial dataset was limited to the U.S., but we know the problem is a global one. How do the costs and drivers differ by regions? Our sample set consists only of publicly listed companies. How do the effects we discovered differ among private companies or among public companies with varying types of ownership structures? Are there metrics that can help predict when a company is becoming too short-term — and how do they differ among industries? Most important, what are the interventions that will prove most effective in shifting organizations onto a more productive long-term path? On this last point, we and many others have identified steps that executives, boards, and institutional investors can take to achieve a better balance between hitting targets in the short term and operating with a persistent long-term vision and strategy. These range from creating investment mandates that reward long-term value creation, to techniques for “de-biasing” corporate capital allocation, to rethinking traditional approaches to investor relations and board composition. We will return to HBR in coming months with more data and insights into how companies can strengthen their long-term muscles. The key message from this research is not only that the rewards from managing for the long term are enormous; it’s also that, despite strong countervailing pressures, real change is possible. The proof lies in a small but significant subset of our long-term outperformers — 14%, to be precise — that didn’t start out in that category. Initially, these companies scored on the short-term end of our index. But over the course of the 15-year period we measured, leaders at the companies in this cohort managed to shift their corporations’ behavior sufficiently to move into the long-term category. What were the practical actions these companies took? Exploring that question will be a major focus for our research in the coming year. For now, the simple fact of their success is an inspiration.
С течением времени участие искусственного интеллекта в повседневной жизни людей лишь возрастает. Об этом свидетельствуют 9 статистических показателей, согласно которым искусственный разум не зря причисляют к величайшим техническим достижениям человечества
Бестселлер Мартина Форда "Восхождение роботов: технологии и угроза будущего без работы" назван лучшей книгой для бизнеса 2015 г. по версии издания Financial Times и консалтинговой компании McKinsey & Company.
Richard W. Fisher, President and CEOFederal Reserve Bank of DallasDallas, Texas February 11, 2014 - - - - - - - 05 Февраль 2014 О ценах на газ в США http://iv-g.livejournal.com/997777.html 23 Октябрь 2013 U.S. Natural Gas Proved Reserves, 2011. 2 http://iv-g.livejournal.com/956077.html 28 Август 2013 McKinsey: Five opportunities for US growth and renewal (Energy) http://iv-g.livejournal.com/931584.html 26 Август 2013 API.org: Инфографика о добыче сланцевых нефти и газа. 2 http://iv-g.livejournal.com/931067.html 24 Август 2013 API.org: Инфографика о добыче сланцевых нефти и газа http://iv-g.livejournal.com/929565.html 17 Январь 2013 IEA: World Energy Outlook 2012. Presentation to the press http://iv-g.livejournal.com/818512.html 26 Декабрь 2012 forbes: Влияние нетрадиционных газа и нефти на экономику США http://iv-g.livejournal.com/806390.html 25 Июль 2012 Занятость в США и добыча углеводородов http://iv-g.livejournal.com/715320.html 28 Март 2012 Citigroup report. Energy 2020: North America as the new Middle East http://iv-g.livejournal.com/633928.html