Sunday, January 8, 2017

Erosion in Global Alliances

 Map of NATO nations. Huffington Post
As the Congress begins to address important domestic policies, which clearly are focusing on deregulation, dismantling the Affordable Care Act and tax reform,  our military and political allies are reaching some pretty serious conclusions about the likely unraveling or weakening of our mutual defense treaties. With the President-Elect doubling down on his denial of Russian hacking and influence in the recent election, despite unanimous findings to the contrary by all of our intelligence agencies, there is fear among our allies that the US is now following the Russian playbook.
Even after a full intelligence briefing on January 6th, Trump maintained: “There was absolutely no effect on the outcome of the election including the fact that there was no tampering whatsoever with voting machines… China, relatively recently, hacked 20 million government names. [Referring to the Office of Personnel Management breach in 2014 and 2015]. How come nobody even talks about that? This is a political witch hunt.” Tampering with voting machines, which are not even connected to the Web, was never the issue. This was all about a complete Russian effort, under the direction of Putin himself, to discredit Trump’s opponent by leaking both private information and falsehood that were rapidly picked up by the Web and spread like wildfire. Why was getting Trump into the White House so important to Russia? More on this later.
Having totally trashed our national intelligence services, Trump held out a minor, fingers-crossed, olive branch to those agencies to come up with an anti-hacking plan within his first 90 days in office. Having denied the impact of this cyber-plan, Trump still sent shivers of concern among Western leaders over his excessively close relationship with a very threatening modern Russia. This “It’s an existential moment for all of Europe’s leaders, most of whom are only just beginning to grapple with the fact that Russia wants to destroy the Euro-American alliance.” Anne Applebaum, Op-Ed, Washington Post, January 5th.
“The US has identified the Russian agents behind alleged hacking ahead of the presidential election won by Donald Trump in November, reports say… The agents, whose names have not been released, are alleged to have sent stolen Democratic emails to WikiLeaks to try to swing the vote for Mr Trump… According to CNN, the Washington Post and NBC News citing intelligence sources, agencies had intercepted communications in the aftermath of the election showing senior Russian government officials celebrating Donald Trump's win over rival Hillary Clinton…
“Mr Trump has repeatedly rejected allegations that the Russian government was behind the hacks. On Wednesday [12/4], he repeated a suggestion that ‘a 14-year-old’ may have been responsible for the breach… On Thursday [12/5], he said he was a ‘big fan’ of intelligence agencies, but later went on to raise questions about how they responded to the security breach…
“The pro-Kremlin media line is that the US authorities have failed to present any evidence to substantiate their presidential campaign hacking accusations… The official Rossiya 24 TV channel says the ‘US secret services have still not supplied a single piece of evidence,’ while the popular Gazeta.ru web site says Washington has ‘still not provided any convincing technical data.’… Opposition websites largely cover what the major US networks are reporting, and all note the Kremlin's denial of involvement.” BBC.com, January 6th. Yet that hard evidence was indeed pouring out of our intelligence agencies.
Many GOP leaders, having spent much of their political careers opposing Russia, were caught between those efforts and a desire to support their president. There was even consternation within the Trump camp itself. “Meanwhile, former CIA director R. James Woolsey Jr., a veteran of four presidential administrations, resigned yesterday [12/5] from Trump’s transition team because of growing tensions over the president-elect’s vision for intelligence agencies.” Daily 202, Washington Post, January 7th. Whether or not Russia is able to change enough votes directly, she very much benefits from the distrust and confusion – a pernicious fog, if you will – which results from her support of hacking and spread of disinformation… along with Trump-like consistent denial of any involvement despite virtually indisputable hard evidence to the contrary.
Moscow has made information and asymmetrical warfare central to its foreign and military policy. When asserting itself in Georgia and Ukraine, Russia has used a hybrid strategy that involves the funding of local politicians and militias, fake news and cyberattacks. Leading German and Polish politicians assert that Russia has engaged in some such activities in their countries as well. And now there is the apparent involvement in America’s election.” Fareed Zakaria, Washington Post, January 5th.  And it’s not as if the United States were singled out for such rather damaging intrusion into the election process. Russia seems to be mounting a full-on assault against Western democracies with a rather thinly-veiled effort either to dissolve NATO or render its policies weak and ineffective.
Thorsten Benner (Co-Founder and Director of the Global Public Policy Institute, in Berlin) and Mirko Hohmann (project manager at the Global Public Policy Institute), writing for the December 16th ForeignAffairs.com, explain the underlying Russian goals in their hacking campaigns: “In  recent weeks, politicians and intelligence officials in France and Germany have stepped up their warnings of Russian interference in the national elections both countries will hold next year. In late November, Bruno Kahl, the head of Germany’s Federal Intelligence Service, told the Süddeutsche Zeitung that Germany had ‘evidence that cyberattacks are taking place that have no purpose other than to elicit political uncertainty.’  German Chancellor Angela Merkel has expressed similar concerns, suggesting that Moscow may attempt to influence Germany’s parliamentary elections, which are slated for September 2017. French politicians have been more circumspect about the specific threats posed to their country’s presidential elections, which will be held in April and May. But Guillaume Poupard, the director-general of France’s National Agency for the Security of Information Systems, has indicated that Paris, too, is concerned about the prospect of foreign interference. Western democracies face ‘the development of a digital threat for political ends and for destabilization,’ he told Le Monde in early December…
“The use of incriminating information to publicly discredit opponents is widespread, but Russian intelligence services have a particularly strong penchant for the tactic. During the Cold War, the practice was common enough that the Russian term kompromat (a portmanteau combining the Russian words for ‘compromising’ and ‘material’) entered the Western vernacular…
“Kompromat operations do not always seek to promote particular candidates, even though Russia’s interventions in the U.S. election clearly meant to elevate Donald Trump. (French officials should expect similar moves in support of National Front leader Marine Le Pen in the coming months.) The goal is usually broader: to corrode democratic norms and institutions by discrediting the electoral process and to tarnish the reputations of democratic governments in order to establish a kind of moral equivalence between Russia and the West. From the Kremlin’s perspective, attacks on democratic political institutions are a form of payback for what it perceives as the West’s longstanding attempts to hem in and undermine Russia—most recently, the leak of the Panama Papers, which pointed to the cronyism of Russian President Vladimir Putin’s inner circle and which Russian authorities attributed to Washington, and the anti-government demonstrations that roiled Russian cities after the country’s election in 2011. Putin accused Hillary Clinton, then the U.S. secretary of state, of instigating those protests.
The problem, however, is not only that is Donald Trump cozying up to master-manipulator Vladimir Putin, but he is spouting rather clear statements that seem to mirror Putin’s disdain for NATO itself and other comparable alliances. “The ideas that the United States should ‘strengthen democratic nations against aggression’ and maintain a network of ‘free states and free peoples’ around the world led to the [post WW2] creation of powerful alliances in Asia, as well as a whole host of transatlantic and European institutions that have kept Europe safe, free, prosperous and allied to the United States: the North Atlantic Treaty Alliance, the Council of Europe, the European Union. Some of these institutions came with costs for the United States, but because they are the bedrock of American power in the world — because America’s allies promote American values and its interests around the world — no U.S. administration in seven decades has ever sought to undermine them.
“When Donald Trump is inaugurated this month, that will no longer be the case. Trump has made clear that he is no longer interested in promoting America’s ‘democratic faith,’ or an America that maintains a special relationship with ‘free states and free peoples.’
“In the past few weeks, some of America’s oldest and closest allies in Europe have begun to fear that Trump’s White House may not just neglect them, which has happened often enough in the past, but will actually seek to undermine them and their institutions. The link between Trump, his senior counselor and chief strategist Stephen K. Bannon and Breitbart News, the website Bannon was running until he went to work for Trump, is what worries them most. Flush from its success in the United States, Breitbart now seeks to monetize anti-immigration and racist sentiment in Europe, too, promoting it, selling it and using it to elect populist politicians who are just as skeptical of NATO as Trump, and who will do their best to destroy the European Union as well.” Applebaum.
Will Trump actually disband NATO or withdraw the U.S. from those treaties? Unlikely, but he may in fact pledge to Russia not to expand NATO further and to reduce U.S. participation in military exercises as well as underlying funding for the organization. This may very well promote our (former?) allies to seek alternative defensive structures or alliances that no longer involve or depend upon the United States. This stance would undoubtedly lessen American influence in global affairs and may ultimate diminish the role of the U.S. dollar as the global reserve currency, a reality that would create rather severe negative consequences for our economy. Not to mention how quickly China and Russia will react to fill the void at our expense.
I’m Peter Dekom, and a single term from a policy-reversing U.S. president could effect a permanent and irrevocable change in American power, prestige and influence all over the world.

Thursday, January 5, 2017

Lingering Mythologies: Tax Cuts = More Jobs

As I have blogged on so many occasions, when rich folks and well-capitalized companies get tax cuts, they do not knee-jerk to hire more people. In a gig economy, with contract workers and tons of outsourcing possibilities, even where additional staffing may be required, smart companies tend to approach adding permanent staffing with increasing skepticism, particularly when “uncertainty” defines the approaching business landscape. “Temporary” is better than “permanent” these days. Downsizing under pressure is both painful and expensive.
But when there is much economic uncertainty, companies don’t even reach into those “temporary” or outsourced workplace solutions. They use their newfound tax savings for other purposes that could even have the opposite effect: for example, implementing mergers and acquisitions where jobs cuts are a natural result of such activities.
The most recent disaster from assuming that tax cuts increase hiring (called “supply-side,” “trickle-down” or “incentivize the job-creators” economics) appears in Kansas, where a Republican governor, Sam Brownback, and a Republican legislature installed massive tax cuts in 2012, expecting a significant increase in new jobs, which, in turn, would generate new sources of income tax to make up the difference. Wrong!
Kansas has since faced a rather large budget deficit with some pretty nasty consequences. The June 14, 2014 The Kansas City Star says it best: “Kansas officials badly underestimated the negative effect of the individual income tax cuts that the Legislature passed and Gov. Sam Brownback signed in 2012… Ignore the political spin from Kansas politicians trying to downplay the state’s budget concerns. Focus on the numbers, because they matter more.
“And they reveal the state’s finances are in a huge hole that could get a lot deeper. A state that already is shortchanging its schools, underpaying employees and running disgraceful waiting lists for disabled citizens who need services is likely to experience even more pain…” With a stubborn legislature, Kansas got more pain, as you can see from the picture of Governor Brownback above. He and his GOP legislature seem more like the “Drown-back or Down-back boyz” instead.
As with most such tax cut schemes, the rich do not determine their hiring policies simply based on tax cuts. To add employees, they always need to see how that hiring move would increase their overall business performance and profitability… and that reality is hardly inherent in any basic tax cut analysis.
Enter Donald Trump with all kinds of proposed tax cuts for corporate America and the wealthiest in our land, making the same unjustified statements about how such moves are going to result in the addition of many new, high-paying jobs. Among other proposals, he expects to incentivize corporate American to bring all of their untaxed off-shore profitability back home. Wanna make any bets on how many new, solid jobs will result? CEOs, according to Trump, will create more jobs? So if The Donald gets his way?
“[Corporate] boards and executives may have different ideas… They are likely to use much of the estimated $2 trillion held overseas to acquire businesses in the United States, to buy back their own stock or [in anticipation of interest rate hikes] to pay down debt, say advisers of America’s top corporate executives.
Merger bankers ‘are sharpening their pencils with what types of deals those larger companies can look at,’ said Marc-Anthony Hourihan, co-head of mergers and acquisitions [M. & A.] in the Americas for the Swiss bank UBS. ‘I think M. & A. will be fairly high on the list.’
“American corporations have kept an accumulation of earnings abroad because they would be subject to paying more taxes when they bring it home…. Mr. Trump has said he wants to repatriate such corporate profits with a one-time rate of 10 percent. That is about a third of what is required by the current law, which says companies need to pay up to 35 percent of their earnings to the government, and then get credited for taxes they have already paid overseas, which usually is not much.
“If they were to bring that capital back, those companies could use it to invest in their businesses, which may in turn create jobs. Yet that is only one of several options… If the priority turns out to be deals, that would be good news for investment bankers who generate fees from large advisory assignments. It would be less so for American workers who might get laid off as a result of cost cuts derived from combining two companies.
“Job losses did result the last time Congress initiated a tax holiday, in 2004. The top 15 repatriating companies brought home $150 billion but reduced their work force by 20,931 jobs, according to a 2011 study commissioned by the Senate Permanent Subcommittee on Investigations.” New York Times, December 27th. Another place where facts interfere with mythology and fake news.
So, what else could corporate/wealthy America do with that extra cash – besides blindly spending to hire new workers without a justifiable business plan? They could just declare higher dividends for their shareholders. Or install that “new-fangled” automated equipment, much of it driven by self-adjusting artificial intelligence, to reduce labor costs. More layoffs? The rich get richer? Funny how the Democrats are so smarting from their vicious losses (plural) in November that they have yet to provide a coherent counter to Trump’s “tax cuts that only benefit the rich” plan.
I’m Peter Dekom, and I never ceases to be amazed at how so many Americans believe that implementing clearly failed economic theories under the same assumptions that made them fail many times before will actually work the next time they are used.

Wednesday, January 4, 2017

The Hidden Politics of Childcare

Having a child in the United States has never been more expensive – measuring childcare costs in hard dollars or as a percentage of income. The average hovers just below 30% of average household income and around 85% of what a minimum wage worker earns. For folks making enough where tax planning makes a difference, being able to deduct childcare costs is an important incentive/benefit. For those whose incomes are very low or have fallen well into lower tax brackets, deductibility is largely irrelevant. Tax credits and school vouchers are incentives for those who (a) know how to use them, (b) have needs that are directly benefited by such incentives and (c) have sufficient incomes to make those benefits matter. Next.
One of the most important determinants of birth rates is the cultural attitude from the relevant social segment regarding the cost of what their peer group calls “proper” childcare. The more it costs in this social segment to raise a child, the fewer children are likely to be born. If your cultural environment does not factor in external childcare costs in the decision as to whether or not to have a child, the more likely the birth rate will remain stable or even rise for that cohort.
So it does merit examining the rather upwardly-tending cost of childcare in the United States. The most expensive state, according to Eric Reed at TheStreet.com (December 26th), is Massachusetts, where the average child care costs averages $21,095 per year (which can rise to $31,827 if a family opts to avoid daycare – which itself averages $13,208 per year – and tend to junior from home).
According to Reed: “Child care in Massachusetts costs 33 percent of the average household income, and 113 percent of a minimum wage worker's pay. This state is the most expensive, on average, in the country one of the states with the biggest gap between what average and lower-income workers can afford in terms of childcare.” Simply put, a minimum wage worker cannot afford average childcare in Massachusetts. Might as well stay home and rely on welfare.
North Dakota, on the other hand, has the most affordable average childcare cost in the land: $11,886 per annum. Daycare is a reasonable $7,409/year while in-home childcare still hits a significant but closer-to-affordability $27,667 per year. True, wage rates there are lower, but then most costs in North Dakota are lower than what most of the rest of the country pays. Reed notes: “This is, on average, the cheapest state in the country for child care. Childcare in North Dakota costs 20 percent of the average household income, and 79 percent of a minimum wage worker's pay. This makes [the state] one of the most competitive states for working parents, as earnings can still meaningfully outweigh costs.”
But these are just numbers. This does not measure the psychological toll on those who just cannot spend what they think they need to spend on their kids. It gets tougher when junior heads off to college, an issue I have blogged about so much, it does not merit repeating here. 59% of Millennials have at least some college, reflecting the bare necessity of post-high school education in a fiercely competitive job market. That, of course, means that 59% of families with Millennials have grappled with college costs. Bad numbers reflecting a downward pressure on having children. The Z-generation will have even tougher numbers.
But paying for proper childcare tends to be a priority value only among those in the middle or higher on the economic ladder. They’ve been doing that for years. So as earnings pressures on middle class or recently middle class (and now downwardly mobile) families mount, as real wages continue to stagnate under our new “gig” and more-competitive-wage environment, the birth rate for those used to spending that childcare money has plunged as reflected in my recent The 0.7 Percent Solution blog. For the most part, this is evidenced in traditional white families.
For minorities, who have lived at the bottom of the economic ladders, childcare is handled by extended families with the elderly taking on that role, welfare of one form or another, or by simply not providing that care much at all. As my recent blog points out, their birth rates remain much higher than those of their old-world white traditionalist counterparts.
All of which is essential in understanding precisely how desperate that traditional white voting bloc has become. They know their numbers are declining because they cannot afford to have more kids… but with fewer children, the relative number of white traditionalists is falling like a stone tossed off of a high cliff.
Look at the world from their perspective. Not only are most of them unlikely to live a lifestyle equal or better than that of their parents, too many of them are simply unable to afford to take care of the children at what their value system tells them is the bare minimum level of financial support. Ouch! These trends explain why so many traditional white voters, feeling very victimized by the very system they have supported for decades, are increasingly hell-bent on securing (or re-securing) their control of that system, regardless of the fairness of their dominance over minorities they wish to disenfranchise. This trend is a massive threat to our democratic system of government, so we damned-well better address the underlying issues.
The above amalgamation of pain is just one more way of saying, “It’s the economy, stupid!” to all those politicians with fancy “values of a fair and just society” programs that have become their national priorities. Values which I myself embrace. Whether this stems from a political Maslovian perspective (see Maslow's hierarchy of needs reflected in the above graphic pyramid) or some deep political reality, successful politicians cater to their respective constituents by mirroring the priorities of their voters. As more people fall out of the middle class to “less,” this constituency is only going to get angrier and more desperate.
As callous as this may sound, while “values of a fair and just society” are valid political goals, unless the deeper economic issues are addressed first, solicitations based on pure “goodness” and “fairness” will fall on relatively deaf ears. Likewise, national security values rise with fears of being attacked and killed, real or not.
For political parties trying to figure out their next move, they must first consider their likely constituents’ fears and priorities before substituting those values that they might prefer to prioritize. For most seeking election, there is nothing wrong with fighting for what they deeply believe in, but to get that cherished office, there really is a pragmatic need to place those special issues in the right order of what their constituents really want. Liberal elites, take heed. Hillary Clinton may have lost the presidency for myriad reasons, but prime amongst them was a failure to present a clear and prioritized path to address the economic needs of a constituency they once dominated.
I’m Peter Dekom, and I am trying to “tell it like it is.”

Tuesday, January 3, 2017

Food Flight and Other Lessons of Life

Back on November 17th, I blogged (The Truth May be Hard to Swallow) about how rising food, labor and real estate costs are making operating restaurants – even pricey ones – increasingly difficult in America’s most expensive cities. I focused on the New York metropolitan area, but clearly San Francisco, Washington, D.C. and Los Angeles are facing the same plight. Famous eateries – like NYC‘s famous Carnegie Deli – have simply not been able to create an economic model to keep their doors open… even when they are packed. Indeed, with these changing economics, New York City may be a city where finding a deli anywhere will be a challenge.
For those restaurants at the top of the food chain, where there is no price issue because of the rich folks still able to afford the best, there may some marginal profit cuts, but life can still trundle on. Food trucks are filling in the void in many urban areas, where real estate becomes a much lower barrier to existence. The growing trend toward Web-driven, gig-based, home delivery by a number of restaurants also reduces the dependence on a larger real estate footprint as well. But for those who still like to go out… what is truly amazing is what is happening to restaurants “in the middle.” Especially food chains.
“During the 1980s and 1990s, many restaurant chains set their sights on the middle class, offering reasonably priced meals that entire families could enjoy and afford. But as the middle class shrinks and inequality becomes more of an issue, these chains are hemorrhaging customers, and it’s ultimately going to doom them. Another product of less-than-ideal economic conditions is rising food prices. There have been a variety of factors behind skyrocketing prices for food, including climactic conditions and tighter regulations. As a result, consumers and businesses are feeling the pinch.
“Many restaurants simply haven’t been able to absorb the costs without switching gears. You’ll notice that certain fast food restaurants have bumped up prices over the past few years, for example. If they crank the dial too much, though, consumers will just find another place to eat if they feel prices are too high. And the cutthroat world of food offers consumers alternatives almost everywhere they look…For example, chains like The Olive Garden and Red Lobster are or looked to be nearing the end of their lifespan.”CheatSheet.com, December 21st.
Indeed, as the new administration seems intent on deporting a whole host of undocumented workers – labor that is heavily associated with kitchen staffing as well as agricultural work (from stoop labor to working in slaughterhouses), keeping those related costs much lower than if U.S. citizens (assuming they would even take those jobs) were employed – it does seem that there will be an even greater upward spike in the cost of eating out. Higher costs with fewer people able to afford to spend more… not a good sign for a work force that is often either an entry-level job market for the young or a last-ditch place to make a buck for the lightly-skilled or unskilled.
We can look at the marginalized middle-aged and older rust belt workers featured in my December 20th Left Behind to Rust blog and look at that group as “them,” but – if you’ll pardon the allusion – they just might be the collective canary in the coal mine for the rest of “us.” They are simply part of the same narrative that will eventually swallow most of us up, sooner most probably than later. We know that those West Virginia coal miners are not going to get back those highly-paid coal mining jobs in any numbers to make a difference. But then, this little story is going to get played out across our entire economy – the world actually – over the next few years.
Technology and global competition are eliminating a lot of domestic manual labor, from manufacturing to harvesting/mining. Shifts in global energy demand and environmental requirements (whether or not we maintain our own environmental rules) will also reconfigure our workforce, whether we like or not. As automation gets increasingly complex, as artificial intelligence (with self-learning capacity) becomes increasingly entwined within our job market, those unemployed coal miners might soon be sales clerks, stock brokers, doctors, lawyers, accountants, financial planners/advisors, etc., etc.
About a third of existing jobs in the United States will be replaced by technology within the next decade plus. Will there be a set of new jobs, with equal or better pay, replacing them? I am very skeptical. Self-driving cars and trucks anyone? Clearly, without the earning power of this cadre of better-paid workers, demand for “little luxuries” like dining out anywhere are slowly going to get priced out of reach for increasing numbers of Americans. Those jobs will vaporize, creating a vicious spiral where those who own the machines make money and those who don’t no longer do. For a while anyway.
Yet everything in our modern economy is predicated on having enough people working to make enough money to support the business of business. 70% of the American economy is consumer driven. No or fewer consumers and even those who own the machines will be hurting. Under a pure capitalist system, that’s just the way it is. People will adjust under market forces… but can they? We certainly have neither the governmental structure, the leadership nor the will to deal with what appears to be inevitable. Job scarcity that virtually no level of training will overcome.
We are excellent at kicking the can down the road and passing laws based on slogans without practical value. We are even better at using tax dollars to spend on costs without an economic rate of return (like a fat military that has not won a major conflict since WWII) but terrible at supporting investments in our future: better education, infrastructure and scientific research. While the president-elect promised major infrastructure investments, he has yet to convince his own party to get behind that effort. And meanwhile, he has appointed cabinet heads committed to downsizing the federal role in public education and scientific research.
If anything, we are making the inevitable worse, probably a lot worse, by pretending we can kill off these measures of progress and turn back the hands of time to an era when those impending forces either just didn’t matter or did not exist. We can’t, populist demand to the contrary. But someone in this ugly chicken coop better get to planning how to run an entire society on a new economic model that will be shoved down our throats, like or not.
I’m Peter Dekom, and true leadership – which seems to be entirely lacking in our country – is one that deals with what is and what inevitably will be to the benefit of the entire society.

The Internet of Hackable Things

Trucks ramming into pedestrians. Bombs planted at public events. Random shootings and stabbings. These are easily observable acts of terrorism for which the perpetrating organization often takes clear credit. But as we progress through an increasingly automated world, one where e-communications of one form or another often determine policy and reaction and our real world systems are run by smart machines, we face an escalating threat of cyberattacks where the perpetrator often prefers to remain cloaked and untraceable. Bots. Malware. Data breaches. Diverting control. Yeah, that stuff.
If we are to believe the CIA, NSA and the FBI, Russia (with the full complicity of its strongman/president Vladimir Putin) pretty clearly actively intervened in American election-related communications to tilt the electorate away from Hillary Clinton and towards Donald Trump. Their method: pay a cadre of clandestine misfits a lot of cash under the table to implement the government’s policy directives – adding a layer of separation and deniability to their efforts. Unfortunately for Russia, some of those misfits spilled the beans.
Such threats strike at the core of our system of government, and I can easily envision where a rogue president might someday declare the American election system to be so untrustworthy that one or more entire election cycles are simply cancelled pending correction to such rogue president’s satisfaction. Fake news is apparently more relevant today than facts.
That is political terrorism, well within the bounds of foreseeability. But there is another form of cyberterrorism that will simply kill people through pre-programmed “accidents” inflicted by hacking systems governed by artificial intelligence or automation. It is invading that nebulous arena, very common today, where machines communicate with other machines: the Internet of things.
Look at some of what could happen. From a driverless car or truck that turns into a crowd of pedestrians, to a complete shutdown of a power grid, the destruction of our electronic financial infrastructure, the intentional meltdown of a nuclear power facility, disrupting GPS controls, to an oil storage/refining structure that suddenly dumps thousands of gallons of flammable liquid to flood a large area before igniting the mess, to medical implants controlled by outside monitors instructed by hackers simply to stop working, to the thousands of potential industrial accidents that wind up taking the lives of operators of automated assembly lines… well you get it.
If the news weren’t full of a litany of serious data breaches, we could just write all this off to a bad science fiction movie. But we know better:
“In mid-December, Uber put its first fleet of self-driving cars on the road in its hometown of San Francisco. The online transportation network company did so in the face of state regulators who say the company needs a permit to keep the vehicles on the road. By noon that first day, a video surfaced online showing one of Uber's Volvo XC90s, equipped with their ‘state-of-the-art self-driving technology,’ running a red light.
“Just a year prior, Chrysler issued recalls for 1.4 million hackable cars after a vulnerability in several models was discovered which gave hackers the ability to control vehicles remotely… ‘It will take some kind of major event to push this type of industry,’ Marshall Heilman, VP of Mandiant Consulting at cybersecurity firm FireEye, told AOL.com… ‘If you look at the automation of cars, obviously the government has to have some type of legislation and mandate to secure that environment. Otherwise, could you imagine if hackers were able to take over a bunch of cars and drive them around; that would be extremely bad,’ said Heilman… ‘Some type of event I think is going to have to occur before the government actually gets involved and sets those particular standards.’
“Heilman sees the trajectory of safety in the automation industry as analogous to the oil and gas industry. ‘Safety is the biggest thing that they worry about now,’ said Heilman of oil and gas companies. ‘And that's because since they've had a number of accidents over the years, the government has stepped in, and there are now all types of mandatory safety requirements and legislation around that particular problem…I expect to see a same thing in the automation industry,’ said Heilman…
“One field where security has seemingly yet to catch up to its innovation is the medical industry. While there have been considerable breakthroughs in defibrillator and other implantable technologies, research suggests these advancements may come with a price. ‘There are certain medical devices that are implanted in human beings that can possibly be hacked,’ said Heilman.
“In October of this year [2016], cybersecurity firm Bishop Fox backed an original report from short-selling firm Muddy Waters which claimed to find a critical and life-threatening vulnerability in Jude Medical Inc cardiac implants. If compromised, the report states that hackers could convert the company's Merlin@home patient monitoring devices into ‘weapons’ with the ability to cause cardiac implants to stop providing care, and even deliver shocks to patients… St. Jude has strongly disputed these claims, which are currently under investigation by the U.S. Food and Drug Administration.” AOL.com, December 26th.
We are constantly reminded by various governmental agencies that the United States remains exceptionally vulnerable and unprepared for these increasingly sophisticated cyberattacks. We also know that the United States itself engages in such clearly intrusive behavior, both defensively and offensively. It’s just the way it is. 2014 was a horrible year for data breaches. 2015 was worse. 2016 achieved new depths of hacking, breaking records. 2017 is upon us. How bad does it have to get there before we realize how important protecting our digital world has become?
I’m Peter Dekom, and I suspect that the United States might need to reallocate massive amounts of military budgetary proposals to focus much more heavily on what could cause enough damage to shut down the entire United States: inadequate cybersecurity.

Sunday, January 1, 2017

That Dog Don’t Hunt

It is most interesting to watch the evolution of English as an international language. Until the Internet revolution, for example, sophisticated English-speakers in India – reflecting their British rule until 1948 – spoke with a decidedly upscale London twang. But as the United States began relying heavily on India-based call-centers – “Hello, my name is Bob” – the accents rapidly gravitated toward that flat mid-Atlantic Americanese. There were diction and language schools all over India teaching that American accent. They went for “neutral” American, and today you can almost separate educated Indians by generations simply based on their choice of dialect when speaking English. Older – British. Younger – American.
But even as English (mostly with an American lilt) has become a must-speak for international business, assuming that we remain the major force in global markets (not a certainty anymore), it takes a lot more than an accent to make oneself understood overseas. Americans lace their conversations with lots of idioms and metaphors that just plain don’t travel. The title of this piece is a southern expression for “that’s really not a good idea.” It travels abroad almost as well as “it’s fourth and inches” or “that’s a home run.” Uniquely American sports just don’t carry much meaning overseas.
To most non-North American English-speakers, “football” still means soccer, and if you are referring to the Denver Broncos or the Seattle Seahawks, that’s “American football,” if they even get that allusion at all.
While there are strange pockets of native English-speakers in countries like Singapore (where English is one of three legal languages, along with Mandarin and Malay) or the Philippines (English and Filipino/Tagalog are official languages), local expressions have created dialects that are just plain indecipherable to Americans. Singlish is almost a language unto itself, adding words and phrases (often abbreviated or adapted) from other local languages. An “ah beng” is a badly-dressed hick. Even seemingly pure English phrases have different meanings to Singaporeans. For example, “ah then?" is the sarcastic response given to glaringly obvious questions or statements.
In India, where there are dozens of local dialects not remotely understood by folks from other parts of the Subcontinent, English has become a “subsidiary official” language, one that all educated Indians speak; it is the language of Indian commerce and national political communication (English crosses over local dialects). While 41% of Indians speak Hindi, the linguistic diversity pushes English to the fore for anyone doing business across the country. But there are words that baffle outsiders. They tell you they are speaking English, but... If you’re “out of station” in India, you are out of town. A “dacoit” is a criminal miscreant, while a “crore” is ten million; a “lakh” is a mere hundred thousand. Read the local papers in English and prepare to be most confused.
For native speakers, slipping into idiomatic English can seriously undermine an international dialog; some of us simply need to learn how to edit those idioms out of such conversations. When an American flows into his or her idiomatic habits, while they might think they are being clear, as often as not, they simply are not being understood.
With non-native English speakers now vastly outnumbering native speakers, it’s up to the latter to be more adaptable, says Neil Shaw, intercultural fluency lead at the British Council, the UK’s international educational and cultural body. About 1.75 billion people worldwide speak English at a useful level, and by 2020 it’s expected to be two billion, according to the British Council.
“In the Council’s new intercultural fluency courses launched in September, native English speakers in countries from Singapore to South Africa have been prompted to rethink how they communicate. ‘It’s a bit of a revelation to many of them that their English isn’t as clear and effective as they think it is,’ Shaw says.
“Increasingly, English is being used as a lingua franca. ‘It’s not an exotic thing anymore to be working in a global, virtual team,’ says Robert Gibson, an intercultural consultant based in Munich, Germany. ‘It’s everyday life for many people and it’s quite stressful and difficult.’…
“It can be a culture shock for native speakers to encounter new varieties of English…‘The English language is changing quite radically,’ says Gibson. ‘The trend is not to have one or two clear standard Englishes like American English and British English, but to have a lot of different types of English.’
“Chinese English, known as chinglish, and German English, called denglish, are examples, he says. ‘English is also developing within organisations. In companies, they have their own style of English which is not necessarily understood by native speakers. We are getting away from saying that there is a standard English you need to conform to [towards] saying that there are different standards of English for different situations.’
“Mother-tongue English may not even be an advantage anymore, says Dr. Dominic Watt, sociolinguistics expert at the University of York in the UK… ‘It’s not necessarily in your interests to be a native speaker of English because you haven’t had to go through the same learning process that the non-natives have. So they’re all on the same page and it’s the native speakers who are the odd ones out,’ Watt says.” BBC.com, December 16th.
In the end, international English is reaching out for neutral commonality. It requires those participating in international communications to edit their writings and spoken phrases into expressions that do not require any special regional knowledge (where idioms are born) and embrace a pretty linear and dictionary-literal meaning. And it varies from country to country. Think you can play that game? We Americans are spoiled, because English is still the go-to international common ground… well sort-of English. But we need to try harder to speak to be understood. And then: Will French, Spanish or Mandarin displace English over time?
I’m Peter Dekom, and finding common ground in a world that seems to be splintering apart does require the ability at least to speak with each other.