Friday, July 30, 2021

The Secret (Pandemic) Life of Plants

A picture containing plant, tree, poplar, forest

Description automatically generated

 As COVID rises and falls only to rise again, we are deeply focused on human survival and resistance. Where we are otherwise distracted by nature’s harshness, it is generally over the horrific and never-ending aspects of climate change and those disasters that impact human communities. From fires and floods to desertification, coastal erosion and severe tropical storms. The subtle changes, not really that subtle, include the migration of disease carrying insects to the warming regions, and perhaps the migration of human beings away from increasingly barren land.

Global warming also provides an environmental change that can move plant diseases, harmful plant destroying insects, into ecosystems that have absolutely no resistance to the rising destructive challenges. Plant killers are moving into their unprepared “neighborhoods.” If human beings can be devastated by roiling disease, the same can be said for plant life, on land or in the sea. Many trees, for example, weakened by accelerating heat and dryness, become particularly vulnerable to insect “predators,” fungi, bacteria, etc. Whole forests and jungles can very literally be eaten alive. Pictured above. Destroying the same CO2 absorbing greenery that fights against the flood of greenhouse gas, generating precious oxygen for us all.

For selfish humans, arrogantly believing that they have a right to survive above all other life, the issue generally devolves around agriculture: food production. Writing for the July 21st FastCompany.com, Jacqueline Heard (CEO of Enko, a company providing safe and sustainable solutions to farmers facing crop threats) lays it on the line: “Across farms worldwide, there are now warning signs of a pandemic in food crops. More than 600 pest species have developed some form of resistance to pesticides, which causes $10 billion in losses in the United States alone each year. Climate volatility intensifies these threats, and many crops are already suffering—citrus blight and banana fungus wreak havoc for growers and supply chains. With global food supplies vulnerable and food prices at their highest in almost a decade, a plant pandemic could push more people into poverty and cause social unrest.”

Indeed, given the relative inaction (mostly post-outbreak reaction, most of which was slow and ineffective) to the COVID devastation, can we expect much in the way of global preparation for the rising assault on food production, perhaps even a series of plant pandemics that might mirror humanity’s struggles with the COVID virus? Or will we have to face mass starvation before we begin to take necessary corrective steps? What do we face and what can we realistically do? Heard continues: “Agricultural research and development pipelines are long: it takes an average of 10 to 12 years for scientists to discover and develop new products that will help farmers protect their crops from emerging pests, weeds and diseases.

“Growers around the world can’t afford to wait a decade for solutions to the problems they face. Historic droughts in the American West could wipe out a tenth of the San Joaquin Valley’s acreage for agricultural production in the coming years. In Ecuador, a fungus outbreak threatens the banana crop on which 17% of Ecuadorians’ livelihoods depend. Climate change will worsen these problems: as global temperatures rise, pests will emerge in new geographies and new diseases will infect crops.

“The agricultural industry can develop solutions now by investing in cutting-edge technologies and prioritizing safety in their design. For example, CRISPR—which could revolutionize human healthcare—will play a key role in crop health too. Scientists are already using gene editing to develop more resilient seeds and plants that can sequester more carbon. Other technologies that originated in pharma, like targeted protein degradation, also have promising applications in agriculture, from helping control weeds to addressing how intrusive plants are becoming resistant to current agricultural methods. Tech governance should strike a balance between prioritizing safety and supporting these innovations, not dampening them. New technologies designed with safety in mind will meet regulatory standards sooner and get into growers’ hands faster.

“Just as the scientific community tackled COVID-19 with a toolkit of treatments, vaccines, and preventative measures, the ag industry can develop a set of safe, effective resources with which growers can rapidly respond to emerging threats.

“This is already happening: just as the Food & Drug Administration granted emergency use authorization (EUA) of COVID-19 vaccines, the Environmental Protection Agency has granted EUA to treat plant diseases like allowing chemical fungicides for use against coffee rust in Hawaii. Stopgap measures like these prevent existing problems from getting worse, but they can also force growers to use older, more toxic ingredients because there are no other options available.” It can become a vicious spiral, as that increasing toxicity adds a litany of additional challenges to human health. Heard also argues fiercely for greater transparency, nations cooperating in finding solutions and not hiding their severest problems.

“During the pandemic, unclear communication from health authorities and misinformation led to widespread distrust of COVID-19 vaccines. As a result, the U.S. is still struggling to contain the virus even though vaccines are available to almost everyone.

“The agricultural industry has faced similar challenges. Its lack of transparency with consumers about concepts like organic farming and GMOs have led to misunderstanding and distrust with serious consequences. Some organic farms are causing more environmental harm than good, and the backlash against GMOs has undermined progress toward more resilient crops.

“Rebuilding consumers’ confidence in its innovations requires the ag industry to trust that the public can handle the facts. That means sharing more information on the safety and environmental impacts of its practices than regulators require.” Heard.  The “one true thing” in this arena is not “if” but “when” and “how much.” Waiting and not engendering global cooperation just might kill hundreds of millions of people… and more than a little dependent plant and animal wildlife along the way. 

I’m Peter Dekom, and the threats to plant life, especially to agriculture, are here now and getting worse, so we can simply suffer and react or begin to anticipate and prepare.


Thursday, July 29, 2021

Oh, Do We Owe

 Text

Description automatically generated

There are so many issues with healthcare coverage in the United States; there is a dramatic lack of uniformity state to state. In the developed world, the US is a definite outlier; as we all know, we are the only developed nation on earth without universal healthcare. Universal healthcare in Germany, for example, cover just about any medical condition; it even routinely covers dental work, vision care and hearing aids, elements excluded from all but the most expansive and expensive private insurance policies in the United States. The German program, administered by private insurance companies, charges a percentage of income for the care, not an atypical of such programs. Germany caps monthly contributions for prescriptions to €10/month and the minor other charges amount to almost nothing. Americans also pay double or more per capita on annual medical costs than the average spent on the rest of the developed world. 

And while the Affordable Care Act (AKA “Obamacare”) was passed in 2010, between Congress and the Supreme Court, its coverage is not well-funded and its additional costs remain burdens to many. One aspect of the ACA was the ability of states to expand their Medicaid programs to cover a larger swath of lower income participants. The costs to the states are relatively small, but almost all the red states, already vehemently opposed to government’s providing healthcare to everyone, failed to participate. 

What is Medicaid anyway? “Medicaid is a health insurance program offered in the United States. It is jointly funded by the federal government and each state. It is provided for individuals who qualify based on income, age, or health need. Those who receive Medicaid are usually children, adults with a disability, or a low-income older Americans.” Vittana.org. And obviously, it is the states that set the eligibility rules. Medicare is the medical coverage provided to the elderly as part of their return for a lifetime of payroll contributions.

The United States is home to so-called “medical bankruptcy,” which addresses not income loss from disability as much as it focuses on defaults in medical bills, generally resulting from lack of coverage, deductibles, exclusions and co-pays. Just having insurance does not mean a covered individual has no large medical bills. Even Medicare, with its “doughnut hole,” doesn’t cover the full cost of prescription drugs, and Social Security benefits get taxed in part to cover Medicare. You’d think that we would get these costs under control but instead of watching medical debt subside, the opposite is happening. It’s not just that Americans are unable to pay their medical bills, unsurprisingly a number that is significantly higher in red states that have not expanded Medicaid, but much of that debt is in the hands of collection agencies, which can be vicious in their pursuits, pushing many over the top into bankruptcy.

New research published Tuesday [7/20] in JAMA finds that collection agencies held $140 billion in unpaid medical bills last year. An earlier study, examining debts in 2016, estimated that Americans held $81 billion in medical debt.

“This new paper took a more complete look at which patients have outstanding medical debts, including individuals who do not have credit cards or bank accounts. Using 10 percent of all credit reports from the credit rating agency TransUnion, the paper finds that about 18 percent of Americans hold medical debt that is in collections.

“The researchers found that, between 2009 and 2020, unpaid medical bills became the largest source of debt that Americans owe collections agencies. Overall debt, both from medical bills and other sources, declined during that period as the economy recovered from the Great Recession…

“The $140 billion in debt does not count all medical bills owed to health care providers, because it measures only debts that have been sold to collections agencies. The increasing number of lawsuits that hospitals file against patients to collect debt, which can lead to legal fees or wage garnishments, are not included in the figure. Nor are the medical bills that patients pay with credit cards or have on long-term payment plans. Some of the difference between the new estimate and the older, smaller one may reflect differences in how different credit rating agencies categorize debts… The new paper does not include data during the coronavirus pandemic, which is not yet available.” New York Times, July 20th.

Why? Uncapped and uncontrolled mark-ups, staggering profits and an attitude of pressing the market as far as it can be stretched. The Obama administration had to make major concessions to pharmaceutical companies and insurance carriers to end their massive lobbying efforts against the bill when it was presented to even a Democrat dominated Congress. It’s not as if healthcare lobbyists only have Republicans in their pockets (which they definitely do), but their dollars also fund a number Democrats desperate for campaign funding in a highly expensive and competitive field. There are so many well-funded and greedy players in healthcare solidly arrayed against a healthcare system that could make economic sense. 

Everybody knows the system does not work. But when 20 red state attorneys general, with the full cooperation of the Trump Department of Justice, attempt to throw the entire ACA out the window as unconstitutional, which would have cut millions and millions of Americans off from insurance coverage, you just know those monied interests are not going to allow “reasonable” to prevail without a bloody fight. Ever the populist, during his initial presidential campaign, Donald Trump promised a replacement for the ACA that would extend greater coverage to more people at a lower price. In the entire Trump term, there was never even a single attempt to articulate what that coverage would look like; the entire effort was focused on eliminating the paltry coverage that existed. Which the U.S. Supreme Court rejected anyway.

There’s just too much bribe money… er… unlimited Citizens United vs FEC campaign contributions funded by stakeholders in the healthcare industry primarily to generate profits. There are so many massive issues in the United States that can only be solved by concerted governmental action. Tax cuts for the rich creating massive federal deficits and cuts to state programs, depleting funding for so many necessary solutions to dire issues. Healthcare. Childcare. Unaffordable college tuition and resulting staggering student debt. Quality public education in general. Climate change. Infrastructure. Gun violence. Homelessness. Social safety nets. Pandemic recovery. But the players with the most money, unleashed to spend it to proselytize why hog-slopping richness is good for the country, are fighting back. You know it when a very sane and reasonable policy is labeled “creeping socialism.” We also should know that is a buzz phrase for fair taxation of the mega-rich to solve the most dire problems this nation has faced since WWII.

I’m Peter Dekom, and have we become so completely used to advertising manipulation that we make our most important decisions based on labels without looking at the substance?


Wednesday, July 28, 2021

Charge!

A blue square with a cross and a cross on it

Description automatically generated with low confidence

It is the Achilles Heel of electricity-powered distance travel, why I bought a hybrid instead of an all-electric vehicle: limited range. Average electric vehicle range, with a few very expensive mega-battery-capacity cars and trucks, seldom realistically can achieve more than 200, maybe 250 miles of travel, no matter what the brochure claims. Charging stations still cannot “refuel” an electric car as quickly as a traditional gas/diesel filling station. Not to mention that there is still a major dearth of charging stations within reasonable distance of each other. In plotting an upcoming drive from Los Angeles to Santa Fe, NM, I was shocked that even some pretty reasonably upscale hotels lacked a vehicle charging capacity. 

Even as Biden’s infrastructure dreams envision a massive network of accessible and well-placed electric vehicle charging stations, we are a very long way away from that reality. For most of us, the commuting/running errands range requires generally no more than 40 miles for a round trip. Thus, if local driving is what is needed, electric cars are more than adequate. For longer distance travel, not so much.

There are some general principles in this reality. Cars with empty or near-empty batteries charge much faster, literally like electrical sponges. The higher the voltage of the charger, the faster the charge, assuming that the vehicle can accept those exceptional voltages. So here are some basics from PluginCars.com (presented by Brad Berman, 4/24/19):

The slowest form of electric car charging at 120 volts is called Level 1. And mid-range 240-volt charging is called Level 2. But ultrafast charging that can deliver juice between 50 and 350 kW is not called Level 3 as you might expect. According to official terminology from the Society of Automotive Engineering, it’s called DC Quick Charging or DC Fast Charging—sometimes abbreviated as DCQC. The decision to not use the term Level 3, and to call it "quick charging" instead makes some sense for two reasons:

 

  • First, in Level 1 and Level 2 charging, common alternating current (AC) electricity is fed to the car where it is converted by an onboard charger to direct current (DC) before going to the battery pack. In DC quick charging, the charger is located outside the vehicle. This large piece of equipment handles the AC-to-DC conversion and supplies DC electricity to the battery at a much higher rate.

  • Second, while all electric cars can accept the first two levels of charging, only EVs that carry special quick charging equipment can take advantage of those walloping big jolts of electric fuel. Quick charging is a different animal than Level 1 and Level 2.

 

On a technical basis, you could look at the voltage as an indication of charging speed. Each level of charging essentially doubles the voltage. Jumping from Level 1 to Level 2 means an increase from 120 volts to 240 volts. Likewise, quick charging doubles voltage once again to 480 volts, which is often rounded off to 500.

A more useful metric is the number of miles of range that are added for every hour of charging. Before firing off an angry email, please know that these numbers represent a general rule of thumb rather than any guarantee. With Level 1 120-volt charging, you can add about 4 miles of range every hour. That’s slow, just the way a car that drives 4 miles per hour is barely moving. Level 2 240-volt charging adds around 25 miles of range in an hour. That’s a better speed, just the way traveling 25 mph in a car is good for many city situations.

But quick charging theoretically increases things to Autobahn speed: 100 miles, or more, of added range per hour. The even faster DC quick chargers emerging in 2019 will slice that time down to 15 minutes or less. In practice, due to many factors, the speed of quick charging does not run at a steady pace.

We also know that whether or not an electric car is truly emission-free depends on where and how the electricity at the charging station is generated. If the electricity comes from a coal-fired plant, well, that car may itself be emission free, but the effect “real rate” of effluents is hardly “carbon neutral.” As we evolve alternative vehicles, the future may reverse that risk, perhaps justifying the cost and the effort.

But as Mark Wilson, writing for the July 14th FastCompany.com tells us, we better get used to this ascending reality: “By 2030, an estimated 20% of all cars sold will be electric vehicles. And by 2035, states like California will ban the sale of gas-powered vehicles. Soon, the gas station, as we know it, could be extinct, replaced by an exponential demand for electric vehicle charging stations. And that will change the way we go on road trips forever.” Concomitant with the switch to electrically powered vehicles is the parallel introduction of self-driving control systems, which certainly allow older infrastructure to carry more traffic.

Wilson also notes that since the rollout of charging stations will be exponential, having fixed signs pointing the way to nearby charging stations may be replaced with digital signs capable of reflecting an accelerating and constant addition of more such charging stations. The information is just updated as needed. “If your car has a self-driving vision system, it could even be programmed to recognize a code on this sign and route your GPS right to the sight. Meanwhile, a ring of LEDs around the display could glow red, green, or any other color to indicate if the attraction is open or closed.” Wilson.

And while engineers are developing larger-capacity batteries, perhaps with accelerating refueling times, maybe even without relying on rare, expensive and environmentally toxic rare earth metals, electric vehicles on longer trips face serious time parameters for the foreseeable future: “It takes just 2-3 minutes to gas up a car. Meanwhile, the common DC Fast Charging standard used by electric cars today requires 20-40 minutes to juice up the vehicle. That means people traveling across the country will have serious time to kill, the sort of time that would be wasted looking through mystery meats spinning on the truck stop’s roller grill.

“Given the fact you’ll need to spend half an hour charging, no matter what, why not take 5-10 minutes to see a new place?” says Gadi Amit, founder of NewDealDesign [which is creating those new digital signs].” Wilson. As they say, get used to it.

I’m Peter Dekom, and since we are moving into this alternative fuel era, I thought describing some of the variables might be useful to my readers.


 

Tuesday, July 27, 2021

American Economic Steps and Missteps – Defining Us Today

A picture containing text, person, person

Description automatically generated 

Huge monetary and fiscal decisions in our modern era have pretty much twisted and squirmed to set our current economic stage. We’ve ignored general economic principles – perhaps no one greater than the “guns or butter” maximum (read: you can have lower taxes or widespread military conflict, but not at the same time) – and paved new economic ground. I generally look at “modern” as post-World War II.

Post WWII United States benefitted greatly by the pre-war New Deal focus on infrastructure, particularly our over-building hydro-electric generating capacity with a surfeit of dams. After the war, we were the only large, developed nation left largely unscarred by bombing and artillery damage. Our factories went into overdrive, returning GIs had “benefits” that generated affordable home ownership and college educations, and unions provided workers paid well enough to become middle class consumers. While most of the rest of the world was engaged in rebuilding, Americans were digging into consuming. In 1957, President Eisenhower (R) signed the National Highway Defense Act which began our incredibly significant economically productive Interstate Highway System.

The Korean War could have been a huge drain on our economy, but it lasted only three years. It ended in 1953, and while the Vietnam War technically began two years later, American involvement did not “escalate” into significant involvement until 1964. It did not end until the fall of Saigon eleven years later. But we began some nasty economic habits that became routine. As the cost of spending rose, government domestic entitlements and military costs rose, but taxes were cut. Nixon’s Tax Reform Act of 1969 did not pare a lot of income tax – indeed his alternative minimum tax even taxed a few rich folks more – but it still represented a cut at a time when we were still at war. Butter and guns. Vietnam began our slow march toward massive federal deficits. But Nixon also implemented a policy that changed the economic complexion of global trade forever.

We take a global economy for granted today, but… “At the end of the Second World War, there was literally no functioning global economy, so nations got together to create a new trading system and a new monetary system. That monetary system was devised in a town in New Hampshire called Bretton Woods, so it was called the Bretton Woods Agreement. One of the key elements was that the dollar would be pegged to gold at $35 an ounce. Other central banks could exchange the dollars they held for gold. In that sense, the dollar was as good as gold. Every other currency had a fixed exchange rate to the dollar.

“They established the dollar-gold standard to create some predictability and stability for global commerce. For the next 25 years, it was a tremendous success… When the Nixon administration came into office in 1969, they realize that the world economy had grown very, very big. Everybody wanted dollars, so the Federal Reserve was printing lots of dollars. As a result, there were four times as many dollars in circulation as there was gold in reserves.

“The rate of $35 for an ounce of gold was good in 1944, but it hadn’t changed, so by 1971 the dollar was really overvalued. That meant imports were very cheap, and exports were very expensive. We experienced our first trade deficit since the 19th century. We were experiencing employment problems. For the first time, the U.S. started to talk about losing competitiveness…

“On top of all that, there was the beginning of inflation. If it continued long enough, dollars would be worth less than they were before. The Nixon Administration was afraid that other countries were going to ask for gold and the U.S. wouldn’t have it. That would have been an enormous humiliation and a breaking of their commitment to exchange gold for dollars…

“In August 1971, President Nixon took his top economic advisors to Camp David. Over three days, they made the radical and momentous decision to cut the dollar loose from gold. In the process, they unilaterally changed the whole global monetary system… Nixon masterfully created a situation where suddenly countries understood that they needed coordinated policies to deal with finance, trade, energy, and food. We entered a period of enormous international cooperation on the heels of this very tough decision that Nixon made at Camp David.” Author and Yale School of Management Dean Emeritus, Jeffrey Garten in the July 19th Yale Insights. But delinking the dollar from a tangible value also enabled a new era of deficit spending.

As the Soviet Union began to crumble, President Ronald Reagan began his supply side/trickle down tax policy, a platform that would become an immutable, even when thoroughly discredited, core vector of the Republican Party for all the years that followed, well into the present day.  “In 1980 Reagan promised those cuts and over his next 2 terms, he cut taxes to the lowest since the 1920s when the top Personal Income Tax rates were lowered from 73% to 25% in the Revenue Act of 1921, the Revenue Act of 1924, and the Revenue Act of 1926. When the tax cuts were finally put into the tax code, one of the longest peacetime expansions in history began… [while t]he budget deficit increased from $74 billion in 1980 to $221 billion in 1990.” Wikipedia. The tax code needed adjusting, and efforts seemed to pay off… at first. But we were effectively living on borrowed money. America was beginning an addiction to quick fixes that would have serious repercussions for years to come.

Debates over “entitlements” and “upgrading and expanding our infrastructure” fell victim to keeping taxes low for the well-heeled. “While the average income grew 75% and the median income grew 10% from 1980-1989, this disparity shows that while top earners made huge gains, bottom earners' income grew slower than the inflation rate for the same decade… The US Federal Tax Revenue as % of the GDP decreased from 18.5 to 17.4 from 1980–1990.” Wikipedia. The long march to severe income inequality had begun. We also learned that wealthy taxpayers do not instantly create new jobs and invest in research and development when they receive a massive tax cut windfall. A rising ride definitely did not float all boats. But that notion is still the most basic Republican economic platform. Even though it has never ever worked, it just sounds so right.

But Democrats also got it wrong, very, very wrong. Passed as part of the 1933 Banking Act, the Glass-Steagall Act was originally intended to separate commercial banks, which accepted deposits, made loans, and were insured by the FDIC, and investment banks, which brought securities issues to markets, acted as brokers, and traded those securities previously issued. Responding to pressures from the financial sector, which felt that separation was not competitively advantageous, in November of 1999, President Bill Clinton led the charge to repeal those restrictions. Effectively, as commercial banks, investment banks and trading institutions merged, they now had access to absurdly inexpensive Federal Reserve bank rates, which, instead of creating more liquidity for the rest of America’s economy, generated cheap loans for investment banks and brokerage houses to invest on their own account.

Mired in a world of derivatives, including bundles of subprime mortgages, our leading financial institutions began overleveraging based on profoundly overvalued assets. The massive economic fall, as institutions like Lehman Bros. and Bear Stearns collapsed and Merrill Lynch survived only because of a forced merger, hammered well after Clinton left office: 2007-2010. Bailouts and emergency federal support squeezed “salvation” from a system that simply proved that “too big to fail” financial institutions could never be trusted to self-regulate.

I’ll end this diatribe with the US proclivity to have guns and butter, most severely illustrated by a massive corporate tax cut, one that created no major shift to more solid jobs, but simply inflated values owned by the mega-rich as the expense of the rest of us: the 2017 Trump corporate tax cut. Right in the middle of our multiple trillion dollar “longest wars ever” in the Middle East and Central Asia. Deficits soared. Income inequality went on steroids.

Deficits in times of crisis, like WWII and the recent/current pandemic, are normal. Since global crises of this kind spread the economic nasties everywhere, the deficits generated within the US to right our ship do not put us at a competitive disadvantage against the rest of the world. But whether we invest in our future or simply continue to coddle the rich will clearly define who we are for the foreseeable future. Too many Americans believe the political slogans of politicians seeking office. Economics are simply too complicated to attempt to understand. Not really!

I’m Peter Dekom, and once and a while we need to step back and understand our current economic realities and how we got here.

Monday, July 26, 2021

Unsustainable, A Perfect Storm

A sign in front of a house

Description automatically generated with medium confidence

The tea leaves are gathering, struggling through the “pandemic of the unvaccinated,” suggesting that our explosive economy of soaring values just might not be sustainable. Aside from our own issues with particularly virulent COVID outbreaks among those too young yet to be vaccinated or those too stubborn and unvaccinated to understand the risk they pose for themselves and the rest of us, we must realize that a largely unvaccinated world “out there” is a breeding ground for toxic COVID variants. And viruses do not respect international borders. Sooner or later…

There is a confluence of variables, beyond the above health issues, that also portend a dramatic fall in values, from stocks to real estate. Unfilled jobs tell some economists that there is plenty of growth left in the system, but most of the workers so affected are in the lesser paying labor force. Right-wing proselytizers tell us that if we pay them too well, if we provide universal health and childcare, the economy will collapse. You can always tell when rich people do not want to elevate those at the bottom or the middle rungs of the economic ladder; they always refer to “creeping socialism.” They want more money for themselves. They simply focus on the word “social” to conflate “socialism” with “social programs.” By way of example, public primary and secondary schools are just “social programs,” government ownership of businesses and real estate is true “socialism.”

But well beyond the war of words is the level of disruptive polarization that has ripped this nation apart. And as any economist will tell you, given enough destabilizing disruption and sooner or later the underlying market confidence will erode taking the economy with it. We have culture wars. Almost a third of our nation believes that Donald Trump was elected president. Taking the ability to vote away from liberal-leaning minorities is now a mandate for populist conservatives who do not see any other way to hold power. And these are simply the political factors that suggest a severely contracting economy is in the wind. Our youngest citizens entering the workforce face staggering student loan repayments and home costs that they cannot afford.

While the seeds of this collapse were also born of the slam of the pandemic combined with the ineptitude of the Trump administration, the fall will likely occur during the Biden years, and he and the Democrats will inevitably be blamed for the effect. Trapped in the middle of this concern is the Federal Reserve, dealing with issues like interest rates and money supply. If they allow interest rates to rise to a more natural level, highly leveraged corporate America and an overheated real estate market that is priced based on monthly carrying costs and not true values will face a plunge in share prices and real estate values. The Fed cannot keep interest rates at this level forever. But…

When Trump’s 2017 slash of corporate tax rates from 35% to 21% pumped one of the biggest windfalls to corporate America in history, companies simply bought their own shares or effected mergers and acquisitions, a bidding war which exploded the value of the stock market while failing to create greater spending on job growth or research and development. Abnormally low interest rates further fueled this fury. In short, the share prices increased solely because the government gave companies a whole lot more cash. There was not so much “there” there.

When the pandemic hit, some businesses (which depended on public attendance) faced a slump while others found a mechanism to pare their labor force (they pretended that to be temporary, but all too often it was permanent) and use the downtime to install job-replacing artificial intelligence driven automation. That moved the stock market even higher. Income inequality got so much worse; the United States now had the highest level of such inequality in the entire developed world. By a huge margin. Shareholders and senior management were earning embarrassing and unprecedented levels of income and compensation. Workers less.

As COVID restrictions eased, pent up consumer demand exploded, but can this represent something more than a temporary market condition? And exactly what do the litany of climate-change-caused natural disasters – from desertification to coastal erosion to wildfires to flooding – tell us about our economic future? How about housing costs?

Low interest rates pushed the real estate market into overdrive. Decades of underbuilding also dropped housing inventory through the floor. Short supply and lower rates priced housing not based on the true value of “house plus land” but based on affordability. People made “buy” decisions based on monthly carrying costs, not price. “Over asking price” sales were now common. For those who owned a starter home, now was the time to capitalize on the hyper-growth of that home and step up to the next level before the inventory contracted even more. Frenzy. For renters, as values soared, rents went up.  Pandemic layoffs or vaporizing small businesses added defaults and evictions to the litany of problems. Governmental limitations and stimulus checks would obviously run their course. And then what? 

“According to [their Chief Economist Mark] Fleming, in April, First American Data & Analytics’ nominal house price index increased 16.2% year over year, the fastest pace since 2005, and rapid appreciation is driving declines in affordability, despite rising incomes and lower mortgage rates.

“Nationally, according to our Real House Price Index, housing affordability declined in April on a year-over-year basis by 7%, the most since December 2018, he said. ‘Furthermore, homes typically remained on the market for 17 days in April, a record low. Multiple-offer bidding wars are common across the full spectrum of home prices.’” DSNews.com, June 28th

The market continues its overheated ways, a factor that will continue for quite a while. The reality today is that the average American can no longer afford to buy a home in 70% of the United States, particularly in the hot job market cities. Rising rental rates combined with increasing gentrification are pushing too many Americans into exceptionally long commutes; they are forced to move farther away from job-convenient housing that is no longer affordable. While homelessness is often a product of factors that include mental illness, disability and addiction, increasingly it is a direct result of housing costs. 

So here we sit, watching a series of contradictory forces push against each other as jobs go unfilled, housing prices continue to soar and share prices increase. Yet the last time we experienced such political polarization, the nation erupted into the Civil War. The pandemic, which was beginning to fade in this country, is coming back with a vengeance with the Delta variant. While jobs are unfilled, it’s mostly because workers want a livable wage. With the polarization of home ownership and income inequality at unprecedented levels, we seem to have sacrificed upward mobility and with it, hope. We really need a big “adjustment” based on a harsh reality check. When it comes, unfortunately, history tells us that those of us in the middle and at the bottom of the economic ladder will bear the brunt of that contraction.

I’m Peter Dekom, and who among us truly believes that these purported “good times” are completely sustainable for the long haul?


 

Sunday, July 25, 2021

Magnanimity It is Not

A picture containing ground, plane, transport, airplane

Description automatically generated

We know that Fortune 100 CEOs generally earn over 300 times the pay of their average workers; Fortune 500 CEOs earn over 200 times. The ratio is up from 120-to-1 in 2000, 42-to-1 in 1980 and 20-to-1 in 1950. The numbers are obviously staggering, and the system feeds on itself; companies believing that they have to compete for top executives by lavishing absurd compensation packages have simply bid up these pay levels from merely outrageous to completely and utterly unjustifiably absurd. During the pandemic, for those ordinary employees who were not furloughed or permanently discharged, they faced average cuts to their pay of between 13% and 35%. That was a real hit. As we shall see, for top corporate executives, life remained good, very, very good.

Before the pandemic, all compensation was on the rise, although disproportionately. “The Economic Policy Institute, a Washington think tank, studied 281 large firms and found that total CEO compensation rose 16% from 2019 to 2020 while the annual pay for the average worker in those companies rose only 1.8% in that same period. The study attributed the executive compensation growth to a rebound of the stock market in 2020 after the initial pandemic slump… An analysis by the Wall Street Journal in April concluded that the median pay for the CEOs of more than 300 of the largest U.S. public companies rose to $13.7 million in 2020, up from $12.8 million a year earlier.” Hugo Martín writing for the July 16th Los Angeles Times. 

But when the pandemic started shutting down or contracting businesses, embarrassed by their riches, well-heeled CEOs took to claiming they too would share in the pain with substantial cuts to their base salaries. “Most of the base salary cuts made by executives in 2020 were ‘a public relations gimmick’ and ‘almost inconsequential,’ said Lawrence Mishel, a distinguished fellow at the Economic Policy Institute who has studied executive compensation.

“The announced cuts, experts say, could have been made to show shareholders, board members and employees that they were sharing the pain or to send a message to lawmakers who were being lobbied to approve federal grants and loans to airlines, hotels and other struggling businesses.

“David Larcker, an accounting professor and executive compensation expert at Stanford University, described salary cuts by executives as ‘a good gesture’ but added that it ‘doesn’t mean that at the end of the day these guys are earning less compensation.’” LA Times. Huh? Because “base salary” for those at the top usually represents less than 20% of their total compensation. It does not include bonuses, which are often tied to share price, options or stock appreciation rights. It also excludes the alarming level of perks that did not cease during our medical catastrophe. Spin not substance.

“The base salaries cut by executives represented a small portion of the overall compensation packages. The Economic Policy Institute study called such cuts ‘largely symbolic.’… Calculating an executive’s total compensation is not simple because most of the pay comes in the form of stock and options that only come into play if the company meets certain financial milestones over several years. This creates an added incentive for the executives to increase the value of the company’s stock — and thus increase their own compensation. The stocks and options are reported to the SEC based on the value on the day they are awarded, not on the value they may achieve when they are vested in the future…

“Representatives for several executives whose compensation packages increased in 2020 despite announced reductions in pay defended the company leaders, saying they had no role in causing the financial crisis but volunteered to cut their base salaries to demonstrate leadership amid layoffs or furloughs… They also say that the stocks and options awarded to the executives in 2020 may drop in value in the future, depending on stock market conditions and whether their companies meet specific financial milestones.

“But the Los Angeles Times analysis found three companies — including Hilton — that modified payout rules or adopted new awards in 2020 to offset the loss in compensation that their executives faced because the companies failed to meet milestones. These changes were made to retain executives or reward them for leading the company through the difficult pandemic, the companies said.” LA Times. When these executives are making tens of millions of dollars a year in total compensation, when is enough, enough?

When the Trump-led 2017 corporate tax cut dropped the rate from 35% to 21%, creating a massive and continuing trillion-dollar-plus annual deficit, income inequality soared to unprecedented levels. The reality of high-income excess was simply glossed over, and any attempt to equalize this rising egregious anomaly was simply dismissed by Republicans as “creeping socialism.”

This nation needs a major infrastructure overhaul, an economic eradication of a severely polarized spread of wealth (where the top 1% owns same wealth as the entire bottom 60% of the nation combined) and an introduction of what is standard in the rest of the developed world: programs like universal healthcare, childcare, affordable education and housing and job creating research. If you think taxing the rich more is unfair, remember how much less the same levels of executives earned as a multiple of average pay (as noted above) in 1950, 1980 and 2000. It’s time to get real and tax accordingly.

I’m Peter Dekom, and while I have no issue with those who are able to get rich, I have a very big issue with a governmental system that is so clearly biased in favor of the rich at the expense of just about everyone else.