A Brief History of Offshoring
The Industrial Revolution marked a drastic change in the make-up of the American economy. While agriculture was once considered the key economic driver, this seemingly shifted to industry overnight. While the impact of the Industrial Revolution was probably the most significant economic shift in history, this concept of adaptation and transformation has continued. Lately, most of the changes are being driven by technology and a continuing shift in customer demand. Businesses are constantly evolving, whether it be adapting to the current environment or looking for the next best thing. As economic drives shift and things change, there is always a period of adjustment. As industry evolves, businesses are forced to do the same.
After the Industrial Revolution, one of the next major changes to the economic make-up of the manufacturing industry started to take hold in the 1970s. While some dabbled in it prior, it was around this time that companies more aggressively began to explore the concept of outsourcing. With a goal to improve efficiencies and reduce costs (therefore increasing profitability for its stakeholders), shifting production oversees seemed to be the perfect solution to meet many business needs and overcome some of the major challenges they were facing. It offered:
At the time, the majority considered labor costs to be the most important factor. This is still the case today. With minimum wage on the rise, and the need to provide health benefits, pensions, and other forms of compensation to its employees, corporate profits were starting to dwindle. China, seeing this as an opportunity to grow their own economy, became one of the most sought after destinations for companies to outsource their work. Not only did they offer lower labor rates, but they had an established, skilled workforce and the facilities to accommodate the work and help the process. As a result, the term “offshoring” became a staple in the American economy.
The lessons learned during the Industrial Revolution show that those who are able to adapt and who have the foresight for change are usually the ones who are able to differentiate themselves amongst their competitors and, ultimately, succeed. Over the years, the digital revolution from the late 20th century continues to gain momentum. Over the past decade, the business world started to acknowledge a new era of innovation that would revolutionize the digital age, now known as Industry 4.0. Combining hardware and software innovations with cyber-physics, this new wave would be built on communication and connectivity. Terms such as the Internet of Things (IoT) began working its way into thought leadership conversations along with robotic process automation (RPA) and bots. This new revolution tackled big data, and utilized it for making more informed (and quicker) management decisions. It emphasized speed and accessibility, through dashboards, KPI’s, and data links. Enterprise resource planning (ERP) systems continue to become more robust and automated today. This doesn’t necessarily mean that human labor will no longer be required, however, the skills and job tasks are changing.
The Tax Cuts and Jobs Act of 2017 (TCJA), offered several incentives for businesses to bring manufacturing jobs back to the United States. The jobs the TCJA were looking to onshore, are jobs that have been shipped overseas since the early 1970s (earlier for some). Despite all the incentives to do so, the process comes with significant challenges. Think of the manufacturing industry and its related supply chain as a large vessel that has been underway, in one direction, for over 30 years. During this time, companies have been making substantial investments in overseas infrastructure. Everyone knows that ships don’t have breaks; in order for them to turn, all the ores need to be paddling in the same direction. All of this starts with the financial impact; it needs to make economic sense and maintain measurable shareholder value. Companies considering onshoring face the following challenges:
One of the areas that could be drastically altered by the COVID-19 pandemic is the commercial real estate market. With offices closing due to CDC guidelines, forcing companies into a remote workforce, companies learned they might not need as much space as they currently have. As a result, many companies have invested heavily in the ability to work remotely and will not only want a return on that investment, but will be looking to cut back in other areas of their operating budget, like rent. There could be a significant decrease in square footage taken by companies as leases begin to expire, therefore increasing the supply. Basic laws of supply and demand dictate that as the supply increases, prices will decrease, making it more affordable for companies to buy/lease manufacturing space in the U.S.
Interest rates remain at an all-time low. This can provide many businesses the opportunity to make an investment in the equipment necessary to bring manufacturing back to the U.S.
Due to CDC guidelines, all companies need to be able to do more with less. As such, they will look to invest in smart equipment and other infrastructure to limit the number of bodies needed on the shop floor. The skills required in the industry were already starting to shift to more of a supervisory role under Industry 4.0 and with several companies not being able to make it through this pandemic, or having to scale back substantially, there could be more talent available for those companies looking to expand or bring work back to the States.
For some, onshoring their manufacturing operations could mean abandoning prior investments in overseas infrastructure. While there are Opportunity Zones and other economic incentives to invest in real estate for such activities, the question becomes: is it worth it?
2020 started with overwhelming optimism. Building off a fairly strong 2019, the economy was stable with projected growth.
As the effects of the COVID-19 pandemic started overseas in China, working its way through Europe, and eventually taking hold in the United States, a new reality started to set in. Brick-and-mortar retail came to a screeching halt. While customer demand remained high, relatively all purchases shifted online. Those who had e-commerce channels set up prior benefited, while those who did not scrambled to get themselves established. The good news (if there ever was such a thing during a pandemic) was that most manufacturers were deemed essential and were able to still operate. However, CDC guidelines forced them to seriously alter how they did business. Workers were required to maintain social distancing, reducing the amount of workers available on the floor. Even if new orders were slow, many manufacturers were at least able to catch up on backlog.
Profitability was tough to come by as demand for nonessentials initially slowed. Companies still needed to be mindful of the financial constraints of increased trade terms with its customers while still having to pay its employees and vendors. The Payroll Protection Program loan helped some, but not all, as those who could not attest to poor liquidity could not qualify. So how can anyone find any light at the end of the tunnel?
Some states, like New York, have maintained relatively low levels of new cases and have seen many businesses start to come back throughout the summer. The roads seemed to be crowded again, with consumers showing optimism of a return to some variance of normal. Nevertheless, the question remains, what are the long-term effects of this? What is temporary, what has the potential to be changed forever, and how does this affect manufacturing in the United States?
Citrin Cooperman is focused on your M&D’s concerns, assisting to guide and plan for the unknown yet to come.