January 2, 2013

The Paradox of the Wage Slave

Once upon a time, there was no such thing as the hourly wage. If you were an independent farmer, you'd sell your grains, cows, pigs and vegetables at the market, and in general would try to stagger these so that you could have money coming in most times of the year. If you were a tenant farmer, when the land lord sold the goods you produced, you'd get a percentage of the sales based upon the amount of land you farmed. A smith would negotiate by the piece or the lot, and usually took a down-payment to cover the costs of the materials. Farmhands and soldiers would be paid a set amount each week, usually at the end of the week after the work in question was done, but might also get a certain proportion of their wages from a share of the harvest or a chance at the spoils. Sailors would get a share of the shipping proceeds (or plunder if the ship in question was a pirate or privateer vessel), plus a stipend for completed voyages and occasionally a small signing bonus.

In general, the per week payments were intended to keep the laborer involved until the final payout - in effect the laborer was part of the venture and would share in the rewards, or was paid per piece with just enough to cover the artisan's or tradesman's costs and basic sustenance paid in advance.

Industrialization changed that, along with the arrival of the mechanical clock. People have always had the ability to tell approximate time via candles or hour glasses, but because such resources were both expensive and required maintenance (and were at best very approximate) most timekeeping was managed by church bells sounding the times of worship. With the advent of the clock, however, it became possible to measure tasks more precisely, and as a consequence to break up time into discrete units during the day.

The machine paradigm also broke the normal agricultural rhythms of working at dawn, getting a big breakfast, working until the sun reached it's peak, taking a short siesta, then working until near dark. Instead, you worked to the clock. In the factory paradigm, it made less sense to pay the workers an small initial payment then pay them a share of the proceeds after the project was done, because there was never a "done" point - the machines ran twelve hours a day, every day. Because industrialization was going on in tandem with the break up of the feudal tenant farm system, there were a lot of laborers available for factory jobs, and consequently, factory owners could limit the laborers to hourly stipends without any hope of final renumeration. This was also the stage where factory labor diverged from trade or artisanal labor, although the former also depressed wages for the latter.

In the 1940s in both the US and England, most able bodied young men went to war, where they learned regimentation, and where both officer and enlisted class became intimately familiar with command and control structures. The military had standardized on hourly wages, but also had standardized on the concept of a standard work week for those not in theater in order to simplify wage accounting. In practice, that meant that you got paid for 48 hours of work a week, period. Senior grades had a higher pay structure per hour, and officers made more than enlisted for the same number of hours of service.

When the war ended, the officers went into the newly booming corporations as managers as they switched over from war time to peacetime production of goods, while the enlisted went into the factories as foremen and line managers. The terms "white" and "blue" collar jobs reflected this - naval daily officer uniforms were white cotton, while the ratings and seamen wore blue chamoise-cloth shirts.

Wages began going up both because of increased demand for skilled workers and because the management class was also getting wages - they were still hirelings of the rentier or investor class, but because they were doing management type activities they typically had far more involvement in the longer term success or failure of the company. Moreover, much of that management was involved with sales, which in addition to wages, paid a commission on sales made that boosted the income of the management class significantly in the years after World War II.

Meanwhile, unions, which had struggled during the Depression and World War II, exploded in popularity in the 1950 and 60s, in part because there was a massive demand for people in the building trades - skilled carpenters, electricians, plumbers, and so forth who had until then perforce taken temporary jobs on an as available basis, and in part in manufacturing, where again high employment demand had meant that a system that both guaranteed competence and provided an environment for younger union members to gain experience made them attractive. As many of the companies involved were comparatively weak, the management of these companies were unable to stop this phenomenon, as they needed people too much not to concede to labor demands.

By the 1970s, labor unions had become very pervasive, and arguably had become too powerful, at least from the perspective of corporations that were now facing increasingly severe headwinds. In the 1950s, the United States was effectively rebuilding both Europe and Asia. By the 1970s, however, these economies had recovered, and were increasingly competing against the United States in critical areas. Additionally, the Breton Woods agreement in 1944 that had established a global reserve currency (the US dollar) and pegged that dollar to gold was seen more and more as a burden by the US, since it meant that US banks were very limited in the amount of money that they could loan out. When French President Charles de Gaulle demanded that the US make payments to France in gold, not dollars (as the French were concerned about the Americans' depreciation of their currency during the 1960s), President Richard Nixon severed the tie between gold and the dollar. This had the immediate effect of causing the oil producing states of the Middle East to band together in order to raise prices in response, which in turn began an inflationary spiral that didn't really end until Federal Reserve Chair Paul Volcker raising interest rates to nose bleed levels,

The massive spike in inflation caused demand for American produced goods to fall dramatically, exacerbating problems that the unions faced. With reduced demand, corporations were able to close plants with impunity. People paid into unions because they had been successful in raising wages and work standards (including reducing total work time to 40 hours per week), but as manufacturing jobs disappeared, so too did the clout of the unions, because there were far more people competing for jobs than there were jobs available. This has always been the Achilles heel of the union movement. Ironically, those places where unions have remained strongest are also those where educational requirements and continued training have also been the most stringent - teachers, nurses, engineers, fire and police professionals,

It's also worth noting the distinctions in types of inflation. Talking about a broad "inflation" rate is misleading, because in general, inflation is the rise of labor or resources relative to the nominal price of other resources. wage inflation occurred in the 1950s and early 60s relative to commodities, energy and finished goods because labor was comparatively scarce for many jobs. Wages largely stagnated since about 1971, but there was massive inflation in managerial salaries and dividends. Energy has inflated relative to wages since '71, while commodities inflated during the period from 1998-2008, and real estate inflated dramatically from about 2000 until the market collapsed in 2008.

Most corporate managers and rentier class investors prefer it when labor costs fall while finished goods inflate (which increases their profit), but fear when labor costs rise and raw material goods inflate (which can often squeeze margins at a time when the economy is tight). Not surprisingly, when the main stream media discusses the desire of the Federal Reserve to increase inflation, what they are usually referring to is the inflation of finished goods (from cars and houses to computers, packaged foods and so forth) rather than wage inflation, even though in this case wage inflation is precisely what needs to happen, relative to other asset classes).

In the late 1970s, a new class of business consultants such as Peter Drucker began making the argument that the primary purpose of a corporation was not to create goods and services but to maximize shareholder value. This credo was part of a shift in thinking pushed largely by the Chicago School of Economics and the monetarists, led by Milton Friedman. Along with this came the belief that the senior management of a corporation, such as the CEO or CFO, should be incentivized to increase stock value (which was widely seen as a good proxy for "shareholder value") by giving them options to purchase stocks at a greatly reduced price.

With skin in the game, these senior managers would then have more reason to keep stock prices up. In point of fact, all that this did was to transfer a significant amount of wealth from the employees (who were not similarly compensated) and the investors to the managerial class. Ironically, this has served in the long term to significantly reduce shareholder value, while at the same time making such manageables largely unaccountable as they ended up stocking boards of directors with their cronies. Weighed down with expensive senior management contracts many companies ended up reducing long term wages on employees that weren't critical to success to compensate - additionally, because stock price became the only real proxy for a corporation's value, corporate raiders emerged who would push the stock value of a company down through market manipulation, buy it out, reward the senior managers and fire the labor force, often gorging on pension funds and patents in the process.

The rise of unemployment that resulted was partially masked by the rise of the IT sector. The information technologies revolution started in the 1970s with big iron systems that began to reduce accounting staffs, but it really was only the marriage of the personal computer in the 1980s with networking technology that things began to change dramatically. One of the first things to happen was that as software reached a critical threshold in the mid 1980s, it began to erode the last real bastion of wage employment - the non-managerial white (and pink) collar jobs that had been indispensible to the command and control corporate structure.

The creation of presentations provides an interesting illustration of the impact this had. Until the mid-1980s, many corporations had graphic design departments. If a manager needed to make a presentation, he would need to work with a designer to design the slides, who would then work with a typesetter, a graphic illustrator and photographer to create the slides, a copy-writer, and possibly a printer, and would often take a month of lead time. With the introduction of presentation software such as Harvard Graphics and later Powerpoint, the manager could do all of these jobs himself, eliminating these positions and drastically reducing the time to do this work. Adaptable artists and designers did eventually go to work for themselves to provide such services, but for every person that became successful in this milleau, three or four did not, and in the process it caused a shift away from the monolithic culture into more of a  freelance and studio arrangement.

Ironically, such a process served to hinder the women's movement for at least a few decades. Falling real wages coincided with a rise of women's empowerment to bring a whole generation of women into the corporate workforce as secretaries, which often provided a stepping stone into mid-level management (typically office management or administration). The introduction of personal computers into the corporate workforce actually initially proved beneficial to secretaries, because they were often the first to get access to these typewriter-like devices and consequently ended up getting a leg up on their male managerial counterparts. However, as more people began using PCs in the work environment, it also radically thinned the number of secretaries required in an organization (although in a fitting twist of irony it also had the same effect on mid-level managers a few years later). This is part of the reason that there's something of a gap between older and younger women in most organizations, especially as IT itself became increasingly seen as a specialized domain for nerdy young men.

For manufacturing, however, the IT revolution was devastating for workers. Once you networked computers, it became possible to distribute your workforce, and from there it was a short step to moving work outside the US in particular to countries with low labor costs, low taxes and lax regulatory regimes. Standardization of shipping containers made shipping raw goods to these external factories for processing and sending the finished goods back easier, and new telecommunication systems meant that it was easier to coordinate production eight to ten hours ahead or behind you globally. This served to inject huge amounts of money into the Asian economies, which had the unintended effect of raising the wage levels of Chinese, Indian, Japanese and Korean workers dramatically. This outsourcing drained manufacturing from the US, leaving much of the Midwest and MidAtlantic as derelict ghost towns.

This also had the effect of reducing the overall import costs of foreign goods, which companies such as Walmart took strong advantage of. The outsourcing on manufacturing not only eliminated manufacturing jobs, but also had an adverse on the many service jobs that supported these manufacturing jobs, driving down wages in these areas and giving rise to the McJob - part time, no benefits, paying minimum wage, offering little opportunity for advancement and making an insufficient amount of money to catch up on with steadily rising food and housing prices. Automation generally affected services economies less directly - services almost by definition require either human intervention or human knowledge - but it did mean that mid-level management jobs (which typically provided a career path for people in these sectors) disappeared, leaving fewer ways for a person to break out of the "wage-slave" trap.

Dramatic rises in energy and commodities due both to scarcity and a growing realization on the part of countries that they were being pillaged by Western corporations caused the machine to falter even more. As the opportunities for the giant petrochemical companies to get access to foreign oil at highly profitable rates disappeared, cries for energy independence began to arise in the US. Energy independence in this context should be read, however, not as an increase in the use alternative energy sources (which currently receive a very small subsidy by the US compared to the oil companies) but as increased drilling for shale oil, offshore oil and natural gas deposits via rock fracturing (aka fracking). These deposits were considered less economical (in part because of the remediation and political costs) than foreign oil and natural gas, but at this stage there are considerably fewer alternatives left to the oil companies (in 1960, oil companies owned roughly 85% of all oil deposits globally, in 2010, that number is closer to 10%, as most of these has been nationalized by their respective governments).

This has led to an increase in the number of hydrocarbon engineering and maintenance jobs in the US, but this is a labor market that runs hot and cold. The jobs will be around until the fields play out, then will be gone - this will likely happen within the next decade.

We are now in what has been described as a bubble economy - government stimulus is frequently needed to create a temporary market, but these markets, unregulated, quickly grow to a point where they are oversupplying the available demand, attracting parasitic speculators that then cause the system to collapse, causing inflation in that sector followed by rapid deflation and despoiled ecospaces. This happened in IT in 2000, in housing in 2008, and in education and energy production likely in the next couple of years. The housing collapse in particular is still playing out, primarily in Europe, though it has left a legal tangle of housing ownership that will take decades to untangle, if ever (I expect that ultimately much of this will end up being written off as uncollectable).

It is against this backdrop that it becomes possible to understand what will happen to jobs over the next couple of decades. There are two additional factors that play into the picture as well. The first is demographic. People born in 1943, which I consider the start of the Baby Boom, turn seventy this year. In the depths of the recession that started in 2008, when this group reached 65, many of them went back to work - and for a while it was not at all uncommon to see a lot of low wage jobs being held by people in their seventh decade. However, even given advancements in geriontology, the ability of people to work into their seventies deterioriates dramatically. The Boomer generation peaked around 1953. If you assume that only a comparatively small fraction of those age 70 or above are still in the workforce, this means that this gray workforce will fade fairly quickly from the overall workforce just in the next five years. This will have the effect of clearing out a large proportion of upper-level management as well, which has been heavily dominated by Boomers just given the sheer number of them.

GenXers are a trough generation - as a group there is perhaps 65% as many of them as there are Boomers. These people are now entering into policy making positions in both government and business, but because of numbers, the Boomer peak for leaving the workforce hits at approximately the bottom of the GenXer trough for entering into senior management and senior professional positions. This actually translates into a relative scarcity of executive and professorial level talent by 2020, now only seven years distant. GenXers, for the most part, are engineers. Many of them, in their 20s through 40s, were responsible for the low level implementation of the web in the 1990s and the 2000s. A large number were contractors, people who generally benefited far less overall monetarily from the emergence of the computing revolution and the web, and as such they see far less benefit in large scale corporate structures.

Indeed, the GenXer view of a company is increasingly becoming the norm. It's typically small - under 150 people, in many cases under twenty people. It's distributed and virtual, with the idea of an "office" as often as not being a periodically rented room in a Starbucks, and with people working on it from literally across the world. Participants are often shareholders without necessarily being employees. Their physical facilities are on the cloud, and staffs are usually two or three people devoted to administration while the rest are "creatives" - engineers, developers, artists, videographers, writers and subject matter experts. The products involved are often either virtual or custom as well, and usually tend to have a comparatively small life cycle - often less than six months. This could be anything from software to customized cars to media productions to baked goods.

In effect these microcompanies are production pods. They may be part of a larger company, but they are typically autonomous even then. They can be seen as "production houses" or similar entities, and they may often perform specialized services for a larger entity - a digital effects house for a movie, a research group for a pharmaceutical company, a local food provider, specialized news journalists. When they do have physical production  facilities, those facilities may be shared with other microcompanies (the facilities themselves are essentially another company).

One of the longer term impacts of ObamaCare is that it also becomes possible for such pods to enter into group arrangements with health insurers, and makes it easier for people to participate in such insurance systems without necessarily being tied to a 40-hour paycheck. Health insurance was once one of the big perks of the more monolithic companies, but until comparatively recently changing companies typically involved changing insurance companies as well, a process that could become onerous and leave people with gaps in insurance that could be devastating if a worker or her child was injured. As command and control companies end up putting more of the costs of insurance on the employee, the benefit to staying with that employer diminishes.

The same thing applies to pension plans - it has become increasingly common for companies to let go of employees that are close to cashing out their pensions for retirement, often leaving them with little to nothing to show for years of saving. The younger generations are increasingly skeptical of large companies to manage their retirement, usually with good reason, especially since the average 40 year old today may have ten or more companies under their belt since they started work, and can expect to work for at least that many more before they reach "retirement age". This means that GenXers and younger (especially younger) are choosing to manage their own retirement funds when possible, independent of their employer.

Once those two "benefits" are taken out of the equation, the only real incentives that companies can offer are ownership stakes and salaries. As mentioned earlier, salaries are attractive primarily because of their regularity - you have a guarantee that you will receive X amount of money on this particular date, which becomes critical for the credit/debit system we currently inhabit. Ownership stakes are riskier, but they constitute a long term royalty, which can be important because it becomes itself a long term semi-reliable revenue stream. If you receive royalties from three or four different companies, this can go a long way to not having to be employed continuously.

The GenXers will consequently be transformers, pragmatists who are more interested in solving problems than dealing with morality, overshadowed by a media that is still primarily fixated on the Boomers, quietly cleaning up the messes, establishing standards, and promoting interconnectivity and transparency. Many of them now are involved in the technical engineering involved in alternative energy and green initiatives, next generation cars, trucks and trains, aerospace technologies, programming, bioengineering, information management and design, and so forth. While they are familiar with corporate culture, they find the political jockeying and socializing of the previous generation tedious, and though they are competent enough managers, GenXers generally tend to be more introverted and less entrepreneurial. Overall, as they get older, GenXers are also far more likely to go solo - consulting or freelancing. They may end up setting up consulting groups in order to take advantage of the benefits of same, but there is usually comparatively little interaction between consultants - they are more likely to be onsite with a client troubleshooting.

From a political strategist standpoint, one of the great mysteries of the modern era has been the disappearance of the unions. Beyond the strong automation factors discussed earlier as well as a politically hostile climate to unions, one factor has always been generational. GenXers are probably the most disposed personality-wise to being union members, but because unions generally gained a blue collar reputation, many GenXers (who in general see themselves more as engineers and researchers) have tended to see unions as being outside their socioeconomic class. Moreover, the consultant or freelancer mentality is often at odds with the "strength in numbers" philosophy of most unions.

I expect this generation to also end up much more in academia, especially on the technical and scientific side, or to migrate towards research, especially by 2020 or so as they finally reach a point where passing their knowledge on to the next generation outweighs any gains to be made by consulting. As is typical, the relatively inward looking GenXers will lay the groundwork for the very extroverted generation following after them - the Millennials.

Millennials were born after 1982, with the peak occurring in 1990, and are the children of the latter wave part of the Boomers (many of whom started families comparatively late - in their very late 20s, and had children until their late 40s). However, there's also an overlap with the children of the GenXers that creates a double crested population hump, with the trough in 1997 and then growth until 2007 (which actually exceeded the number of births per year of the Baby Boomers). After that, however there's been a sharp drop off to the extent that in 2012 the number of births is expected to approach the trough levels of 1971. For all that, the Virtuals (those born after 2000) will likely be a fairly small generation, given the drop off (most likely due to the economy's collapse).

The oldest Millennials are now thirty years old. Displaced by the gray workforce and facing the hardships in the economy by 2007, many started work four or five years later than in previous generations, had more difficulty finding work, and were often forced when they could find work to take MacJobs. They are distrustful of corporations, and are in general far more bound to their "tribes" -- connected over the Internet via mobile phones and computers -- than they are to work. Their forte is media - writing, art, film production, music, entertainment programming, social media, all of which lends itself well to the production house model, and which will likely mean that as this generation matures, it will end up producing the first great artists of the 21st century.

What it won't do is make them good workers in the corporate world, or in traditional blue collar positions. Overall, math and science scores for high school plummeted for the Millennials during the 1990s and 2000s, and enrollment in STEM programs in college declined dramatically after 2000 (when the Millennials started into college). Most Millennials are very good at communicating within their generation - this is the most "connected" generation ever - but overall tend not to communicate well with authority figures outside that demographic. (I've discussed this in previous essays.)

While I've seen some commentators who are critical of the Millennials because they see them as spoiled and entitled, instead, I'd argue that these characteristics are actually more typical of a generation that overall is just not heavily motivated by financial factors. Most have learned frugality after years of having minimal jobs. They will likely marry later and have fewer children than any generation before them, and their social relationships may actually prove stronger than their marital ones. On the other hand, they will also likely focus more strongly on their craft because of these factors, which means that as they age, they will prosper because of their innate skills and talents.

Temperamentally, the Millennials will tend to act in concert to a far greater extent than the generations before them. They will not join unions, but they will end up creating constructs very much like them. Moreover, they will be inclined to follow authority, but only if that authority is roughly in their generation. Consensus politics will be very important to them, and this will be the first generation that really employs a reputation economy as currency.

Given all this, it is very likely that the nine-to-five, five day a week job is going the way of the dodo. It won't disappear completely for quite some time, but the concept of a salaried employee will become increasingly irrelevant as the production house model obviates the command and control structure corporation. If you're still learning, you would get paid at a fixed rate plus time, but once you reach a point where you add significant value to a project, you would get points in the project towards a return royalty. Service jobs, similarly, will likely revert to a work for hire basis, coupled with some profit sharing. Manufacturing is shifting to a combination of insourcing with pod companies and artisanal production. Legal and accounting services, where they haven't already shifted to web-based delivery, are pretty much already done on a work for hire basis, with partners getting profit-shares.

The biggest changes that are taking place are in the sales sector. The rise of eRetailing is beginning to hit brick and mortar businesses hard. Christmas hiring at physical retail stores has been dropping consistently in the last five years, even as the economy itself has begun to recover. This is primarily because more and more retail is shifting online, to the extent that it accounts for nearly half of all retail activity in the United States during the last three months of the year. Mobile continues driving that as well, as it becomes far easier to "impulse buy" when your computing platform is constantly by your side.

The only real exception to this trend is in groceries and restaurants, though even there online purchases are accounting for a larger percentage of sales than a few years ago. Many grocery chains now offer online ordering and delivery services for nominal fees, up from only a couple specialized services a few years ago. Supermarket shopping is perhaps more ingrained in people than other retail shopping, so it is likely that this trend will take longer to play out there, but it is happening, especially in cities where grocery shopping is more complicated than it is in the suburbs.

Ambiance stores and restaurants are perhaps the only ones truly bucking the trend, and this has to do with the fact that most restaurants ultimately are as much about entertainment as they are about food. It's why there's been a slow death of the fast food industry, but places such as Starbucks do quite well. They are the modern day equivalent of pubs.

Note that I do not believe that such service jobs will go away completely, but they will diminish, and at some point it is often more profitable for a common to only be virtual and not maintain the costs of storefronts. No storefronts means fewer stores in malls, and already many malls are closing or being converted to other purpose buildings, while there are very few new mall or strip mall projects starting. Similarly, the number of "big box" stores has been declining as well. On any given day, go into an Office Depot or Best Buy, and what's most remarkable is how little traffic there generally is. Yet people are buying from their online sites, and the stores stay open increasingly to keep the brand alive in people's minds. At some point I expect these expensive "advertisements" to finally close down or turn into general distribution points, with only token merchandise on the floor.

This brings up the final paradox of the wage slave. The number of jobs being created is smaller than the number of jobs that are going away by a considerable degree, even in a "healthy" economy. These jobs are not being outsourced, they are being eliminated due to automation. The jobs that are being created in general require specialized skills, skills which used to be acquired via "on the job training", but increasingly these low and mid-tier jobs that provided such training are the easiest to automate, and hence are going away as well.

It is possible to train people some of these skills in the classroom, but the 10,000 hour rule of mastery generally applies - in order to understand a particular topic or acquire a given skill, it usually takes 10,000 hours worth of study, experimentation and practice to truly acquire competency in that area. In practice, this usually correlates to about ten years of fairly rigorous working with the topic. This means that while education is a part of the solution, the time required to impart that education can often make these skills obsolete.

The upshot of this is pretty simple - eventually, you end up with a large and growing percentage of the population that simply become unemployable. They are not lazy - most of them had positions until comparatively recently, but those positions are now gone. Meanwhile, profits that are made from the automation do not go to the people who lost the jobs, but the people who purchased the automation, and from there to the people who commissioned the creation of that automation in the first place. Put another way, productivity gains over the last fifty years were privatized, while the corresponding unemployment was dumped on the public domain. That unemployment in turn created emotional and financial hardship, foreclosures, a rise in crime and in the end a drop in the overall amount of money in circulation.

This last point is worth discussing, because it lies at the crux of the problem. In a capitalistic society, the velocity of money is ultimately more important than the volume of money in circulation. When money moves quickly through the system, more transactions take place, and in general more value is created in that economy. When money ceases moving, no one buys or sells, no investment takes place, no jobs are created (though many may be lost), and money becomes dearer, because you have a fixed amount - you can't count on additional moneys coming in, you can't get loans, even the simplest economic activity stops. This was close to what happened in 2009. As automation replaces work, billions of man hours of work payments disappear - money that would have gone to labor instead goes to the investors, who generally contribute a far smaller acceleration to the global economy than middle and working class individuals do in the aggregate. The wage-hour ceases being an effective mechanism for transferring wealth in society.

Eventually, a tiny proportion of the population ends up with most of the money in that society, and there is no way for the rest of the population to get access to that money to get the goods they need. We're not quite there yet, but the imbalance is already sizable and only getting worse.

One solution to this problem is to tax wealth that's not in use. This transfers money from wealthy individuals to the government, but given that government has become increasingly captured by those same individuals, the result of those taxes end up as corporate kickbacks to the same rentier class in terms of subsidies. Taxes can be reduced on low income individuals, but for the most part, low income individuals generally pay little in the way of payroll taxes, though they do pay in hidden taxes and fees arising from having to buy the smallest units of finished goods and services, which is generally the most expensive per item cost. Money can be distributed to everyone to spend, but the benefits of such stimulus usually tend to be short-lived, because the amounts are too small to make an appreciable difference in the same extractive mechanisms still exist in society.

Government mandated minimum wage floors can be set, but while this will help some, it is precisely these jobs that are most heavily impacted by automation. Moreover, the same corporate capture of the government provides a chokehold on the ability to impose such requirements on corporations. In effect the oligarchical control of the government continues to pursue policies that locally increase their profits, but at the systemic cost of destroying the consumer base upon which those profits depend. It is, in many respects, yet another example of the tragedy of the commons.

In many respects this is what the end state of a capitalistic society looks like - stalemate. Fewer and fewer jobs exist. Money becomes concentrated not in the hands of those who have jobs, but in the hands of investors, yet investment money is seldom sufficient to create a market, only to bring a product or service to that market. Wages become two tiered - bare subsistence level or below, and lavish for those with specialized skills, but only at the cost of continuous learning and training, and the concommittant loss of expertise as skilled workers choose not to share their skills at the risk of losing their marketability.

Because needs are not being met in the formal market, an informal or gray market emerges that is outside of the control of both the government and the corporatocracy, one with lax quality controls and legal redress in the case of fraudulent transactions, and consequently one where organized crime can play a much larger role. While this may seem like a Libertarian wet dream, the reality of such markets is typically like Russian markets in the aftermath of the fall of the Soviet empire, in which crime lords created monopoly markets where basic goods were only available for high prices or coercive acts, and where legislators and activists who tried to bring such crime lords under control were regularly assassinated.

So how does a society get out of this trap? My own belief is that in the end, it decentralizes. Power production shifts from long pipelines of petroleum based fuels to locally generated power sources - solar, wind, geothermal, hydrothermal, small nuclear (such as small thorium reactors), some local oil and natural gas production, intended primarily to achieve power sufficiency for a region with enough to handle shortfalls elsewhere in a power network. This provides jobs - both constructing such systems and maintaining them - and insures that energy profits remain within the region.

Establish a minimal working wage but also provide mechanisms for employees to become participants through profit-sharing and royalties, rather than options and dividends.

Make healthcare and retirement saving affordable and universal, rather than as a profit center for insurance companies and pharmaceuticals.

Tax financial transactions in exchanges, and use this to provide a minimal payment to individuals as a way of redistributing the costs of automation (and financial malfeasance) on employment.

Eliminate the distinction between salaried and hourly workers in the tax code, which has created an artificial two tiered system designed primarily to make it possible for unscrupulous employers to have a person work up to 39 hours a week and still not qualify for benefits.

Eliminate the 40 hour workweek - it's an anachronism. Instead, establish a government base payment that provides a floor for subsistence living for everyone, coupled with wage payments from jobs to fill in towards a production royalty payoff that provides wealth for people willing to put in expertise and effort.

Eliminate the income tax, and replace it with a value-added tax. The Federal income tax has in general been a disaster, increasing class warfare, often being used punitively by various administrations to favor one or another group, is extraordinarily complex, requires too much effort to maintain records for independent workers and small businesses, and usually being easily subvertible by the very wealthy, putting the bulk of the burden on the middle class. A value-added tax, while somewhat regressive, is generally easier to administrate, does not require that employees maintain detailed records, can be automated easily, and can in general be fined tune to encourage or discourage consumption of certain things within the economy.

Tax employers for educational support. Too many corporations want their workers to have specialized knowledge or skills, but in general do not want to pay for the training. Some of that tax can be in kind knowledge transfer from people that do have those skills in those corporations , at which point the corporation pays for that employer/contributor to teach.

Similarly, tax employers for infrastructure support that directly or indirectly benefits them. Much of the last half century has seen the maxim of privatizing profits and socializing costs become almost holy writ, but this has generally resulted in ghettos and gated communities that benefit a few at the expense of millions.

Encourage telecommuting and virtual companies, while taxing those corporations that require large numbers of employees onsite at all times. If telecommunication tools were good enough to outsource to China, they are good enough to provide telecommuting. This generally has multiple benefits - less need for infrastructure, far fewer carbon emissions, less energy consumption, less time wasted in traffic, fewer monster skyscrapers serving as corporate shrines.

These changes (and others like them) are feasible, but in general will only work if they are attempted locally - at the state or even city level. These are transformative changes - as different regions attempt these, facets that work and don't will emerge, and local variations will no doubt come about based upon cultural temperament, but overall success will beget success. Demographic changes, as discussed in this essay, will accelerate this process - those regions that are already investing in twenty-first century technologies are already doing a number of these things, and are seeing benefits, but those that are heavily petroeconomically bound will resist them. The irony here is that this means that in these latter areas, the wage slave paradox will only get worse, and the economy more dysfunctional over time.

It is likely that thirty years from now the economy of the United States will look very different - mass customization through additive printing techniques, millions of virtual pod corporations that number in the dozens of people only distributed all around the country (and probably the world), cities that will be in a state of controlled disintegration, powered locally and with much more local autonomy, with the rise of a strong creative class supported by an elderly engineering class and a youthful research cadre. None of this will happen overnight, nor will it happen uniformly, but I feel it will happen.

10 comments:

Anonymous said...

well thought out and i like it cheers

Anonymous said...

A great article, some very interesting analysis, to the extent I've bookmarked it, which I rarely do :) However, I think you overemphasize/stereotype the different generations. There are, and always have been, people who hold the attitudes you mention, in all generations :)

Anonymous said...

"establish a government base payment that provides a floor for subsistence living for everyone, coupled with wage payments from jobs to fill in towards a production royalty payoff that provides wealth for people willing to put in expertise and effort"

This will drive down wages. See Polanyi's "The Great Transformation"and his remarks on models in England purportedly designed to deal with a changing economic regime when the rural economies became dominated by urban trade economies and the new factory modes of production.

Unknown said...

Re: generations. These are definitely generalizations, though not necessarily stereotyping. I'd be more inclined to say that there are certain characteristics that tend to be more common within a given generation than another (and even within that generation those characteristics will change as you near either end around the peak) and it is those characteristics that, taken in the aggregate, are more likely to shape both the economy and social mores. There are many entrepreneurs within the GenXer generation, for instance, but as a generation, it was shaped by certain significant factors, such as the rise of personal computing and the Internet (which occurred during those years that the GenXers entered the workforce).

You can obviously slice up such generations differently. If you went from trough to trough, for instance, rather than midpoint to midpoint, you'd get a different set of characteristics, in the aggregate, but there would still be evident characteristics.

Unknown said...

Re: Driving down wages - yes, it will, but not uniformly. This is the fundamental argument for or against the welfare state, an experiment that I believe was aborted by the US before it really had a chance to be proved out.

We live in a regime where the number of available job "slots" is dropping dramatically, and only some of that can be attributed to a poor economy. Automation and information technologies are, together, making a wide swathe of jobs obsolete, and these will not come back as the economy improves. Meanwhile the number of new jobs that are generated by this same automation is considerably smaller. This means that the capacity of the US economy to generate jobs in the first place is diminishing.

This is in fact causing wages to drop pretty much across the board, except among the rentier class who benefit from the continued automation. The problem is that the rentier economy by itself is too small to continue to generate profits ... over time, it too will sputter out, leaving a country where most people have too little to live on, which will in turn generate a gray market economy outside of any control.

The goal of a minimal payment is to provide a mechanism for allowing people to develop alternative skills and experiences. It would allow interns to work for companies or startups while not being dependent upon student loans, it would let people hone their writing or programming or (fill in the blank) skills. It would allow people to innovate and build up intellectual capital that could be essential for starting their own companies, and it would reduce the overall wages required by such startups until such a time as their being profitable.

This differs significantly from the English Industrial revolution, which required large numbers of laborers in the factories - effectively the opposite of the situation that exists now. Then, the number of jobs was growing faster than the collection of inhabitants in the cities could successfully satisfy, which forced the mercantile classes of the time to migrate outward to the countries and to offer wages in the first place (keep in mind that prior to that, much of England was essentially a barter economy).

Now the number of jobs is diminishing relative to the overall populace, and it is diminishing systemically. The credit economy was built on the industrialization concept and economic growth, but now we are faced with diminishing jobs where those jobs have a higher degree of specialization.

Overall, the increased velocity of money in a society with a floor may in fact lower the wages of some specialists, but it will have the largest impact upon those reliant upon royalty incomes exclusively, capping those potential gains. Currently this is recovered by the government via capital gains taxes, but too much of those gains in turn go to fund expensive military budgets (primarily complex weapon systems) that ends up circulating these taxes back among a much smaller economy. [Note: not being anti-war here, just noting that wages for most military personnel below the rank of General have declined even more than civilian jobs have, so the problem is not in military pay, save at the most senior levels.]

Anonymous said...

Golly, so many generalizations without any references.

Each paragraph both arguable and worthy of reflection.

Interesting for what has been left out, along with the tidbits worthy of reflection.

C-

Ed Hubbard said...

Really excellent piece. I think your solutions are interesting, but I think that we are unaware of how the one fundamental will change,Money itself. How we exchange value is changing, and that makes a huge difference. Check of Toeffler's Revolutionary Wealth for that final key.

Unknown said...

Ed,

I absolutely agree with you on the change in the nature of money - I'm actually writing a post on that very thing write now.

My primary problem with the Toefflers is that I believe they've lost their impartiality. Perceiving trends generally requires that you can step outside of the overall framework and see it from the outside, which is usually easiest when you don't have a specific perspective within the framework that you've also taken. In practice it's not always easy - you will always be perceiving from one perspective or another, so it is generally best to cultivate several. The Tofflers now tend to look at the world through the lenses of their primary clients, who are CEOs, financiers and heads of state, and this particular perspective is notable for the degree to which it wraps the participants in a bubble. The world ALWAYS looks better if you have several million in the bank, but for futurists such as Alvin and Heidi Toffler, it can also mean that they don't see the bigger negative trends.

Stefan said...

Nice article.

What about defining the terms of "local" a bit more? For instance, in his book "A Pattern Language" (Oxford U. Press, 1977), Christopher Alexander calls for subdividing the world into about 1,000 independent regions of about 2~10 million inhabitants each, with evenly distributed towns of various sizes. So how will locality work in your vision?

Also, could try thought experiments with a few, more pessimistic scenarios. Resistance to wealth redistribution, for instance.

Stefan said...

Nice article.

What about defining the terms of "local" a bit more? For instance, in his book "A Pattern Language" (Oxford U. Press, 1977), Christopher Alexander calls for subdividing the world into about 1,000 independent regions of about 2~10 million inhabitants each, with evenly distributed towns of various sizes. So how will locality work in your vision?

Also, could try thought experiments with a few, more pessimistic scenarios. Resistance to wealth redistribution, for instance.