Our modern-day workers compensation system has a long, dark history dating back to the advent of written history itself in ancient Sumeria. Sumeria (5000-175 BCE) surprisingly created many aspects of our modern-day society including time keeping, literature, geometry, agriculture, wheeled vehicles, medical advances, dentistry and much more. Discovered on an ancient stone cuneiform tablet containing the Code of Ur-Nammu and referred to as the world’s oldest set of written laws, the workers compensation legal concept evolved shortly after their advent of written history itself. The Code’s intent was to establish “equity in the land” and required monetary compensation for worker injuries to body parts, including fractures, caused by another. It applied to all aspects of daily life. The loss of a worker’s thumb, for example, was deemed to be worth half the value of a finger. A loss of an eye was 30 silver shekels. Sumerian society used mathematics, and invented various tools and technology, including the wheel. They established schools; crafted artwork, and established some of the first cities in the world.
Later in history, a harsher system evolved in Babylonia’s Code of Hammurabi (1795-1750 B.C.), known as the “eye for an eye” decree, wherein carelessness and neglectful behavior cited severe penalties. For example, if an unskilled surgeon caused loss of life or a limb for a wealthy patient, his hands were cut off. Scholars point out that the Code also contained provisions when compensation could be awarded to free citizens based upon a schedule if the underlying injury was a result of negligence. Unintentional injuries were generally awarded physician fees only, and workers in high risk jobs like construction were generally not entitled to any legal protections since most were considered in their social system as property slaves.
Nevertheless, the common denominator of these early workers’ compensation schemes spread out into Arab, Greek, Roman, and Chinese laws. The injured worker’s compensation continued to be based upon loss of body part or its function. Virtually no body part was excluded. The early schemes identified specific compensable values associated with the loss of even a portion of one’s anatomy. This early system was based solely on a monetary award noting its focus on “impairment” (the loss of a body part) versus the modern-day concept of “disability”, which measures one’s ability to earn wages and sustain an occupation.
By the middle ages, feudalism gradually became the primary structure of governance, and whatever benefits may have been rendered to a feudal lord’s injured serfs was entirely discretionary. Then, in the 17th and 18th centuries, a surprising new source of prescribed benefits for injured workers emerged from Caribbean pirates and privateers, like Captain Henry Morgan. In order to recruit crew members, each shipmate was guaranteed payment of pieces of eight for various body parts lost during their dangerous trade. Injuries at sea ranged from 100 pieces of eight for the loss of a finger or eye, 400 for loss of a left leg, 500 for a right leg, and 600 pieces of eight for the loss of a right arm. Since 100 pieces of eight was the equivalent of most colonial American wages in a year, the amount could add up to the equivalent of 6 years of lost salary. A spare plank of wood, for instance, was carved to make a pirate prosthetic device for a lost arm or leg. In addition, most injured pirates with less than life threatening injuries were assigned lesser tasks, if able, like galley cooks and deck scrubbers, continued their share of the bounty via a primitive version of return-to-work or modified duty.
Funds for pirate injury payments came from the ship’s collective pillaged profits, with one catch, they had to live to get paid. So, since there were no death benefits, most pirates wore pierced earrings of solid silver or gold, often with their home port engraved on them, to be used to cover the cost of transporting their body back home. Shipmates wanted to avoid burial at sea if they could afford to.
From an historic perspective, it’s important to understand the early distinction of “impairment” (a degree of loss of function of a portion of anatomy) versus “disability” (the limitations of one’s ability to perform specific tasks and/or lead a productive life). The prevailing thought in workers compensation models was that once you were paid for your missing or damaged anatomy, you were good to go. Disability awards, which today determine the level of one’s capacity to work in a given profession, did not exist in antiquity. The legal reference of these two terms, is still an important distinction that comes to play in analyzing current workers compensation benefits. States vary widely on their interpretation of disability and associated benefit schedules based upon factors such as the injured workers’ profession, age, and type of medical impairment. For instance, the loss of a thumb in South Dakota entitles any worker to 50 weeks of compensation regardless of his/her disability.
Written laws of the land regarding these early forms of workers compensation disappeared with rise of the monarchies of the Middle Ages and with feudalism. The arbitrary benevolence of the feudal lord determined what, if any, serf injuries garnered sympathy or recompense. There is evidence that some of the more benevolent feudal lords would provide injured servants with quasi-impairment compensation for disabling physical conditions.
The Middle Ages and Renaissance then gave way to the birth of English Common law, as enslaved or indentured workers diminished. These evolutionary standards provided the framework of labor laws that persisted all the way up to the Industrial Revolution and into the early 20th century. The English Common law, which spread to 1/3 of the world, only protected workers in what is known as the “unholy trinity of defenses.” Furthermore, while injured employees could sue their employers, companies would often require a worker to sign a waiver of the right to sue for exceptionally dangerous jobs. The waiver agreements became known as “death contracts.”
At that time, workers injured on the job still had to sue for damages arising from a work injury, by addressing the three Common Law areas used to determine employer liability. The burden was clearly on the injured worker. Referred to at the time as “the unholy trinity of defenses”, these consisted of the following legal concepts:
- Contributory Negligence – Was the injured worker in any was responsible for the injury?
- The “Fellow Servant” Rule – Was the injury due in any part from the action or negligence of a fellow employee?
- Assumption of Risk – Employees were expected to know the hazards of the jobs they signed up for and they assume any risks it carries
An injured worker’s requirement to file a personal injury lawsuit was expensive and time consuming – commodities few workers had. Nevertheless, by the late 1800’s, employers were battling to protect themselves from increasing litigation by disgruntled injured workers and their unions.
The watershed moment of our modern-day workers comp system originated with Prussian Chancellor Otto von Bismarck. His political strategy to destroy social democracy by introducing social legislation. In 1871, Bismarck created the Employer’s Liability Act and later, in 1884, the Workers’ Accident Insurance covering workplace accidental injury and illness. While the Chancellor was far from being a great humanitarian, Bismarck’s rivals were mostly die-hard Marxists with a determined socialist agenda. So, with the unification and growth of modern Germany as his main objective, he sought to appease his political rivals by instituting a set of laws to protect workers.
This was the first government administered system to provide injured workers with medical and rehabilitation benefits, as well as prescribed monetary reward. It was also the first set of laws, as well, to shield employers from civil lawsuits. The exclusive remedy doctrine, protecting employers from worker lawsuits in exchange for prescribed benefits, was born. The U.K. empire in the late 19th century was responsible for half of the world’s production across a third of the globe’s land. With the great factory reformers labor movement, the U.K. instituted its own workers compensation laws with its Factory and Workshop Act of 1895 and its Workmen’s Compensation Act of 1897, although employers had limited liability arising from any fault regarding industrial accidents, companies were initially left to arrange their own insurance to pay for the cost of these claims.
Back in the U.S., Congressional politicians of that era had no desire to federalize workers comp. The Socialist Labor Party of America (SLP) was established in 1876 and declared its primary goal as the collectivization of the means of production and eradicating capitalism. By 1901, immigrants created the the Socialist Party of America (SPA) , which grew to 118,000 members by 1911. Many socialists were elected as mayors in 56 cities around the country, and their party had 320 newspapers that were socialist sympathizers. The SPA journal “Appeal to Reason” was selling 700,00 copies per week. Amid strikes and incidents of violent protests, the reformists created widespread fear of Bolshevism and anarchism spreading into the American labor movement.
It was a chaotic time in our history known as the First Red Scare, and Congress clearly wanted to steer from a federalized workers compensation system. The desire by Congress and the Republican Presidents at that time (Theodor Roosevelt 1901 to 1908 and William H. Taft 1909 to 1913), was to let states deal with the issue of workplace injuries arising from growing industrialization and largely unregulated and often unsafe working conditions. Keep in mind that social security wasn’t implemented until 1937 and the nation’s Occupational Safety & Health Act (OSHA) wasn’t signed into law until 1970.
Employers in the early 20th century were still heavily shielded from worker lawsuits since it took expensive, drawn out litigation for a worker to ever win in the tort system. But the populist sentiment began to grow with the literary “muck-rakers” movement. In 1906, the most famous of these was author Upton Sinclair’s compelling book, the Jungle, which detailed the horrific working conditions and exploited lives of poverty-stricken life of a Lithuanian immigrant in Chicago meatpacking plants. Ironically, the short-term swell of public outcry led to the Food and Drug Act (FDA) of 1906, not workers’ rights and protections.
Nevertheless, the courts began siding more frequently with injured workers as the public grew alarmed at the harsh working conditions and debilitating injuries many, including women and children, endured. In 1908, the first viable workers compensation laws were signed into law by President Taft creating the Federal Employers Liability Act to protect federal and interstate commerce workers like railroad employees. Other federal laws implemented addressed longshore, harbor workers. There was no appetite to federalize workers compensation for all workers.
Four States tried to implement workers compensation laws from 1902 to 1910 but were struck down by the federal courts as unreasonable and deemed unconstitutional as a violation of the 14th Amendment requiring “due process.” Ironically, many unions at that time feared that states’ control of workers benefits would reduce the need for and popularity of the unions. Unions were not early supporters of State workers compensation programs since it was seen as a threat to their collective bargaining power for increased wages and better working conditions.
Following form, on March, 24, 1911, the New York Court of Appeals declared compulsory its 1909 compulsory workers’ compensation laws unconstitutional. Then, a day later, a horrific fire ripped through the Triangle Shirtwaist Company in New York City killing at least 146 young garment workers, mostly young women. Heartless managers had locked exit doors in order to prevent workers from taking breaks. The public was further outraged when only 23 families of deceased workers received $75 each as a result of a civil lawsuit. The disaster galvanized labor and led to many reforms in state safety, health and labor laws. The state ultimately changed its constitution and implemented workers compensation in 1914.
Arising from this tragedy was a ground swell of movement in many states to pass laws that were intended to give injured workers “prompt and dependable” compensation for on-the-job injuries by giving up their right to sue their employer. Workers and their employers wanted a no-fault system of compensation free from costly and drawn out litigation. In return, employers received exclusive remedy. Known originally as “the Great Compromise” or “Grand Bargain”, these early workers compensation laws were intended to provide for expeditious, although partial, wage replacement provisions and appropriate medical care arising out of injuries or illnesses incurred in the course and scope of employment.
The first comprehensive workers comp State law was passed in 1911 by Wisconsin, with nine other states following suit that year. It’s significant to note that these States initially made workers compensation a voluntary offering. This was the law of the land until 1917 when the U.S. Supreme Court ruled that States could legally require employers to provide workers compensation without violation of the 14th Amendment. Some 36 other states passed workers comp by 1920 with Mississippi being the last State to mandate the coverage in 1948. Texas remains the only State to grant its employers the choice to provide employees (1) Workers Compensation; or (2) an ERISA approved injury benefit program; or (3) nothing. The latter two options provide no exclusive remedy protection for employers.
Some States act as the exclusive or monopolistic insurance provider such as Ohio, Wyoming, Washington, and North Dakota. Only Texas provides its employers with alternative choices. Interestingly, after 100+ years, several State workers compensation systems are again being challenged in the see-saw battle over their constitutionality, including Pennsylvania, Alabama, and Kansas – with Kansas, and in theory, New Jersey, even provide employees or employers the choice of opting out of workers compensation waiving exclusive remedy protection.
Today’s workers compensation laws are all modeled loosely on the Prussian system and remain so today despite countless revisions. Today’s cumbersome bureaucratic system of workers compensation – required in all states except Texas – is complex, arcane and often leads to litigation and protracted disputes that can produce the system’s original purpose. Unintended economic hardships, distrust, and frustration are commonplace as injured workers are forced to navigate each state’s individual disputatious workers’ comp system of ever-changing rules and regulations – especially with employees who sustain a serious injury or illness. Sadly, some workers, for instance, end up with battling doctors regarding diagnosis and treatment plans; unsuccessful or repeated surgeries; addiction to pain killer opioids; and an inability to return to a productive work life. Furthermore, each dollar of benefit to an injured worker with a lost time injury typically costs $2 to $3 or more in frictional costs or in litigation, so inefficiencies still abound.
The U.S. workplace in today’s society is highly modernized with robotics employed for most heavy lifting, repetitive, and dangerous tasks. In addition, technology, safety, and groundbreaking medical achievements continue to evolve in innovative ways that serve to protect most of today’s workers from the serious types of accidents and injuries incurred by workers in the last century. Furthermore, with the American With Disabilities Act of 1990, there has been a massive campaign to provide disabled workers or those with special needs a “reasonable accommodation” in order to allow them to be productive in the workplace, and earn a livelihood.
Therefore, for sake of argument, if we had a clean slate to imagine a streamlined, 21st century injury/benefit program for today’s workforce, how would we design it? How would we ensure excellence, responsiveness, and high degree of consistent benefits? How would we handle disputes or appeals? How can we avoid injured worker fears and anxiety, or diminish layers of bureaucracy, disproportionate benefits due to 50 state plans, administrative delays, and litigation? What specific objectives or outcomes would we seek to achieve for both employees and employers?
Texas is currently the only state in the union that provides an exciting opportunity for eligible employers to provide its workers with a comprehensive injury benefit program governed by Texas and Department of Labor ERISA rules and regulations. The cost-effectiveness, job satisfaction, and competitiveness of the Texas right of employers to choose workers compensation or an injury/benefit plan over the past three decades is unique and impressive. Some 20% of Texas employers achieve timely, responsible medical and return-to-work outcomes that exceed the highly regulated workers’ comp alternative. These employers have transparent, well communicated plan benefits that ensure immediate wage replacement and personal post-accident relationships. Their goal is quite simply to take care of injured employees and their family through whatever effective and innovative means possible and returning them to work as quickly as possible. Keep in mind that they can be sued for liability or pain and suffering since participating injury/benefit program employers do not have “exclusive remedy’ like workers’ compensation.
Texas has one of the nation’s leading economies and continues to draw new businesses and workers from across the nation, and the choice many employers have made of utilizing the innovative injury benefit program has been a win-win scenario for both employers and their employees.