When Did Food Labeling Start? A Historical Overview

Understanding when food labeling started requires a journey through decades of evolving consumer needs, scientific advancements, and regulatory responses. While rudimentary forms of informing consumers about food contents existed earlier, the structured nutrition labeling we recognize today began taking shape in the latter half of the 20th century. This article explores the key milestones in the history of food labeling, focusing on the pivotal moment when providing nutritional information became a recognized necessity.

Early Stages: Special Dietary Uses

Before the late 1960s, food labels provided minimal information about nutrient content. Between 1941 and 1966, the Food and Drug Administration (FDA) considered foods with calorie or sodium content information as being for “special dietary uses,” designed to cater to specific dietary requirements arising from physical or medical conditions. The food industry had yet to fully embrace modern standards of food transparency.

During this period, most meals were prepared at home using basic ingredients, reducing the perceived need for detailed nutritional information on packaged foods. However, the rise of processed foods in the marketplace led to increasing consumer demand for more product transparency and comprehensive food information.

The White House Conference and Voluntary Labeling

A significant turning point occurred with the 1969 White House Conference on Food, Nutrition, and Health. A key recommendation from the conference was that the FDA should develop a system for identifying the nutritional qualities of food. This recommendation encouraged manufacturers to provide accurate nutritional information to help consumers make informed dietary choices.

In response, the FDA started developing various approaches to nutrition labeling. In 1972, the agency proposed regulations specifying a format for nutrition information on packaged food labels. Initially, this inclusion was voluntary unless nutrition claims were made on the label, in advertising, or when nutrients were added to the food. In such cases, nutrition labeling became mandatory, grounded in Section 201(n) of the Federal Food, Drug, and Cosmetic Act of 1938.

When finalized in 1973, these regulations required that when nutrition labeling was present, it included the number of calories; grams of protein, carbohydrate, and fat; and the percentage of the U.S. Recommended Daily Allowance (U.S. RDA) for protein, vitamins A and C, thiamin, riboflavin, niacin, calcium, and iron. Manufacturers could also include sodium, saturated fatty acids, and polyunsaturated fatty acids at their discretion. These values were based on the Recommended Dietary Allowance (RDA) set forth by the National Academy of Sciences (NAS) in 1968. The U.S. RDAs were generally the highest value for each nutrient given in the RDA table for adult males and non-pregnant, non-lactating females, with some exceptions for calcium and phosphorus.

The Rise of Undefined Claims and Health Concerns

After 1973, the scientific community significantly advanced its knowledge of the relationship between diet and health. Consumers, increasingly aware, began seeking more information on food labels, particularly for processed and packaged foods.

Food manufacturers responded by using new, often undefined, claims on product labels. Terms like “extremely low in saturated fat” became common, aiming to attract consumers’ attention. However, this proliferation of ambiguous claims led to concerns about misleading and deceptive marketing practices. There was a growing perception that the government was tolerating claims that were, “at best confusing and at worst deceptive economically and potentially harmful.”

Some food manufacturers also started making health claims about the benefits of their products. However, FDA regulations had historically prohibited explicit discussions of disease or health on food labels since the FD&C Act in 1938. A pivotal shift occurred in 1984 when the Kellogg Company, in collaboration with the National Cancer Institute, began linking fiber consumption to a possible reduction in the risk of certain cancers. This campaign dramatically changed food labeling and marketing, leading other companies to make similar claims in the absence of regulatory action.

Initiatives to Standardize and Require Nutrition Labeling

In the summer of 1989, recognizing that current food labeling practices were inadequate, Dr. Louis W. Sullivan, then Secretary of the U.S. Department of Health and Human Services (HHS), directed the FDA to undertake a comprehensive revision of food labels. He expressed his concerns that the grocery store had become a confusing “Tower of Babel,” requiring consumers to be linguists and scientists to understand the labels.

This initiative began with an advance notice of proposed rulemaking in August 1989, seeking public comment and public hearings on the content and format of nutrition labels, ingredient labeling, and both nutrient content and health claims.

By July 1990, the FDA had published proposed rules for the mandatory nutrition labeling of almost all packaged foods. The FDA also proposed replacing the U.S. RDAs and establishing regulations for determining serving sizes. In replacing the U.S. RDAs, the FDA aimed to base new values, known as Reference Daily Intakes (RDIs), on the most recent RDAs. New values, known as Daily Reference Values (DRVs), were also proposed for food components important for good health (fat, saturated fatty acids, unsaturated fatty acids, cholesterol, carbohydrate, fiber, sodium, and potassium).

Passage of the Nutrition Labeling and Education Act (NLEA) of 1990

Culminating years of discussion and debate, Congress passed the NLEA in November 1990. This landmark legislation amended the Federal Food, Drug, and Cosmetic Act, granting the FDA explicit authority to require nutrition labeling on most food packages and specifying the nutrients to be listed.

The NLEA mandated that nutrients be presented in the context of the daily diet, that serving sizes represent amounts customarily consumed, and that standard definitions be developed for nutrient levels. It also provided for a voluntary nutrition labeling program for raw fruits, vegetables, and fish. The requirements of the NLEA were closely aligned with the FDA’s 1990 proposal, including complex carbohydrates and sugars in the list of required nutrients.

On January 6, 1993, final regulations were published mandating nutrition labeling in the form of a Nutrition Facts panel on most packaged foods. The Nutrition Facts panel included calories, calories from fat, total fat, saturated fat, cholesterol, sodium, total carbohydrate, dietary fiber, sugars, protein, vitamins A and C, calcium, and iron.

Evolving Nutrition Labeling

Nutrition labeling continues to evolve to incorporate scientific advancements and changes in consumer behavior. Current initiatives focus on modifications to give more prominence to calories, amend serving-size regulations, and establish new reference values. The ongoing debate about front-of-package labeling demonstrates the continuous search for innovative approaches to help consumers make healthier dietary choices. The original question, “When Did Food Labeling Start?”, can therefore be answered by considering the developments in the field to be continuous and ongoing to this day.

[

Comments

No comments yet. Why don’t you start the discussion?

Leave a Reply

Your email address will not be published. Required fields are marked *