1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 369 370 371 372 373 374 375 376 377 378 379 380 381 382 383 384 385 386 387 388 389 390 391 392 393 394 395 396 397 398 399 400 401 402 403 404 405 406 407 408 409 410 411 412 413 414 415 416 417 418 419 420 421 422 423 424 425 426 427 428 429 430 431 432 433 434 435 436 437 438 439 440 441 442 443 444 445 446 447 448 449 450 451 452 453 454 455 456 457 458 459 460 461 462 463 464 465 466 467 468 469 470 471 472 473 474 475 476 477 478 479 480 481 482 483 484 485 486 487 488 489 490 491 492 493 494 495 496 497 498 499 500 501 502 503 504 505 506 507 508 509 510 511 512 513 514 515 516 517 518 519 520 521 522 523 524 525 526 527 528 529 530 531 532 533 534 535 536 537 538 539 540 541 542 543 544 545 546 547 548 549 550 551 552 553 554 555 556 557 558 559 560 561 562 563 564 565 566 567 568 569 570 571 572 573 574 575 576 577 578 579 580 581 582 583 584 585 586 587 588 589 590 591 592 593 594 595 596 597 598 599 600 601 602 603 604 605 606 607 608 609 610 611 612 613 614 615 616 617 618 619 620 621 622 623 624 625 626 627 628 629 630 631 632 633 634 635 636 637 638 639 640 641 642 643 644 645 646 647 648 649 650 651 652 653 654 655 656 657 658 659 660 661 662 663 664 665 666 667 668 669 670 671 672 673 674 675 676 677 678 679 680 681 682 683 684 685 686 687 688 689 690 691 692 693 694 695 696 697 698 699 700 701 702 703 704 705 706 707 708 709 710 711 712 713 714 715 716 717 718 719 720 721 722 723 724 725 726 727 728 729 730 731 732 733 734 735 736 737 738 739 740 741 742 743 744 745 746 747 748 749 750 751 752 753 754 755 756 757 758 759 760 761 762 763 764 765 766 767 768 769 770 771 772 773 774 775 776 777 778 779 780 781 782 783 784 785 786 787 788 789 790 791 792 793 794 795 796 797 798 799 800 801 802 803 804 805 806 807 808 809 810 811 812 813 814 815 816 817 818 819 820 821 822 823 824 825 826 827 828 829 830 831 832 833 834 835 836 837 838 839 840 841 842 843 844 845 846 847 848 849 850 851 852 853 854 855 856 857 858 859 860 861 862 863 864 865 866 867 868 869 870 871 872 873 874 875 876 877 878 879 880 881 882 883 884 885 886 887 888 889 890 891 892 893 894 895 896 897 898 899 900 901 902 903 904 905 906 907 908 909 910 911 912 913 914 915 916 917 918 919 920 921 922 923 924 925 926 927 928 929 930 931 932 933 934 935 936 937 938 939 940 941 942 943 944 945 946 947 948 949 950 951 952 953 954 955 956 957 958 959 960 961 962 963 964 965 966 967 968 969 970 971 972 973 974 975 976 977 978 979 980 981 982 983 984 985 986 987 988 989 990 991 992 993 994 995 996 997 998 999 1000 1001 1002 1003 1004 1005 1006 1007 1008 1009 1010 1011 1012 1013 1014 1015 1016 1017 1018 1019 1020 1021 1022 1023 1024 1025 1026 1027 1028 1029 1030 1031 1032 1033 1034 1035 1036 1037 1038 1039 1040 1041 1042 1043 1044 1045 1046 1047 1048 1049 1050 1051 1052 1053 1054 1055 1056 1057 1058 1059 1060 1061 1062 1063 1064 1065 1066 1067 1068 1069 1070 1071 1072 1073 1074 1075 1076 1077 1078 1079 1080 1081 1082 1083 1084 1085 1086 1087 1088 1089 1090 1091 1092 1093 1094 1095 1096 1097 1098 1099 1100 1101 1102 1103 1104 1105 1106 1107 1108 1109 1110 1111 1112 1113 1114 1115 1116 1117 1118 1119 1120 1121 1122 1123 1124 1125 1126 1127 1128 1129 1130 1131 1132 1133 1134 1135 1136 1137 1138 1139 1140 1141 1142 1143 1144 1145 1146 1147 1148 1149 1150 1151 1152 1153 1154 1155 1156 1157 1158 1159 1160 1161 1162 1163 1164 1165 1166 1167 1168 1169 1170 1171 1172 1173 1174 1175 1176 1177 1178 1179 1180 1181 1182 1183 1184 1185 1186 1187 1188 1189 | INTRODUCTION Stardust and Sunshine Over two thousand years ago, the Greek philosopher Empedocles proposed his theory of roots: that everything is composed of four elements. Fire, air, water, and earth. He wasn’t too far off. Only four elements—oxygen, carbon, hydrogen, and nitrogen—make up almost all the body’s mass and most of what we eat. Imagine a countryside garden, filled with a colorful patchwork of edible plants—tomatoes, cucumbers, strawberries, and corn. To one side stands a chicken coop. Inside and out, the birds cluck and jerk, occasionally pecking on grains. In a bright pasture nearby, cows graze on green grass. As daylight breaks, the chickens lay big, brown eggs that will help give form to cakes, omelets, and custards. The cows are milked, and their milk will become butter, yogurt, and cheese. All of this—everything contained in the garden, and all its bounty—is mostly made from different configurations of those four elements: oxygen, carbon, hydrogen, and nitrogen. Except for the hydrogen, these elements were all created by previously existing stars that exploded as supernovas. The ability of every living thing in and around the garden to harness energy from another star, the sun, is what made their living possible: the strawberries, the corn, the cows. When sunlight hits a plant, it sets off a chain of chemical reactions, helping it convert light energy, along with water, nitrogen, and carbon dioxide, into glucose, protein, and oxygen gas. The plant grows and what it breathes out, the animals breathe in. When animals eat a plant, or other animals that ate plants, they indirectly tap into the sun’s energy, too, using the microcombustions of metabolism. These go off in each cell, continuously, for every millisecond that an animal is alive. We humans, like everything else in the garden, are nothing but stardust powered by sunshine. Long after Empedocles’s time, the science of nutrition and metabolism was established, a project dedicated to figuring out how all of this works: what makes our food and how it, in turn, makes us. Only recently has the project changed and the question transformed. How and why is our food undoing us and devouring the planet? NUTRITIONAL WISDOM The animals in the garden know what and how much to eat to meet their needs. They were born with the ability to quickly learn which foods would best serve their bodies: how much grain, carrots, and grass they’d have to consume to live and grow. When they aren’t getting enough of a particular nutrient, their preferences shift. They instinctively seek out foods that contain more of whatever is missing. Eating behavior is a biologically controlled phenomenon, much like breathing. At the same time, precisely what each animal eats is ultimately determined by what is around them: their food environment. The same is true for us. We’re born with an intelligence that helps us learn what food is, and what we need to eat for survival. From the earliest moments of life, eating helps us learn about our environment[*1] and when we eat, our environment becomes us—building every cell and tissue of the body, transforming into blood, guts, eyeballs, hair, and skin, powering our every movement. For most of human history, when our environments provided us with fresh food, we did a reasonable job of eating just enough —the right blend of nutrients and energy to ensure our survival, and the survival of the species. Diet-related diseases, such as type 2 diabetes[*2] and obesity—now defined as excess fat that interrupts the function of the body’s organs and tissues—were rare. This rapidly changed in the twentieth century. Many of us started to eat too much, and the wrong things, even when we didn’t want to. Obesity rates began rising, first in rich, Western, industrialized countries such as the United States, then elsewhere. Between 1980 and today, obesity prevalence doubled in more than seventy countries around the world. In the United States, all age groups experienced the surge simultaneously, suggesting a common environmental cause. Some tried to fight the tide with policy, with astoundingly little success so far. Today, 70 percent of American adults are classified as either being overweight or having obesity. The same is true for a third of U.S. children. With rising obesity came increases in the prevalence of associated ailments: type 2 diabetes, cardiovascular disease, and fatty liver disease. Each year, these diet-related chronic diseases result in the deaths of more than a half million people, going a long way to explaining why life expectancy has recently been decreasing for the first time in decades. They also cost more than a trillion dollars. The health declines coincide with the period during which food became more available, and increasingly processed. Many of us lost our connection to what we eat—how it’s grown and produced, how it’s cooked. We moved away from the garden, even out of the kitchen. If what we eat is ultimately determined by our food environment and our environment becomes us, humans increasingly resemble factory-produced, fat-, sugar-, and salt-filled organisms. The wellness industry, another invention of the twentieth century, rose alongside the growing rates of chronic disease, with annual global markets above $6 trillion as of 2023. Influencers and gurus peddled supplements, diets and diet books, exercise equipment, and cleanses. They stoked the “diet wars,” fights about which foods or nutrients are most responsible for our health problems. They sold the idea that the problem is us: that we can control what we eat and how healthy we can become. That we must do better. And we believed them. As we were putting the finishing touches on this book, we commissioned the polling company Morning Consult to survey a representative sample of Americans. We asked people, “Who should be held most accountable for the quality and healthfulness of the food on your dinner plate?” By far, the most popular response was “myself.” Instead of helping, the diet industries, and their debates, distracted us and obscured important truths about food. They put the focus on questions like: What’s the best diet? or How can I lose weight? Those questions miss the point. Instead, we should ask: Why do we eat what we eat? A shared obsession with that question prompted us—a scientist and a journalist—to write this book. THE SCIENTIST The scientist (Kevin) and the journalist (Julia) met one summer day in 2015. Researching a story about the effects of exercise on body weight for the American news outlet Vox, Julia went to interview Kevin in his booklined, fourth-floor office at a sprawling U.S. federal research hub on the outskirts of Washington, D.C. Kevin wasn’t wearing a white lab coat, but jeans and sneakers. The lone physicist conducting human research on obesity at his agency, Kevin brought an unusual tool kit to his lab—a place where mathematicians and physicists use their quantitative skills to study biology. He explained to Julia that he’d stumbled into a focus on nutrition and metabolism by accident. As a kid growing up in Canada, Kevin liked to help his dad in his workshop, where he seemed to constantly be repairing household appliances and machines by taking them apart, fixing what was broken, and putting them back together. He especially liked repairing engines and thought he wanted to become a mechanic. By the time he finished high school, questions about the fundamental nature of the universe took hold of his attention. What is heat, really? What is matter made of? Physics called. Between stints as a bass player in alt-rock bands with names like the Gropetoads and Wazzo, Kevin became the first person in his family to attend college, eventually earning a PhD from McGill University; his doctoral research focused on building mathematical models to better understand the drivers of abnormal heart rhythms. After graduation, he went to work at a Bay Area, California, biotech start-up to do more mathematical modeling—this time focused on simulating what happens inside the human body after people with and without type 2 diabetes eat. That was when he had a humbling realization: He knew the ins and outs of how electrons behaved in atoms and how electrical activity coursed through the heart with each beat, but nothing about the chemical structure of a carbohydrate or how digestion works. Even with his advanced degrees in science, he didn’t have the faintest whiff of what happens to food inside our bodies after we take our first bite. So Kevin studied some more and had another epiphany: Figuring out how our bodies use food, breaking things down and rebuilding ourselves, was way cooler than the engines he used to take apart as a kid. That fascination led him to his current job, where he began mathematically modeling how different diets affect the human body over periods of months, zeroing in on metabolism and the question of what regulates body weight. Perhaps because he entered the health field as an outsider, with uniquely quantitative training and no medical background—or maybe because he’s from Canada—Kevin saw things differently from others Julia had run across in the nutrition space in America. Kevin wasn’t selling anything. He didn’t push a pet theory. Preternaturally analytical and curious, he seemed to be keen on doing what scientists were meant to do: designing experiments to test ideas, following the data. That day in his office, he explained why exercise—while great for overall health—wasn’t a weight loss elixir. He urged Julia to write about bariatric surgery, which even in the Ozempic era remains the most durable and effective means of losing weight, though it got almost no public attention at the time. In the years of conversations and interviews that followed, Kevin never promoted easy answers or fast fixes. Foods weren’t singled out for being uniquely evil, even ones many popular diet peddlers pile on, like potatoes or ultra-processed foods. He never advocated for particular diets, and he certainly never claimed that any supplement or food could burn belly fat or boost metabolism. Instead, he pointed out areas of uncertainty in the science. He spoke from evidence dispassionately—his own studies, and those of other trusted scientists. Most importantly, he constantly asked fundamental questions that many seemed to take for granted, questions like: What really happens inside the body when people go on a diet? THE 3,500-CALORIE RULE One morning in 2007, Kevin shadowed a dietitian who was reviewing the food diary of a patient struggling with weight. He was eager to learn about how people with obesity were being counseled and whether the mathematical models he was building could help them. The dietitian examined a food diary that her patient had been keeping. She told the man that simply replacing his daily intake of full-calorie soda with diet soda would lead to about 50 pounds of weight loss in a year. The claim sounded incredible. Kevin wondered where the dietitian’s advice came from. It was based on something called the “3,500-calorie-perpound rule,” she explained. Cut 500 calories from your daily diet and after seven days you’ll lose 1 pound. After four weeks, 4 pounds, until more than 50 pounds melt away a year later. The idea was that eliminating “empty calories”—soda, candy, your daily glass of wine—would eventually add up to massive weight loss, a promise that could motivate patients. The 3,500-calorie-per-pound rule had formed the bedrock, ground zero, for obesity science. It was adopted by leading health organizations around the world, including the National Institutes of Health in the United States and the National Health Service in the UK. It appeared in official treatment guidelines, nutrition textbooks, countless websites, dietetics licensing exams around the world, and even in top medical journals. Not only was it used to predict individual weight loss, the rule was also used to explain how seemingly small increases in calorie consumption resulted in the rise of population obesity prevalence and how policy interventions could mitigate the “epidemic.” But was it true? When Kevin asked his colleagues where the claim came from, no one could give him a straight answer. He started digging. It turned out this omnipresent, near-universally adopted weight loss assumption was based on an estimate from the 1950s. Max Wishnofsky, a medical researcher, wanted to gauge how many calories were stored in a pound of human fat tissue. You guessed it: roughly 3,500. How did this basic biological result get translated into a dubious weight loss rubric? People assumed that altering the balance between calories eaten and calories burned leads to a constant deficit or surplus. Calories in, calories out—the body as a zero-interest bank account with no fees. Simply adding deposits or subtracting withdrawals in the form of food would lead to a constant rate of change, more or less weight. Unfortunately, metabolism is more complicated. The body reacts dynamically when we make changes to our diet, like eating fewer calories. In response, calorie burn does not stay constant. Shortly after we start to eat less, our metabolic rate slows down, the calories used in physical activity drop, and, as weight loss progresses, calorie burn can slow even more—so that our bodies use fewer calories over time, even for the same activity. What’s more, most people don’t just cut calories and keep them low, in part because weight loss seems to send a signal to the brain: “Eat more!” This means that diet calories usually creep back upward, mostly unconsciously, despite the best efforts of dieters. Eventually weight loss plateaus when the calories burned equal the calories eaten—typically within a year. [*3] The body settles at a new size after losing much less than the 3,500-calorie-per-pound rule would predict. Put simply, cutting calories in the diet—just like burning more calories by doing more physical activity—sets off a cascade of little-appreciated effects inside us, which can hamper weight loss efforts (though it doesn’t make weight loss impossible—more on that to come). In the absence of interfering with the fundamental biology that controls our appetite or metabolism, our bodies resist efforts to lose weight. All of this meant, Kevin discovered, that the 3,500-calorie rule drastically overpromised weight loss. Kevin spent years refining a more accurate rule and by 2012, his research prompted the American Society for Nutrition to officially abandon the 3,500-calorie rubric. After decades of getting it wrong, many other professional health organizations have since adopted Kevin’s weight loss model instead. The rule is not dead, of course. Calories-in, calories-out logic still permeates policy and the public discussion about body weight and obesity. If people could just exercise a little bit more and eat a little bit less, they’d be skinnier. If we just taxed soda or labeled menus, we might reverse population obesity. If only we took the stairs instead of the elevator every day…. It can take years for evidence to make its way into policy and practice, but hopefully change is finally coming. Change starts by asking questions, doing experiments, and gathering data. Until Kevin questioned the rule and examined it critically, it was just another dubious idea in a sea of food and weight loss myths, parroted over and over until it hardened into what appeared to be truth. This made Kevin wonder about all the other things we take for granted about eating and overeating—what else we might be overlooking or getting wrong about our food. Soon, Kevin’s bosses asked him to test the predictions of his models in humans. That was when he took on a role he never imagined for himself. He started running clinical studies (i.e., studies in humans). Kevin’s more recent randomized controlled trials show that food environments that contain lots of ultra-processed foods—those factory-made concoctions of unfamiliar ingredients and additives in frozen pizza, sugary cereal, and soda, which are now most of what people eat in industrialized countries like America—seem to affect the biological control of food intake we share with other animals. They reset whatever it is that used to help us regulate our body weight at lower levels. Over two decades, Kevin’s work has amounted to a stress test of popular twenty-first-century ideas about nutrition and metabolism. For his contributions to science, he has won numerous accolades and been called “one of the major figures of twenty-first-century nutrition science,” despite never training in nutrition. Kevin’s research on ultra-processed foods has been referred to as among the most influential studies in nutrition, shaping health policy and the conversation about how we eat globally. Still, Kevin has more questions. The groundbreaking research on ultraprocessed foods was done in adults—might the effect be different in kids? The studies were carried out in a hospital setting for one month. What would happen if they ran longer, under real-world conditions? How exactly do ultra-processed foods affect how we eat? There’s so much more to explore about our food and how it’s changed us. We’ve only scratched the surface of the full complexity of why we eat what we eat. THE JOURNALIST Since Julia and Kevin met, they talked—a lot. Many of their conversations circled around fundamental questions: What is metabolism? What do different nutrients do inside the body? What controls eating behavior? What is food, actually? The dialogue was a refreshing break from a news cycle that promulgated nutrition fads and mistruths. Across peddlers and platforms, Julia and Kevin detected a pattern: The cause of our health woes is always singular, easy to pinpoint, and often fringe—a certain “truth” that’s been withheld from the public by a corrupt and untrustworthy “establishment.” Solutions are neat, tidy, and painless—fully within our control—and typically available for purchase. No testing of the fringe theory or peddled products required. The peddlers weren’t necessarily illmeaning. Many wanted to help people and were sure of their solutions. But their competing advice in the marketplace amounted to a lot of noise about individual fixes, and a lot of confusion. More than in any other corner of science, the public seems to be repeatedly told that everything they thought they knew about nutrition is wrong. Just do this—or cook this, or take these, or buy that—instead. After many talks with Kevin and other scientists, it was clear to Julia that the root causes of people’s ill health had nothing to do with the things we usually dwelled on in the public conversation or the media, nor would they be fixed with the popular solutions. The root causes were clearly structural. People wanted to eat better. They wanted to be stronger and leaner. And yet they struggled. During reporting trips to the sickest counties in America—places riddled with obesity and diabetes—Julia saw how easy it was to get burgers and fries, ultra-processed products, even drive-through daiquiris, while fresh food was almost impossible to come by. Julia could empathize with those who struggled to eat well. She had been one of them. A child of the 1980s, she was born into a calorie glut— the moment pre-prepared, packaged, ultra-processed, and convenience foods were overwhelming North America. Her family, immigrants and descendants of immigrants to Canada from postwar Italy, blended the Italian eating obsession with the North American embrace of junk food. By the time Julia reached age ten, her pediatrician had already shamed her about her body size, and her mom—herself a recovering calorie counter— tried to help by sending Julia to dietitians. By age twenty, Julia had gobbled up metabolism-boosting supplements; used meal replacement shakes and bars; joined, quit, and rejoined Weight Watchers several times; and attempted the Zone and Atkins diets, among others, feeling equal parts ashamed and like a failure. Her weight, meanwhile, fluctuated along an upward trend line until it reached nearly 70 pounds more than it is today. She blamed herself, giving little thought to the food environment and system she’d found herself in. At some point, in one of their many chats, Julia and Kevin talked about the books they were thinking about writing. Kevin was getting requests for a book about his groundbreaking research on obesity and diets. Julia was being asked to write debunking books about the diet gurus she’d covered. But Kevin really didn’t want to write the expert tome about the One True Way to eat, divorced from the work of his predecessors and colleagues. Julia didn’t want to do yet another book telling people that everything they knew about food was wrong. Together, Julia and Kevin thought they could write something more helpful than anything they could do alone—a book that was proudly devoid of false fixes and misleadingly simple explanations. A book their years of conversations, research, and reporting suggested was sorely missing. WHY WE EAT, DECODED This is a book about the fundamental, often overlooked, and always enthralling science of nutrition (the chemicals and energy we get from food) and metabolism (how our bodies use food)—covering what we know and the history of how we came to know it, up to the frontiers of research into the invisible forces that really shape eating behavior. Based on hundreds of interviews with researchers, scientists, analysts, advocates, archivists, historians, entrepreneurs, patients, clinicians, and officials around the world; hundreds of scientific papers and books; and hundreds of hours of conversation between Kevin and Julia, the book will take you on a sprawling tour of centuries of science into the wonders of food and the marvelous ways our bodies use it, for better and worse health. We will weave Kevin’s insights, and the scientific quest that led him to study ultra-processed foods, with storytelling from the labs of other scientists, and Julia’s search for answers on behalf of patients. Lurking behind most wellness products and nutrition trends is the idea that a lack of willpower, gluttony, and sloth have caused obesity and type 2 diabetes rates to increase, when in fact the scientific evidence overwhelmingly points to the contrary: It was never about us as individuals. Our food environment is wrecking us. It’s the reason so many of us are struggling with weight and diet-related disease. It’s the reason so many of us struggle to eat the right things. We will argue that while blame is typically placed on individuals, diet-related diseases like obesity and type 2 diabetes are a direct result of food systems working as designed. Before we get there, we will walk you through the science of how food works. We will break down what we eat into its constituent parts, looking at what we know about each of them, and explaining why those parts can’t tell us about the whole. We will show how, with just about every component and feature of food and its impact on our health, advocates and peddlers, many of them well-meaning, have rushed to market with untested ideas, claiming they had the answers, and perpetuating misinformation that has reverberated and confused people through centuries. An industry has been built around protein supplementation, even though loading up on extra protein isn’t necessarily healthy or helpful. Carbohydrates and fat may be diametrically opposed in the diet wars, but they really work more like great colleagues inside us, standing in for each other when one is out. And while a low-carb obsession still permeates Western diet culture, low-carb diets don’t outperform low-fat diets when it comes to losing body fat. We will reveal how the discovery of vitamins helped showcase the amazing chemical complexity of food, part of the “nutritional dark matter” still being uncovered. We will explore what really shapes our eating habits, tracing how food has changed to rewire our brains and bodies. We will cover the latest neuroscience illuminating an invisible symphony of internal signals that regulate body weight and nudge us to eat, and the newest thinking on how ultra-processed food environments seem to alter and disrupt those signals, leading to weight gain and disease. Finally, we will reframe our food challenges at the societal and planetary levels. We will show how, in solving food problems that have made famine and vitamin deficiencies rare in industrialized nations, we created, or exacerbated, new ones: the warming of the planet, loss of biodiversity, and widespread obesity and metabolic diseases such as type 2 diabetes. Despite the nostalgia of popular food commentators, we will argue that we’ve actually never had an agricultural food system that worked well for humans. Sure, eat like your great-grandmother—but there’s a good chance she spent hours in the kitchen every day or maybe struggled with hunger and nutrient shortfalls. At some point in the very near future, we will have to find a way to feed ourselves sustainably and equitably, to meet the needs of a growing world population on a warming planet. And we’ll have to do it in a way that doesn’t harm our bodies. Throughout the book, we will show how old, unproven ideas and outdated policies continue to guide our current thinking and approaches to food. We won’t advocate for any one diet or pretend to be “clean eating” gurus ourselves. Ultra-processed foods are a staple of Kevin’s diet, and Julia still eats too much sugar. After all, we too are humans, living in the toxic food environment. Reading this book, we sincerely hope, will make you see the stuff on your dinner plate as more than stuff on your dinner plate. We hope to simultaneously ramp up your sense of awe about food and your body and decrease the chances of being swindled by influencers, while helping you to be kinder to yourselves and each other. We might answer questions you never knew you had about your body and what happens after you take your first bite. This knowledge won’t necessarily fix our food landscape or reverse obesity, but it is, we think, the first step to truly intelligent eating. A SHORT NOTE ON VOICE, STORIES, AND SCIENCE Science is an iterative process undertaken by humans. It takes time and many studies to get to the truth of the matter. Journalism is often referred to as the “first draft of history.” We wanted to write this book in a way that reflected these processes—and our collaboration, as a scientist and a journalist, groping at the truth about food and nutrition, from two different, and complementary, vantage points. From here on out, we will blend research and reporting, telling the story from the point of view of the coauthor who was doing the observing. Sometimes it’ll be Kevin. Other times, Julia. On occasions, we will also revert to we. As for stories and science: Humans tell stories. It’s how we understand the world. Facts[*4] are okay, but they don’t have the motivational or emotional heft of a great story. Indeed, facts on their own are uninterpretable without a story that weaves them together. Science aims to reveal stories that help us understand the world and make accurate predictions. Scientific stories are called models because scientists know that they are merely representations of the world that help us make sense of observations and tie them together. Science progresses by critically testing models and improving upon them. Better models explain more of the world and make more accurate predictions. Sometimes incorrect models are more compelling. This is especially true when a scientific-sounding story is told by a charismatic person who convincingly presents data that seems to support what they’re saying and uses double standards of evidence to downplay challenging research. Once you believe that your way of seeing the world is correct, it’s easy to find data that supports your belief. Those who don’t share your enlightened view seem ignorant and the evidence offered opposing your model, weak. You might even suspect ulterior motives in those who disagree with you. We need to recognize that all models in science are wrong because our feeble brains are incapable of fully understanding the world. Scientific models have varying levels of uncertainty and are supported by data from studies with different kinds of limitations. We’ll talk a lot in this book about observational studies, which look at associations between phenomena like eating particular foods or diets and the occurrence of diseases. We’ll also talk about randomized trials, which try different interventions on two or more randomly assigned groups; if the health outcomes differ among the groups, because they were randomly assigned, it’s likely because of the different interventions. Nevertheless, we will also use scientific storytelling because that’s the only way to make sense of the experimental and observational data accumulated so far. If the models we present are proven wrong, we hope it’s because of a future discovery unknown to us as we write. We will do our best to convey varying degrees of certainty when putting together our understanding of the state of the science and what it means for our health. Skip Notes *1 When one of our kids was a baby—we won’t name names—he learned the hard way that rocks aren’t food after ending up in the ER with symptoms that passed just as soon as the rock did. *2 Type 2 diabetes is far more widespread and very different from type 1 diabetes, which is an autoimmune disorder. References to diabetes in this book will mostly be related to type 2. *3 Unless you’re talking about GLP-1-based drugs and bariatric surgery. In these cases, the plateau takes about two years because these interventions weaken the feedback control of appetite that occurs with weight loss. *4 Facts aren’t easy to establish in science because they result from reproducible and accurate observations in well-designed and controlled experiments. Easier said than done! CHAPTER 1 The Biggest Losers, the Slowest Metabolisms It was an unusual laboratory for a metabolism and weight loss study: a sprawling ranch, the former home of the Gillette razor-blade tycoon, nestled in the dusty green Santa Monica Mountains near Malibu, California. A group of people milled about a hallway in the ranch’s white, Spanish Colonial–style mansion early one morning before breakfast—and they were not exactly chipper. Six weeks into season eight of the reality TV program The Biggest Loser, the contestants were tired. Their bodies ached. Days on the ranch had been spent mainly in the gym and what amounted to a sports rehab facility, where health professionals attended to blisters, stress fractures, and sore muscles from working out, sometimes plunging the contestants into a plussize ice bath designed for racehorses. Despite all that exercise, they’d also drastically cut their calorie intake. The contestants were motivated to carry on anyway because of the strange, weight loss Olympics premise of the show they were starring in. Whoever lost the greatest percentage of their original body weight after thirty weeks, the “biggest loser,” went home with a $250,000 cash prize. In addition to all those hours training and recovering, and stints in front of the cameras, the sixteen men and women had agreed to submit to the demands of the scientists. That day, starting around five a.m., Kevin and his colleagues welcomed the contestants, one by one, into a dimly lit room in the mansion—a makeshift metabolism laboratory. For forty-five minutes, each Biggest Loser took a spin in a “metabolic cart.” The person would lie down on a bed, and the cart’s domelike bubble would be placed over their head to read their resting metabolic rate—or how many calories their bodies burned while they were doing next to nothing. When the day’s data gathering was done, the contestants went back to working out. The show was an extreme televised version of a “fat camp,” and exercising was pretty much all they were allowed to do. Their phones had been taken by the show’s producers and they had no access to TV or the internet. Kevin, meanwhile, walked around the ranch, touring the Biggest Loser gym, which was filled with state-of-the-art exercise equipment and banners featuring motivational quotes from the show’s celebrity trainers. The conference room, where contestants deliberated over who would be sent home each week, was run-down. There was no personal chef working behind the scenes, just a simple kitchen where the show’s participants made their own small meals, often using food from Biggest Loser partners such as Jennie-O low-fat turkey. Everything looked a lot less glitzy than it did on TV, Kevin thought. Mostly, Kevin felt a little out of place. Not only was this a strange research setting, but the study was among the first in humans he’d ever run. The higher-ups at his agency, where Kevin had his own lab, didn’t want the reputation of their prestigious institution tarnished by association with a reality TV weight loss contest. But he’d found a way to make the research happen anyway because he was sure of one thing: Peering inside the contestants’ bodies as they crash-dieted and exercised their way to thinness would help unlock mysteries of metabolism and obesity. Weight loss of the magnitude the contestants were experiencing rarely happened outside of bariatric surgery. Trying to induce it for the purposes of a study today would never get the approval of a research ethics board. The bodies of the Biggest Losers, unbeknownst to them, were a “natural” experiment, brimming with original data, and the scientists were capturing it all—before, during, and long after the competition was over. In the end, the show contestants delivered new insights, as Kevin expected. What he didn’t anticipate: The findings would run entirely contrary to the way many of us think about metabolism and weight loss. The discoveries even challenged the most long-standing and persistent assumption of all: that the speed of one’s metabolism determines their body size, and a faster metabolism means a thinner body. METABOLISM MYTHS If you read just about any lifestyle magazine or newsletter, or follow the shenanigans of self-proclaimed wellness experts on social media, you might believe metabolism has something to do with how much you can eat without gaining weight, a knob inside the body that can be finely tuned with dedicated “metabolism boosters.” Some people are blessed with a fast metabolism. The rest of us can try chili peppers, cold exposure, green tea extracts and other supplements, or muscle-building exercise routines. The implicit message in all the products and protocols is that you can do certain things to speed up your metabolic rate, or calorie burn—and that the increased speed is desirable because it helps with weight loss. On the flip side, the influencers tell us, weight loss can cause the metabolic rate to decline perilously, making it even more difficult to keep fat off. This metabolism boosting subsection of the weight loss industrial complex has a surprisingly rich intellectual history. One of the most effective weight loss drugs ever invented sped up metabolism. Factory workers in France during World War I used a chemical called 2,4- dinitrophenol, or DNP, to manufacture explosives, when they started to report fever, sweating, nausea, and vomiting—as well as rapid, unintentional weight loss. When researchers at Stanford University learned about the chemical’s slimming effects, they saw an opportunity. DNP could be used as a “metabolic stimulant” for weight loss. After testing in animals, then in humans with obesity, the scientists boasted about the drug’s “power to increase metabolism to very high levels” and promoted DNP as a safe and effective tool for treating obesity. Within a year after they shared their initial findings, in 1934, at least 100,000 Americans had reportedly taken DNP. One brand’s packaging featured testimonials about “literally burning the fat away,” also warning consumers “DO NOT BECOME ALARMED” if rashes, eye and skin discoloration, or burning developed. That the drug originated with toxic side effects during explosives manufacturing didn’t seem to deter consumers; such is the desire for fast fat loss. By World War II, a foundational study in nutrition science bolstered the related idea that a metabolic slowdown—meaning a person burns calories more slowly than expected for their body size—kicked in with weight loss. The legendary American physiologist Ancel Keys gathered thirty-six lean and healthy young men at the University of Minnesota and slashed their dietary calories in half for six months. The study’s participants—all conscientious objectors eager to serve their country—had volunteered to go hungry with a compassionate goal: figuring out how best to rehabilitate famine survivors and malnourished people after the war. Over twenty-four weeks, the men wandered the university campus and town, often sitting in restaurants watching other people eat, while mostly abstaining from food themselves. As they withered away, dropping a quarter of their body weight, they grew tired, feeble, cranky, and cold. They also saw their metabolic rate cut by almost 40 percent. The findings of the “Minnesota starvation experiment” built on fasting research, which already demonstrated that as we cut calories, the body senses the new low energy state and, much like a smartphone or laptop running short on battery, goes into power-saving mode. The decrease in calories burned was often greater than what could be accounted for by a person’s new, smaller body size. (Smaller bodies, contrary to popular wisdom, generally have lower metabolic rates than larger bodies.) Putting these findings together with the DNP example suggested metabolism was important not only to body weight; it could accelerate weight loss if properly manipulated. It didn’t take long before cases of blindness caused by “dinitrophenol cataracts’’ and deaths linked to DNP turned up in humans. By 1938, the drug attracted the scrutiny of newly empowered federal food and drug regulators in the United States and was banned after being deemed “extremely dangerous and not fit for human consumption.” Using a drug to speed up one’s metabolism suddenly seemed like a very bad idea. Yet the link between metabolism and body fat had been established in the marketplace of weight loss ideas. Products continue to trickle out to this day (including illegal DNP-laced supplements, which are still sold for weight loss and bodybuilding over the internet; as of 2011, over sixty related deaths were reported in the medical literature). In addition to the metabolism-boosting supplements to buy, health and fitness influencers promote exercise and muscle building as natural metabolism enhancers during weight loss—claims that Kevin often wondered about. Kevin had studied the details of the Minnesota experiment. The brains of the conscientious objectors focused almost exclusively on food. Eating was all they could think about. If they couldn’t eat, they wanted to be around or see pictures of food. “I was one of the many,” one participant later recalled, “that mentally was transfixed on cookbooks. And I collected probably a hundred cookbooks…I would read cookbooks like you would read Reader’s Digest.” As they ate less and lost weight, they became so weak that they could barely muster the strength to open a door, let alone work out. THE BIGGEST EXERCISERS? Something quite different happened on every TV episode of The Biggest Loser. For two years before Kevin’s study trip to the ranch, the show had become a guilty obsession but not because of its absurd spectacle. Kevin’s research focused on figuring out how what people eat and how much they exercise affect metabolism and body weight. He had studied these phenomena using data from hundreds of patients, collected by other researchers over many decades. Never, ever had he seen people lose weight at anything close to the rate of the Biggest Losers—not even during the Minnesota study. The contestants screeched, cried, vomited, and panted their way through grueling workouts in front of toned and tanned celebrity trainers. One was hospitalized with rhabdomyolysis, a sometimes fatal condition caused by overexertion. Others starved and dehydrated by spending lots of time in the sauna and subsisting only on low-calorie foods—vegetables and sugar-free Jell-O. In one case, a contestant reported that he’d been urinating blood, a potential sign of kidney damage. The people on the show seemed to be eating far less, and moving far more, than the Keys research participants. How was this possible? One morning, more than a year before his visit to the ranch, Kevin got Robert Huizenga, known as Dr. H, on the phone. A chiseled “workout fanatic” and son of a noted athlete and Manhattan Project physicist, Dr. H had long been a TV fixture, as well known for his role as the former team physician for the Los Angeles Raiders as he was for public appearances about the health of his Hollywood patients.[*1] Dr. H was also The Biggest Loser’s medical consultant. On the call, the doctor was forthcoming about his weight loss strategy. Like the show’s episodes, he emphasized all the exercise the contestants were doing. They were basically subjected to the training regimen of professional football players, he explained, an approach he picked up working for the L.A. Raiders. He had noticed that linemen participating in “two-a-days”—two workouts per day—couldn’t help but lose weight. In his past life, this had been a problem; the largest players needed to maintain a high body mass to effectively play the game. He couldn’t get them to eat enough food to prevent weight loss. That experience left him wondering: What if we could get people with obesity, who wanted to lose weight, to work out like football linemen? That was what unfolded every week on TV. The contestants weren’t put on any specific diet—just told to eat at least a minimum number of calories, depending on their body size, and no junk food—and then strongly encouraged to exercise for several hours per day. Exercise was the main driver of the weight loss. Amazing transformations followed. During the call, it dawned on Kevin: While scientists had known for years about the metabolic slowdown after cutting calories, no one knew what happened to metabolism during weight loss with lots of exercise in the context of obesity. Maybe larger bodies registered the surplus of energy they had stored in the form of body fat, and unlike the lean conscientious objectors in Minnesota, the Biggest Losers would continue to burn calories as usual? Maybe all that working out built muscles and prevented a drop-off in their metabolic rate, as metabolism-boosting exercise advocates often promised? The Biggest Losers could help solve these mysteries. There was just one problem: Kevin’s bosses told him to drop the idea. Shortly after the call with Dr. H, Kevin headed to a scientific conference, where he ran into Eric Ravussin, a colleague with a lab at the Pennington Biomedical Research Center, a world-renowned obesity research institution in Louisiana. Kevin told Eric about his idea and the difficulties getting buy-in from his bosses. Maybe Pennington could lead the study? Eric was game. He’d already been concerned for years about the show, and that it could potentially mislead the public on obesity management. The series’s fat-shaming premise had been dubbed by critics “maybe the most damaging television show ever,” one that pivots between “sadism and empathy at whiplash-inducing speed.” (The Biggest Loser creator maintains the show prioritized the health of the contestants, which is why it had a medical team in place. He also said they’ve addressed bullying on the show in more recent years.) What followed was the first reality TV show turned metabolism experiment, and the first study to follow people with obesity as they lost and regained exceptional amounts of weight. The findings were so odd, Kevin had his study equipment examined because he didn’t believe the results. SIX YEARS LATER Over more than six years, the scientists gathered hundreds of data points about how the contestants’ bodies were transformed. Contrary to the stereotypes of people with obesity, they documented enormous, outrageous levels of willpower. During their time at the ranch, the Biggest Losers drastically reduced their calorie intake—by an average of 65 percent— while burning about 4,500 calories each day, including three hours (often more) of vigorous exercise. After thirteen weeks on the ranch, when the remaining contestants had returned to their homes to await the final episode of the show, they continued to burn around 3,000 calories per day, exercising for more than an hour every day on average. By thirty weeks, they had lost an average of 130 pounds—roughly the size of a twelve-yearold boy. The winner of season eight, Danny Cahill, shed 239 pounds, more than halving his body weight. In most weight loss diet studies, people struggle to lose even 5 or 10 percent of their body weight. The Biggest Losers were outliers. They’d dropped 40 percent on average. The starving Minnesotans lost far less—25 percent. And unlike the men in the Minnesota experiment, whose weight loss was mostly from muscle and other lean tissue, about 80 percent of the weight loss in the Biggest Losers came from body fat. The show demonstrated that people with obesity can be incentivized to exercise intensely while eating very little for many months. With the carrot of a quarter million dollars and the admiration of a TV audience, they’ll get trimmer. Dramatically so. But what drove the weight loss wasn’t merely exercise, as implied on TV. Kevin and his team found that diet was at least as important as working out during the competition. The Biggest Losers who cut the most food calories lost the most weight. There was no such relationship with physical activity. To get Biggest Loser–sized results, the kind of calorie cutting they were doing was almost unthinkable. They had reduced their food intake even more than the Minnesota participants. Cahill, for example, recalls working out constantly while at the lowest point eating only 800 calories in four small meals. This created a calorie deficit that Kevin estimates would have killed the average Minnesota starvation experiment subject in less than six weeks. Despite all the muscle-building exercise, despite the many pounds of fuel the contestants had been carrying around in the form of fat at the outset of the contest, by the show’s finale, their metabolic rates had dropped spectacularly. They were now burning several hundred fewer calories per day, much less than could be explained by the amount of weight they had lost. The Biggest Loser contestants switched into power-saving mode, just like the lean Minnesota participants. The contestants with the greatest weight losses at the end of the competition also had the most exaggerated metabolic slowing. These findings contradicted the idea—promoted in countless fitness books and magazines—that working out to build and maintain muscle “boosts” metabolism during weight loss. Perhaps most surprising of all: These effects on metabolism persisted, long after they left the ranch. Six years after season eight wrapped, the people with the “broken metabolisms” flew in from across America to Kevin’s lab outside Washington, D.C., to have their health assessed—a reunion that would allow the scientists to peer inside their bodies one last time. The burning questions now: Did the people with the greatest metabolic slowdown at the end of the contest go on to regain the most weight? Did their metabolic rates rebound? Just like old times, they took a spin in the metabolic cart and submitted to a bunch of tests—blood draws, urine sampling, and X-rays to measure body fat. Outside the lab, the former reality TV stars toured the U.S. capital and caught up over dinner at a restaurant near the White House before heading back home. Fierce competitors at the ranch, they now found comfort in each other, basking in the warmth of old friendships. “The best way I can describe it is we all went through what was like a train wreck together,” one season eight contestant, Sean Algaier, told us, “and we all trauma bonded with each other really deeply, and so when we see each other, it’s like seeing an old home again.” When Kevin and his team crunched the data, they were so taken aback by what they saw in the numbers, they thought some sort of measurement error must explain them—that the metabolic cart at their lab was somehow different from the one they’d used at the ranch all those years earlier. So before moving forward, Kevin frantically tracked down the original device, now at the Pennington Biomedical Research Center in Louisiana, and had it flown to his lab for analysis. Both carts gave the same results. There was no measurement error. The finding was real: Even though the ex-contestants regained an average of about two-thirds of the weight they’d lost, at the sixyear mark, their average metabolic rate was no different than it had been at the show finale. AWAY FROM THE CAMERAS When the contestants of season eight left the ranch and went back to their regular lives—filled with work, family obligations, and other daily stresses —exercising for several hours each day while eating very little was… difficult, to say the least. Most of them regained much of the weight they’d lost. Even Cahill, the season’s winner. His victory launched a motivational speaking tour that lasted for four years, taking him on roughly one hundred flights per year across America. But soon enough, “That took a toll,” he told us. He wasn’t seeing his wife and kids as much as he wanted to, and after exercising more in a few years “than most people do in a lifetime,” his knees, ankles, feet, and joints “were breaking down.” “You can only keep motivation up for so long but eventually you get tired,” he told us. He returned to his former job, as a land surveyor, and to his old body size. When Cahill and his peers were much larger on average, no longer dieting and exercising all day, their power-saving mode persisted, which is not at all what the scientists predicted. They expected just the opposite to occur—that their metabolic rate would tick back up. There were other oddities that defied the researchers’ expectations. People whose metabolisms dropped the most at the end of the contest—also those who had the greatest weight loss at the show’s finale—did not regain the most weight. Instead, they appeared to be just about average in terms of their weight regain over the six years. A “slowed metabolism” didn’t prevent weight loss and it also wasn’t a predictor of future weight gain, findings that have since been corroborated by other scientists. Even more intriguing, those with the greatest metabolic slowing at the six-year mark were now the people who had maintained the most weight loss. Put another way: The biggest weight losers consistently boasted the slowest metabolisms. Before these studies, it seemed logical that if a person started to burn fewer calories than expected when they lost weight, they’d be predisposed to regaining fat. But once again, widely believed weight loss dogma turned out to be wrong when put to the test. Now just the opposite was true: The slower the metabolic rate, the greater the weight loss success. There was a final surprise. The behavior most linked to maintaining lost weight at six years was different from the weight loss phase during the TV show. This time, there was no relationship between how many calories people cut from their diet and their ability to keep weight off. Now, exercise was key. The people who had increased their physical activity the most had the best odds of avoiding weight regain. That suggested physical activity, while not a predictor of who got slimmer during weight loss, correlated with weight maintenance years later—another trend that’s been observed by other researchers. Tracey Yukich, the contestant who was airlifted to a hospital with rhabdomyolysis, was a case in point. Because of her health scare, she was allowed only light-intensity exercise during the competition—walking, swimming, and physiotherapy. Still, she shed 118 pounds, nearly halving her body size, by eating so little she became lightheaded and anemic. At six years, she was among the most successful at fending off weight regain— and one of the few contestants exercising heavily. In both cases, she recorded among the greatest metabolic slowdowns. OUR ENVIRONMENT MEETING OUR BIOLOGY, AGAIN AND AGAIN We asked Yukich and several other Biggest Loser contestants what they thought of the research findings. They told us they were comforted by them. In the results, they saw proof of their biology fighting back against their attempts to change their body size—a reminder of how difficult it is to lose weight and keep it off, and of how hard they had to work to lose it in the first place. “After those numbers came out, I had to realize that even though I did all that process of exercise and losing weight, I’m still me inside,” Yukich said. In one sense, they’re right: Our bodies react to the lifestyle changes we make—exercise more, sleep less, eat less. Whatever you do, your body registers those differences and compensates, outside your conscious awareness. This influences not only metabolism but appetite and whether we’re more likely to, say, eat an avocado or a cookie, or walk to work or take the car. [*2] But the metabolic slowdown was only a small part of the story, and it didn’t at all explain the struggle people experienced in the long term. A slow metabolism wasn’t the “life sentence” even the Biggest Losers believed it to be. Something more interesting was going on. Kevin thinks maybe a metabolic slowdown is a reflection of hard work, not destiny—like the tension on a spring. The harder you pull, the greater the spring stretches, and the more tension. The more intense the intervention, the greater the weight loss, but also the more slowing of metabolism. For the Biggest Losers, maybe all the exercise drove the persistent power-saving mode. Unlike people in other weight loss studies, the reality TV stars were different in one major way: They had chronically and substantially increased their physical activity—moving much more at six years than they were before the show. The change was likely great for their overall health but may have led to the surprising, long-term effect on metabolism. This gets at an idea from evolutionary ecology called “life history theory.” Energy is a finite resource in the body, and our bodies make tradeoffs to optimize our ability to reproduce. So doing a lot of physical activity costs energy and may cause other biological processes to shut down—like when female athletes start missing menstruation. The resulting energy savings could show up in a reduced metabolic rate.[*3] Scientists still don’t have a very detailed understanding of which processes power down when the resting metabolic rate slows—how our bodies subtly shift their energy budgets for different purposes.[*4] But the phenomenon may help explain the metabolic drop-off on the show and years later. Altogether, Kevin’s study uncovered a pile-up of compelling stories about metabolism and weight loss that did not pan out when put to the test. Exercise is the key to weight loss, in part because it prevents the slowing of metabolism. Those who exercise the most lose the most weight. People whose metabolisms slow the most lose the least weight. Those with the greatest slowing of metabolism regain the most. People with persistently slowed or “broken” metabolisms are the worst off weight-wise in the long run. Wrong, wrong, wrong, wrong…and wrong. The metabolic slowdown didn’t seem to determine anyone’s ability to lose weight or keep it off in the short or long run. Put more succinctly: It’s not metabolism, stupid. And just because something sounds like it makes scientific sense doesn’t mean it’s correct. Just because an idea has kicked around for decades, even centuries, doesn’t mean it’s been properly tested. We will see this again and again in this book. The media coverage of the Biggest Loser studies missed these subtleties. The popular narrative that emerged about Kevin’s research was simple: The reason people struggle with weight loss is that they’ve destroyed their metabolisms. In a story on the front page of The New York Times, the study helped explain “why so many people fail to keep off the weight they lose.” The results seemed to affirm that old belief: that a slow metabolism was the probable driver of many people’s struggles with weight[*5]—and that something should be done to obtain speed. It was as if people interpreted the findings to fit their preconceived narrative about metabolism—a view that could be traced back to the World War I DNP research. Julia’s own story about the study perpetuated the idea, while experts around the world weighed in with tips for how to reverse metabolic slowing and supercharge metabolism. A sluggish metabolic rate causes fatness, the logic went, and fatness breaks the metabolism even more. Or, as another New York Times article put it, “When you have been fat, your body doesn’t behave the way a thin body does, even when you become thinner.” In the aftermath of the intense media coverage of the study, Dr. H expressed frustration about the data and told us he still doesn’t believe the results. He wrote a letter to the scientific journal that published Kevin’s paper and complained that the metabolic slowing could have been an artifact of erroneous measurements. Kevin and his colleagues responded to Dr. H’s letter, explaining why that argument had little merit. Unfortunately, it will be almost impossible to replicate the findings. Following seventeen seasons, The Biggest Loser was dropped from the broadcast TV network’s lineup the year after Kevin’s study was published. Instead of a slowed metabolism, what caused the Biggest Losers’ fat to come back was the pressures of the real-world food environment and real life outside the ranch. The food environment interacting with our biology— which we will unpack in this book—explains not only why the contestants regained their weight but why most people do in just about any weight loss boot camp, fat camp, crash diet, or health spa humans have ever attempted. People go home and return to their old routines and ways, and the fat comes back. The contestants had delivered a far more interesting and important story, one that had gotten lost—perhaps with good reason. One evening at home in Maryland, after reading dozens of confused stories on his Biggest Loser research, Kevin leaned back in his office chair and thought: Wow. No one knows what the hell metabolism is. So let’s move on to, you guessed it, what the hell metabolism is and why its connection to weight loss has been a great big distraction. Skip Notes *1 In 1995, he served as a defense witness in the murder trial of the former football star O. J. Simpson, and two decades later, he sat next to Charlie Sheen while the actor publicly revealed he was HIV positive. *2 On the flip side, the same thing happens when we eat too much. Overfeeding studies show that our bodies can also resist weight gain, and how much weight people gain when they overeat is highly variable. *3 These ideas were popularized by anthropologist Herman Pontzer in his book Burn. He makes the point that increases in physical activity above a certain point cause decreased energy expended for other processes. To test this model, we still need more long-term research. *4 While it’s relatively easy to gauge changes in one’s overall metabolic rate, it’s more difficult to home in on what’s driving the changes. *5 Some data suggest that people who have an abnormally low energy expenditure for their body size have increased risk for weight gain, but even then it seems that increased food intake is still the main contributor to weight gain. CHAPTER 2 The Fire of Life One sunny summer day, instead of enjoying the beautiful weather, a reporter is enclosed in a dim, hermetically sealed 11-by-11½-foot room on the outskirts of Washington, D.C. This metabolic chamber at the National Institutes of Health (NIH) Clinical Center, the largest hospital in the world focused on scientific research, is considered the gold standard for measuring metabolism. The reporter is there because she is among the metabolically confused. The only way to understand metabolism is to finally see how it works. This meant volunteering for a study involving a twenty-four-hour chamber stay. The chamber isn’t quite the silvery space-age contraption that the reporter, Julia, had imagined but a sparsely furnished cube—with an exercise bike, a toilet, and a bed. Julia’s every move is being monitored through a plexiglass window as well as a video camera and an infrared motion detector in the ceiling. She wiles the time away by taking a few spins on the exercise bike, reading, working on her laptop, and eating. But she can’t eat whatever she wants whenever she feels like it. Like a prisoner in solitary confinement, she receives her food on plastic trays at organized intervals, passed to her via a small, air-locked opening in the wall. Julia sends every leftover scrap back through the wall, and it’s taken to a subbasement kitchen where dietitians tally up the calories she ate. Just before lunch arrives, a scientist visits Julia—via the plexiglass window and a phone—to inquire about how she’s doing. Julia takes the opportunity to ask him how the heck this room is possibly looking inside her body. She learns that measuring metabolism has nothing to do with the gadgets—a heart monitor and three accelerometers—that are currently hooked up to her wrist, waist, and ankles, nor the infrared motion detector, as she’d assumed. The real action is taking place in the chamber’s ceiling, where an array of discreet metal pipes suck air out of the room. The best way of measuring metabolism today is by measuring a person’s breath. WHERE DOES BODY HEAT COME FROM? Survey your friends or family and ask them what metabolism is, as we have done repeatedly in the process of writing this book. After mumbling something about body weight, you’ll probably get a funny reply if you push for specifics. On where metabolism is located, one economist friend told Julia, “It’s in the digestive system,” pointing at his belly. Kevin’s son, age nine, offered, “It’s something about food.” Even without knowing exactly what, or where, metabolism is, most people have an intuitive sense that it has to do with “burning” calories. It was the ancient quest to understand the origins of our internal fire—body heat—and how that warmth related to breathing that led to the unraveling of one of the most important, and elusive, vital processes known to humans, the key reason why we eat. Before scientists had the periodic table of the elements, before they discovered the atom or the cell, and long before the term metabolism was coined, poets and philosophers knew that breathing and bodily heat were essential for sustaining life, and somehow linked. In Genesis, when God animated Adam, he “breathed into his nostrils the breath of life.” The Greek physician Galen preached that life derived from pneuma, the ancient word for breath. “Heat combined with moisture so conceives and life results from these two things,” reads Ovid’s Metamorphoses. “For though the flames may be the foes of water, everything that lives begins in humid vapour.” The ancients also observed that animals, including humans, are warmer than their environments, and that the temperature difference quickly dissipates with death. Aristotle had an explanation for what was happening: The heart produced the “vital” or “innate heat” of life, while the breath from the lungs cooled the heart. In the eighteenth century, the French chemist Antoine Lavoisier began to study how breathing and body heat were related chemically. Born into an aristocratic Parisian family, Lavoisier invested his personal fortune in a “tax farming” institution that profited off the public collection of levies on behalf of the Old Regime. With that money, he built one of the most impressive laboratories in Paris just as chemistry was emerging from alchemy and starting to formalize. In his spare time—when he wasn’t directing France’s tobacco and gunpowder industries, or acting as commissioner of the national treasury, among his numerous official posts— he managed not only to describe and name oxygen[*1] but to deliver a related insight that takes us to the cutting-edge research at NIH’s metabolic chamber. In his palatial lab near the banks of the Seine in Paris, Lavoisier fitted his assistant—another nobleman, Armand Séguin—with an airtight brass mask, and using putty, he affixed Séguin’s lips to a rigid tube connected to a series of vessels. Through the tube, Séguin inhaled air, and everything he exhaled ran into a vessel filled with water, for analysis. Lavoisier tracked Séguin’s pulse and breathing—noting how the air he respired changed during eating, pumping a foot pedal, and exposure to various temperatures. For another test, Séguin donned a rubber-coated taffeta suit that sealed in air and humidity. This time, the aim was to track how much water Séguin lost through his skin and exhalations, and how that linked up with any fluctuations in respiration. Lavoisier discovered something curious. The composition of the air Séguin breathed in and out shifted after he ate and when he was exposed to cold; his respiration changed even more during physical activity. There appeared to be a direct relationship between the work Séguin was doing and the air he respired. But he realized there was something potentially even more interesting going on inside Séguin. The harder he worked, the more heavily he inhaled and exhaled, the faster his heart beat, the more water he lost through his skin and exhalations—and the hotter he became. Séguin’s breathing was connected to his body heat. To further investigate, Lavoisier invented the ice calorimeter, a device with three concentric compartments: two outer layers packed with ice, and an inner chamber where tubes were fitted to pump fresh air in and draw out used air. Inside the device, Lavoisier combusted a lump of charcoal, tracking how quickly the burning charcoal melted the ice and how much carbon dioxide was produced. Later, he placed a guinea pig in the calorimeter. His remarkable observation was that the ratio of ice melted to carbon dioxide produced was almost identical between the breathing guinea pig and the burning charcoal. Breathing and burning both consumed oxygen, released carbon dioxide, and generated heat in similar proportions. Respiration wasn’t just metaphorically like combustion; it was chemically equivalent, Lavoisier reasoned. What was happening inside the body of a living animal was a sort of fire, just one that roared gently. Or as he and Séguin put it, “breathing animals are actual combustible bodies that are burning and wasting away.” Lavoisier didn’t have a word for what he was measuring, but he knew he was homing in on that mystery that had captured the imaginations of humans for millennia. “Fire taken from the heavens, this flame of Prometheus, not only represents an idea that is ingenious and poetical but it is a true picture of the operations of nature on behalf of animals who respire,” he wrote. “One can say with the ancients that the fire is lighted the moment a baby takes its first respiration and is not extinguished until its death.” The scientists thought respiration was the source of not only animal heat, but human bodily heat, too. Food, they argued, turned into a carbonhydrogen fuel, like charcoal, in the blood. The blood then carried the fuel to a place in the body where it burned to warm us. The site of combustion, they proposed, was the lungs. Lavoisier and Séguin wound up being wrong about where respiration and combustion happened inside the body—a reminder that even the best scientists come up with plausible theories that don’t pan out. But their sense of food’s purpose wasn’t far off the mark. We eat, they guessed, to replenish fuel that gets burned up through breathing. Without food, animals eventually waste away, “perish, just as the lamp goes out when it lacks fuel,” they wrote. When Lavoisier and Séguin shared the observations with the French Academy, more than two hundred years ago, it was the first attempt to explain what we now refer to as metabolism. METABOLISM, EXPLAINED Scientists now know that respiration doesn’t just happen inside the lungs. It happens inside almost every cell of the body. [*2] And the respiration in our tens of trillions of cells is part of the complex web of chemical reactions that collectively make up metabolism—an uninspiring name for what is arguably one of the most awe-inspiring and beautiful processes humans have ever discovered. Food molecules—either from our last meal or those stored within the body’s tissues and released for use between meals—are carried in the blood, along with dissolved oxygen from the breath, into our cells. Our cells are like minuscule hives of chemical and electrical activity. They take in the digested food particles, along with molecular oxygen delivered by red blood cells, and feed it all into a series of carefully controlled chemical reactions that power life itself. When scientists like Kevin talk about metabolism, what they’re really talking about is this: The chemical reactions that bring together the stuff we eat and breathe and, through choreographed biochemical waltzes, transform it all into the energy and building blocks we require for life. These almost invisible waltzes are everything that happens between “ashes to ashes” and “dust to dust.” In humans, they’re behind our every heartbeat, every thought, every blink of the eye. They’re the reason we can walk, heal injuries, grow muscles and hair, play music, fight infections, and have children who can do all those things. These reactions, and the sequences in which they unfold—the bane of every high school science student—are not only beautiful and life-giving, they’re also common to every living thing, from bacteria to humans. Nature solved the problem of metabolism once, and used the solution over and over and over again. After studying metabolism for years, what Kevin loves most about it are the nerdy intricacies of how nature managed this feat—how cells transform the energy in our food into a universal form of chemical energy, ATP, that can power every cell of the body. Kevin likes to think about making ATP as something like swapping partners at a waltz. Some of the partners are electrons, and others are protons.[*3] Atomic elements share their electrons when they combine to form molecules. And when molecules break apart, they release their electrons, which seek new dancing partners. When the bonds among the atoms in food molecules break down within our cells, smaller molecules get shuttled into subcellular structures called mitochondria. There, they unite with other molecules as part of a roundabout series of chemical reactions that produce carbon dioxide. These reactions, known as the Krebs cycle, are like a circular waltz, with dance partners coming in and going out. The carbon dioxide they produce gets transported out of the cells into the blood and exhaled in our breath. Back inside the mitochondria, electrons get passed through a series of four proteins called the electron transport chain. Together they create a tiny electrical current to charge mitochondrial batteries by pumping protons across a membrane. Electrons at the end of the electron transport chain are accepted by molecular oxygen from our breath and combined with protons to form water, which comes out in our sweat, our urine, and the moisture of our exhalations. The mitochondrial batteries are charged with an electric field strength equivalent to a lightning bolt and power tiny molecular motors that spin more than 100 revolutions per second. On every rotation, they make one ATP molecule by adding a phosphate ion to a molecule called ADP. If you made it through the last few paragraphs, take a deep breath.[*4] The upshot is that the chemical energy in our food is used to make ATP out of ADP, to power the innumerable range of cellular functions that keep us alive. When it does this, ATP gets turned back into ADP so that it can return to the mitochondria and keep all the waltzes going—the partner swapping of protons and electrons that make life possible. The average person turns over some 50 kilograms of ATP and ADP every single day. Metabolism doesn’t only explain how we’re able to carry about our lives. Metabolic reactions also give off the heat that warms us. As Lavoisier observed, the process unfolds gently, stopping short of exploding into flames. That’s because every one of the chemical reactions of metabolism is precisely controlled by an enzyme. Specific enzymes regulate each chemical reaction, allowing metabolism to burn fuel slowly and in coordination with other cellular processes. Our enzymes take the place of the spark required to ignite a fire outside our bodies. And these specialized molecules, like everything else inside us, are ultimately made from the food we eat. As City of Hope metabolism scientist Charles Brenner puts it, “Metabolism converts everything we eat into everything we are and everything we do.” In a time before the function of the cell was discovered, Lavoisier figured out that breathing is chemically linked to the “burning” of metabolism in a process that also explains bodily warmth. Even more remarkably, he realized that quantifying body heat—a warmth we now know is radiating from the metabolic reactions in our trillions of cells—is as simple as measuring our breath and how it changes the air. Both the chamber Julia had stepped into and the metabolic cart Kevin and his colleagues fitted on the Biggest Losers were mere upgrades of the centuries-old tube-and-vessel contraption Lavoisier used on Séguin. Simply measuring the breath—the rate of oxygen consumed and carbon dioxide produced—could reveal how much heat a person produces. And since calories are units of heat,[*5] quantifying the warmth from the metabolic reactions in our bodies also reveals how many calories a person is burning at any moment, their metabolic rate. As the ancient Greeks and poets sensed, heat is fundamental to life. Without heat, living things could not exist. According to the second law of thermodynamics—the physics of how heat is related to other forms of energy—the universe tends to become more random and disordered over time. It’s the idea that you can’t unscramble an egg or unbreak glass. Heat, or thermal energy, is the vibration of molecules in a substance that makes it more random. Emitting heat is what allows living things (which are highly organized systems) to obey the second law: The heat produced from metabolism increases the randomness of the universe more than it creates the complex and highly ordered structures of life. Life self-organizes by harnessing a continuous flow of matter and energy derived from food and breath, and that flow gives off heat. Metabolism is that flow—sustaining life, building and repairing us, and powering everything we do, all the while warming us. So essential, some form of metabolism is thought to be present in not just terrestrial but extraterrestrial life, too.[*6] METABOLISM DISTRACTION The myth of the slow metabolism died in the chamber for Julia. More important, her perspective on metabolism shifted. Before she entered the study, she thought of metabolism mainly in its relationship to body weight, and she was sure that she’d managed to slow an already sluggish metabolism through dieting. As it turned out, her metabolic rate was completely normal for someone her age and size. She didn’t have an unusually “slow metabolism.” Most people don’t, Kevin told her when they went over her results. The cause of her weight struggles lay elsewhere (more later in the book). The focus on metabolism for weight loss has been a big distraction, not only for people seeking real help and answers but from the insights that careful science, over centuries, has uncovered. Metabolism wound up as the body’s problem child, instead of being celebrated for what it is: core to the very existence of life. That this mysterious chemical process gives us the energy for life is just part of what metabolism does; it’s also the engine that builds and rebuilds our bodies. We will turn next to what in our food provides some of the most important building blocks of life, long considered the “one true nutrient” that people still obsess about eating more of: protein. Skip Notes *1 Joseph Priestley is often credited with being the discoverer of oxygen, even though Lavoisier named it. Before Priestley, the Swedish chemist Carl Scheele also discovered oxygen but he only published his findings after Priestley. Priestley called the gas “dephlogisticated air,” fitting his finding into a popular theory of the day. The “phlogiston theory” held that burning, corrosion, and respiration were all chemical processes that released something into the air—an entity called phlogiston. This “fire matter” was present in everything but only visible when it escaped an object during burning. In other words, Priestley was so invested in the compelling narrative of phlogiston, he couldn’t see what he’d actually discovered—a trend we’ll see on repeat. *2 Red blood cells don’t respire, maybe because they are responsible for delivering oxygen to the other cells of the body. *3 A friendly reminder that all matter is made from atoms, which are in turn made from uncharged particles called neutrons, and charged particles called protons and electrons. *4 Fun fact: When you lose weight, you’re burning body fat, which is mostly made of carbon and hydrogen. And the by-products of this fat oxidation are the carbon dioxide you breathe out and the water in your urine, sweat, and exhalations. *5 More specifically, according to the most widely used definition, a calorie is defined as the amount of heat needed to raise the temperature of one kilogram of water by one degree Celsius. *6 NASA scientists used this idea to test for life on Mars, measuring whether basic nutrients were metabolized in Martian soil samples. Surprisingly, the results of this 1976 experiment turned out to be positive and some scientists still believe they support the existence of Martian microbial life. CHAPTER 3 Protein, the “Only True Nutrient” We’ve just covered how metabolic reactions inside every cell transform food into fuel and body heat. Let’s move on to what exactly in our food fuels us while also building and repairing our bodies. That’s the job of the macronutrients: protein, fat, and carbohydrates. In this chapter, we’ll look at the macronutrient that was, and perhaps still is, considered more important than the others, the only one that’s skated along since its discovery mostly unscathed by the diet wars: protein. It may feel like everywhere you turn, you’re told you need to eat more of it. The obsession with protein maximization has a long history. It’s as old as the word protein itself. It’s also an indicator of the extent to which many of us don’t understand protein—why we need to eat it, how best to get it, and how our bodies use it. But it’s no wonder we’re confused. We’ve been misled for centuries. A VERY OLD OBSESSION The man is shirtless and he’s sitting in front of a chopping block piled with meat, raw and cooked. He speaks in a husky voice, through a wild, caveman beard that’s curling out in all directions. As he describes what he’s about to eat, the bulging muscles of his abdomen and arms ripple. He points to a bloody hunk of flesh at the center of the table. “Liver King is having liver— surprise, surprise—for dinner today. And its little cousin, don’t forget, chicken liver, cooked.” The man, Brian Johnson, is an American social media influencer who goes by the name Liver King. Through his social media channels, he tells his ten million followers how to abide by nine “ancestral tenets” that’ll help them reach their “highest and most dominant form.” Their highest form is ripped. The foods that’ll deliver muscles like Johnson’s are certainly not plants. The camera pans across the heaps of meat to two black packages: protein supplements Johnson sells on his website. The products “build muscle easily and unearth the warrior beast within.” They also purport to answer a contemporary problem: “Modern-day protein powders and shakes have left us malnourished,” according to Johnson’s website, which features information on his so-called “science and mechanisms,” a section devoid of any scientific research or detailed explanations. His supplements, he claims, can fill in the gaps. Johnson is far from the only protein and meat enthusiast. He’s certainly not the first. In 1853, a caricature in a Parisian newspaper depicted vegetarians as so weak, they had to be carried on stretchers into a restaurant. By the early twentieth century, young men were advised to eat their fill, especially from animals, if they wanted to be competitive in the workplace and on the battlefield. In the 1980s, as a scrawny prepubescent, Kevin devoured the autobiography of five-time Mr. Universe winner Arnold Schwarzenegger— Arnold: The Education of a Bodybuilder. “The secret of rapid weight gain is a high-protein, high-calorie diet,” Schwarzenegger and his co-author wrote. Though Kevin dreamed of gaining brawn like his favorite actors and superheroes, he didn’t go quite so far as guzzling raw eggs in the mornings. By the 1990s, when Julia was a chubby preteen, amping up one’s protein intake was being touted as the route to fast fat loss. Toned celebrities, like Friends star Jennifer Aniston, extolled the high-protein Zone diet, which promised that if people got more of their calories from protein, they’d slim down. In the decades since, as populations alternately eschewed fat and carbohydrates for better health, protein hung on to its exalted status. Google searches for “high protein diet” reached their highest point ever in January 2025, according to The Economist. On social media, claims abound that extra protein can help you do just about everything: live longer, maintain a healthy metabolism, suppress hunger. In grocery and health food stores around the world, you can find protein-boosted versions of, well, everything. Kevin and Julia had a contest—who could spot the most absurd protein-enriched foods in their local supermarket. They tied with “protein water” and pancake mix. As of 2024, the global protein supplement business was worth $28 billion, almost double the size of the worldwide olive oil market. Is there anything protein can’t do? Protein’s prominence can be traced back to an ingenious and bombastic chemist with a knack for self-promotion. Justus von Liebig had some very innovative, and plausible, theories about protein’s role in our diet and our bodies. Instead of testing them in experiments, though, he rushed to market with books and products, promoting protein as the only nutrient that really mattered and animal products as the best way to get it. Liebig also created the mold for the contemporary diet pseudoscientist peddler and health guru. His ideas not only shape how we think about the nutrient to this day, they helped form the modern food system.[*1] But the productive and ambitious scientist had only begun to scratch the surface of why we eat protein. Many of his grand theories, including some that persist, turned out to be dangerously flawed. WHAT KEEPS US FROM WASTING AWAY Liebig was born in 1803 to hardware merchants in Darmstadt, Germany, who mixed and sold household paints and varnishes. Twelve years later, when Liebig was barely a teen, the eruption of Indonesia’s Mount Tambora, the largest in recorded human history, drove temperatures in Europe to unseasonable lows, leading to the “year without a summer.” Crop failures and soaring food prices followed, exacerbating famines that had already emerged during the Napoleonic Wars. Liebig’s family, like many in his time, constantly strained to have enough to eat. By the time he’d established himself as a world-famous scientist, Liebig helped sow a meat panic and protein obsession that endures. The panic was rooted in chemistry. An early interest in chemicals, honed in the laboratory of his parents’ shop, led Liebig in 1822 to Paris, where, during his studies at the Faculty of Science, he experienced a “metamorphosis” from “an unsophisticated bucket chemist to a natural philosopher,” writes W. H. Brock in his biography Justus von Liebig: The Chemical Gatekeeper. Chemistry had only recently turned from questions about nonliving matter—like describing newly discovered elementary gases, such as oxygen—to analyzing the “organic” materials derived from living things. Liebig—handsome, with a mop of dark hair and piercing eyes —promptly landed a job with one of the world’s most important chemists, Joseph Louis Gay-Lussac, a compatriot of Lavoisier’s. There, Liebig picked up the latest in organic chemical analysis and took that knowledge back to Germany, where he invented the contemporary research lab: A single senior scientist leads a group of trainees through a research program. He also perfected methods for measuring the chemical composition of hundreds of organic substances—animal organs, bones, blood, milk, manure—and thanks to his lab, filled with talented acolytes, did so at a rate then unrivaled in the history of science. With such immense productivity, he not only was laying the foundations of organic chemistry, he built the case for why protein is crucial in the body and the diet. Liebig observed the deep connection between what went into an animal in the form of food, water, and air, what was absorbed into and retained by the body, and what came out in the sweat, urine, breath, and feces. This investigation into the chemistry of living things helped him unravel a fundamental mystery: how food is assimilated, or “animalized,” to form our blood and tissues. Before Liebig, even before Lavoisier, the ancients observed that humans seemed to slowly waste away when we weren’t eating, and food appeared to be the thing that replenished us. In the Hippocratic corpus, nutrition derived from a single, universal “nutriment” in food. Digestion’s purpose was only to extract the nutriment and replace whatever got used up between meals. The loss of bodily material—in a process known then as “insensible perspiration”—remained a mystery. But while it was clear that animals had to eat the nutriment to survive, and that they’d perish without it, no one knew what it was. By the 1820s, the English physician William Prout noticed that all food contains three organic, or carbon-containing, nutrients in different proportions: the oily (fats), the saccharine (carbohydrates), and the albuminous (proteins). The trio, which makes up most of the mass in food after water, became known as the “macronutrients.” [*2] Protein appeared to be unique among the three. While carbohydrates and fats are formed of different configurations of carbon, hydrogen, and oxygen molecules, protein was the only macronutrient that also contained nitrogen. Scientists had already figured out that dogs eating foods lacking in nitrogen became sickly and promptly died, so protein had to be an essential element in their diets. A Dutch chemist, Gerardus Johannes Mulder, discovered the seemingly identical chemical composition of protein in animals and plants and concluded that plants were the gateway to nitrogen for animals. In other words, protein in animal flesh came from either eating plants, or eating animals that had eaten plants. Mulder named the nitrogen-containing substance protein after the Greek word proteios, meaning “of first rank” or “primary.” With the name, he was signaling that protein was the bedrock from which all animal life sprang.[*3] Fascinated by Mulder’s discoveries, Liebig realized that protein was the long-sought-after “only true nutrient” that humans had been wondering about for thousands of years, the ingredient that becomes us and keeps us from wasting away. Like today’s social media influencers, he rapidly popularized his ideas without bothering to test them. In his book Animal Chemistry, borrowing from Mulder’s work, Liebig claimed that plants make protein, which is then, with little modification, taken into animals. Animals, eating plants or other animals, used the digested protein to form their own flesh. Since animal flesh is rich in protein compared to plants, Liebig valorized meat as the ultimate replenisher, the substance that best built the body and enhanced human performance. Protein breakdown inside muscles not only animated us, it fueled our muscular work. Adopting Prout’s classification of the macronutrients without crediting him, Liebig argued that carbohydrates and fats played no role in the chemical reactions that formed animal tissue or provided energy for movement. Instead, they were mere “respiratory fuels,” which Lavoisier had discovered could provide energy to heat the body. If the protein we eat was the critical ingredient, essential for life, the question on Liebig’s mind was how to deliver more of this miracle substance to people, especially those who couldn’t afford meat. His answer: a protein-boosting food supplement. DISCOVER, PACKAGE, SELL In 1865, Liebig launched one of the world’s first mass-marketed processed foods. One pound of Liebig’s Extract of Meat, a dark brown syrup, sold in cream-colored jars, was reduced from 34 pounds of lean raw meat, Liebig claimed. By adding water, one could use meat extract as a quick base for a broth, also known as “beef tea.” Proponents argued that beef extract had medicinal value, helping convalescing patients regain strength, and that the salts and creatine that gave the broth its flavor could create the equivalent of body-building animal protein in a much cheaper product. Liebig pushed the product in numerous letters to scientific journals. And at first, the scientific and medical communities embraced Liebig’s ideas. The year meat extract hit store shelves, the British Pharmaceutical Conference featured “a meat extract panel,” where speakers proclaimed there was “probably no food available” that could repair the tissues of sick people the way Liebig’s product could. Even the founder of modern nursing, Florence Nightingale, promoted meat extract as “perfectly invaluable” in tending to sick soldiers. The great chemist had successfully packaged the idea that meat was protein, and protein was king. If you couldn’t buy meat, you could get similar nutritional benefits eating his extract with vegetables, making their plant proteins more meatlike. You wouldn’t know it from Liebig’s advertising campaign, nor from how readily his peers hopped on the meat extract bandwagon, but his claims were based only on plausible theories with little evidence to support them. When put to the test, it turned out meat extract delivered practically none of the protein found in beef or chicken, nor could it help vegetable protein build muscle. Similarly, Liebig never tested his protein theories in physiological experiments (and only conducted one unrelated study in humans).[*4] By the time other researchers ran their own studies to investigate, they found that protein was not the main, or even most important, energy source powering the movement of the body. Rather than accepting the findings, Liebig tried to swat them away with tactics that have been used by influencers and peddlers ever since. He and his Munich colleagues claimed that the results could be reinterpreted to support Liebig’s ideas. He also used his house journal, Annalen, to savage colleagues who disagreed with him, accusing them of plagiarism or belittling them, in one case, as “cocks crowing on a dunghill.” The counterattacks were so brutal, one of his students lamented that his mentor had fallen in love with his debunked theories and had “forgotten, to the sorrow of those who know and value his service to science better than do his flatterers, that these are all mere ideas and possibilities, whose validity has to be tested by actual animal studies.” Despite the controversy, Liebig’s ideas had been so forcefully stated by a scientist who had risen to the top of his field that they continued to set the research and policy agendas for decades. The flawed protein theories— including the identification of meat eating with vigor—had taken on a life of their own. Eventually, the protein obsession Liebig launched was exported to America and helped form the food system we have in industrialized countries to today. MORE = BETTER First came the dubious diet advice. In 1877, Carl von Voit, one of Liebig’s protégés, published an influential recommendation that working men required 118 grams of protein per day. The recommendation wasn’t based on physiological research; Voit simply asked what men in Munich would like to eat if they could eat anything. This was another “example of a standard that the public believed to be authoritative and derived from the scientific work of an expert although, in fact, the standard and the work had little or no connection,” wrote nutrition scientist Kenneth Carpenter in his comprehensive Protein and Energy. The idea that strong working men needed a lot of protein to fuel their productivity had devastating knock-on effects. Women made a habit of eschewing protein-rich foods for the men in their families, a practice that left them to subsist mainly on cornmeal and flour, a diet so deficient in niacin that it could bring on the deadly nutrient deficiency pellagra, Carpenter argued. A student in Voit’s lab who became one of the most influential nutrition voices in America transported the protein zeal across the Atlantic. When Wilbur Atwater wrote America’s first nutrition guidelines, he recommended a protein intake for physically active men—125 grams a day—that was even higher than Voit’s recommendation because he thought Americans not only ate more but were more vigorous than Europeans. In Atwater’s time, meat was still expensive and hard to come by. As countries like the United States and England industrialized and workers moved to cities, the demand for animal products surged ahead of the supply—a gap that stoked panic. In Britain, breathless worry over a “meat famine” read like reports of climate change today. The answer to the problem was establishing a cold chain that would allow for the safe transport of meat and other animal products from slaughterhouses and the countryside into cities. In the United States, the cattle population more than doubled, from fifteen million in 1870 to thirtyfive million in 1900. In the twentieth century, humans found ways to continue boosting the supply of animal protein, pumping cows and chickens full of antibiotics, engineering and subsidizing crops to feed the animals, and expanding the cold chain so that not only eggs, beef, and milk could be transported across the country but also avocados from Mexico and kiwis from New Zealand. Thanks to the push to feed everyone animal protein, not only did meat become cheaply available and omnipresent, but so, too, did bananas, tomatoes, and tuna, the journalist Nicola Twilley writes in her tour de force, Frostbite, about how refrigeration changed humanity and how we eat. “It’s worth remembering that all of these far-reaching and often unexpected consequences of refrigerating meat were spurred in part by a nutritional fallacy: the mistaken conclusion that protein from flesh foods was the only essential nutrient,” Twilley writes. “If chemists had come down in favor of grains and beans instead, the world might have looked very different.” This vast “artificial cryosphere” we’ve built for our food—which is estimated at 5.5 billion cubic feet in the United States alone for refrigeration and air-conditioning—accounts for one-tenth of global greenhouse gas emissions. While all that food certainly helped humans grow larger and taller, it’s also contributed to food waste and a system that doesn’t work optimally for human, animal, or planetary health (we’ll return to this later in the book). But as we already saw with DNP and metabolism, once ideas are out there, they can live on in our medicine cabinets, hospitals, dinner plates, and food systems for a very long time. Only at the end of his life did Liebig admit that he was wrong about protein’s role in fueling working muscles. Nevertheless, sales of his meat extract continued to grow, and within a half century of founding his company, Liebig had become one of the world’s largest cattle farmers. Meat extract made Liebig a wealthy man, further distancing him from the childhood hunger he’d worked a lifetime to overcome. In 1865, Liebig wrote in a letter that he hoped the meat extract would one day reach “every household.” In a sense, it did. The ready-made bouillon many of us use to flavor everything from soups to risottos can be traced back to Liebig’s company, which under new ownership, eventually introduced the Oxobranded bouillon cubes still in production. He’d also established misleading relationships between meat and protein, and protein and strength, that continue to shape how we eat—not to mention the enduring template for rushing to market with untested nutrition ideas. WHAT YOU DON’T USE, YOU LOSE Let’s be clear: Liebig’s errors still distort the way many of us think about nutrition, even now. Eating vast quantities of protein is not necessarily useful, or even beneficial, to health. We certainly do not need to derive our protein from meat. Consuming extra protein alone will not build muscle, nor is protein what powers muscles. Contrary to Liebig’s theory, the other two macronutrients, carbohydrates and fat, provide almost all of the energy necessary for both the physical and chemical work of life. So carbs and fat, not protein, are the main fuels for the metabolic reactions we learned about in the last chapter. On an essential protein fact, however, Liebig was correct: Protein is the stuff in food that builds and repairs us, or as he’d apparently put it, “the stuff of life itself.” We’ll unpack all this, but first, let’s look at how protein gets onto our dinner plates and into our bodies in the first place. The process starts out of thin air. Most of the atmosphere is made of nitrogen, but even Lavoisier noticed that animals don’t absorb this gas from the air they breathe. Instead, nitrogen makes its way into the proteins of plants—and then into us—by way of microorganisms in soil. These bacteria “fix” the nitrogen from the air, converting the nearly inert gas into volatile ammonia molecules. The ammonia derivatives get taken up by plants and combined with carbohydrates (made by photosynthesis from carbon dioxide and water). The result is the amino acids—the nitrogen-containing constituents of proteins. The amino acids are combined to build the proteins in plants, which are then eaten by animals to become part of the animal body. Without the nitrogen that plants and bacteria bring into the food cycle, animals like us couldn’t exist. Rather than plant proteins being absorbed into animals unchanged, as Liebig thought, proteins do something far more amazing once inside us. After we eat, proteins get broken down during digestion into their twenty different amino acids. The amino acids are absorbed into the blood and taken up by our cells, where they are recombined, in myriad ways, to form the tens of thousands of proteins that make our skin, bone, hair, heart, eyeballs, fingernails, and yes, our muscles. Not only do proteins make you, they make you uniquely you. The genes encoded in your DNA provide the instructions for stringing together the precise sequences of amino acids that form each of the proteins in your body. The body is made of protein along with water, fat, carbohydrates, and minerals. What makes a liver cell different from a muscle or skin cell is that each cell type produces a unique complement of proteins in different amounts. To explain how this happens, Kevin thinks of proteins as Lego structures, with the twenty amino acids constituting different colors and shapes of Lego bricks. After we eat a Lego structure, we break it down and absorb its bricks to build the different structures that make us. But how does the body get the Lego bricks in the colors and shapes we need for our cellular buildings? We have an amazing power to turn some Lego bricks into other shapes and colors. Inside our cells, nitrogen-containing amino groups from some amino acids break off and get attached to newly built organic acids that don’t contain nitrogen. The end result: different amino acids. We can’t do this to make all twenty, however. We have to get the nine “essential amino acids” from our food. To keep all the of the body’s proteins in top shape, they constantly undergo repair and maintenance work, also known as “protein turnover.” So, in addition to the Lego bricks delivered to our cells after we eat protein, and the old bricks that were transformed into new Lego bricks, the full complement of colors and shapes also comes from the breakdown of existing structures in the body. Bricks go in, bricks come out. We can repair Lego structures that are broken. This happens continuously, which may sound inefficient. But think about the proteins that make our heart muscles. They can get damaged over time—and it is essential that they do not fail. So older protein structures get replaced with newer ones, and everything hums along nicely. Different proteins in the body turn over at different rates, but the total amount of protein changed every day is typically several times the amount of protein we eat. That’s how we almost always have all the Lego bricks we need. Protein turnover is part of the constellation of chemical reactions that make up metabolism. In addition to powering and warming us, metabolism fuels the building and rebuilding of proteins that make the body. Almost every biochemical reaction, including those that build a new protein, is controlled by a different enzyme, which is also a protein made from amino acids. Yet as important as protein is, there’s a limit to how much our bodies need. Unlike carbohydrates and fat, we don’t have a dedicated storage form for excess protein, so we lose the amino acids we eat that aren’t used immediately. The nitrogen from the amino acids gets stripped off to form ammonia that is converted to urea and excreted from the body in urine. The carbon skeleton of the former amino acid becomes an organic acid that enters the metabolic waltz we described in Chapter 2. As Stuart Phillips, a leading protein and energy researcher at McMaster University in Canada, puts it, “You can eat a ton of protein. You can digest a ton of protein. But the key question is: How much can you use? And we don’t have a way of stocking away extra amino acids.” This means that chasing down a meat meal with protein shakes, as Liver King suggests, isn’t necessarily going to help you build protein stores, nor will it build more muscles. You need some other stimulus. In the case of the Liver King, an investigation revealed that his muscular physique was built by an $11,000-per-month supply of anabolic steroids, a finding that prompted a $25 million class-action lawsuit alleging he misled his followers. The case was ultimately dismissed; Liver King did not respond to our attempts to contact him for comment. Of course, Liver King did not advertise his use of dangerous steroids, just the protein supplements. Exercise is safer than steroids, and more effective than protein supplements, for muscle building. Phillips looked at the effects of protein supplementation on muscle growth across studies. He found that supplements alone had very little impact; it was resistance exercise that made the difference. Even for weight loss and satiety—some of social media’s favorite protein claims—it’s not entirely clear that extra protein helps. Diets with higher protein have been shown to reduce energy intake. But there’s an open question about whether it’s the protein content per se driving these effects or something else that differs between the diets, like the types of foods people eat. In one study, Pennsylvania State University nutrition researcher Barbara Rolls covertly varied the protein content of the same foods and failed to show any differences in appetite or calorie intake. In Kevin’s research, when people were asked to eat whatever they wanted, the higher the protein in the meal, the greater their energy intake. Then there’s the question of how to eat your protein. Despite the link between protein and meat that Liebig promoted, animal products are not the only sources of usable dietary protein. With relatively few exceptions, just about everything we eat contains protein and the full complement of amino acids, even “carbs” like bread, which has long been known to deliver protein (mostly in the form of gluten). This means the nine essential amino acids can be found in legumes, nuts, dairy products, and good old fruits and vegetables. The nuance here is that animal products contain greater concentrations of protein and the proportions of amino acids that are closest to what humans need compared to plant proteins, as Liebig had suspected. Plant-based eaters need to work harder to make sure they’re covered. But if you get enough calories from a diverse diet, even without animal products, you should be getting enough of the essential amino acids, no protein powder or “meat extract” required. Another nuance: Enough is not the same as optimal, and we don’t really have clear evidence of the optimal amount of protein to eat in each subgroup of the population and for different purposes. The general advice of the Recommended Dietary Allowance (RDA), for protein, published by the National Academy of Sciences, is 0.8 grams per kilogram of body weight per day in adults. That’s about 65 grams for a 175-pound person— roughly the same amount of protein in a 9-ounce steak, or 3 cups of lentils. The RDA is based on research that tracks the amount of protein we need to eat to prevent a loss of body protein over a relatively short period of time. It’s then set at two standard deviations above this estimated average requirement to make sure almost everybody is covered. Even so, scientists aren’t clear on the exact point extra protein goes from useful to useless for different end points. Still, Phillips doesn’t worry about the protein intake for most people in industrialized countries, especially not bodybuilders and athletes. But for older folks who have likely been losing muscle mass for decades, Phillips suggests twice the RDA based on his research showing health benefits up to —but not beyond—that point, as well as the associations between higher total protein intake and healthy aging. Most Americans eat more than the RDA, but elderly people and those who live in institutional settings, like care homes and hospitals, are at the greatest risk of undereating the nutrient. There’s another reason most of us don’t need to obsess over protein. The body may also be quite adaptable over periods of many weeks and longer. Around the beginning of the twentieth century, Yale University physiological chemist Russell Chittenden showed that after an adjustment period of a few weeks, young men could adapt and even thrive for prolonged periods on diets that provided about half the amount recommended by Voit and Atwater. Over time, the body accommodates by downregulating processes that use protein, Phillips explained. So again, whether low- or high-protein diets are optimal probably depends on what is being optimized. It also depends on what the body wants. PROTEIN WISDOM The focus on figuring out how much protein to eat overlooks a fundamental piece of our biology, one that’s common to humans and just about every other species that scientists have ever looked at. We may have an innate ability to control our food intake in order to meet our protein needs, a phenomenon called “protein leverage.” Nutritional biologists David Raubenheimer and Stephen Simpson, who’ve studied protein appetite in more than fifty species, learned just how common it is when they put cockroaches on an ultra-low-protein diet. Cockroaches are seemingly indiscriminate creatures when it comes to food. Notorious for surviving in just about every climate imaginable, roaches can live off sewage, decaying wood, even hair. If they also exhibited a protein appetite, then there was a good chance that it was universal, part of the nutrient balancing system that seems to be common to all animals. Ideas about nutrient balancing emerged in the 1930s, with a clever American biologist named Curt Richter. In experiments that spanned decades, Richter figured out how to make rats do things like lose sodium and calcium from their bodies at a deadly rate. The animals responded by simply choosing foods that helped them eat more of the nutrients they’d been losing, and surviving as a result. Richter concluded that animals have an inborn ability to sense their nutrient needs and find ways to make up for shortfalls by balancing their diets. How the craving for protein fit into the nutrient balancing system was less clear until Simpson and Raubenheimer came along. Enter their cockroach experiment. Raubenheimer and one of his students put cockroaches on three different diets over two days: low-protein and highcarb, high-protein and low-carb, and a balanced mix of protein and carb. After that, he offered the insects a buffet of all three diets, allowing them to choose what they wanted to eat. What happened next stunned them. The lowly roaches that had been eating the protein-restricted diet quickly made up for the shortfall during the buffet phase of the experiment. The reverse was true for the high-protein-diet roaches: They quickly sought out more carbs. “The term nutritional wisdom was at the time in vogue. Here we had seen nutritional genius,” the scientist duo recounted in their enlightening book, Eat Like the Animals. Over decades, they’ve looked at locusts, fruit flies, elephants, moose, humans, even slime molds—an acellular species without a brain. They’ve demonstrated that all share the capacity to control food intake to ensure they eat enough protein—but not too much.[*5] Perhaps that helps explain why, for such an important nutrient, we can’t store protein in the way we can carbohydrate and fat.[*6] It may also explain why the amount of protein humans eat has remained remarkably stable over decades and across populations, unlike carbohydrate and fat. “What people actually eat,” Simpson told us, “and all around the world, is a diet that contains about 15 percent of calories as protein. No human population consumes fewer than 10 percent of their calories as protein. No human population consumes more than about 25 or at most 30 percent of its calories as protein. It’s a very narrow range.” If the protein appetite explains the floor on protein intake, what sets this relatively low ceiling? Is it just the absence of a protein appetite above a certain level, or have we evolved to also limit our protein intake for health reasons? We don’t really know, but there’s emerging evidence that above a certain level, protein is no longer helpful—and can be harmful. A trade-off between lifespan and reproduction has become an organizing biological theme across species: Low protein intake increases lifespan but reduces the rate of reproduction, whereas higher protein intake reduces lifespan but increases the rate of reproduction. Scientists recently discovered that genes associated with early and increased reproduction in humans are also associated with reduced lifespan. A wide variety of species that eat diets with protein levels far lower than are optimal for reproduction have longer lifespans. It’s not clear why, but the effects might be related to specific amino acids—like methionine and branched-chain amino acids— whose restriction has been shown to increase longevity in mice. Whether reduced-protein diets have an antiaging effect in humans, and whether we should limit our protein intake as a result, are still unknowns. Some observational research suggests that lower animal protein intake is linked to reduced mortality and disease risks, and that people who derive most of their protein from plants as opposed to animal sources have better health. This may be due to differences in overall dietary patterns, including increased fiber, which is absent in animal sources of protein, rather than the protein type per se. But it’s possible that future dietary advice may be just the opposite of Liebig’s own: Less protein is more. So what to take away from all this? Your protein needs shift over the course of your life, but our bodies seem to be adaptable to different levels of protein intake and have in-built protein-seeking machinery that helps guide our appetite. Getting enough is something most people don’t need to worry about (unless you’re in one of those high-risk groups, or your doctor says otherwise). At a certain point, more protein just means more waste. Though it is easier to meet your protein needs with animal products, you certainly don’t need to eat meat to get your amino acid fix. A varied diet with enough calories to meet your energy needs should have you covered, even one that’s plant-based. And while lots of people have been trying to top you up with protein products since Liebig, the latest evidence suggests that megadoses may not be great for health. But let’s not do what Liebig did and pivot from theory to advice before science has had time to catch up. Try to remember all this, especially if you walk into a supplement shop, as Julia recently did in Munich, not too far from where Liebig spent the last years of his life. It contained a world that Liebig wouldn’t recognize, but his fingerprints were everywhere. Instead of meat extract, there was floor-toceiling, wall-to-wall protein-delivering…everything. Candy bars, cookies, pasta, peanut butter, muesli, sauces, spreads, even “energy cakes.” There were jumbo plastic bags and huge canisters boasting thousands of grams of “designer protein,” “hemp protein, “whey protein,” “oat protein,” even “extreme hard gainer” protein. The message was clear: Eat lots of prepackaged protein supplements, to be fit, lean, muscular, and healthy. The obsession with protein that Liebig helped launch more than a century ago lives on in the place where he died, and all over the world. With all the focus on protein as the king of the macronutrients, we’ve barely broached its other two, supposedly inferior, counterparts. Carbs and fat play a more important role than Liebig realized—but what is it? How do they move through our bodies? What effects do they have on our health and what we’re made of? Diet experts have already taken sides—alternately portraying one as an elixir and the other a destroyer of health and driver of weight problems. As you might have guessed by now, the truth is once again more complicated, and far more interesting. Skip Notes *1 Liebig also advocated searing meat as a way to seal in the nutritious juices, and drinking meat broth when we’re sick, and yes, in case you were wondering, he’s the namesake of the Liebig condenser you may know from your high school science lab. *2 Alcohol is another macronutrient—it’s a source of energy but delivers nothing essential for health. *3 Or was that bedrock actually air, as the Greeks suspected? Air provides humans with not only the oxygen they need to survive but the nitrogen that plants use to make protein. *4 Liebig had estimated the relationship between food consumption and carbon dioxide output among soldiers. *5 In case you’re wondering how this squares with the fact that elderly folks undereat protein, the researchers also found the protein appetite seems to fail with age, at least in mice. At the same time, Simpson and Raubenheimer pointed out, it doesn’t help that protein is expensive and not always generously offered in institutional settings. *6 On the other hand, if there are specific carbohydrate and fat appetites, they appear to be much weaker than the appetite for protein. Recently, researchers discovered one way protein appetite works. When protein is cut in the diet, the liver increases its production of a protein called FGF-21. Secreted in the blood, FGF-21 travels to the brain to increase our preference and motivation to eat foods higher in protein. CHAPTER 4 Flex Fuel: Carbs, Fat, Calories The most vicious battle zone in the diet wars has centered on the question of whether diets high in carbs or fats are more fattening. Think of the turn against butter in the 1980s and 1990s, on the basis that high-fat foods caused weight gain, giving way to the low-carb craze of the 2000s, which is arguably ongoing. Even if you’re not a keto adherent, you may well have the impression that carb-rich foods are especially fattening and to be avoided if you’re watching your weight. And yet the reality is that inside the body, carbs and fats act more like helpful colleagues, standing in for each other when one goes on holiday. If protein is what builds and repairs us, carbs and fats are the body’s fuels, and they’re practically interchangeable, through feast and famine. We do such a good job adapting to whatever fuel combination we eat—high-carb, lowcarb, high-fat, low-fat—that, so long as we eat the same number of calories, the result is almost no difference in body fat. In this chapter, we’ll explain why, despite the countless diet books and products suggesting otherwise, the idea that our body fat depends meaningfully on eating some ideal macronutrient ratio independent of calories consumed just isn’t supported by science. This message got drowned out in the rush to promote untested and deeply held beliefs about carbs and fats, and to sell weight loss books and diet plans. The relentless focus on weight loss has also been a sideshow from the truly fascinating effects different diets have on our bodies. STEAK AND POTATOES On a cool November evening, Andrew Taylor grabbed a beer from his fridge and took it back to his couch, where he was struck with a realization. As he sat down and cracked the can open, he thought, Oh—this is the same thing. The way an alcoholic would treat the beer in my hand, that’s how I’ve been treating food. Thus began Taylor’s transformation from couch potato to potato-only eater. Taylor was the kind of person who obsessed over the cakes he saw in Melbourne’s pastry shop windows. Just the night before the beer epiphany, he had binged on pizza after a few weeks of attempting healthier eating. A former marathon kayaker, his athleticism had for years masked the problem and helped keep his weight down. But after his first child arrived in 2013, the demands of parenthood meant fitness “went out the window,” he told us. Food addiction, he theorized, took over. “The reason [I kept] choosing the donut instead of the apple is addiction.” By the end of 2015, Taylor weighed 334 pounds—which made him more than 100 pounds overweight, and the heaviest he’d ever been. With the body weight came high levels of blood cholesterol, high blood pressure, joint pain, depression, anxiety, and difficulty sleeping. Food was the root cause of his problems, he reasoned. So he resolved to approach food as he thought an alcoholic would wine and beer. He wanted to get as close to quitting as possible, finding a single food he could live on for at least a year. Googling “what is the perfect food,” Taylor quickly discovered the path of the potato in a video featuring Dr. John McDougall, the late low-fat vegan diet advocate. A physician and the co-author of The Starch Solution, McDougall used case studies to make the point that potatoes had a deep, if little appreciated, history as a health food. There was a 1920s study of two people who thrived eating only spuds. There was the modern potato farmer who lost weight, you guessed it, on potatoes. “All large populations of trim, healthy people, throughout verifiable human history, obtained the bulk of their calories from starch,” McDougall repeated, his modest Midwestern manner buoyed by the zeal of an evangelist. Digging into the research, Taylor—a vegan for ethical reasons—quickly learned that potatoes deliver roughly 90 percent of their calories from carbohydrates, 10 percent from protein, and less than 1 percent from fat, and that a combination of sweet and white potatoes could meet nearly all his body’s needs. He’d get his essential amino acids, minerals, and a good dose of vitamin C. The only thing spuds couldn’t deliver seemed to be vitamin B12, an essential nutrient that he could take as a supplement. Starting on January 1, 2016, Taylor began his experiment, baking and boiling 3 to 4 kilograms of sweet or white potatoes—enough to meet his energy needs—for his meals the next day. And every day, he ate as many potatoes as he needed to feel satisfied. No other veggies or fruits of any kind. No juice. No meat. Not even butter, oil, or sour cream. Just potatoes, with a few spices and sauces for variety, and the odd Friday night beer. When he shared his potato plan with others, he was met with equal parts skepticism and worry. Near the height of the low-carb craze, his weight loss plan was almost heretical. Former New York Times columnist Mark Bittman had recently declared, “Butter is back,” a response to the low-fat heyday of the 1980s, when a specific concern about the health effects of saturated fat morphed into a generalized panic about all fat. Carb-rich bread, rice, and pasta had gone from being staple foods that had nourished humans for centuries to public health enemies allegedly causing obesity—and potatoes were singled out as one of the worst offenders. Nutrition experts warned Taylor he’d get fat, grow weak, or suffer a protein deficiency. Taylor’s wife, preternaturally supportive, was on board—but insisted he be monitored by a doctor. Within a week, Taylor’s cravings subsided. Cramps he’d initially suffered, possibly due to large increases in fiber intake, melted away. His depression and anxiety lifted. Soon his sleep improved. Taylor started exercising again. The excess weight began dropping off. One year into the potato diet, Taylor had lost 114 pounds. On every indicator monitored by his doctor and a University of Adelaide research team that took on his case, his health had improved. Blood pressure, blood sugar, blood cholesterol— all better than pre-potato. McDougall’s larger message had been that a low-fat, starch-based vegan diet would not only help with fat loss but could prevent heart disease, diabetes, and high blood pressure. Rice, pasta, bread, and potatoes were among the best possible foods one could eat for body weight and overall health. According to Taylor’s experiment, McDougall was spot on. When we last talked to him, eight years post-potato, he’d kept the weight off, he told us. He maintains that spuds curbed his food addiction. Though he’s back to a vegan diet involving only the odd potato, he now eats mostly whole foods instead of donuts, cakes, and pizza. He believes in the potato so deeply, he, too, launched a weight loss program and a co-authored book called Spud Fit. Yet for every low-fat, vegan success story like Taylor’s, there are examples of people who swear by diets diametrically opposed to his for health or weight loss. Take a cruise through TikTok, X, even the medical literature, and you’ll find them. Maybe you’re one of them. Paleo, carnivore, low-carb, keto, gluten-free—members of diet tribes that eschew carbs, eat lots of fat, and often meat, while thriving. People like the Liver King today, or more famously in the medical literature, the meat-eating Arctic explorer Vilhjalmur Stefansson. An assistant instructor in anthropology at Harvard, Stefansson completed three expeditions to the Arctic, from 1906 to 1918, which spanned ten winters and thirteen summers. During his travels up north, he’d live “exclusively on meat and water,” he wrote in a 1935 Harper’s article. Boiled fish (his favorite) along with seal, caribou, whale, and polar bear—three-quarters of the calories came from fat. The all-meat diet left him “more optimistic and energetic than ordinarily,” Stefansson continued, echoing Taylor’s sentiments. When he returned to North America and clued colleagues and friends into his diet odyssey, they alternately didn’t believe him, argued that the cold explained his vigor up north, or attributed his newfound well-being to all the exercise he got exploring. Given this, Stefansson eagerly accepted an offer to have his health monitored by a New York City gastroenterologist. What followed was probably the first and only nutrition experiment involving Arctic explorers. Starting in February 1928, Stefansson and a colleague moved into the city’s Bellevue Hospital. They started on a typical, varied American diet so the researchers could gather baseline data on their health. Then they switched to a meat-only diet for a year—beef, lamb, veal, pork, and chicken. In 1929, the results of the study were published. Like Taylor, the explorers appeared to be perfectly fit. They, too, lost weight, their blood pressure was normal, and they had no symptoms of dietary deficiency. Even Stefansson’s constipation improved. As his doctor put it, “living on nothing but meat and fish (including fat) his health was at its maximum.” So how is it that humans can survive, even flourish, on diets as distinct as Taylor’s high-carb, low-fat vegetarian protocol and Stefansson’s high-fat, almost zero-carb, meat fest? The reason is this: Carbohydrates and fats are practically interchangeable fuels for the body, and we seem to be incredibly adaptable to using almost any combination. An obsession with the human capacity for flex-fueling—how we can get by on different macronutrient mixtures—led Kevin unwittingly into the diet wars, and into the center of one of the great controversies of twenty-first-century nutrition science. A CONTROVERSIAL PREDICTION The project started out innocently enough, in a small seminar room on the ninth floor of a huge redbrick building overlooking the research campus in Bethesda, Maryland. Kevin, standing at the front of a room filled with colleagues, was feeling nervous. A few years into his job, in 2007, it was time to review the scientific work he’d been producing to decide whether he was on track for tenure. Success meant the guarantee of a stable salary and research support for the rest of his career. Failure meant packing up and looking for a new job. Kevin, heart pounding, presented a computer model he’d been working on for the last three years. It was designed to explore the question Taylor and Stefansson had lived: How does the human body adapt to diets of different macronutrient compositions? How does flex-fueling work, and what impact does it have on body fat? Kevin was interrogating the idea that “a calorie is a calorie” when it came to body fat. The low-carb diet proselytizers, like the cardiologist Dr. Robert Atkins, claimed that people could lose body fat, even if they ate more calories, so long as they cut the carbs in their diet. He called this the “metabolic advantage” of very low-carb diets. Building on Atkins’s ideas in a widely shared New York Times Magazine article at that time, science journalist Gary Taubes argued that carb calories were especially fattening as compared to fat calories. Maybe a calorie wasn’t a calorie after all. Nutrition scientists often dismissed the claims of Atkins, responding with the “calorie is a calorie” refrain that they believed to be a law of physics. The first law of thermodynamics, on energy conservation, holds that energy can only be transformed from one form to another but not created or destroyed. Kevin’s physics training helped him see that this apparent refutation of Atkins wasn’t quite right. Carb-heavy diets could, for example, be a more efficient fuel for the body and result in more energy being available for fat storage. No creation or destruction of energy, and no violating the laws of physics, required. Low-carb diets might lead to more fat loss. Maybe Atkins’s promise—that eating low-carb was “the high calorie way to stay thin forever”—had some basis. But first, Kevin wondered: Where did the idea that “a calorie is a calorie” even come from? A CALORIE IS A CALORIE? Carbs and fats are each composed of carbon, hydrogen, and oxygen atoms. [*1] Different configurations of the three elements give rise to everything from the glucose, lactose, and fructose that sweeten honey, milk, and fruit, to the oily triglycerides, made of fatty acids and glycerol, in a marbled steak or a creamy slab of butter. But even with these common elementary roots, at a chemical level, there’s an important distinction between carbohydrate molecules and fat molecules. Carbs carry a lot more oxygen than fats. This difference led to the unraveling of how carbs and fats fuel us, how they affect body fat storage, and how we can live on diets as distinct as Taylor’s and Stefansson’s. When we metabolize fat, the chemical reactions require more oxygen from our breath to produce carbon dioxide (single carbons bonded to two oxygen atoms) compared to when carbohydrates are metabolized. So measuring the breath—the respiratory quotient, or the ratio of carbon dioxide produced to oxygen consumed—reveals the mixture of carbohydrates and fats that an animal (or human) is using to fuel metabolism at any moment. Researchers first observed this in animals in 1849. A former student of Liebig’s, Henri Victor Regnault, along with another French chemist, Jules Reiset, studied the respiratory quotients of different species, including reptiles, birds, dogs, and rodents. All had different respiratory quotients when they ate their usual diets, composed of different proportions of carbs and fat. But the French chemists also discovered that changing the diets of the animals changed their respiratory quotients. Shifting a dog’s diet from fat-rich meat to carbohydrate-heavy bread increased the carbohydrates the dog burned relative to fat—and how much carbon dioxide relative to oxygen it respired. Eating mostly carbs caused the animal to use mostly carbs for fuel, and less fat. What’s more, when a dog didn’t eat for several days, its respiratory quotient appeared similar to when it ate only meat. The fuel of starvation, it seemed, was the meat of the animal itself as it wastes away—burning mostly stored body fat for energy. All of this meant that how animals used fuel wasn’t an inborn trait, as people had suspected until then; they could adapt to whatever was available to eat. In the late nineteenth century, another two ex-students of Liebig—Carl von Voit (one of the protein pushers we met in Chapter 3) and Max von Pettenkofer—explored how feeding starving dogs different amounts of carbohydrate and fat slowed the loss of body fat. What amount of carbs and fat would cause the same body fat change? They noticed that the animals had to eat far more carbs than fat—just over double the grams—to stall body fat loss to the same extent. Why was dietary fat so much more potent than carbs when it came to preserving body fat? Max Rubner, a student of Voit’s in Munich, solved the mystery. An obsessive quantifier of all things, he filled his pockets with barometers, hygrometers, and pedometers. In his lab, he deployed a bomb calorimeter to quantify facts about the energy in different foods. Modeling his device after the one Lavoisier had used a century earlier to discover that combustion and respiration were equivalent chemical processes, Rubner burned different foods and measured how much heat the burning produced by tracking the temperature changes in the calorimeter’s water chamber. Burning fat produced the same amount of heat as did a little more than double the mass of carbohydrates. Similarly, Rubner found that for dogs to have the same body fat change on an all-carb or all-fat diet, they had to be fed about double the mass of carbohydrates as compared to fat. Carbs and fat could be exchanged in the diet with the same effect on body fat—so long as the total number of calories didn’t differ. It wasn’t the ratio of carbs to fat, or their weight in grams, that determined the amount of body fat loss. It was their calories that counted. Rubner called this his isodynamic law, which became “a calorie is a calorie.” He went on to use the calorimeter to establish the calorie values of protein, fat, and carbs, standards that are still in place today. He also showed that the physical law of energy conservation—that energy can’t be created or destroyed but only transformed—held true in a dog living inside a metabolic chamber. Any changes in energy in a dog’s diet could be accounted for by changes in body fat and the heat it produced. Animals obeyed the laws of physics, just like inanimate objects. Another of Voit’s students with Rubner, the American nutrition researcher Wilbur Atwater, set his sights on understanding how Rubner’s findings applied to humans. After he returned to the United States, Atwater built a larger metabolic chamber—the first in America that could house humans. Soon, he confirmed that what was true for Rubner’s dogs was also true for people. Calories from fat and carbs appeared to be interchangeable inside the body—how many calories people ate determined how much body fat they lost or stored. Under the spell of Liebig and Voit, Atwater was already a “protein enthusiast,” as we saw in the last chapter. Now his study results convinced him that, after providing people with enough protein, food should also be viewed as a means of giving the body enough calories. By 1887, Atwater became an administrator in the U.S. Department of Agriculture, with the ability to influence what and how Americans ate. The son of a Methodist minister, he was particularly focused on the home economics of food, advising the public about how to affordably feed their families, eventually pioneering what became known as “scientific eating.” To that end, he and his colleagues measured the amount of protein, carbohydrate, and fat in a variety of foods and examined how cooking affected the digestibility of each.[*2] These measurements were the basis for the nutrition facts panel on almost every packaged food or beverage, not only in the United States but around the world. In Atwater’s vision, people would use the information to make decisions about how to give their families the right mix of protein and calories—the original “balanced diet.” It wasn’t long before calories became a middle-class preoccupation and the dominant way of thinking about weight loss. A diet advocate seized on Atwater’s ideas to promote “calorie counting” as the secret to slimming. In 1918, the American physician and socialite Lulu Hunt Peters published the runaway bestseller Diet and Health: With Key to the Calories, providing a long list of 100-calorie portions of a wide variety of foods. Eat whatever you wanted—if you kept your calories down, body fat would melt away. After all, that was what Rubner and Atwater had discovered: A calorie is a calorie. OR MAYBE NOT In the seminar room, Kevin started to describe the findings of his model. He predicted that the body would select fuels for metabolism in a way that caused body fat loss to vary only a little, regardless of the proportion of carbs or fat a person was eating. Cutting carbs from a balanced diet caused the body to shift toward burning fewer carbs and more fat after several days. But surprisingly, reducing dietary fat by the same number of calories didn’t seem to change the mixture of carbohydrate and fat the body burned. The net result was that both diets led to similar body fat losses, but with a slight difference that contradicted the popular claims of low-carb acolytes like Atkins. The reduced-fat diet, Kevin’s model predicted, led to a little more body fat loss compared to the reduced-carb diet. Maybe a calorie wasn’t exactly a calorie, Kevin told his audience. But the difference was in the opposite direction from the one claimed by the low-carb diet camp. Kevin’s reviewers weren’t impressed that he had published less than a handful of papers in the three years since starting his job, but they loved Kevin’s attempt to mathematically model the way all three macronutrients worked inside the body, something that had never been tried before. Kevin riffed about the work that could be done to test the model at his institute’s soon-to-be-launched “metabolic clinical research unit,” a ten-bed hospital ward designed for human studies of nutrition and metabolism. By picking apart a controversial and long-standing law of nutrition, Kevin hoped he could entice the clinical researchers in the room to bring his model’s predictions to life. Instead, something happened that kept Kevin up that night. The scientific director of his institute, who was in the audience, suggested that Kevin take the lead and conduct the human study himself. A degree in physics didn’t exactly qualify him for clinical research. Learning how to run a clinical trial, with its many stringent layers of oversight, seemed like an impossible leap—and a huge risk for his career. If anything went wrong, he could kiss his job goodbye. At the same time, if there ever was a moment to take a risk, Kevin realized that the time was now. He was going through a divorce and had no kids and no responsibilities. And wasn’t designing experiments to test models what science was all about? Over the next several years, Kevin and his team recruited nineteen people with obesity to see what would happen to their metabolism and body fat as they shifted among three different diets. The study participants moved into the metabolic unit for a month, bringing laptops, video games, and the odd guitar along with them. Every participant followed a “baseline” diet, designed to match their calorie burn with a macronutrient mix that the average American eats. That lasted for five days. Then everybody in the study saw their calories slashed by about a third, with half the participants randomized to a diet that cut calories exclusively from carbs while the others cut calories solely from fat. After two weeks in the hospital, the study participants returned home for a break, and then they checked back into the metabolic unit for another two weeks. This time, they repeated the five-day baseline diet, then followed the calorie-restricted diet they hadn’t tried in the first round. Along the way, they had their every move and every morsel of food meticulously measured and spent ten days inside the metabolic chambers to measure how many calories their bodies were using and how much of that energy was derived from burning fat versus carbs. If Atwater and Rubner were right, cutting carbs would lead to the same amount of body fat loss as cutting fat—a calorie is a calorie. The study participants would see the same body fat loss with both diets since their calorie intake was identical. If the low-carb believers were right, cutting carbs would lead to more body fat loss than restricting dietary fat. But if Kevin’s model was correct, the low-fat diet would have a small edge on fat loss. In 2015, just before Taylor kicked off his potato year, Kevin published the results in a scientific journal. This was a rigorous human test of the lowcarb diet claims that were filling newspapers, books, and podcasts at the time, and the findings again ran contrary to the popular pronouncements. While cutting carbs did drive the body into fat-burning mode, the low-fat diet caused slightly more body fat loss. Kevin’s model was materializing in people. But the measured difference in body fat loss was a trifle—a mere 40 grams per day, or the weight of a large strawberry. Not meaningful to health, and certainly not a big edge in the diet wars. In the end, Rubner and Atwater were in fact nearly spot-on: A calorie was just about a calorie when it came to body fat changes.[*3] When you keep calories constant, and vary carbs and fat by a lot, it has almost no effect on how much body fat changed. FLEX-FUELING The diet wars have drowned out a stunning feature of our physiology. Different macronutrients, with entirely distinct chemical structures, enter the body in their mind-boggling variety in the food we eat. We then strip the chemicals down to equivalent units of energy and burn them to fuel metabolism—or store them as body fat for use when we’re not eating. But as elegant and prescient as Rubner’s and Atwater’s experiments were, they still didn’t explain how the body manages to accomplish this caloric equivalency, how we can get by eating only meat or only potatoes. The first peek inside the process happened in 1921, in a Toronto lab far removed from the diet wars, where Frederick Banting and his graduate student Charles Best discovered the hormone insulin. The medical researchers noticed that despite high levels of glucose (a carbohydrate) in their blood, people with type 1 diabetes were hardly able to use carbs for fuel. Instead, their bodies used copious quantities of fat or fat that had been turned into ketones—another fuel source for the body. When the patients were injected with insulin, they suddenly began using glucose while their fat burn slowed down and blood ketone levels dropped. Their work resulted in a miraculously effective treatment of diabetes in children, and in 1923, Banting won the Nobel Prize in medicine. (Best, a medical student at the time, was overlooked; Banting shared his prize money with him anyway.) Later, researchers observed that in people without diabetes, eating carbs made insulin levels go up, and again the body turned to carbs for fuel. Eating fat hardly raises insulin levels at all and the body fuels itself with fat. Insulin seemed to be the internal messenger that told the body which fuel to use, and when. If insulin ran low, the body used mostly fat to fuel itself. When insulin ran high, the body turned to carbs. One hundred years later, we have a fuller picture of the physiology behind calorie swapping—and it’s amazing. To explain, Kevin uses the metaphor of a flex-fuel vehicle. Inside each of our cells are tiny generators, in the mitochondria, that constantly need fuel to run and keep us alive— much as an engine requires fuel to keep a vehicle going. To make the metaphor more accurate, imagine a vehicle capable of running on arbitrary mixtures of fuel—diesel, ethanol, even electricity. Its driver can fill up with whatever is available, and the car will move with practically identical efficiency. The vehicle itself is mostly built out of the fuel, too, and its component parts are constantly being taken apart and rebuilt. Metabolism is the process that uses the food we eat to fuel our bodies, as well as to build and repair them. Again, the primary fuels are carbs and fat; protein’s major role is body building and repair. [*4] We can get by on carb-heavy potatoes or fat-rich meat or myriad mixtures in between: The body’s fuel tank seems to be agnostic. STEAK AND POTATOES, REVISITED To see calorie swapping in action, let’s look at what happens inside your body when you eat a potato versus a steak. The responses of our bodies to these two foods couldn’t look more distinct in terms of how each organ behaves, the hormonal signals in the blood, and how the different fuels transform—and yet the net result is practically the same each time. Excess energy becomes body fat. When you wake up in the morning, assuming you hadn’t eaten in many hours, your low blood insulin levels tell your fat cells to increase the breakdown of stored fat, and to release fatty acids into the bloodstream to fuel the metabolism of tissues like our muscles. But the brain needs something different to fuel itself: glucose or ketones, both of which are released to the blood by the liver when insulin levels run low. The fast is over: You start eating potatoes. Everything changes. During digestion, the starchy carbohydrates in the potatoes get broken down into glucose. When the glucose reaches the small intestine, specialized glucose transporters shuttle it into the blood supply of the liver, [*5] which is a fundamental hub of metabolism—the gatekeeper for what the rest of the body receives after you eat. That includes how much glucose from the potatoes gets to other places in the body. Within minutes of eating, any glucose that wasn’t absorbed by the liver to replenish its own energy stores passes into the blood. Time for your pancreas to respond to your rising blood sugar levels. It spills insulin into the blood supply of the liver, which helps control how much of the hormone flows to the rest of the body. Your brain keeps extracting about the same amount of glucose from your blood, but for other body parts the rise in insulin acts like a traffic cop, directing the flow of fuels to and from your various organs. It tells your muscles to take up more glucose and your liver to stop releasing glucose and instead absorb it and store it for use between meals. While insulin is busy shuttling glucose from the potatoes out of the blood, it also tells your fat cells to slow their release of fatty acids, and to start taking up the glucose and fat circulating in your blood. At this point, the level of fatty acids in your blood will run low and levels of glucose and insulin will run high. Your muscles switch over to burning more carbs and less fat, sparing fat that would have otherwise been used up. Your liver and fat cells also build fat from carbs—a process called de novo lipogenesis, which Liebig discovered. It’s not a major source of body fat in most humans eating a normal diet. But eating very large amounts of carbohydrates can ramp up de novo lipogenesis significantly. In short, on the potatoes, your body uses mostly carbs to fuel itself, and whatever’s left over gets stored as body fat. Okay, your potatoes are done. Your insulin levels drop back down again hours later, and the fuels in your blood shift. Your liver returns to spilling out glucose and your fat cells go back to releasing fatty acids to power you. Now let’s eat a steak. During digestion, you break the steak’s protein down into those all-important amino acids, which get absorbed into the blood. Your liver again gets first dibs at taking up amino acids, just like the glucose after eating the potatoes. After passing into general circulation, via the liver, amino acids send a signal to the pancreas to start spilling out more insulin, this time in combination with another hormone called glucagon, which gets released in response to the steak’s fat and protein. The uptick in glucagon tells your liver to perform an amazing switcheroo: It transforms some of the amino acids into…glucose. So even without eating carbohydrates, your body makes them in a process called gluconeogenesis. That’s why there’s no minimum dietary requirement for carbohydrates: We can build carbs from some amino acids as well as the glycerol part of fat. (And while we can make fat from carbohydrates via de novo lipogenesis, we can’t make the essential fatty acids that are needed in the diet in small amounts—less than 2 grams for omega-3 per day and 10 to 20 grams for omega-6, depending on your age.) Your newly elevated glucagon levels prevent insulin from completely shutting down the liver’s release of glucose. Now there’s enough glucose to fuel your brain. If insulin levels don’t increase by too much, the liver also turns some of the fatty acids released from fat tissue into ketones for your brain. Meanwhile, your muscles mainly run on the fatty acids provided by your fat tissue—not, at first, from the food. That’s because, unlike carbohydrates and protein, which get absorbed into your liver’s blood supply within minutes after eating, dietary fat takes the scenic route through the body before it can be used as fuel. The intestines repackage the fat you ate and release it into the body’s lymphatic system, where it takes hours to enter your bloodstream in the form of triglycerides. The triglycerides in the blood tell your fat cells to stop releasing fatty acids and push their way into the fat cells to be stored for future use. So it’s not just insulin that slows the rate of fat being released from fat cells. After the steak, your body was fueled mainly by fat. But the net result is the same as after the potatoes: Any fat that wasn’t burned by your body gets stored as body fat. This happens in about the same proportion as after eating an equivalent number of calories in the form of potatoes. Rubner would have been amazed to find out about the different hormonal and metabolic dances taking place among organs depending on the fuels we eat. Evolution created an intricate system to stockpile body fat when we eat excess calories, no matter their source. Even without understanding these processes, a moment’s thought suggests that the idea that dietary carbs are uniquely fattening makes little evolutionary sense. That would mean our ancestors wouldn’t have been able to store a lot of body fat during the feast after a successful big game hunt. But evolution didn’t have to result in a perfect calorie equivalence between dietary carbs and fat; it just had to be good enough to avoid selection pressures to evolve a more perfect system. The system that evolution devised, Kevin was discovering, seemed to be precisely good enough. DIET WARS, GROUND ZERO In the aftermath of Kevin’s study, he received criticism from low-carb diet advocates, like science journalist Gary Taubes, who had written books arguing that cutting carbs was the key to losing weight. In the opinion pages of the New York Times, Taubes argued that the study was too short-term and, because calorie intake was strictly controlled, the research ignored the effects of hunger. If people were left to eat as much as they wanted on a low-carb diet, they’d eventually experience less hunger and eat fewer calories than low-fat dieters, he argued.[*6] Others were similarly critical, pointing out that the reduced-carb diet in Kevin’s study wasn’t really lowcarb enough for the liver to produce a substantial amount of ketones. Maybe the critics were onto something. Kevin decided to run additional experiments, longer term and even lower carb. The first focused on seventeen men who were overweight or had obesity. They stayed in the hospital for two continuous months to test the effects of eating the very low-carb, high-fat ketogenic diet following a period on a moderate-carb, high-sugar diet. Both diets had the same number of calories and protein. This time, the men experienced a tiny increase in the number of calories their bodies burned after transitioning to the lowcarb diet. But they lost slightly less body fat during the keto diet despite their insulin secretion being half as much as during the high-sugar diet. Again, the differences in fat loss were almost a wash. A calorie seemed to be more or less a calorie when it came to body fat changes—a fact that has been repeatedly confirmed in other studies since. The low-carb camp was not happy with Kevin. They accused him of spinning his studies to suit a low-fat diet narrative. So in hindsight it was probably predictable that Dr. John McDougall, the low-fat vegan diet advocate, would invite Kevin to present his results at one of his “advanced study weekends” in California, aimed at plant-based dieting members of the public. It was there that Kevin met the potato eater Andrew Taylor, who had also been invited to speak. Just after Kevin’s talk, McDougall ushered him to a private room for an on-camera interview. The doctor kicked off the video by expressing his exasperation with the renewed interest in low-carb diets, and thanking Kevin for his recent studies countering the ideas that the opposing diet wars camp promoted. McDougall was convinced that Kevin’s data supported his low-fat, high-carb way of eating, and he tried to get Kevin on the record supporting this view. Kevin didn’t take the bait. The edge on fat loss from a low-fat diet had been marginal. Kevin also reminded McDougall of the limitations of his studies—they were short-term and featured hospitalized participants eating a fixed number of calories. As Taubes had pointed out, maybe a low-carb diet would reduce hunger and cause people to eat fewer calories. Surely if that happened, low-carb eaters might lose more body fat in real life. Kevin set up another clinical trial to investigate. Now the focus was appetite differences, and any changes in body fat and weight that resulted, between two minimally processed diets that varied widely in carbs and fat. Both diets had a common foundation of nonstarchy vegetables, but one was plant-based, McDougall-style: high in starchy carbs and low in fat. The other was an animal-based ketogenic diet. The twenty study participants spent four continuous weeks in the hospital, eating each diet for two weeks in random order, with simple instructions to eat as much or as little as they wanted. The people were not trying to lose or gain weight. They didn’t know the purpose of the study other than that Kevin and his team were interested in learning how the different diets affected various health markers. Yet again, the low-fat diet came out with an advantage. Each person consumed fewer calories eating low-fat, despite much higher levels of insulin after meals and more insulin secreted throughout the day compared to the low-carb-diet period. They also lost a little more body fat despite markedly higher levels of insulin. And once again, people who were primed to see the results favorably embraced them without question. McDougall—who had appeared frustrated with Kevin by the end of their video interview—was delighted. “I am pleased to see you working towards what I understand is true,” he wrote Kevin in an email. On the other side, those whose worldview was not confirmed found fault in the study. Low-carb devotees, for example, were not as happy, to say the least. Just as Liebig had done with his critics more than a century earlier, they dismissed Kevin’s findings or reinterpreted them to justify a continued belief in their theories. To this day, Kevin and Gary Taubes have agreed to disagree on what to make of Kevin’s research. Taubes maintains Kevin’s studies are limited— too short-term for the low-carb diet’s effects to really kick in. “Can the body’s response to the diet over six days or two weeks or a month be assumed to represent what happens over longer time periods?” he asked. He pointed to other research groups who report finding increased calorie burn on a low-carb diet—a result that Kevin believes is more likely explained by methodological issues. “It’d be almost a miracle for all these hormonal effects to balance out such that the only thing that mattered is calories,” Taubes told us. If Kevin or another research group ran a controlled feeding trial that lasted for at least several months, comparing a low-fat and a lowcarb diet matched for calories diet in patients with obesity, Taubes predicts the metabolic advantage of low-carb would emerge. “At six months, you’ll see something very different than at two weeks,” Taubes told us. Perhaps such a study will eventually be run, overturning more than a century of research that suggests a calorie is just about a calorie. But as of now, it looks like pundits on both sides of the diet wars got excited about the rewards of their chosen approach, and wanted to spread the good news to others, while also finding flaws in contradictory science. And that’s not because anyone was seeking to mislead people. It’s because this is how humans are wired: We latch on to evidence that supports our worldview, and discount data or experiences that don’t. The low-fat versus low-carb debate is no exception. As Taubes says, his writing is “filled with caveats and sentences that make it very clear that what I’m saying is what I believe to be true, believe it enough that I feel a book should be written about it, but that’s all I can say. Laying out an argument and the evidence in support of that argument is not prematurely arguing a stance. It’s just arguing a stance.” And the way he sees it, we’re also arguing a stance here. The fact that Kevin was undertaking trials of diet trends, with results that challenged some of those ideas, garnered a lot of media attention. The flex-fuel studies were covered in hundreds of stories, by journalists around the world, including Julia. The responses to Julia’s stories were more heated than to any other topic she has ever written about, including abortion, gun policy reform, and maternal death. One “investigative blogger” obtained Kevin and Julia’s email correspondence from his employer under the Freedom of Information Act and published a blog post insinuating that they were colluding in an anti-low-carb conspiracy. The low-fat camp was similarly ruthless, but they had different targets. McDougall told the media that Dr. Robert Atkins, popularizer of keto diets for weight loss, was “grossly overweight” at the very end of his life. Kevin and Julia were learning the hard way a fact about food that persists: Perhaps even more so than politics and religion, diet is controversial dinner table fare. OTHER EFFECTS Does all this mean you shouldn’t try a keto diet or that low-fat veganism is the one true way to eat? Absolutely not. Both diets can be effective for weight loss and helpful for a variety of other reasons, as Stefansson and Taylor demonstrated. We don’t doubt that those experiences are real, as are the experiences of countless others who’ve had tremendous successes on these and other diets, and clinicians and advocates who’ve seen people’s lives transformed by changes in what they eat. We also don’t doubt that many food pundits may be well-intentioned, truly believing their theories and diet plans will help. Our point is this: Neither low-fat nor low-carb diets have been shown to have a big fat loss advantage as proponents on either side claim. Even the low-fat edge Kevin kept finding was marginal. Over the longer term, randomized controlled trials show that weight loss is virtually indistinguishable between the two diets. Of course, these are averages, which hide lots of variation, including all the Stefanssons and Taylors out there. But most of us will cluster around that dismal middle and experience a weight loss wash. Weight loss is only one metric, and the “carbs versus fats for weight loss” debate has been a distracting sideshow. The focus on body size and the products of low-carb and low-fat peddlers yet again distracted us, not only from the wondrous things our bodies can do with different kinds of food but from the potentially far more interesting effects of these diets. As we described in this chapter, changing the proportion of carbs to fat shifts the body’s hormonal and metabolic milieu, and we’re only beginning to understand the effects of these shifts. Very-low-carbohydrate ketogenic diets have been used to treat epilepsy since the 1920s, when researchers noticed that people who fasted experienced fewer seizures. It’s not clear why the diet works in some, even where standard medication fails, but possible reasons include making neurons more resilient during seizures and altering gut bacterial populations, known as the microbiome. Whether diets varying in the ratio of fat to carbs have other effects on the brain, perhaps influencing behavior or mental health, is another fascinating question. Kevin’s research recently found that reductions in dietary fat, but not carbohydrates, led to increased levels of the neurotransmitter dopamine in brain regions that may affect food choices and motivation. For three days after the controlled reduced-fat diet, people chose to eat more foods high in both fat and sugar as compared to what they chose to eat after the reduced-carb diet. This could mean that the early phases of low-fat dieting are more difficult. Other studies have suggested that ketogenic diets can potentially help alleviate alcohol cravings in alcoholics, and help treat nonalcoholic fatty liver disease, a problem that’s now estimated to affect about a quarter of American adults. Ketogenic diets also show promise for managing type 2 diabetes, a disease diagnosed by excess blood glucose, and one that arises when the body’s organs become resistant to the effects of insulin. Some patients with diabetes who eat a ketogenic diet reduce their average blood glucose levels, or HbA1C, as well as their reliance on medication, but the benefits only persist as long as people adhere to the diet—as with every diet, this is not easy—and it’s not clear how much of the beneficial effects of keto are due to the weight loss people experience. The most interesting research focuses on how the body’s immune system responds to different eating patterns, possibly because of alterations to the microbiome. Kevin recently published a study showing that shifting between a plant-based, low-fat diet and an animal-based, ketogenic diet led to rapid changes in the gut microbiome and immune system. The low-fat diet enhanced the innate immune system, which is the body’s first line of defense against a wide array of pathogens. The low-carb diet stimulated the adaptive immune system, which creates a tailored response to eliminate specific pathogens. What exactly this means for our health, no one knows yet. Researchers are even exploring keto for treating or preventing cancer. The idea is that elevated levels of insulin and growth factors are detected by cancer cells that increase the activity of a molecule called PI3-kinase. PI3-kinase helps the uptake of nutrients that are used to synthesize components of growing and proliferating cancer cells. Inhibiting the activity of PI3- kinase is therefore a promising cancer drug target. But there are a few wrinkles. PI-3 kinase inhibitors work on all insulin-sensing cells, causing them to be resistant to the effects of insulin. This increases blood glucose and stimulates the pancreas to produce more insulin, partly counteracting the effect of the drug. Ketogenic diets reliably lower insulin levels. One study, published in Nature, tested whether PI3-kinase inhibitors would perform better in mice when they also ate a keto diet. This shrank twelve types of tumors in the rodents—while causing the leukemia to worsen. It’s not yet clear what this means for mice, let alone humans. We’re still “in the Stone Age” of understanding how to eat for each type of cancer, a researcher once told Julia for a Vox story. On the flip side, plant-based diets that are rich in fiber—the kind of eating pattern McDougall promoted—have been linked to a range of health benefits, including improved gastrointestinal health, and a reduced risk of heart attack, stroke, as well as type 2 diabetes and several types of cancers. Fiber seems to nourish the gut microbiome, feeding the health-promoting bacteria that live inside us. That’s not to mention the planetary benefits of avoiding meat and other animal products (stay tuned for more on this later). So if the goal is just weight loss, and you want to try a low-carb or lowfat diet, go ahead with this new appreciation of how your body will select fuels on each one, and how any fat loss that happens would probably be similar if you ate the same number of calories on the opposite diet. For some of these other end points—cancer, diabetes, immune system function, gut health—the diets may perform differently, and which one is the best for your health goals is a question for your doctors. But even they probably won’t have a clear answer because the science just isn’t there yet. If you stopped reading the book now, you might think the only things we need to get from food are protein and energy. We’ll get to why that’s not the case soon enough. Until then, the debate over cutting carbohydrate versus fat calories for weight loss not only missed the most interesting things about the diets but also marketed them as strategies for getting rid of body fat— the metabolic hero that’s been hiding in plain sight. Skip Notes *1 Even more fundamentally, plants produce carbohydrates by using the energy of sunlight through photosynthesis to combine the carbon dioxide in the air with water. These carbohydrates can then be turned into fat. Or plants can combine carbohydrates with nitrogen captured from the air through soil to make proteins. This is how the three macronutrients end up in our diet when we eat plants or animals that ate plants. Kudos once again, to the ancient Greeks for their elements: It’s ultimately sun, water, air, and earth that feed us. *2 There has been a lot of discussion about discrepancies between calories in food as measured in a calorimeter and how this differs from the calories available to the body via digestion, absorption, and metabolism. Atwater was aware of these discrepancies, which was why he performed detailed measurements, collecting all the urine and feces from volunteers eating controlled diets for extended periods. The average results of these painstaking experiments led to the Atwater factors relating calories to consumed macronutrient grams (4 kcal, 9 kcal, and 4 kcal per gram of carbs, fat, and protein, respectively) that became the standard values to this day. In other words, Atwater accounted for many of the things people complain are missing in calorimetry measurements. *3 Low-carb diets do lead to greater weight losses early on because the body loses more water weight. So when people measure body weight on their bathroom scale soon after cutting down on bread and pasta, they might get the impression that the approach is working well. If they had the ability to make precise measurements of their body fat, they might think otherwise. *4 We do use protein as a fuel source, but it’s a minor player relative to carbs and fat. We use it mainly through the conversion of amino acids to carbohydrates—a process called gluconeogenesis. *5 The body is mathematically donut shaped, with the donut hole being the tube connecting your mouth to your anus. So food is only inside you after it is digested and absorbed into the topological donut that is your body. Imagine that. *6 Before the experiment was over, Kevin asked Taubes to predict the result. Body fat accumulation, in Taubes’s view, was driven by the hormone insulin. Eating carbohydrates clearly stimulated insulin secretion, whereas eating fat did not. Ergo, carbs were especially fattening. Taubes said, given the calories in the two diets were equal, his theory would predict decreased insulin secretion, less hunger, greater calorie burn, and more body fat loss eating a reduced-carb diet. Kevin’s study found the opposite for all but hunger, which wasn’t measured. Despite no measurable reduction in insulin secretion, cutting dietary fat led to a little more body fat loss than cutting the same number of carb calories even though reducing carbs led to significantly decreased insulin secretion. CHAPTER 5 Ode to Body Fat We’ve been dancing around a subject that’s key to this book: body fat. It provides the energy to fuel metabolism when you don’t eat. It expands when you consume a surplus of calories. An obsession with losing it fanned the trillion-dollar diet and exercise industries and spurred millions of people to seek out surgery or medication—Wegovy and Mounjaro today, fen-phen and DNP in the past. Here we’re going to explain body fat’s critical role, what it does for health, and why excesses can also diminish health in some people. We will get into all this by way of efforts to remove it, and the people who can’t make enough. We will argue that people often malign body fat because they don’t understand its function and think about it backward when it comes to metabolic disease. By the end of this chapter, we hope you will see body fat for the marvel it truly is. A DOCTOR WENT TO SEE THE POPE… In 1958, a Jewish doctor in Rome made his way to the Vatican for a private audience with the pope. The doctor, Arpad Fischer—imposing stature, round face, thinning slicked-back hair—had recently returned to Italy after sixteen years in exile since World War II. In New York City, he’d studied the emerging art of cosmetic surgery under one of the city’s top surgeons, perfecting facelifts and nose jobs. Now that he was back home, the expertise he’d picked up in America wasn’t welcome. Medical clinics were overseen by the Catholic Church. Cutting into the body to alter what God created for cosmetic purposes was considered sinful, which made practicing plastic surgery in Italy impossible. Arpad wasn’t a religious man, but he figured he had no choice. If he wanted to share what he’d learned in America with his patients in Italy, he needed the pope’s blessing. During his meeting, Arpad made a moral case. Not only was cosmetic surgery helpful for people who had been disfigured because of injury, he said, it could heal those whose physical defects socially isolated them. He wasn’t sure what to think when he left the Vatican that day—maybe he’d need to pursue a new specialty?—but the pope seemed open-minded and Arpad was an optimistic man. Soon enough, he got the blessing he’d hoped for. “If the end is good,” the pope decreed in a papal dispensation, “cosmetic surgery is commendable.” Arpad’s practice began to flourish in Rome. He ingratiated himself with the Italian jet set, counting politicians and film stars among his clients. Soon his son, Giorgio Fischer, joined the bustling practice. The junior doctor quickly picked up his father’s techniques in facelifts and nose jobs. But it frustrated Giorgio that he couldn’t easily help patients with something many routinely complained about: unsightly body fat. At the time, the only way to remove fat was to cut into the body in open surgery. Not only were the operations high risk, they left behind terrible scars. One day, a solution presented itself during a visit to California. Giorgio noticed an oil drill burrowing into the ground. Watching black oil spew out, his mind wandered to human body fat. What if he used a similar principle to suck the fat out of places where patients didn’t want it? Back in Rome, he invented a new surgical instrument: a long, thin cannula, attached to a vacuum with a sharp blade inside that could be switched on or off. This allowed the surgeon to easily travel inside the fat tissue, avoiding the need for major surgery, while suctioning out fat tissue destined for the garbage, where surgeons like Fischer thought it belonged. After the father-and-son duo demonstrated the invention in Paris at an elite plastic surgery clinic, media outlets around the world reported the story. Dramatic before-and-after photos appeared in newspapers and magazines. What they depicted almost seemed like a miracle. Go into an operating room and leave a skinnier version of yourself, no dieting or exercise required. Love handles, saddle bags, and big butts be gone, with only a tiny scar left over. By 1986, 100,000 people in the United States had signed up for the operation, making it the most popular cosmetic procedure in America. It is now one of the most popular around the world. This was the birth of modern liposuction. If there’s a single goal of all diet culture, it’s encapsulated by liposuction: Get rid of body fat, as quickly as possible. Not only is fat bad, unwanted, an object of derision, it’s also long been considered unhealthy. In the Hippocratic Corpus, obesity was defined as a surplus of humors and linked with infertility and early mortality. According to early Roman medicine, “The obese, many of them, are throttled by acute diseases and difficulty breathing; they often die suddenly, which rarely happens in the thinner person.” By the twentieth century, statisticians at insurance companies quantified obesity’s mortality risks for the first time, in Metropolitan Life Insurance Company reports on the link between body weight and health. In 1937, the company’s vice president, Louis Dublin, came out with tables based on data from four million policyholders. Those who managed to maintain what Met Life determined was an ideal body weight relative to their height were the least risky, lived the longest, and cost the insurance company the least. By the 1970s, as obesity rates were rising, body mass index (BMI) started to be used in epidemiological studies to track population trends. The calculation—a person’s weight in kilograms, divided by the square of their height in meters—had been introduced by an early-nineteenth-century Belgian mathematician, who was interested in finding the characteristics that defined the “average man.” Studying data on white European males, Adolphe Quetelet discovered that weight typically increased not in proportion to height, but height squared. BMI was the relationship between weight and height that varied the least among people and had the expected bell curve distribution in the population. To be clear, BMI wasn’t meant to be a diagnostic tool for disease. It wasn’t even invented as a measure of body fat. Efficient and easy to calculate—and a reasonably good correlate of body fatness—BMI was adopted for diagnosing obesity and predicting health risks anyway. People with a BMI of 30 or over were determined to have obesity. Outside the clinic, BMI became a gatekeeper for gauging risk for life and health insurance policies—who should get access, and who should be shut out. Most fundamentally, the widespread reliance on BMI reinforced the ancient association: the bigger the person, the more body fat, the unhealthier. If that was correct, removing fat through liposuction should improve health. That was the hypothesis in the early 2000s, when physician-scientist Sam Klein decided to track the changes to the health of patients before and after their surgeries. Klein, a graduate of MIT with a quick wit and dry sense of humor, had grown curious about liposuction’s health impact by way of the existing research on dieting or exercise. In those studies, body fat loss always seemed to lead to cardiovascular and metabolic health benefits, such as improvements in heart attack risk and insulin sensitivity— or how well the cells in the muscles, fat, and liver respond to insulin. But no one knew whether the lifestyle changes that typically accompanied fat loss—exercise, a healthier diet—drove the health improvements, or whether they were caused by the loss of fat itself. Older studies of liposuction drew mixed conclusions. But they weren’t rigorous, Klein explained. Patients in existing studies, for example, would “become religious” about lifestyle changes after surgery, eating less and moving more. “Maybe that was why some of those reports showed a beneficial effect,” Klein said. So the question remained: Could people get the health benefits of weight loss with liposuction? To investigate, Klein and his colleagues meticulously tracked the health of fifteen women with obesity before and up to twelve weeks after major abdominal liposuction. They made sure all their study subjects had a stable body weight going into the operating theater, and prevented them from losing additional weight afterward. Every woman had about the same volume of fat removed. They saw no health gains on any of the parameters the researchers checked—blood sugar levels, insulin sensitivity, blood pressure, blood cholesterol. And their surgeries weren’t minor: They each had about 20 pounds of fat sucked out of them—known as large volume liposuction. “There was great depression in the lab,” Klein told us. He had hoped the surgery would turn out to be not just a cosmetic procedure but a treatment for metabolic disease. While Klein had managed to answer one question—liposuction couldn’t deliver the health gains of traditional weight loss—his research raised another: How could it be that a higher BMI is closely associated with a greater risk of disease and mortality, and losing weight is associated with health improvements—but removing body fat surgically did nothing to boost health? It turns out the subcutaneous fat tissue that plastic surgeons focus on getting rid of is actually a vital and dynamic organ—it’s our safe harbor for storing excess energy, designed to sustain us when we’re not eating. The padding on the butt, arms, or thighs is where we want our stored fat to be. Fat that’s less likely to meet a plastic surgeon’s scalpel—the visceral fat that encircles abdominal organs, or ectopic fat that develops inside the cells of the organs themselves—is the kind that’s more closely linked to health complications. Yet cutting out even visceral fat doesn’t seem to bring on metabolic health gains, just as liposuction fails to deliver on the benefits of traditional weight loss. This is part of the reason why excess body fat is increasingly seen as a sign of an underlying physiological problem, not necessarily a problem in and of itself. It’s also why, after a century of relying on BMI, medicine is finally moving away from the simple size metric to diagnose the disease obesity. Instead of the idea that it’s the quantity of fat that matters most for health, the focus now is on the quality and function of a person’s fat tissue —an idea that turns how many of us think about fat on its head. These insights implore us to take a hard look at a part of the body many of us have been going to great lengths to get rid of. RECYCLE, REUSE, REDUCE Compared to other apes, humans store far more energy in the form of body fat. With access to fresh drinking water, most of us can get enough fuel from our fat tissue to survive weeks or even months without food. The Guinness World Record for the longest survival without eating is 382 days. Angus Barbieri, the son of a fish-and-chips shop owner from Edinburgh, Scotland, weighed 456 pounds at the start of his medically managed weight loss in 1965. More than a year later, he was down to 180 pounds. When he decided to start eating again, Barbieri told the press he’d forgotten what food tasted like. Photos of him smiling, about to tuck into his first meal in more than a year—a boiled egg, buttered toast, and black coffee—show a bony figure, eyes bulging out from a narrow face, in a turtleneck sweater he can hardly fill. He’s unrecognizable from the cherubic, double-chinned man who’d checked into the hospital over a year earlier. Nearly forty years after Barbieri’s fast, in 2003, the magician David Blaine survived without food for forty-four days. The fast was shorter, but its location was more impressive than Barbieri’s. Dangling above the London’s River Thames in a small plexiglass box, Blaine was embarking on a public display of starvation. His performance harked back to late nineteenth-century and early twentieth-century “hunger artists,” who’d abstain from eating for the amusement of others. He emerged from his selfimposed isolation 54 pounds lighter, stoking speculation about how he could have lost all the weight by doing nothing but sitting in a box. What Barbieri, Blaine, the hunger artists, the Minnesota starvation subjects, and the millions of others over time who have survived fasts or famines have in common is this: They started out with enough energy tucked away on their bodies to keep going, fat that “fed” them—and their metabolically expensive brains—while they had no food to eat.[*1] This fuel storage capability is what has allowed humans to migrate around the globe and survive in environments as varied as the frozen Arctic, the arid plains of the savanna, and modern cities like Melbourne—with similarly varied diets. If you could peek beneath your skin, you’d see subcutaneous fat tissue. It looks like a net brimming with glistening pale-yellow fish eggs popping through. The net is a matrix made from collagen and other molecules, there to give structure to the mass of fish eggs—our fat cells. We each have billions of fat cells, or adipocytes, in our body. The vast majority are white adipocytes,[*2] each filled with a single droplet of fat made up of about ten trillion molecules of triglyceride. Our excess calories are stored as triglycerides, each composed of a glycerol backbone attached to three fatty acids. The triglyceride-dense fat droplet—about 90 percent of the fat cell’s volume—is so large, it squishes the nucleus against the side of the cell membrane, making the cell’s other machinery appear superfluous. This form hints at fat tissue’s main function. Fat cells can balloon in volume by several thousandfold, keeping store of massive quantities of energy in the triglyceride molecules. Fat contains more than twice the calories per gram compared to carbohydrate or protein. And because fat doesn’t mix with water, it can be stowed away without additional water weight—unlike glycogen, the storage form of carbohydrates, found mainly in the liver and muscles. Hence, fat tissue is our primary fuel storage tank, and triglycerides are the body’s storage form of fat. The result: Even people without a lot of fat, like Blaine, can hold almost 200,000 calories on them. The fat tissue of people like a pre-fast Barbieri can hold almost a million calories. Thank you very much, fat. The way our bodies store and use body fat involves processes so astonishing, so ingenious, they fill us with awe. It’s easy to imagine that the fat cells just, well, hang about, passively stockpiling energy until we need it, while making it difficult to fit into nonstretch blue jeans. Look more closely, and you’ll see something amazing happening. Fat cells are simultaneously building and breaking apart triglycerides all the time. The fat droplet inside each fat cell is coated in special proteins that act in response to signals from the rest of the body. Hormones and triglycerides in the blood control the fat cell’s breakdown and release of triglycerides—into fatty acids and glycerol—to the bloodstream, acting as the messengers in a complex fuel supply operation that runs inside us all the time. To make sure every bit of the body gets the fuel it needs at the right time, starvation becomes nature’s ultimate expression of “reduce, reuse, recycle.” When we’re not eating, and levels of insulin and triglycerides in our blood go low, the triglycerides inside the fat cells get the message to increase the rate at which they break apart, spilling out their contents into the blood for uptake by cells in other tissues—a stream of fuel for the body from our fat. As you’ll recall from the last chapter, the body resorts to burning fat stores when there’s no new fuel coming in. The building up of new triglycerides inside fat cells slows and, if this happens for a long enough time, the fat cells noticeably shrink in size, leaving the netting of our fat tissue depleted. By outward appearances, slimming has begun. After about two days without food, the body starts to get crafty about getting energy. That’s when the “reducing” kicks in. The metabolic rate of the cells in most tissues (though probably not the brain) drops off to increase the “mileage” we can get from our stored fuel. Just as we saw with the Biggest Losers and the Minnesota starvation subjects, this is the body going into power-saving mode. Every bit of energy we have stored on us will last longer. Since not all our tissues can rely directly on fat for energy, other forms of fuel get mobilized. Some rely on stored glucose—glycogen. The energy we put away as glycogen is minuscule in most people compared to fat.[*3] A lean adult holds about a hundredfold more energy as triglycerides than as glycogen, which is mainly tucked away in the liver and muscle. But glucose is an important source of fuel, not only for the brain but the red blood cells. So the body does whatever it can to spare and make glucose. Since we stopped eating and insulin levels dropped, the brain and liver get the signal that there’s no new glucose coming in. The liver mobilizes its limited supply of glycogen and releases that glucose into the blood while simultaneously ramping up another process to make more glucose— gluconeogenesis. This is where the recycling and reusing starts up. In gluconeogenesis, our body turns the glycerol part of our stored triglycerides into glucose to get the brain and red blood cells energy they can use. So far, we have fat being released from the fat cells to fuel the body. We have the metabolic rate slowing so the available energy lasts longer. We have glucose being released from storage or made from parts of fat, for the bits of the body that can’t rely directly on fat for fuel. And after a few days of fasting, the liver starts recycling and reusing in other ways: It turns some of the fatty acids into yet another type of fuel—ketones. The brain, which can’t use fatty acids as fuel, can use ketones. This might seem complicated but don’t despair; the upshot of all this is quite easy to grasp. Any body part that can use fat directly as fuel burns that. The body parts that can’t—like the brain—use glucose or parts of the stored fat that got converted into glucose and ketones. The muscles also do their part to recycle and reuse. Like the liver, they mobilize glycogen to fuel their own work in the first few days of starvation. Meanwhile, they feed the liver with lactate that can be converted into glucose through gluconeogenesis. But the muscles have another incredible life-extension trick. They take in fatty acids, a bit more than they need for fuel—a surplus that eventually interferes with the insulin’s ability to tell the muscles to take up glucose from the blood. With that insulin signal now muffled, the muscles burn less glucose, relying instead on ketones and fatty acids. The resulting insulin resistance in the muscles is a survival mechanism: The muscles spare glucose for use by the brain and red blood cells. How very generous of them. If starvation lasts longer than a week or two, the muscles continue their contribution to the body’s survival quest by carrying out additional reducing, recycling, and reusing tactics. First, their uptake of ketones drops, causing the levels of blood ketones to rise. This helps the brain get more fuel and, again, saves glucose for the red blood cells. As we saw in Chapter 3, protein in the muscles is always breaking down and rebuilding. During starvation, the muscles export amino acids into the blood for other tissues to use. Some of these amino acids also give the liver yet another source for gluconeogenesis—this time, turning protein’s constituents into glucose. So that’s protein becoming carbohydrate for the brain and red blood cells. Amino acids from the muscles also supply the body with building blocks to rebuild vital proteins that may be damaged. With no food coming in, this can only go on for so long. This is because we don’t really have a storage form of protein. If protein is the “only true nutrient,” the molecules that make life possible, reducing our body’s protein to critically low levels is a signal that the end is near. When body fat gets depleted, protein sparing can’t go on. The so-called pre-mortal uptick in nitrogen excretion in urine reflects a dramatic increase in protein breakdown as a last resort to fuel metabolism before death. Of course, people who went into a fast or starvation with more body fat can avoid that fate for longer. Someone like Barbieri would outlive Blaine in a head-to-head starvation contest, thanks to all that extra body fat. Body fat also has a lot of other functions besides energy storage. It’s the padding on our bodies that helps us sit comfortably; it insulates us and some kinds of fat tissue even warm us through a process called thermogenesis; it produces hormones that send messages around the body about everything from our immune system status to how much energy we’re storing; a small amount lines everything from our brain to our intestines and nerve endings; it’s intimately linked to our reproductive health; and we could go on. The bottom line: Fat is amazing! So when exactly does body fat transform from resourceful friend to health foe? It’s complicated because it seems that each person may be able to pack away different amounts of triglyceride in their fat tissue before it affects how the rest of the body functions. THE BODY AS A HOUSE To explain, Kevin likes to use a simple analogy. Imagine a house whose closets and cupboards start filling up. The more full they become, the harder it is to put things away. When the storage spaces are brimming, the objects inside them start spilling out into the rooms around them. The clutter all over the floors causes problems in the rest of the house—the rooms become uninhabitable, maybe the floorboards underneath all the stuff creak or crack. Every new delivery of more things into the house makes everything worse. The clutter builds, and so, too, do the clutter-related problems. The house is the human body. The cupboards and closets are our fat cells. The stuff inside them is our triglycerides—the stored fat itself. During our evolutionary history, our bodies generally did a good job of putting away surplus calories in our fat cells whenever we had a lot of food around. Now that so many of us live in a state of constant calorie abundance, we eat more than we can safely store on a regular basis (for reasons that we will start to unpack in the next chapter). Our fat tissue does its best to absorb the excess, but gradually the fat cells get stuffed full. Fat starts building up in other organs—like ectopic fat in the muscles, liver, or pancreas, places where fat wasn’t meant to be stored. The solution to the clutter is not to demolish the closets and cupboards, just as liposuction doesn’t fix the underlying problems that caused fat to be put away in unhealthy regions. Removing the body’s cupboards and closets eliminates our safe energy storage system. Just as the overall size of the house doesn’t determine where clutter is located or whether it’s a problem, body size alone doesn’t necessarily indicate whether there’s too much fat in unhealthy places. What defines excess fat varies from person to person. Some of us can store a lot of fat in our fat tissue without cardiovascular or metabolic problems. Others can’t. The relative amount of fat stored in different locations in the body has a genetic component related to sex hormones and fat cell development that also predicts one’s risk of metabolic and cardiovascular diseases. For example, your genes and hormones determine whether you have a pear-shaped body, with most of your fat located just under the skin around your butt and upper thighs, or an apple-shaped body, where fat is centered on your belly, indicating that your liver might be stuffed with ectopic fat. Some people are genetically destined not to have enough room to store fat. Their bodies are like small houses with hoarders living inside. Rather than boxes, clothes, and pets, they’re packed full of fat. NATURE’S EXPERIMENT Sonia Rehal’s body is like one of those small, cluttered houses. You’d never find her standing in line for liposuction or going on a weight loss diet. She’s been told throughout her adult life that she’s “lucky” she’s so thin—a comment that always makes her shudder. Rehal can’t remember a time when she didn’t long for more fat. As a kid, she was bony and birdlike, and remained so even as all the girls in her class in Montreal, Canada, developed “boobs” and “butts,” Rehal recalls. Her shapelessness wasn’t lost on her peers. Bullies shoved her into a locker so hard, she needed stitches. “You’re a guy. You’re not a girl,” her schoolmates would say. It wasn’t as if Rehal didn’t eat enough. She was hungry—all the time— and at every meal she says she binged to the point of sleepiness. “I could put down three or four Big Macs, easy. If you gave me pasta, I would just continue to eat plates full of pasta until I was literally nauseous…. The off switch doesn’t exist.” There were other signs that something was off about Rehal’s metabolic health. Before she was five, dark and velvety patches popped up on her face and body. Her parents, immigrants from Punjab, India, tried to scrub them off. But the spots didn’t go away and her doctor eventually realized they were a sign that Rehal was extremely insulin resistant, leaving lots of glucose circulating in her syrupy blood. By age thirteen, she was diagnosed with type 2 diabetes and put on insulin medication. Because her body was so impervious to the effects of insulin, the drug had to be injected at doses a thousand times higher than is typical for people with type 2 diabetes. Diagnoses of high blood pressure and dyslipidemia followed, and Rehal started taking pharmaceuticals to manage the problems—ACE inhibitors, statins, and fenofibrates. The bullying worsened. Rehal went on antidepressants. Now in her forties, she’s still on all those drugs and more, after multiple strokes and a major heart attack in 2015. Rehal has all the medical complications associated with someone with a great deal of excess fat— fatty liver, metabolic syndrome, diabetes—but she’s tiny. At five feet, five inches, she weighs 118 pounds. Her BMI, 19.6, is at the threshold for underweight. Her arms and legs look sinewy, sculpted. Her cheeks, hollow. “People look at me, and they’re like, ‘Oh, she’s this fit girl. She’s slim. She’s athletic. There’s nothing wrong with her. And yet, there’s a storm behind that.” The storm is lipodystrophy—“one of the most elegant experiments of nature,” she told us. A rare and sometimes deadly genetic syndrome, lipodystrophy messes with the body’s ability to store fat normally. People with lipodystrophy can’t make the big lipid droplet in the white adipose tissue of their subcutaneous fat. They’re missing physiology’s specialist at storing calories for the reduce, recycle, and reuse program of fasting or starvation. The fat that enters Rehal’s body in food, or that’s formed after eating carbs, has nowhere to go that’s safe. Instead, it’s turned her liver and muscles into the human equivalents of foie gras and marbled steak. Her blood is a milky white color because of the sky-high triglycerides secreted by her fatty liver. Like people during prolonged starvation, or those with a lot of body fat, patients with lipodystrophy are also resistant to insulin. In other words, Rehal has many of the complications of severe obesity. The similarities between these diseases hint that perhaps they’re linked by a common cause: the inability to declutter, to safely store fat, which in both conditions messes up insulin signaling. INSULIN RESISTANCE AS THE VILLAIN Just like body fat, insulin resistance is usually thought of as the onedimensional villain in the feature film about our health. But as we’ve now seen, insulin resistance can also help us survive long periods without food while continuing to fuel our brains. In obesity and lipodystrophy, insulin resistance manifests outside the normal physiological context of starvation. There, it transforms from life-giving savior to life-taker. There are lots of ideas about why insulin resistance develops in obesity. One is that as the demands of fat storage grow, subcutaneous fat tissue remodels itself, attempting to increase its storage capacity by killing and removing old, large fat cells and recruiting new, small fat cells to take their place. During these processes, coordinated by the immune system, fat tissue’s blood supply can become choked off, meaning it doesn’t get enough oxygen. The tissue starts to malfunction, becoming inflamed and resistant to the normal effects of insulin. This makes it harder to store more fat. The inflamed fat tissue sends out molecular signals of distress. These, along with more and more fatty acids spilling out, get delivered to tissues like the muscles and liver. Along the way, insulin signaling is muted.[*4] Whereas starvation-induced insulin resistance occurs when insulin levels run low, during obesity and lipodystrophy it’s the opposite: The pancreas compensates for insulin resistance by spilling out more insulin, which can worsen insulin resistance. This kicks off a vicious cycle that is thought to eventually exhaust the pancreas. Then the failure to produce enough insulin leads to poorly controlled blood sugar and the development of type 2 diabetes. But even if overt diabetes doesn’t set in, insulin resistance provokes a cluster of dangerous metabolic problems. Muscle insulin resistance keeps blood glucose levels high after meals. The insulin-resistant liver spills out too much glucose between meals. Insulin-resistant fat tissue fails to curtail its release of fatty acids and glycerol in the blood. The liver takes up these substrates and creates triglycerides. The buildup of fat in the liver leads to fatty liver disease—which results in more insulin resistance, “inflammation.” The insulin-resistant liver secretes some of its excess fat, driving up blood triglyceride levels. It also slows its removal from the blood of low-density lipoprotein (LDL) cholesterol particles that cause atherosclerosis—plaque buildup in the arteries. If fatty liver persists, it can progress into its more serious form where the organ becomes inflamed: metabolic dysfunction-associated steatohepatitis (MASH). Altogether, this increases the risk for complications such as heart attacks, strokes, cirrhosis, and cancer. When people are being treated for the metabolic and cardiovascular complications of obesity, they’re ultimately being treated for knock-on effects of insulin resistance. The idea that fat tissue’s limited capacity eventually drives insulin resistance and metabolic disease is still just theory. We need more research to see if it holds up and to better understand how it works. But already, a mouse model of lipodystrophy supports the idea. Mice were genetically engineered to prevent the development of the cells that evolved to store fat. Like patients with lipodystrophy, the mice have severe metabolic abnormalities and insulin resistance. After receiving a transplantation of a subcutaneous fat pad from a normal mouse, many of the lipodystrophic mouse’s metabolic problems resolved. Another hint comes from a class of diabetes drugs that cause subcutaneous adipocytes to proliferate and store more fat. These medicines have been shown to improve the body’s sensitivity to insulin, even in some lipodystrophy patients. In both examples—the fat pads on the mice, the diabetes drugs—it seems adding fat cells to the body improves the body’s ability to properly respond to insulin, the equivalent of a home renovation that adds closet space. In these cases, increasing body fat storage capacity equals more health. All this research points to an important, still underappreciated truth about body fat, one Rehal and many of the researchers we spoke to wish people could grasp: A person’s body size is not necessarily a good shorthand for how healthy they are. Rehal’s thinness is a symptom of illness. A larger person with well-stored fat may be healthy—sometimes referred to as metabolically healthy obesity—while another may be sick because they couldn’t store even more fat in the places you can see. “We default to what we can see with the visible eye to make diagnoses, but you really can’t get the full story,” Rehal said. “There’s so much to being human than just our physical look, or physical attributes.” There’s also so much more to health. RETHINKING FAT Yet we still use how people look to gauge their health, and their worth, all the time. Thin is good, virtuous, salubrious; fat is bad, lazy, morbid. Doctors are not immune to this shorthand. Some of obesity’s health consequences are attributed to stigma and discrimination, especially on the part of healthcare providers who undertreat patients with obesity, pegging medical issues to excess weight even when they have other causes. The social pressures around body size can be nearly as devastating as the effects of too much body fat.[*5] The status quo is especially risky for non-Caucasians, who are less likely to be accurately diagnosed by BMI and whose obesityrelated health complications are more likely to go overlooked. Drawing attention to the costs of weight stigma and bias, while trying to decouple health and body size, are what movements like “Health at Every Size” got right. It’s true that being large doesn’t always mean being unhealthy, nor does being small equate to health. It’s also true that it’s unfair that society expects—demands—thin, and that people (particularly women) pay a price when they’re not. At the same time, these movements tend to gloss over the real health consequences of having too much body fat, “too much” being the key phrase and again, unrelated to body size, or the size of the house. This complicated science of body fat is finally starting to get through, at least in the clinic. As we were putting the finishing touches on this book, medical groups in the United States and Europe had put forward statements questioning the medical community’s reliance on BMI as a diagnostic tool for obesity. One of them, a commission organized by the Lancet medical journal, offered an alternative: a definition of clinical obesity—or how the disease manifests in patients, body system by body system, urging clinicians not to rely on BMI as the diagnostic factor. Think obesity the disease, instead of obesity the health risk. The bottom line is that whether a person has the disease obesity should not be based on body size, or even the amount of body fat a person has, but whether they have signs and symptoms of problematic fat. The tricky part is that we have only indirect ways to measure the healthfulness of our body fat. The most accessible is the hip-to-waist ratio. An elevated measure can signal someone has excess visceral and liver fat, likely meaning they have a limited healthy fat storage capacity. But like BMI, it’s not meant for diagnosis. Problematic body fat can also show up in blood tests for the illnesses that comprise metabolic syndrome—high levels of LDL cholesterol, glucose, insulin, and triglycerides in the blood. Because of the lack of good alternatives, we suspect body size measures like BMI and hip-to-waist ratio will die a slow death—which is why even the Lancet group proposes using them but only as screening tools. Plus, a high BMI can point to the need for more investigation and potential health risks, including the mechanical problems that arise because of excess weight—orthopedic issues, sleep apnea, or difficulty with breathing or mobility. It’ll also overlook the millions of people who aren’t overweight at all but harbor unhealthy fat. The best way to avoid these complications is not to gain too much weight in the first place. Love the body fat you have but make sure it has plenty of expansion capacity. Live in a house that is comfortable, not overly cluttered. The next best thing is to lose fat if you gain too much, removing the clutter from the body and restoring the functionality of fat tissue. When we lose weight, the body starts to use up the ectopic and visceral fat stores, in addition to decreasing the size of the fat cells in the malfunctioning and inflamed subcutaneous fat tissue—other reasons why the health benefits of diet- and exercise-induced weight loss don’t translate to liposuction.[*6] But as many of us know from personal experience, losing weight and keeping it off isn’t easy. The slim among us may take credit for our thinness or blame those who gain body fat for what amounts to a moral failing: gluttony. Those with obesity may blame themselves. All of us, thin or fat, overestimate how much control we have over our eating behavior. What and how much we eat, and how our food shapes our bodies, is determined by forces that go way beyond conscious decision-making. The next chapter is about those mysterious forces. Skip Notes *1 Our muscles make up 30 to 40 percent of our body weight but only burn about 3 percent of our total calories at rest—muscles are super efficient when they aren’t moving. Despite being only about 2 percent of body weight, the brain uses about 20 percent of our resting energy expenditure, and it’s relatively constant regardless of whether you are thinking hard or bingeing on Netflix. *2 There’s also brown fat. Brown adipocytes have multiple smaller lipid droplets and more mitochondria, which give them a brown appearance. Unlike white adipocytes—again, whose main function is to store energy—brown adipocytes mainly generate heat, and their role in human health is hotly debated. *3 Glycogen attracts a lot more water than fat, making it relatively heavy and therefore an inefficient way to carry around fuel. *4 Mitochondrial abnormalities in these tissues may exacerbate the progression of insulin resistance in some people because of a decreased ability to burn fat and a more rapid accumulation of fatderived molecules that interfere with insulin signaling. *5 We think it’s a safe bet the GLP-1-based drugs wouldn’t be the blockbusters they are if they only improved metabolic health but did nothing for body weight. *6 Physical activity and fitness—even without weight loss—can cut the risk of some of the metabolic complications of too much body fat and reduce inflammation, with profound health benefits no matter a person’s size. CHAPTER 6 The Conductor In the book so far we have detailed how food’s components are transformed inside us to fuel metabolism and to build and rebuild our bodies, including our body fat stores. Now we are going to turn to what orchestrates our eating, providing what’s needed for all that fueling, building, and rebuilding. Our entire lives could be thought of as a series of periods of hunger, followed by food seeking, eating, and satiety—and sometimes eating in the absence of hunger or continuing to binge even when we’re stuffed. But how does all this happen? How did we manage to meet our nutrient needs in a time before calorie-counting apps and nutrition facts panels? Why do some of us eat even when we’re full? Why do we eat what we eat? Too many of us have been operating under the illusion that we consciously determine every morsel and type of food that enters our mouths —an idea that the wellness and diet industrial complex has certainly helped perpetuate. What and how much we eat is a matter of personal responsibility, wellness influencers tell us. They say we can take full control. We, in our conscious brain, can decide to not eat the dessert in front of us, or to cut our carbs or calories. But these ideas represent yet another disconnect between the best available science and the popular understanding about how people interact with food. We’ve been misled about how much willpower and conscious control we really have over our long-term food intake. Scientists have known for more than a century that eating behavior is a biologically controlled phenomenon. Our food choices are guided by signals from within (think blood hormones) interacting with signals from our environment (the smell of dinner, the beautiful pastries on display in the bakery window, the donuts in our morning meeting). All this shapes our food choices in ways over which we have much less control than we like to believe. Take a moment to let this sink in. It took a while for Julia, the reformed dieter, to grasp this fact. The idea that what we eat is entirely the sum of deliberate, personal choice is so pervasive that any challenge to that seems absurd. We constantly play up individual decision-making and play down physiological control and our environment when it comes to food. We should do the opposite. To start to unravel why, let’s take a trip to a restaurant. DINNER IN PARIS Pretend you’re in Paris, city of culinary dreams and gastronomic delights. It’s a classic bistro—those dim, cozy spots serving traditional French fare. You haven’t eaten for several hours and you’re ravenously hungry. You’ve been touring all day. You’ve walked the banks of the Seine, maybe even climbed the Eiffel Tower. Now you need rest. Even more, you need to sate the enormous appetite you’ve worked up. You know what to do. You scan the menu. Make a choice. The choice tonight seems obvious. They have your favorite French food on special—maybe it’s a salty steak frites or a crisp salade Niçoise. Whatever it is, you’re anticipating its delicious flavor as you make your order. While you wait, you try to distract yourself from thoughts of food and the people around you already eating. You talk to the friend you’ve met for dinner. You check your phone. But you can scarcely take your mind off eating. Your dinner companion is talking about her work. You’re not really listening. How could you? You’re focused on the meal that should arrive at any moment. You start to feel uncomfortable, anxious—the word hangry comes to mind. You breathe in the smells coming from the kitchen. Your eyes wander over to the waiter who keeps swirling by with wonderful dishes. Is that plate he’s carrying yours? It looks like it is! Finally! You’re about to eat. You start to feel better at the mere sight of your plate. But it isn’t your meal. Those plates are for the people seated next to you. Didn’t they arrive after you? The hanger roars back. A few minutes later, your actual dinner arrives. The plates are set down in front of you. Without thinking, you pick up your fork and knife, hanger rapidly dissipating. You’re chewing. You feel relieved, maybe even happy, and finally able to focus on something other than food. After the first few bites, your attention shifts back to the conversation your friend has been valiantly trying to carry on while all you could do was think about being served. The meal is sumptuous. The food in Paris is a delight! Fast-forward to the last bites of your meal. By the time you’re nearly done, you notice the food is not quite as tasty as those initial bites. It’s almost as if you’ve turned your back on the dish you’d been so eager to consume. You finish the plate anyway, comfortably full. The waiter comes back. “Dessert?” he asks, in the colluding tone in which the question is usually posed. Even though you’re pretty much stuffed, there’s always room for dessert, as they say. Hadn’t you made a deal with yourself earlier today that you would be as indulgent as you like with a main course, but that you’d have coffee and no dessert afterward? And yet “Oui,” you tell the waiter. Cream-stuffed profiteroles, perhaps? Caramelized apples in a tarte tatin? You order. Your anticipation rises again, but this time despite the slight cognitive dissonance you experience at the memory of telling yourself you wouldn’t eat dessert, and the fact that you’re not actually hungry anymore. The dessert arrives. You take a bite. You’re seduced by pleasure and take another. Then another. Before you know it, your plate is almost empty. The last scrapes of the dessert are less desirable than when you started. You eat them anyway. Now you’ve really had enough. The waiter brings chocolates with the check, and of course, you manage to eat those, too. It’s time to go. You pay and head back to your hotel. Tomorrow morning, like every morning, the process will begin again. Find food, eat, repeat. You don’t have to be a neuroscientist to understand that your brain orchestrated what happened at that restaurant: what you chose to eat, how much you ate, even how you felt about it. Food deprivation stirred a feeling of hunger—a signal from your body about your nutritional status, which your brain turned into an all-encompassing desire to eat. When your food arrived and you started eating, your body sent signals back to the brain, changing your sensations about the food, including how much pleasure you got, and altering what and how much you chose to consume. We may take them for granted, but we have these sorts of experiences all the time. We feel things like craving and hunger. Yet we have less control over what we put in our mouths than we think. To help explain, Kevin offered an analogy: Eating is like breathing. As you read this paragraph, try to focus on your breath. You may notice that you’ve taken control of it. In most situations this is easily done. As you think about your breath, you can choose how deeply or how rapidly you suck air in and push it out. Yet as soon as you get distracted, your breathing doesn’t stop. It returns to its natural rhythm without any thought. Your lungs will supply your blood with the necessary oxygen to fuel metabolism and expel its resulting carbon dioxide. Breathing is subconsciously controlled by the brain in response to changes in things like blood oxygen levels. Some people breathe faster, others more deeply, but each breath provides what the body needs for metabolism. Biology takes over: no conscious thought or action required. Like breathing, we can exert control over an individual meal or snack. While most breaths happen below our conscious awareness, almost every eating occasion provides the perception of control and choice. But even when we pay close attention to our food, it’s often only for a short duration, and many aspects of our eating behavior still fly under our conscious radar. Did we really notice the portion size? How large were our bites? How quickly did we finish the plate? At what point did we change our mind about not ordering dessert? Day to day, what and how much we eat can fluctuate wildly, reflecting different situations, with little correlation to nutritional need. Yet, mysteriously, over the long haul, food intake is gradually and subconsciously controlled by our biology. If we’re in a food-secure environment, our calorie, protein, mineral, and vitamin needs generally get met, as do many of our cravings and desires. When a deficiency arises, this subliminal control of eating kicks into high gear. Kevin saw this play out in one of his studies. People with type 2 diabetes received either a placebo or a diabetes medication—canagliflozin—that caused them to lose an extra 360 calories per day through their urine. Not surprisingly, the group that received the medicine lost weight because of the extra calories they excreted. Incredibly, they gradually started eating more food, eventually compensating for calories they didn’t even know they’d lost. The progressive increases in calorie intake were undoubtedly imperceptible, occurring in the background of large day-to-day fluctuations. But they slowly accumulated over months and in proportion to the amount of weight they’d dropped. It seemed their food intake was being subtly but confidently controlled by a signal that responded to the amount of weight they’d lost. By the end of the year, their calorie intake had increased to compensate for the hundreds of lost calories in the urine, and the participants’ weight loss plateaued. Allow us to throw in one final analogy. The central nervous system— and specifically the brain—acts as the conductor of our nutritional needs and culinary wants. It leads the orchestra (the rest of the body) while listening carefully to the music (signals coming from the body)—a form of feedback from the orchestra. The conductor manipulates the way the musicians play (by controlling what we eat[*1]). The conductor had years of training that allow her to guide the musicians almost by reflex, just as the brain guides our eating behavior, based on a lifetime of learning about food and nutrients. In the same way we don’t have to think about changing our breathing patterns when our bodies need more oxygen, we don’t have to think about shifting our appetite to meet our nutritional needs and wants. It just happens, over periods of many weeks. We still have the conscious experience of making choices whenever we eat, but most of the steps before that moment, and most of what went on inside the body during eating and after, required little or no conscious thought. This can feel deeply counterintuitive. Researchers started working to unravel how all this works nearly two hundred years ago. That’s when they got the first hint that body size might be under the influence of something outside our conscious control in a case report about a German woman with obesity. A HORMONE CONNECTING BODY AND BRAIN Elisa Moser was the wife of a gardener who started complaining to her doctor of frequent headaches and dizziness. Soon memory loss, vision loss, and periodic blackouts followed, which meant she could barely do her housework, let alone carry on with her regular life. By the time she was admitted to the hospital in October 1839, the right side of her body had grown weak, and she experienced tingling in her fingertips. “Involuntary defecation and urination were also present,” her doctor, Bernard Mohr, reported in a scientific paper the following year. Along the way, Moser had developed “uncommonly extreme obesity.” By December, she blacked out and stopped breathing. She was fifty-seven. An autopsy revealed that a tumor about the size of a golf ball had grown inside her head. In his paper, Mohr described the phenomena—the tumor and the weight gain—but didn’t conclude that one caused the other. Sixty years after Moser’s death came the Austrian case of a twelve-yearold boy, called R.D. The boy had been suffering with terrible headaches, so intense he’d vomit. The pain usually radiated from the left side of his skull, where his vision had also begun to fade. Two years later, he returned to the doctor and by this point, R.D.’s weight had begun ballooning—to 54 kilograms, “excessively well nourished,” his doctor, Alfred Fröhlich, reported—at least 15 kilograms more than the average for a child his height and age. Fröhlich guessed that the symptoms, including the weight gain, were caused by a tumor at the base of R.D.’s brain. Because of the child’s deteriorating condition and vision problems, Fröhlich reasoned, the tumor must be near the region where the optic nerves crossed. A few years later, Fröhlich was proved right: X-rays and a craniotomy found a precancerous mass in the boy’s head. One by one, more cases like Moser’s and R.D.’s turned up in the literature. Obesity was associated with brain tumors but it wasn’t clear how the tumors affected body weight, and whether they had formed on the pituitary—a pea-size gland located at the base of the brain that regulates bodily functions through the release of hormones—or within the folds of the brain itself. Once again, experiments in animals helped uncover what was going on. When dogs, monkeys, and rats had parts of their brains lesioned or burned, their appetites shifted. Depending on the location of injury, they either ate more and gained weight—or starved to death. Changing brain anatomy altered what and how much the animals ate, and in turn, their body weight. More specifically, weight gain or loss followed from damage to certain parts of the hypothalamus, an almond-size structure in the deepest part of the brain, behind the nose and connected to the pituitary. Our most ancient hardware, the hypothalamus controls the body’s homeostatic processes— breathing, drinking, temperature, reproduction, all part of the drive to maintain stability and equilibrium. It also seemed to control the drive to eat. By the early twentieth century, there were a slew of competing ideas about what was being regulated in the body that signaled to the brain how eating should be controlled. Scientists had in mind an engineering analogy: a thermostat where the room temperature is monitored and used to send a signal back to the furnace to moderate heat production and keep the temperature within a desired range. Some researchers proposed the glucostatic theory after observing correlations between hunger and food intake when blood levels of glucose dipped too low, a finding that’s been replicated in recent studies. Maybe maintaining blood sugar levels drove food consumption to keep it from dropping. Another group proposed the lipostatic theory—that the body was regulating its fat stores to ensure survival during periods of starvation. The idea of a lipostat piqued the interest of Douglas Coleman, a Canadian researcher working at the mouse-filled Jackson Laboratory, nestled near Maine’s Acadia forests. His workplace was home to thousands of strains of mice—including one called ob/ob, a very round rodent that ate three times as much as your average mouse and stored five times as much body fat.[*2] Another strain, db/db, had both obesity and diabetes. Coleman became obsessed with a question: Why did these two mice store so much fat while others didn’t? He guessed that body fat might be regulated by a hormone circulating in their blood, and that the hormonal signal was misread in these mice, like a thermostat with a broken thermometer. [*3] To test his hypothesis, Coleman surgically connected the mice by attaching them skin-to-skin so that a tiny amount of their blood supplies were shared—enough for hormones to commingle between animals but not enough blood to sufficiently fuel the other’s metabolism. (Yes, he literally sewed mice together.) When the lean mice were conjoined, they ate and metabolized food normally, and maintained normal body weights. But things got interesting when he attached a lean mouse to db/db or ob/ob. In the case of ob/ob, the body fat of the obese partner went down and its eating behavior normalized. With db/db, the mouse with obesity remained fat while the lean mouse stopped eating and died of starvation. Soon, Coleman came up with an explanation for what was going on. He guessed that normal mice produce a hormone that sends a signal to the brain about body fat stores. The ob/ob mice lacked the functioning hormone. The db/db mice made plenty of the hormone but lacked a functioning receptor, meaning their brains didn’t get the signal. The excess amount of the hormone produced by db/db, reasoned Coleman, was the reason the lean mice attached to it stopped eating and starved—their bodies came to believe they were fat, even though they weren’t. After spending the rest of his career searching for the hormone and trying to disprove his theory (as good scientists should), he failed on both accounts. By 1994, geneticists at the Rockefeller University in New York, Rudy Leibel and Jeff Friedman, found what Coleman couldn’t. Friedman named the hormone leptin, from the Greek word leptos, which means “thin,” and soon afterward he characterized its receptors. Leptin is produced by fat cells and shuttled into the bloodstream. The more body fat, the more leptin gets secreted. Its receptors are located in a variety of tissues but are concentrated in the hypothalamus, in the base of the brain. Just as Coleman suspected, leptin sends a message to the brain about how much energy is stored on the body, essentially acting like a thermostat for body fat and making sure fat stores aren’t becoming dangerously depleted. Lots of fat means higher levels of leptin in circulation. When fat stores start to get used up, leptin levels drop, and the brain gets a message: Eat more! Leptin was the knock-out obesity story, a molecule that finally connected body and brain, and specifically, body fat and hypothalamus, to explain the biological control of eating behavior. The ob/ob mice were leptin deficient. The db/db mice were leptin resistant. Both suffered perpetual hunger. Their brains never received the message that they’d eaten enough. More than a century after Moser’s death, this was the first peek inside the intricate eating machinery that regulated body fat. But that was just rodents. By 1997, researchers at the University of Cambridge—led by Sadaf Farooqi and Stephen O’Rahilly—had discovered the human equivalent of the ob/ob mice in a pair of cousins who couldn’t stop eating. Their fat cells released a mutated, malfunctioning version of leptin, which meant their brains didn’t register all the fat stored on their bodies. What was true of the mouse was also true for people. With leptin,[*4] the signal is focused on energy storage, one of the body’s many servants of metabolism directing us to get food. Normally, when leptin levels drop, the brain responds by increasing the sensation of hunger, telling us to eat more. When the leptin signal goes awry—because the part of the brain that gets the signal is damaged, as in the case of the brain tumor patients, or because the signal itself is faulty, as in the case of the cousins who don’t produce a functioning version of leptin—the brain perceives the body as having no stored fat and its keeper is perpetually starving. The leptin discovery created a stir and led to an obesity treatment moonshot. Obesity was already known to have a large genetic contribution. Studies of families, dating back to the 1920s, showed that people who are closely related shared a similar body size, even when living apart.[*5] In leptin, researchers found the first obesity-causing hormone. Perhaps people with less severe weight problems than the cousins had genes that hampered leptin’s function rather than a complete loss of the leptin signal. Maybe drugging them with extra leptin could hold the key to permanent weight loss. After all, it had helped the cousins. Farooqi and O’Rahilly found that injections of recombinant human leptin rapidly reversed their patients’ obesity. The drugmaker Amgen made a big bet on this possibility, licensing a patent for a leptinlike drug to the tune of roughly $20 million, a record amount for a deal with an academic institution. The hope was that an extra dose of leptin in people with obesity would help the brain understand that the body had too much fat, resulting in hunger-free weight loss. The possibilities were thrilling: Treatments finally appeared to be on the horizon. Of course, it couldn’t be that simple. It turned out that most people with obesity produced copious amounts of perfectly functional leptin. Adding more led to disappointingly little weight loss. Farooqi and O’Rahilly had managed to solve the problem for the rare patients, like the cousins, who didn’t make enough functional leptin, and for others who’ve been diagnosed with genetic leptin dysfunction, but it remains a vanishingly rare cause of obesity in humans. This genetic disorder affects one in several million people. Even so, the leptin discoveries were relevant to us all. Researchers finally had molecular evidence that could explain what patients like the cousins experienced, why they behaved the way they did: Eating was influenced by biological signals outside our conscious control. Maybe there were many more people like the cousins, people with obesity whose biology persistently nudged their food intake higher. Different food preferences, larger portions, faster eating, bigger bites, all accumulating over the course of the day to hundreds of calories, and over the course of years as obesity, yet flying under the radar of conscious perception and control. “Much of modern science has reminded humans that we are not as much in charge of our destinies as we might traditionally have assumed,” O’Rahilly told us. “That’s what Copernicus did. It’s what Darwin did. And their work made people uncomfortable. Modern neuroscience also tells us that our big clever brains are less under our control than we like to think, too. And many people don’t like that either.” THE ORCHESTRA We now know that what and how much we eat is shaped by much more than leptin. We also know that it’s not just our body fat stores that are regulated. Since the discovery of leptin, researchers have found many layers of signals that work together in still mysterious ways to shape our eating behavior. There are the internal signals—all the hormones, neurons, neurotransmitters, networks, and pathways in different brain and body regions—we’re about to describe. But—and this is crucial—they come together to make it more or less likely that a behavior will happen in ways that are affected by the environment (we’ll explore this in the next section and chapter). When the orchestra tours to a different venue, the acoustic environment changes, and the conductor tries to adapt. What struck us most about the map of signals we have now is that it was put together by mostly disconnected camps of researchers, who each focused on the roles of different parts of the brain in shaping different kinds of behavior. One is the group we’ve just met: the hypothalamus set, interested in signals of hunger and fullness in the context of obesity—how body weight seems to be regulated to stay within a narrow range in a constant environment. The other group fixated on the brain’s reward system —how things like desire, anticipation, and motivation influence learning and behavior more generally, in ways that result in habits and sometimes lead to compulsion and addiction. There were always hints that the two camps were mapping the same territory from different directions. The addiction researchers, for example, were more likely to make an animal compulsively consume cocaine if they first made it hungry. The obesity researchers noticed that animals would gain weight when they were exposed to environments with highly palatable and rewarding foods. Both groups agreed on that basic fact: that a lot of what determines what and how much we eat happens outside our conscious awareness. But the siloed research continued anyway. Today, the homeostatic and hedonic demarcation between brain systems is increasingly seen as an artificial human construct, like the geopolitical borders of a map. Yes, we have different brain regions involved in eating, but they act in harmony to orchestrate our eating behavior. This determines not just our body size, but why we like what we like, why we do what we do, why we eat what we eat, and how the biological, environmental, and social cards we’ve been dealt shape who and what we become. LUNCH IN LONDON This mess of biological signals is hellishly complex. To help make sense of it all, we needed a guide. So Julia went to lunch with Tony Goldstone, a clinician-scientist based at Imperial College London. The main agenda item: matching distinct parts of our biology to the sensations we all experience around food, the kinds that arose during our imaginary Paris supper. Goldstone is a compact man with a neatly shaved head, and modern titanium glasses. He speaks as eruditely about the recent travel photography trips he’s taken as he does about his research on eating behavior, and the patients he cares for. The majority of Goldstone’s patients have Prader-Willi syndrome (PWS). People with PWS can’t sense they’re full, and often feel constant hunger, which, in rare cases, causes them to gorge so much, their stomachs burst open, leading to death. Goldstone was drawn to the work after an interest in hormones, sparked by his own type 1 diabetes, led him to endocrinology, the study of hormones. A London hospital he trained in happened to treat a patient with PWS. Today he runs the largest clinic in the UK for adults with the condition. He places his patients at the far end of a spectrum of eating behavior that we’re all on. “We study the extremes in order to try and get information and techniques for the general situation,” he says. “People think they only eat because they want to eat, or they’re cognitively deciding to eat. But much of it is not taking place at that conscious level.” Goldstone would know. He is also a psycho-neuro-endocrinologist. That means he studies how hormones from the body act on the brain to alter behavior, including shaping our food choices. The work sometimes involves feeding people lots of mac and cheese or milkshakes, then using fMRI to see which brain regions light up in response to food photos. He’s found, time and again, that the same brain regions involved in alcoholism and drug addiction are the ones that respond to food cues. This is the brain’s reward system, home to our feelings—desires, urges, cravings. It’s also where we form associations between things in our environment (people, food, activities) and positive or desirable outcomes, usually related to reproductive fitness. “Drugs of abuse hijack a reward system that’s been set up for food, water, sex, and salt,” Goldstone explains, as he and Julia sit down at a flower-filled Indian restaurant near his West London office. All these substances have been extracted from nature—cocaine from coca plants, heroin from poppies, nicotine from tobacco plants—and modified by humans to be addictive. They seem to send the reward system into overdrive, similar to the way some researchers now suspect ultra-processed foods hook us (more on this soon). By the time Goldstone was finishing his MD training and embarking on a PhD in the mid-1990s, leptin had just been discovered. Research of the kind he does today didn’t really exist. The homeostatic system was thought to only control how the body meets its nutrient and energy needs (calories, sodium, etc.), whereas the hedonic system only drove the desire for rewards. Nowadays, more researchers like Goldstone blur the boundaries. “The pleasurable aspects of eating are also under homeostatic control,” explains Goldstone. Is this why, feeling famished, Julia orders a creamy and sweet mango lassi? Maybe, Goldstone says, “we’re attracted to high-energy foods when we’re hungry.” One of his studies looked at how the brain responded to pictures of high-calorie foods (burgers, cakes, chocolate) versus low-calorie foods (vegetables, fish), as well as nonfood photos (furniture, clothing) after people had fasted compared to when they were fed. In the fasted state, when people hadn’t eaten in a long while, their brain reward circuitry was activated more while looking at the burgers and cakes than the fish and vegetables. When the study participants were fed, the high-energy foods didn’t have the same effect because they weren’t hungry anymore. The research seemed to shed light on the cliché that “hunger is the best sauce”— or why the initial bites of a meal taste better than the last, as we saw in our imaginary Paris bistro. The waiter stops by. It’s time to place the food orders—traditional appetizers and a selection of curries and rice. Right now, Goldstone says “everything’s geared up” inside the body to put the focus on obtaining calories and nutrients. “People who are basically in a very hungry, aversive state—they’re going to be attuned to [food],” Goldstone says, bringing up the Minnesota starvation study participants. They read cookbooks in their leisure time, and hung out in restaurants to watch others eat. As they wasted away, all they could think about was eating. A range of signals, neural impulses, and hormones from different organs are sending messages that arrive at the brain telling it about the body’s nutrient status. Again, these internal signals don’t cause behaviors per se— they don’t put a gun to your head and force you to eat. They work together to create the milieu that nudges you one way or another when food is for the taking. Most of the appetite-regulating hormones that were discovered since leptin—like ghrelin, PYY, and the hormone du jour, GLP-1—are produced in different parts of the gut. They’re secreted into the blood by the stomach and intestines and some are also sensed locally by nerves that communicate with the brain. When most of them are running low, we feel hungry. The exception is ghrelin, which starts out high before eating and signals hunger. Knowing that our hunger will soon be sated seems like a good moment to ask: What is hunger anyway? There’s no specific mechanism for hunger, Goldstone explains. “We don’t feel hunger in our blood hormones. It’s an internal feeling, which arises from our brain.” The brain signal most associated with initiating hunger comes from a set of brain cells, the AgRP neurons, located in the hypothalamus. AgRP neurons become more active when the body senses that it hasn’t eaten for a while or its energy stores are low. Stimulate AgRP neurons in well-fed mice, and they start eating as if they were starving, continuing to gorge as long as the neurons are stimulated. When AgRP neurons turn on, food seeking goes into overdrive. As eating begins, they shut off in response to nutrients and hormones circulating in the blood. Or that was how we thought AgRP worked. But like most things in science, a more complicated story emerged upon closer examination. Over the last twenty years, scientists developed remarkable experimental techniques—such as optogenetics and fiber photometry—that allowed them to control and measure activity in particular populations of neurons in the brains of mice. Using these methods revealed that AgRP neurons actually cease their firing before animals start feeding—a finding that shocked the neuroscience community. The mice didn’t have to ingest a single nutrient to begin to feel the relief of eating. They just had to see and smell their food. If the food the mice saw and smelled wasn’t quickly eaten—if the researchers did the mean trick of dropping a pellet of food in their cages and taking it out before they could eat, like the waiter who makes you think your dinner is coming and then sets it down in front of someone else—AgRP neurons would begin their firing again. The hanger would return. Julia’s mango lassi arrives. At the sight of the cool, yellow glass, she notices her hunger easing. Maybe that’s the AgRP neurons quieting, Goldstone speculates, telling her body that it’s about to get calories. The rodent findings haven’t yet been translated to humans, but even in mice, the system appears to be ingenious. AgRP neurons shut off in proportion to how many calories the mice will go on to ingest over the next twenty or thirty minutes, this time predicting the amount of eating that’s about to happen. So yes, these brain cells see into the future based on past learning —they anticipate what’s about to happen: Mango lassi in front of you means you’ll eat a certain amount of calories and nutrients very soon. A physiological change based on a prediction. Another part of the relief we feel at the sight of food may have to do with dopamine, a key chemical of the brain’s reward system, Goldstone explains. Commonly referred to as the pleasure chemical, “dopamine is more about motivation and drive rather than pleasure as such,” he explains. We need motivation to do things—to get out of bed, to go outside and seek food or a partner. All of that requires a certain “get up and go.” Dopamine’s first known function had to do with the control of movement. Early in the dopamine story, researchers figured out that Parkinson’s—a disease that makes it difficult to move—is caused by a shortage of dopamine. Purposeful movement also requires motivation. Making a mouse deficient in dopamine stirs such apathy, it starves to death because it won’t even bother to move toward food placed directly in front of it. But it will eat— move its tongue and jaws—and survive if food is put in its mouth. Otherwise, it couldn’t care less. Neuroscientists have started unraveling how dopamine motivates us, and helps us learn about cues in the environment that predict the availability of rewards, in different regions of the brain and on different time scales. Going back to your high school psychology class, remember that Pavlov’s dogs learned to associate a particular sound (the beats of a metronome) with an impending meal. Now we know that before learning that association, the meal caused a short burst of dopamine in part of the brain’s reward system —the ventral striatum. But as the association between the sound of the metronome and the meal is made, the dopamine burst shifts to being triggered by the sound. The delivery of the meal doesn’t spark a dopamine increase in that same brain region. Now, another region of the brain’s reward system—the dorsal striatum—sees a prolonged surge in dopamine driven not by the taste of the food but by signals coming from sensing nutrients in the gut and the blood vessel connecting the gut to the liver. After all this learning, dopamine cells start to release the chemical quickly, a rapid burst in the ventral striatum when something in our environment cues us to the availability of a reward. This tells us to get ready: Something good is coming. So we don’t have to eat the food for dopamine to be released in the ventral striatum. This signal heightens attention, desire, and anticipation. It can also set in motion habitual behaviors that drive us to consumption—the actual act of eating, leading to a prolonged dopamine surge in the dorsal striatum. If the predicted reward that we anticipated doesn’t arrive, dopamine activity becomes briefly depressed in the ventral striatum—another chemical response that may help explain why you take special notice when that dinner plate you thought was yours goes to the neighboring table.[*6] Putting all this together so far, there’s the homeostatic system that evolved to drive you to eat by creating an uncomfortable state of hunger you want to get out of. There’s a reward system that drives you to eat by stirring anticipation, desire, and craving, feelings that can motivate our actions and alert us to cues in our environment that predict food availability. The level of hunger encoded in the homeostatic system enhances the signals in the reward system. The two systems cooperate with each other. Dopamine neurons fire at a cue predicting that food is available. AgRP neurons become active when we’re hungry and stop firing when food arrives. If we don’t eat, dopamine activity dips. AgRP also turns back on. Dopamine can drive eating palatable food in the absence of hunger, while AgRP can spur eating unpalatable food to relieve hunger. If this all sounds a bit redundant, that shouldn’t be surprising. With something as fundamentally important to life as eating, biology wasn’t messing around. And it doesn’t stop there. There are other systems we’re just beginning to understand that also play a role in regulating eating behavior. One recent discovery came thanks to patients with PWS. Researchers led by University of Pennsylvania biologist Nick Betley compared how the brains of people with and without PWS responded to pictures of food when they were fasted and hungry versus after eating a big meal. Those with PWS responded differently but only in a tiny brain region deep inside the cerebellum—a surprise, since the cerebellum was not thought to be involved in controlling food intake at all, but rather helping to coordinate movement and balance. It’s the part of the brain that tells you when you overshot trying to hit that tennis ball, comparing your last miss to where your racquet needs to be to hit the ball next time. Betley’s best guess now is that the cerebellum works with the other brain systems involved in eating behavior to help you accurately predict how much of a particular food you need to eat to avoid striking out on satisfaction. If you miss the mark, the cerebellum nudges you closer to hitting it next time. Like the cerebellum, researchers initially boxed dopamine into a much smaller role than it appears to play in the body. The more we learn, the more it seems to be involved in just about everything. Consider all it did for you at a restaurant meal. Dopamine was humming along in the background, motivating us to get to the restaurant. Long before that, dopamine was involved in learning that restaurant = food. Dopamine orchestrated how we learned to perform complicated movements that once required lots of cognitive effort, like eating with utensils. It helped turn those movements into habits we don’t have to think about. In the earliest days of life, dopamine helped you figure out what is and isn’t food, and which foods deliver which nutrients and calories. This happens, in part, through gut-brain feedback. Taste and smell give the brain important sensory information about food, which causes short-term dopamine increases in the ventral striatum. But after swallowing, the gut becomes an equally important sensor: It picks up information about the nutrients in the food and sends it back to the brain. Speaking of food, a colorful appetizer is set down on the table: chaat, a mixture of potato, fried bread, coriander, yogurt, and chickpeas. Right about now, Julia’s gut is most certainly talking to her brain. A few minutes into the meal, the brain will start to receive messages that the body is filling up on energy and nutrients, that the brain’s predictions about the meal are being realized. Goldstone suspects insulin—another hormone that regulates eating behavior—will start spilling out of the pancreas, a response to increasing blood sugar levels, to help shuttle glucose into the cells. This will help suppress her appetite, along with other signals. Ghrelin, the hunger hormone that was being secreted from the stomach prior to the meal, will ease up, while hormones like CCK and PYY—from the intestines—will be produced in greater volumes, again, more layers of signals that contribute to our feelings of fullness. Incredibly, gut signaling to the brain is enough to learn about food rewards and change behavior, even without the ability to taste the food. In one study, researchers used “sweet-blind mice”—rodents that had been engineered to lack the ability to taste sweetness—and left them in cages where they could lap up a calorie-free sweet solution, or one that delivered an equally sweet solution with sugar. The mice still learned to seek out the real sugar water because their guts sensed the sugar-causing increases in brain dopamine. Emerging evidence in mice suggests that the microbiota of the gut may also shape food preferences. This is why researchers increasingly see the digestive tract as another sense organ, much like the nose or tongue. Something similar happens when researchers directly stimulate the reward system of animals: Food preferences change. For example, cats are carnivores that have a mutation in their taste receptors that prevents them from experiencing sweetness. Nevertheless, cats learned to prefer sweet bananas over meat when the bananas were paired with electrical stimulation of their brain reward regions. Interfering with their brain altered food preferences in ways that didn’t even align with their feline needs or tastes. It gets weirder. The weanling kittens of the brain-stimulated cats imitated their mothers’ odd eating behavior. The tiny carnivores learned to go bananas for bananas. Maybe the exposure to fruit flavors through mothers’ milk shaped their preference. Or perhaps the cats’ food choices were influenced by social cues —they learned to like the fruit by watching their mothers gobbling up bananas. So in addition to the complex web of internal signals, social and cultural learning also shapes our food choices. Even if Julia or Tony wanted to, at a working lunch, they wouldn’t hoard all the food on the table, or order six courses—that’s just not how it’s done. But at an American Thanksgiving, it would be strange not to partake in ritualized overindulgence. If Tony or Julia were strapped for cash, they’d order differently than if money were no object. Or if their lunch took place in Tokyo instead of London, they’d dine differently again. By the time the main courses arrive—fish and chicken curries, along with heaping dishes of rice—Tony and Julia aren’t feeling hungry anymore. Their brains and bodies aren’t focusing entirely on food. They keep talking. Still, it’s shared plates and Julia notes a decline in conversation as they each figure out what they’re going to take. About twenty minutes into the main course, the stomach has distended, Goldstone guesses, another signal to the brain that we are filling up. Levels of ghrelin—secreted by the stomach, stimulating hunger—are by now probably running low, while levels of its recently discovered counterpart, an opposing hormone called LEAP 2, are running high. The other gut hormones, like PYY, CCK, and GLP-1, are likely also nearing their peak. Again, they’re thought to suppress appetite, just a few of our many eating signals. But their effects can also be overridden by other attributes of a meal, at least in the short term. Texture, for example, affects how quickly we eat, which, along with food that packs more calories into each bite, tends to boost our energy intake. The point is, no gut hormone works like an on/off switch to control eating, despite how they’re often depicted in the popular conversation. Not even leptin works as researchers initially thought. Leptin’s role during a single meal isn’t normally very prominent. Over time horizons of weeks or months, leptin nudges the body to preserve its fat stores, by influencing the activity of neurons that affect our short-term eating behavior, including AgRP neurons and those that produce dopamine. But leptin only rises and falls gradually, so it would not be the lead player in the orchestra at an individual meal.[*7] And yet, when some gut hormones reach extremely high levels, far beyond those that occur naturally, the effects on eating behavior can be dramatic—just as leptin’s effects are when a person lacks leptin or its receptors. Bariatric surgery is a great demonstration of this, Goldstone explains. “If you’d had obesity surgery, your PYY and GLP-1 would be huge right now—you wouldn’t have eaten that much, you would have stopped a while ago.” Retooling the guts of patients through surgery, it seems, rewires their responses to food, seemingly through the mysterious forces of gut hormones—just as we saw a century earlier retooling the brains of animals. And it’s not just that PYY and GLP-1 increase even more with eating, but the reward centers in the brains of people who’ve had the surgery don’t light up as much to pictures of food following their operation, Goldstone’s research found. This helps explain why the surgery is one of the few durable and effective treatments for obesity, though researchers are still trying to nail down precisely how all this works. The other set of effective obesity treatments, which we also don’t yet fully understand, are GLP-1-based drugs, such as Ozempic or Wegovy. These drugs are basically synthetic GLP-1 molecules, with chemical modifications that dramatically slow their breakdown inside the body. They cause average circulating levels of GLP-1 to accumulate to much higher levels—orders of magnitude greater than natural GLP-1—and even higher than after bariatric surgery. That signal also lasts for much longer than it would normally. But while natural GLP-1 is thought of as one of our satiety signals, and the drugs are often referred to as satiety enhancers, they also demonstrate how complicated hormones’ effects are. GLP-1 was first targeted for treating type 2 diabetes, a disease where the body doesn’t make enough insulin to control blood glucose levels. GLP-1 stimulated the pancreas to secrete insulin after meals. Other drugs on the market also stimulated insulin secretion but this happened regardless of blood glucose levels, and too much insulin could drive blood sugar to dangerous lows, a condition known as hypoglycemia. GLP-1 had a unique property. It enhanced insulin secretion when glucose levels ran high but stopped before glucose got too low. This meant that boosting GLP-1 levels wouldn’t lead to hypoglycemia. So drug companies worked on ways to increase GLP-1 to progressively higher levels to get more insulin to be released when glucose was high and thereby better control diabetes. At very high levels of GLP-1, that was when researchers noticed a surprising, and unintended, effect: weight loss. The weight loss effects of boosting GLP-1 to super high levels occur because the drugs act on the brain. We know this because when researchers knocked out the GLP-1 receptors in the brains of mice, the drugs stopped working for weight loss. Curiously, though, the medication doesn’t seem to easily cross the blood-brain barrier—a sort of protective wall that keeps damaging stuff away from the brain. But then how could high blood levels of GLP-1 affect the brain’s control of food intake if they hardly reach the brain? We still don’t know for sure. It seems to have to do with a brain region that has a permeable barrier so that it can detect toxins in the blood and cause nausea when there’s a food poisoning threat. Other neurons in this region, the hindbrain, detect nutrients circulating in the blood. Both types of hindbrain neurons have GLP-1 receptors that allow them to sniff out high levels of GLP-1 and send signals to other parts of the brain, affecting reward circuitry as well as satiety centers.[*8] Lunch is almost over—there’s no time for dessert. Goldstone points out that if a big chocolate cake arrived without prompting, Julia might take a few minutes to tuck in. Simply seeing the cake might cue her up to eat again, causing a dopamine surge that, for some, would be hard to ignore. There’s also something researchers call “sensory-specific satiety.” The pleasantness of a particular food we’ve eaten decreases the more we eat it— but the other foods we haven’t eaten still remain appealing. So while Julia may have tired of rice or curry, she hasn’t tasted cake yet, one reason why there’s always room for dessert. Not everyone would respond to the cake the same way, of course. We all have a friend who would seem to effortlessly say no, and another who would want to devour the whole thing. “Eating in absence of hunger scenario—there’s a lot of variability,” Goldstone remarks, opening up a new layer of complexity. The latest thinking is that this might have to do, once again, with the reward system. Instead of overeating being the result of too much hunger, or too much or too little pleasure from eating—a few of the previous guesses to explain obesity—it’s possible that some people have brains that make them experience more temptation or craving for food, which may be related to dopamine in brain reward regions. Dopamine responses to environmental food cues may be greater in some people, heightening their attention and setting in motion habits learned from previous encounters that successfully rewarded those behaviors. Again, these dopamine-mediated cravings aren’t the same as liking, which is now thought to be mediated mainly by the opioid system. Craving and liking usually go together but don’t have to. Think of people who are addicted to drugs who don’t like the drugs they’re taking anymore but feel they have to keep using them. (We’ll cover food addiction in more detail in the next chapter.) Even before we’ve fully untangled how the reward system works, Goldstone is finding that synthetic gut hormones are potential targets for treating addiction, not just obesity and diabetes. Researchers have shown that ghrelin spurs on alcohol consumption, while GLP-1 medicines may help curb addiction to drugs like alcohol. Goldstone says, “They’re working through the same reward pathways involved in addiction.” The way he sees it, if a person’s reward system is very responsive to food cues, it’s going to be really, really hard to say no to the chocolate cake. At the extreme, people who have food addiction—who continue to eat even when it’s painful or they want to stop—also “have abnormal brain circuitry for things that have nothing to do with food. It may be to do with impulsivity. It may be to do with abnormalities in reward processing. It may be to do with responses to stress,” a driver of overeating, including in binge-eating disorders. The discussion starts to get tricky, reaching back into the realm of philosophy and such pesky questions as: How much free will do we really have? To address this question, Goldstone shares a related thought experiment he likes to give his students. First, he asks them to raise a hand and bring it down when they think a person should be blamed for their obesity. He brings up PWS, “the most extreme appetite dysregulation and obesity that exists in any human beings.” He tells the students he hopes no one blames people with PWS for their obesity, since it’s due to the loss of a number of genes on chromosome 15. Everyone keeps their hands up. He then moves through other examples—from leptin-receptor mutations to less severe forms of monogenic, or single-gene, obesity. He ends with what’s known as common obesity, the kind most people with obesity are thought to have. It’s linked to the cumulative effects of more than a thousand gene variants. “When does it switch from being not their fault to they’re just not exerting enough control?” he asks them. Nobody puts down their hand. GENETICS AND BEYOND After lunch, Julia leaves the restaurant and heads for the London Underground. The tube is full of people with different body shapes and sizes. There’s an extremely tall and slim woman standing near the door, and next to her, a man with severe obesity. Height and weight—both of these are heritable traits shaped by our environment—yet we only blame fatness on people, and equally stupidly, we tend to celebrate thinness. We also blame and shame or congratulate ourselves about our body size, despite nearly two centuries of evidence that we shouldn’t. One patient with hypothalamic obesity—the kind that led to the unraveling of the biological regulation of eating behavior—told us he couldn’t shake the feeling that his size was the result of a personal failing. His tumor seemed like “just another excuse” for being too large. He’d so internalized the idea that his body size was his “fault,” that even the effects of a brain tumor seemed like something he should have been able to will his way out of. Julia’s mind wanders to her own history of obesity, the feelings of failure and shame she’d been carrying around since childhood. Those feelings lingered like an invisible scar, long after she lost the excess weight. Talking to Goldstone, she could see more clearly why she’d struggled with obesity. There are no brain scans available to diagnose people who are particularly motivated by food, but she didn’t need an fMRI to know that she’s the kind of person who reads cookbooks in the evening, relishes grocery shopping, and feels more excited for the restaurant after a movie than for the movie itself. This motivation went beyond just taking pleasure in eating and cooking. Julia struggled with binge eating when she was younger, and she still finds it hard to cut back on sweets and, too often, to stop eating them once she starts. These kinds of responses to food have a genetic basis—or more specifically, genes influence how our brains react to food. Almost all of the thousand-plus genetic variants now linked to common obesity—the kind that affects most people with the disease—act mainly in the brain. That means the neurobiology of people who have obesity variants is different from that of people without them. A consumer genetic test revealed that Julia is at a higher risk than most of the population for both obesity and type 2 diabetes. The finding helped explain not only her own past obesity but why, even in impoverished postwar Italy, her ancestors tended to be plump. Several, including her grandfather and his mother, had been diagnosed with type 2 diabetes. Instead of discovering answers about our diet struggles in a metabolic chamber, we might find them in our DNA—a less popular idea, perhaps because we have no illusion of control over our genetic code, no “geneboosting” supplements to sell. But pinning eating behavior—any behavior—to genetics would again be too simple, just as our hormones, neurons, and brain pathways are each only pieces of the puzzle. (Genes that fully explain behaviors, like the leptin receptor mutations we discussed earlier, are rare.) These biological signals, or even systems, don’t cause a behavior to happen. As Goldstone described over lunch, hormones and other signals from the body influence the activity in brain circuits that control eating behavior, as do our genetics and hundreds of other layers of context that shape our biology, like culture, early childhood experiences, exercise, sleep, stress, income, and education. Together, these factors make a behavior more or less likely. Then something in the environment—the ad for chips, the smell of the bakery, sitting on the couch watching TV—prompts us to act. To eat the chips or cookie, you need the motivation—hunger, or desire—but you also need the chips or cookies to be there, ready for the taking. To understand how all this might unfold in an individual, let’s go back to Julia—present penchant for sweets, past obesity. Her mom has a sweet tooth, perhaps cultivated by the fact that she was fed sweetened condensed milk with formula as an infant. Her diet probably leaned sweet while pregnant with Julia. Maybe this bathed Julia in high levels of glucose in the womb, perhaps increasing her odds of liking sweets and even of obesity. Julia’s sweet tooth was refined further by the food environment at home. Her mom lovingly cooked fresh food every day for Julia and her brothers. The kitchen cupboards, like many in North America, were also brimming with sugary, fatty junk food—Lucky Charms and Pop-Tarts for breakfast; candies, cookies, and chocolate granola bars for snacks; bottomless soda and juice. Just as the kittens in the banana study did, Julia watched her elders munching on these foods, and she learned to eat and drink them, too. Julia lived in North America, a risk factor for sweet eating and for obesity, in a time when food, and foods designed to be overeaten, were overwhelming the broader food environment—school, grocery store, coffee shop. When Julia’s weight gain accelerated in adolescence, she started commuting a long distance for school, which cut into her sleep—another underappreciated factor that affects what and how much we eat. Julia didn’t grow up poor or food insecure but many of her ancestors had, further risk factors for obesity. Cohort studies, which follow people and their descendants over decades, found that the offspring of mothers who experienced famine during pregnancy were more likely to develop obesity and type 2 diabetes. The memory of a lack of food as a fetus was encoded in chemical modifications of their DNA that may have persistently affected how their genes worked decades later. It’s even possible that these epigenetic changes that increase disease risk can be passed on through multiple generations. These nuances help explain why not everybody who’s genetically susceptible develops obesity. On the flip side, researchers are learning that those who carry obesity-associated gene variants seem to be even more sensitive to their environments—including healthy ones—another shift Julia lived. As a teenager, she would have given anything to not be fat, and then suddenly, by her thirties, she wasn’t. Her weight loss didn’t follow a deliberate diet or exercise plan; it happened gradually, with changes to her environment. She married a thin, marathon-running European who shopped mostly in farmer’s markets. Together, they moved to some of the most walkable cities in Europe—Vienna and Paris. They cooked more, moved more, and mostly avoided ultra-processed foods. Julia’s weight dropped off without conscious effort, just as she got fat earlier in her life, without conscious effort. Going back to the breathing analogy, she was breathing easier now, in the food environment equivalent of sea level, rather than gasping for air at the top of a mountain. Even in cases as extreme as PWS, a healthy food environment reduces the chances that a person will develop obesity, which is why doctors like Goldstone spend so much time trying to get patients into group homes where professionals manage their diets. This kind of control over one’s food intake happens only in people with genetic neurodevelopmental disorders like PWS. Their intellectual disability and extreme hunger usually mean that they do not have the capacity to make decisions about what they eat, Goldstone explained. This is a major risk to their health. “We are then able to put in place a deprivation of liberty legally, that enables a caregiver to say, ‘No, you can’t have that food,’ and this can be lifesaving.” Most of us don’t live in group homes or authoritarian regimes where every morsel of food is controlled. We live in environments where food is plentiful, highly palatable, and accessible all the time. If our biology makes us vulnerable and creates potentials that shape our behavior, our environment seals the deal. This cacophony of signals, from inside and outside, guides our lives with food and our body size. Not conscious decision-making. Not willpower. If we accept that, then we should never poke fun at another person for being too large (or too small). We should be less judgmental, and less cruel, to ourselves and each other. [*9] We should also be extremely skeptical the next time someone tries to sell the latest diet or lifestyle trend promising to result in effortless longterm weight loss by adopting a simple lifestyle hack or diet. The peddlers perpetuate the illusion that we are in full control of what and how much we eat. They say our past efforts have failed because the truth had been hidden from us, and the truth they reveal is a compelling story that sounds scientific. Slow metabolism. Poisoned mitochondria. Trapped body fat. Glucose hacks. Just follow a new plan or buy this product. Testimonials of success abound. All of them downplay the critical role our environments play in shaping our behaviors and lead us to the conclusion that something in us needs fixing, not the food environment. As we’ve already seen in this book, people can adopt a range of different lifestyles and lose weight in the process. But these changes usually occur alongside major environmental shifts that promote success. All of the people we interviewed for this book who lost dramatic amounts of weight and kept it off told us as much. They were also privileged enough to be able to insulate themselves from the worst effects of the broader food landscape. The influence of our environments on our behavior and biology is so much more dramatic than any single diet hack that it’s now the focus of Kevin’s research. He shifted away from looking at the relatively small effects of swapping out different nutrients on calorie expenditure or body fat to trying to understand the mega impact the food environment has on our eating behavior. This seems like a good time, then, to look at the food environment—how it changed, and how it’s changing us. Skip Notes *1 The brain also influences the body’s organs through the nervous system, providing another way for the conductor to guide the orchestra. *2 The name ob/ob comes from the fact that the mouse has obesity resulting from inheriting mutations in both copies of its leptin genes from its mother and father. *3 A quick reminder: Hormones are the messengers of the endocrine system, one of two major communication and control systems in the body—the other being the nervous system. Produced in one area of the body, hormones travel around in the bloodstream, sending chemical signals that only cells with dedicated receptors can receive. *4 Like all the hormones involved in eating that we’ll talk about, leptin does much more than just control food intake; it controls a range of hormonal responses that originate in the brain, including the reproductive hormonal axis. This is why people who don’t produce leptin experience delayed puberty. *5 The weights of adopted children, for example, were closer to those of their biological parents than their adoptive parents. Twins raised in different houses also shared an almost identical weight. *6 Now you might be wondering: If driving down dopamine causes apathy in a condition like Parkinson’s, how could the dopamine decline after your meal not arriving cause you to become more alert? Dopamine works on different time scales. The short bursts or dips you get with cues—known as phasic dopamine release—are anticipation and learning signals that help you predict rewards. If the reward doesn’t come, that quick dopamine decrease helps you learn that something unexpected happened. In Parkinson’s, the low dopamine state is tonic, or long-term, and more related to motivation. *7 Leptin might set the overall amplification of the orchestra of short-term signals controlling food intake within a meal. So, if you arrived at a meal after days of starvation, leptin would already be low and would likely cause you to eat larger portions and more calories. *8 In the future, it may be possible to target GLP-1 drugs to activate only the nutrient-sensing hindbrain neurons and not the nausea-inducing ones, paving the way for even better obesity drugs with fewer side effects. *9 “If we deny free will when it comes to the worst of our behaviors, the same must also apply to the best,” the neurobiologist and free will skeptic Robert Sapolsky writes in his persuasive book Behave. For more great free-will-skeptical reading, see Sapolsky’s Determined or Free Will by Sam Harris. CHAPTER 7 The Food Environment We just covered the mess of signals that control our eating behavior— and made the case that overeating and rising rates of obesity did not stem from failures of willpower or a collapse in personal grit, but from our biology meeting the contemporary food environment. And let’s not put too much emphasis on our biology. It’s the food environment that’s likely been the main driver of the increasing prevalence of many chronic diseases. Now we’re going to go deep into what’s changed about our food environment (yes, this is the chapter where we get into ultra-processed foods). We’ll also look at how food environments alter our biology and drive diet-related diseases, including obesity. What is it about ultraprocessed foods, or UPFs, that compels us to eat even when we’re not hungry or we’ve had enough? Are these foods as addictive as illicit drugs? As always, the insights started coming thanks to rodents. SUPERMARKET DIET Tony Sclafani was a neurobiologist with a problem. He wanted to develop an animal model for human obesity, but he couldn’t get his rats to fatten up quickly enough. For years, Sclafani had studied hypothalamic obesity in rodents. He knew that slicing into the brains of the animals was one way to cause rapid weight gain, as we saw in the last chapter. But he had trouble finding a diet that would do the job in his animals. Other researchers had managed to get young rats to gain weight by simply offering them high-fat chow—a lard-enriched variation of the standard lab fare. The approach didn’t work well on adult rats. Decades earlier, another researcher had apparently explained why: Adult rats eat for calories and adjust how much they consume to resist weight gain and maintain a healthy body size. The only way to affect the so-called “set point”—the narrow range of weight that an individual seemed to regulate— was to mess directly with the hypothalamus. Surgery could do it. Diet couldn’t. One day, an accident in the lab made Sclafani wonder whether he’d gotten it all wrong. On the table where he was working with one of his rats, a colleague spilled some food—Froot Loops, the sugary boxed cereal of improbable colors and flavors that don’t exist in natural food. The cereal drew the rat like a magnet. “[It] immediately started eating Froot Loops,” Sclafani told us. Usually, whenever he worked with one of his rats in an open space, the animal would continually try to escape. “The last thing they want to do is to start eating,” Sclafani explained. Something about the Froot Loops was so alluring, the rat ignored its natural rat instincts and instantly dug in. Sclafani knew he had to study rats on human “junk food.” One summer day, he sent a graduate student, Deleri Springer, to the supermarket near his lab at Brooklyn College with simple instructions: Buy a variety of junk food, the kind that populated supermarket shelves even then, in the early 1970s. Springer returned with a smorgasbord that included chocolate chip cookies, milk chocolate, marshmallows, peanut butter, and sweetened condensed milk, as well as salami and cheese. Back in the lab, Sclafani and Springer launched what would become one of the seminal experiments of obesity science. They placed twenty rats in wire mesh cages and offered them food and water, then divided the animals into two groups. The ten rats in the control group continued to eat a chowand-water diet. The other ten also got chow and water, plus high-fat chow and a rotating menu of at least four different supermarket foods. Within ten days, what they observed won’t surprise anyone who has seen the popular video of the New York City rat dragging a slice of pizza down the stairs of a subway station. The rats on the supermarket diet essentially ignored the chow and opted for the junk food. They began to rapidly gain weight. By day sixty, the experimental group increased their average body size by about 50 percent—almost three times the weight gain of the control group. Sclafani was the grandchild of Italian immigrants, reared on Italian food, supplemented by his favorite treat: supermarket chocolate chip cookies. He wasn’t surprised that the rats shared his penchant for the cookies, but he was taken aback by how much they were eating and how quickly they were gaining weight. The supermarket group achieved levels of obesity that rivaled the animals with the lesioned hypothalamuses. They gained more weight than previous studies of adult rats on high-fat chow. And they did this without any effort. Something about the human food environment spontaneously caused them to eat more than they needed to and grow fatter than on any other diet Sclafani had ever tried. If there was a set-point body weight determined by the hypothalamus, supermarket food could reset it at a much higher level. “Obesity can easily be produced in the laboratory by giving normal adult rats…an assortment of highly palatable foods,” Sclafani and Springer reported in a 1976 paper. “The findings of this study question the view that rats eat for calories and are precise regulators of their body weight.” What became known as “the cafeteria diet” was an uncontrolled approach to feeding animals for experimental purposes. The rats could choose what they wanted to eat among the smorgasbord on offer, making it difficult to pinpoint the variables that may have driven their weight gain. After the paper was published, Sclafani was criticized for his methodology. In retrospect, the criticism seemed to belie the genius of the experiment. What Sclafani was doing with rats in the lab was a great parallel for how humans had begun to eat—and still do. We were living in the human-size equivalent of Sclafani’s junk food–laden rat cages. The types of food the scientists bought for the lab rats, later dubbed “ultra-processed” and “hyperpalatable,” had begun to infiltrate every part of American life. Like Sclafani’s rats, humans weren’t consciously choosing to eat more and fatten up. What if the human food environment played the same role in people as it did in rats? What if the food industry had been unwittingly conducting an experiment in humans—changing our food environment in ways that messed with our internal eating signals and shifting the balance point where our body weight is regulated? It was into that quagmire that Kevin, his mind on Sclafani’s work, waded. Kevin had already been thinking for a long time about how important environments are for shaping people’s bodies. It occurred to him: No one had ever run a clinical trial in people like Sclafani’s supermarket study in rodents. Putting study participants in his hospital ward to test different food environments might be just as artificial as Sclafani’s rat cages but perhaps equally enlightening. The resulting trial, published in 2019, has been cited thousands of times and continues to garner enormous press attention. It has been hailed as the most important nutrition study since vitamins, and activists and doctors in countries as diverse as Brazil, the United States, and England have held it up as evidence that governments need to regulate ultra-processed foods. The study added heft to the discussion about the links between poor health and UPF consumption, helping to catapult public interest beyond the previous nutrition villains: salt, sugar, and fat. The impact of the research still surprises Kevin. After all, he’d simply looked at twenty adults, for a period of a month, when they resided in a hospital. The study didn’t say anything about the real world—like how UPF environments affect kids, or what happens when people have to buy, prepare, and cook their own foods. Some critics wondered how the results might change outside a hospital and over the long term, while others argued that they were so glaringly obvious, it was as if Kevin designed a study to see if the sun would rise. As it turns out, there are lots of foundational questions about these foods that haven’t been answered, or even tested, as people rush to incorporate the concept of UPFs in diet advice, diet books, and policy. (Sound familiar?) The most pressing of all: How exactly do diets high in UPFs lead to overeating and poor health? Is it even the UPFs themselves—or is it their marketing, convenience, ubiquity, and affordability? HOW THE FOOD ENVIRONMENT CHANGED We have had hundreds of thousands of years to adapt to the invention of cooking, and about ten thousand to thirteen thousand years to adapt to agriculture for food production. That’s the blink of an eye in the context of evolutionary time. The pace and scale of the food environment shift we’ve been living through over the last several decades is not even a blink. In countries like the United States and England, our food has transformed so rapidly that much of it would be unrecognizable to even our recent ancestors. It’s not just that what we eat changed a lot since our grandparents; it’s that what we eat changed from our parents’ childhood to ours, and it’s morphed again from our childhood to our children’s. This new food environment—in which UPFs dominate the food supply—has emerged as the most likely driver of increasing rates of obesity, far more impactful than any decreases in physical activity. Some of the world’s longest-term data on food consumption trends, processed and otherwise, comes from Canada. Focused on household food purchases, a team of researchers from Canada and Brazil found that around 1940, a quarter of calories available at home came from ultra-processed foods. Back then, the category featured mainly breads, spreads, and sauces. By 2001, the purchasing of ingredients for cooking plummeted while 55 percent of household calories came from UPFs. These were now mostly sweetened products—pastries, cookies, sugary drinks, and ice cream— along with breads, cereals, and snacks. By 2018, UPFs comprised almost 60 percent of the calories consumed by American adults and two-thirds of the calories eaten by kids. And while the popularity of snacking, and eating sugary foods and drinks grew, so did portion sizes. Many of the foods we eat now come from the factories of a handful of global firms. Three corporations, for example, control 80 percent of America’s chocolate market; three firms control 73 percent of the breakfast cereal market; and four firms control 61 percent of the U.S. cookie market. Breakfasts from cardboard boxes in sugared cereal, tart, or waffle form. Microwave lunches and dinners in plastic. Gooey cheeses that can last for years in a cupboard. In between, bottomless snacks, juices, sodas, chips, and cookies. These changes not only altered what and how much we eat but how and where we eat. We eat out more than ever before, while the total share of income spent on food, and time dedicated to cooking at home, has dropped off spectacularly. As the London-based physician-researcher and journalist Chris van Tulleken writes in Ultra-Processed People, a book that Kevin’s UPF research helped inspire, this represents an entirely new food age, “in which most of our calories come from food products containing novel, synthetic molecules, never found in nature.” The least healthy UPFs are then heavily marketed at the public, including at children. When Sclafani ran his original experiments, only about 15 percent of adults in America were classified as having obesity. But Americans had already been growing larger for some time. At first, the increases in body size, which began around the turn of the last century, were a good news story. Prior to that time, malnutrition and deficiency diseases such as rickets and pellagra were common, stunting human growth—both height and body weight. As nutrition improved and people grew taller and larger, height and weight took on different trajectories. Over the course of the twentieth century, Americans were gradually getting fatter, and the normal distribution for BMI started skewing. This meant more and more people were developing not only obesity but severe forms of obesity. By 2018, the obesity prevalence in America was 42 percent—a near tripling from the 1970s. Almost 10 percent of the population was classified as having severe obesity—a BMI of 35 or greater. The increases in obesity rates that started in America and other high-income, Western countries eventually took off worldwide. For the last several decades, public health researchers have been practically shouting about how the “obesogenic” and “toxic” food environment is driving these trends, how obesity is clearly an “ecological” phenomenon. Our genes, they point out, couldn’t have changed quickly enough to explain the rapid increasing prevalence of obesity and dietrelated disease within only a few generations. Economic development, they point out, tracks precisely with obesity trends. Barry Popkin, an American nutrition researcher, described the “nutrition transition”: As societies move out of famine and poverty, diets shift toward Western-style eating patterns —higher in sugar, fat, processed foods, and animal products. Around the world, more wealth has meant more calories available in the food supply, and more obesity—what the New Zealand obesity researcher Boyd Swinburn later called the “obesity transition.” The changes are easiest to see in immigrants, who leave countries that are earlier in the nutrition transition for destinations like America or the UK. When researchers track their health, and even compare them to the siblings they leave behind, the immigrants are more likely to gain weight and develop heart disease, high blood pressure, obesity, and type 2 diabetes—risks that compound with every passing year in their new countries. The reverse trend has also been documented: Curtail economic development and interrupt food supplies, and calorie availability drops off, along with diet-related chronic diseases. When Cuba experienced an economic crisis in the early 1990s, following the end of aid and trading privileges with the fall of the Soviet Union, food became expensive and scarce. Obesity prevalence and type 2 diabetes incidence declined precipitously. In the first decade of the twenty-first century, as the food supply recovered, both diseases rebounded. To say it again: It was certainly not a collapse of willpower that caused diet-related chronic diseases to surge. In one paper, researchers tracked shifts in a range of behaviors that require personal responsibility. They wanted to get to the bottom of whether people had become less responsible in other areas of their lives, outside of eating. Going to college, smoking, drinking, using protection during sex, driving without a seat belt—on every single metric, the situation had either improved or stayed the same. Over the same period, the food environment drove people to increasingly eat in ways they didn’t want to and knew they shouldn’t. In another paper, researchers found a simultaneous uptick in obesity prevalence around 1980 across age ranges in men and women—another observation that also pointed to an environmental trigger. All along, the American nutrition researcher Marion Nestle has argued, the food industry covertly shaped government nutrition policy and even science in ways that only exacerbated the obesity epidemic. Despite the widespread and long-standing appreciation among scientists of the role of the food environment in causing our health to collectively deteriorate, the popular and political focus has remained on individuals. In an international survey of some three hundred policymakers, more than 90 percent said they thought personal motivation was the likely cause of rising obesity rates. Even science is skewed toward an individual bias. Biomedical research funding overwhelmingly goes to developing treatments for individuals rather than understanding how environments sicken us in the first place or how we can change them to prevent disease. James Tabery, a professor of philosophy at University of Utah, documents this tension extensively in his book Tyranny of the Gene. He contrasts the relatively paltry investment in research to explore the environmental determinants of health compared to the many false promises offered by the biomedical paradigms of human genomics and precision medicine. Diet-related chronic diseases are most responsible for driving up healthcare costs, yet less than 5 percent of the NIH’s budget is invested in nutrition research. All of this meant that the leading theory about how our new food environment drove up rates of human obesity—that UPFs were the key culprit—had never been experimentally tested in humans. And as we’ve also seen throughout the history of nutrition science, just because something seems painfully obvious doesn’t make it true. You have to design an experiment and test it. THE NOVA PARADIGM The first time Kevin heard about UPFs causing obesity, he was at a conference in 2015. After he presented his work on low-carb versus low-fat diets in Los Angeles, two dietitians walked up to him and asked why he kept studying nutrients when the real problem was clearly how UPFs took over our food environments. Kevin didn’t know what to make of the question at first. He asked them to explain what they meant. The women told him that they were from Brazil, where the NOVA food classification system had been developed and had recently formed the basis of widely praised national dietary guidelines. NOVA was a new way of thinking about food, established by a Brazilian physician-researcher, Carlos Monteiro. Monteiro had been analyzing nutrition trends in Brazil when he noticed something puzzling: People reported purchasing less salt, fat, and sugar—nutrients often thought to be driving the observed increasing rates of obesity. Brazilians, he soon realized, were cooking less, eating fewer unprocessed or minimally processed foods, but also relying more and more on industrial pre-prepared products. This change to the national diet, more so than the nutrient composition of the foods, was probably responsible for increasing obesity, he reasoned. Monteiro called for a paradigm shift. “Orthodox teaching and practice on nutrition and health almost always focuses on nutrients, or else on foods and drinks,” he wrote in his first paper on the subject, in 2009. Instead of analyzing diets through the old lenses of carbohydrates, fat, or vitamins, or even the consumption of individual foods themselves, the focus should be on how the foods are made—their processing and formulation. He eventually refined NOVA’s four categories. “Unprocessed or minimally processed” (think fruits and vegetables, legumes, milk, meat); “processed culinary ingredients” (sugar, butter, oils, salt); “processed foods” (canned beans or fish, fresh bread, artisanal cheeses); and “ultra-processed foods,” which often lack intact whole foods and contain ingredients not commonly found at home, like cosmetic additives to alter their flavor, texture, and color. If you (or even a chef) can’t make it in the kitchen, it’s probably ultra-processed. UPFs are often energy dense and hyperpalatable, making them easy to overeat, he pointed out. But with NOVA, Monteiro also wanted to spotlight the purpose of food ultra-processing. The food industry developed UPFs to increase profits, he argued, taking low-cost commodities that wouldn’t be appealing alone— corn, wheat, or soybeans—breaking them down, and reassembling them with additives, to make foods they could market to consumers as delicious, convenient, and time saving, displacing traditional foods from our diets. Kevin was intrigued. He asked the dietitians what they thought it was about ultra-processed foods that increased obesity prevalence. They speculated that these products are often high in salt, sugar, and fat. The trio of nutrients, they said, drive excess consumption—the narrative that had recently been popularized by journalist Michael Moss in the best-selling book Salt Sugar Fat. But of course, salt, sugar, and fat were just nutrients. The questioners were leaning on the old nutrient paradigm to explain the problem of ultra-processing. Kevin left the conference unconvinced. Nutrition science had focused on nutrients for good reason. Food was the vehicle for delivering energy and nutrients to our bodies. If the problem was processing and formulation, as Monteiro proposed, and NOVA was going to blow up the nutrient paradigm that had reigned since the time of Liebig, we needed to be sure that the idea had merit. At the same time, Kevin realized, there was no randomized trial comparing the effects of a food environment high in ultra-processed foods with a food environment based primarily on minimally processed whole foods. Thinking back to the cages containing Sclafani’s rats, Kevin remembered that the animals had left behind lots of calories in the form of rat chow while they were gorging themselves on supermarket foods. It clearly wasn’t only calories and nutrients they were after. There was something about the supermarket fare that drew them in. TESTING UPFS Instead of rats in a cage, Kevin recruited twenty people to live 24/7 in a hospital ward for four weeks. There, he would alternately expose them to two different food environments for two weeks each, in random order. Kevin’s team would match the food environments for calories available as well as several nutrients: salt, sugar, fat, carbs, and fiber. But each diet would vary widely in the amount of UPFs they delivered. One food environment would feature more than 80 percent of the calories from ultra-processed foods—a medley of hamburgers, hot dogs, chicken nuggets, breakfast cereal, flavored yogurts, soft breads, potato chips, crackers, and cookies. Similar to the diets of most Americans, it also failed to meet the U.S. dietary guidelines and included very few whole grains and vegetables. The other environment would have no UPFs and more than 80 percent of calories from minimally processed foods—a rainbow of vegetables and fruits, nuts, unsweetened yogurt, eggs, meat, and legumes. The study participants would also get huge portions—double the number of daily calories required to maintain their weight—and instructions to simply eat as much or as little of the food as they wanted. What they didn’t know was that all their leftovers would be meticulously measured to calculate what they had eaten over the monthlong experiment. In addition to their calorie intake, the researchers would track many other variables at several time points: blood hormones, markers of inflammation, insulin secretion, body weight, body fat, and more. They’d spend one day each week living in metabolic chambers to measure their energy expenditure, and they would continuously have their glucose levels and physical activities monitored. If they ate more during their time in the ultra-processed food environment, then something other than nutrients like salt, sugar, and fat in the diet was responsible for the gap in calorie intake. On the other hand, if the available calories and matched nutrients drove consumption, then people would eat about the same number of calories in the two environments. That was Kevin’s prediction, anyway: The nutrients would be the main drivers of intake. The extent of the food processing wouldn’t matter. But the beauty of the experiment was that whatever the outcome, its results would be interesting and potentially important for understanding the role of UPFs in the obesity epidemic. When the study participants went home and his team analyzed his results, Kevin quickly realized his prediction was dead wrong. The same people spontaneously lost weight and body fat when they were in the minimally processed food environment. The ultra-processed food environment, on the other hand, caused them to eat some 500 more calories each day; they gained body weight and fat, just like Sclafani’s supermarket rodents. Interestingly, the participants did not report significant differences in their appetite—claiming they experienced similar hunger, fullness, and satisfaction on both diets. They reported the meals to be similarly pleasant, yet another clue that “wanting” or “craving,” not just conscious “liking,” drove the UPF overeating. They also consumed equivalent numbers of calories from the constantly available snacks in both food environments, meaning the three square meals could explain the overall effects. Kevin’s team had just produced the first experimental evidence in humans that lined up with the mountain of observational studies linking UPFs to obesity. Diets heavy in UPFs had already been linked to a higher risk of other chronic diseases like type 2 diabetes, cancer, fatty liver disease, cardiovascular disease, dementia, depression, and asthma, as well as allcause mortality. The junk food environment was unequivocally a powerful driver of eating behavior not only in rats but in people, too. It messed with that orchestra of signals that helped guide our food intake. The NOVA godfather, Carlos Monteiro, now argues that the observational evidence alone is enough to fulfill the Bradford Hill criteria, a group of principles that can be used to establish a causal relationship from epidemiological studies (named after the researcher who linked smoking and cancer). To name a few, across countries and contexts, the data are consistent, there’s a dose response (the more UPFs, the worse the health outcomes), and health complications arise as people start eating more of these foods. After a half century of skirmishes about nutrients—like the carbs and fat the diet tribes relentlessly pitted against each other—the problem with food seemed to lie elsewhere. But where exactly? HYPERPALATABLE FOODS There is no end to the theories. Is it the soft texture of UPFs that increases the speed of eating? Their lack of insoluble fiber? Is it that UPF manufacturing extracts water from food, thereby concentrating more calories per bite—their energy density? Is it the flavor additives that disrupt how the body senses nutrients? Do UPFs contain too little protein to adequately satiate us? Do they just push out healthier foods like whole fruits and vegetables? Are there combinations of ingredients that make UPFs addictive? Despite the bold pronouncements, once again, we’re just starting to tease out the details. And the picture already looks far more nuanced than the headlines might suggest. When Kevin finished his first UPF study, he began to look for clues in the data, examining what properties of the meals people were offered most related to how many calories they consumed. Did meals with less protein lead to more calorie intake? Surprisingly, no. In contrast to the popular notion that higher-protein meals decrease calorie intake, Kevin’s study found the opposite. The more protein in a meal, the more calories people consumed.[*1] There were also no diet differences between glucose levels measured using continuous glucose monitors (stand by for more on these devices in Chapter 9). So it wasn’t blood sugar spikes that drove overeating. People even burned fewer calories during the minimally processed diet period, probably because they had cut calories and started losing weight without knowing it. So it wasn’t about a slowing metabolism, either. What about energy density? Bingo! The number of calories per gram of food on the plate appeared to be a major cause of overeating. This was true regardless of the level of processing. When the study participants were offered meals that were more energy dense, they wound up eating way more calories. Energy density has been a well-known driver of calorie intake for decades, thanks to pioneering research from Barbara Rolls at Pennsylvania State University. Energy density was also a mechanism Monteiro proposed in his first NOVA paper. Foods with greater energy density are often thought to be higher in fat relative to carbs and protein because, as we discussed in Chapter 4, fat packs more than double the number of calories per gram. But another way for foods to have higher energy density is if they contain less water. The manufacturing of UPFs disrupts the food matrix—the physical structure of foods—to extract water, inhibiting bacterial growth and extending shelf life. In Kevin’s study, this was the reason the UPFs were more energy dense— and these dry, calorie-laden products were accompanied by few watercontaining whole foods (like fruit and vegetables). The result: energy-rich meals strongly linked with overeating. What about the formulations of particular foods—could they also explain overeating? A psychologist at the University of Kansas, Tera Fazzino, reached out to Kevin to suggest another way of looking at the data. She pointed out that while the UPF study matched the overall diets in terms of salt, sugar, fat, and carbs, maybe study participants were presented with more individual foods during the UPF period that featured harder-to-resist combinations of these nutrients. Maybe these hyperpalatable foods led to overeating. Fazzino has a PhD in experimental psychology, focused on addiction. During a brief stint working with patients with obesity, she noticed they’d often talk about food the way alcoholics would wine and beer. They’d detail disruptive cravings, eating even when they didn’t want to, or they knew they shouldn’t, and sometimes past the point of feeling good or even vomiting. But how could one quantify the addictive potential of particular foods? In addiction science, Fazzino knew the specific ways to measure when a substance causes its acute or toxicant effects. But nothing similar existed for what we eat. So she reviewed the available scientific literature on potentially addictive foods and dumped that data into standardized nutrition software that allowed her to quantify their nutrients. Then she and her colleagues analyzed the data to figure out commonalities across food items. They distilled their findings to a description of three groupings, or hyperpalatable foods. They aren’t necessarily ultra-processed. Rather, they are foods that contain unique pairs of nutrients that exceed thresholds that don’t usually exist in nature. They’re either high in both sugar and fat, high in both salt and fat, or high in both carbs and salt. This encompasses everything from a Twinkie or a bag of Doritos, to homemade broccoli smothered in cheese, and your grandma’s apple pie. When Fazzino started to look into why these foods became more prevalent, she found that the explosion of ultra-processed hyperpalatable products coincided with the time during which America’s biggest tobacco companies owned some of America’s largest food brands. Philip Morris purchased Kraft in 1988, and R. J. Reynolds bought Pacific Hawaiian in 1962, Del Monte Foods in 1979, then merged with Nabisco in 1985. The food brands and products that were under tobacco control included familiar names like Lunchables meal kits, Hawaiian Punch, Kool-Aid, Philadelphia cream cheese, 7UP, Velveeta, Tang, and Capri-Sun. The tobacco-owned foods were more hyperpalatable than anything else in the industry, Fazzino discovered. Between 1988 and 2001, tobacco-owned foods were nearly 30 percent more likely to be classified as fat-and-salt hyperpalatable and 80 percent more likely to be carb-and-sodium hyperpalatable than foods that were not tobacco owned. Much of the rest of the industry followed tobacco’s lead. By 2018, hyperpalatable foods comprised almost 70 percent of what’s available in the U.S. food supply, up from 49 percent in 1988. When Kevin and Fazzino analyzed the data from his UPF study for hyperpalatability, they found a relationship that suggested Fazzino was onto something: The more hyperpalatable foods offered at a meal, the more likely the study participants were to eat a greater number of calories. Along with energy density, hyperpalatability emerged as the strongest predictor of calorie intake. Both energy density and hyperpalatable foods had similar effects on the number of calories eaten during meals. Importantly, this happened regardless of whether those calories came from the minimally processed or ultra-processed food environments. ADDICTED TO FOOD So how do hyperpalatable, energy-dense foods cause overeating and obesity? Media headlines confidently claim that “ultra-processed foods are as addictive as cigarettes or cocaine” based on the idea that they hijack the brain’s reward system. Tony Goldstone talked about this in the last chapter, where we discussed how dopamine is a chemical that helps you learn and seek rewards related to survival and reproductive fitness. When we eat a food that has a high addictive potential, it’s thought to result in a dopamine surge—larger than what we’d experience with most natural foods—in brain regions related to reward processing, similar to how the brain responds to addictive drugs. We know much more about how drug addiction works than we do about food addiction. Take cocaine. Addictive drugs like cocaine are thought to have a common final pathway involving dopamine and the brain’s reward system. Using cocaine results in a large, rapid, and prolonged surge of dopamine in various brain reward regions. With repeated exposure, these responses only grow—a process known as sensitization. Over time, dopamine begins to respond to cues in the environment that predict the availability of the drug and helps with learning about how to get more. Dopamine is also involved in eventually making these actions habitual, often below conscious awareness. With prolonged and repeated use, the neurobiology of the brain adapts. The dopamine surge after consumption drops, as do the number of dopamine receptors on reward neurons. This dampens the dopamine signal people get when they use the drug, leading to a “tolerance” that manifests as a need to consume more and more, a futile attempt to get the same high as the early days of drug use. Withdrawal of the drug results in an uncomfortable state of low dopamine signaling. Researchers now believe that compulsive drug use can become more about avoiding the negative consequences of withdrawal than about the pleasure of consumption. Fazzino and many other researchers studying obesity and food addiction suspect that something similar goes on in the brains of people who become addicted to hyperpalatable, energy-dense, often ultra-processed foods, an idea with roots that go back to 2001. That’s when the renowned addiction scientists Nora Volkow and Gene-Jack Wang published a landmark brain imaging study on patients with obesity. It appeared to show that their brains had reduced dopamine receptors similar to people addicted to cocaine. This observation was interpreted as evidence of tolerance in response to large dopamine surges after repeated bouts of eating hyperpalatable foods. Maybe people with obesity had food addiction? Maybe some foods were particularly addictive because of their ability to induce outsized dopamine responses? Kevin’s team recently tested these ideas by measuring dopamine responses to UPF milkshakes, high in fat and sugar. Fifty people without clinical eating disorders participated in the study, making it the largest of its kind to date. Using the PET scans that addiction researchers deploy in human studies, Kevin found, on average, that the participants didn’t exhibit a measurable dopamine response after consuming the milkshakes. The dopamine responses weren’t even different between people with and without obesity. Kevin suspects that there was a dopamine response to the UPF milkshake; it was just too small in most people to be detected using the PET method designed to measure much larger dopamine increases following the consumption of drugs like cocaine, or even nicotine. The results challenged the simple model that UPF addiction happens through dramatic, cocainelike dopamine surges, causing a tolerance that drives people with obesity to eat more and more. Among those who have food addiction, we may learn that it works differently than scientists originally thought, maybe differently from addiction to stimulant drugs like cocaine.[*2] Kevin also found that dopamine receptors were not lower in people with obesity, as that hallmark 2001 paper had suggested. Instead, people with obesity had higher levels of basal dopamine in their brain reward regions, resulting in an illusion of lower numbers of dopamine receptors. Higher levels of basal dopamine may play a role in motivating eating behavior or stamping in food habits, but we don’t yet know for sure. Interestingly, some of the study participants appeared to have a dopamine response that was related to how hungry they were before the milkshake and whether they wanted more after finishing it. There was also an association between the dopamine responses to the milkshake and how many UPF chocolate chip cookies people chose to eat at a buffet meal—the only available high-fat, high-sugar UPF offered. This all gave Kevin a new appreciation for the complexity of dopamine’s apparently multifunctional ability to shape our behavior, and left him feeling less confident that anyone really understands dopamine’s role in eating and obesity. Despite the challenges to the dopamine-mediated UPF addiction theory of obesity, there’s little doubt that food addiction is real, just as there’s little doubt that the addicting foods are probably hyperpalatable, energy dense, and often UPF. Years before Fazzino’s research, Ashley Gearhardt, a psychologist studying compulsive eating at the University of Michigan, developed the Yale Food Addiction Scale. Like Fazzino, Gearhardt wanted to systematically approach the addictive potential of food, but this time, to find a way to diagnose food addiction, based on the known characteristics of people hooked on illicit drugs. The scale has since been used in hundreds of studies in thirty-six countries and has shaped the research on food addiction. Across patients, Gearhardt told us, “ultra-processed foods are overwhelmingly the most commonly consumed addictive foods—chocolate, ice cream, donuts, pizza, potato chips.” Like some drugs—think heroin’s relationship to the poppy plant, for instance—UPFs are also enabled by manufacturing technologies that highly refine and concentrate rewarding substances for mass consumption, heightening their addictive potential. UPFs, Gearhardt argues, meet the same scientific criteria used to identify tobacco products as addictive. Specifically, many elicit strong cravings, we eat them compulsively, they have mood-altering effects, and they hook us so we keep coming back for more.[*3] Roughly 14 percent of the population and about 30 percent of people with obesity are estimated to have food addiction. As Gearhardt put it, simply talking to people who struggle with hyperpalatable foods is all you need to do to know that some of us are addicted and can’t stop. We don’t need brain imaging to prove it. But the threshold for clinical food addiction is much higher than the popular conversation suggests. Julia took the Yale survey, suspecting that her long-term and unflinching relationship with sugar, one she struggles to break off, might be addiction. When Gearhardt scored her results, she found that while she had some symptoms, she didn’t at all meet the cutoffs for “clinically significant impairment or distress.” Most people don’t, Gearhardt explained, and have only one or two symptoms, while a subset will warrant a clinical diagnosis—the same way many of us drink, but only a few have alcohol-use disorder. Lack of a diagnosis doesn’t mean there’s nothing to see here, she added. “When a substance is highly accessible, marketed, and socially acceptable, the widespread subclinical problems contribute in a large way to the public health costs and suffering.” And if UPFs are anything, they are highly accessible, heavily marketed, and socially acceptable. BACK TO THE FOOD ENVIRONMENT Fortunately, we don’t need to fully untangle the neuroscience of food addiction, or know precisely how the food environment resets the regulated level of body weight, to tease out the properties of UPFs that are most likely to lead to overeating, even addiction. Based on the analysis of the meal data in the original UPF study, Kevin’s team decided to test whether the energy density of the meals and number of hyperpalatable foods they contained were really the drivers of excess energy intake. To do this, they designed four test diets that were once again matched for various nutrients while changing their energy density and the proportion of calories from hyperpalatable foods. The study is still ongoing, and Kevin’s team is tracking what happens to calorie intake, body weight, and body fat in the same people who cycle through all four weeklong diets in random order. The first diet is minimally processed. That means no UPFs, and it’s also low in both energy density and hyperpalatable foods. The second is a diet high in UPFs, with lots of hyperpalatable foods and high in energy density. The third is high in UPFs and energy density, and low in hyperpalatable foods. The final diet is another high-UPF diet, but this time it is low in both energy density and hyperpalatable foods. All four test diets are matched for calories and various nutrients—salt, sugar, fat, carbs, fiber, and protein. Kevin’s prediction is that those two key parameters—energy density and hyperpalatability—will matter more for predicting energy intake than ultraprocessing per se. He thinks that even the food environment with lots of UPFs will result in calorie consumption similar to the minimally processed environment as long as both energy density and hyperpalatable foods are matched. In other words, it’ll be energy density and hyperpalatability that matter more than the level of processing for determining what people eat and how much weight they gain. So far, the results from the first half of the participants suggest Kevin may be right this time—but with yet another twist. When exposed to a UPF diet high in energy density and hyperpalatable foods, people eat the most calories and gain the most weight. This time, they consumed about 1,000 calories more compared to the minimally processed environment and gained about 1 kilogram in a week. So once again, the kind of environment many of us live in unequivocally causes overeating and weight gain, replicating the findings in the first study. When people were offered the high UPF diet that’s low in hyperpalatable foods but high in energy density, calorie intake seemed to be trending downward, but the study participants still gained weight. Does that mean hyperpalatable foods aren’t important? Not necessarily. The experiment may not have reduced hyperpalatable foods enough—or maybe the participants who completed the study so far were not particularly susceptible to them. Another intriguing finding: When the same people were exposed to the food environment rich in UPFs but low in both energy density and hyperpalatable foods, they consumed only slightly more calories as on the minimally processed diet. They also lost about the same amount of weight during both diet periods. So a diet high in UPFs may not lead to overeating if it’s low in energy density and hyperpalatability. Does this mean energy density is a more important determinant of calorie intake than hyperpalatable foods? Perhaps. But another interpretation is that we have to cut back on both energy density and hyperpalatable foods to reduce the risk that UPFs drive up calorie intake and cause weight gain. Remember, Kevin did not test a high-UPF diet that was rich in hyperpalatable foods and low in energy density (mainly because adding test diets increases the number of participants required and the length of time it takes to finish the study). Maybe people would have gained weight and it’s only by reducing both hyperpalatable foods and energy density that the UPF effects are mitigated. Surprisingly, Kevin once again found that the participants reported their meals to be similarly pleasant on all test diets. This suggests that reengineering the food environment to prevent excess calorie intake doesn’t necessarily require us to eat meals we like any less than those that cause weight gain. It also doesn’t require us to banish UPFs. Now here’s the twist (because there’s always a twist): Only the minimally processed diet caused body fat loss. The differences in body fat have to do with how many calories people are absorbing, Kevin suspects. Those eating a minimally processed diet consumed more insoluble fiber, the kind that comes from intact plants. Insoluble fiber has been shown to reduce the digestibility of food, leading to fewer calories absorbed. In Kevin’s first UPF study, he’d noticed that the difference in body fat between the two diets was greater than could be explained by the calorie differences calculated using nutrition software. He theorized that this was probably because the software calculations didn’t account for the fact that high insoluble fiber intake decreases calorie absorption, which some speculate happens by altering the gut microbiome. That aphorism “a calorie is a calorie” from Chapter 4 best refers to absorbed calories. Maybe eating a minimally processed diet caused more body fat loss because people digested and absorbed fewer calories compared to the UPF diet with low energy density and few hyperpalatable foods? Why the latter diet leads to weight loss without body fat loss is still a mystery. The study was still ongoing while we were writing, but already it tells us yet again that food environments are hugely important for determining what and how much we eat, and that energy density and hyperpalatability are important drivers of overeating. It also tells us that diets featuring UPFs can be designed in ways that don’t necessarily cause weight gain. That last bit squares with what we know from observational research about UPFs: Some subcategories are clearly harmful (like sodas, processed meat, and refinedgrain products) while other UPFs are associated with reduced risk (like whole-grain products). Even if Kevin and his colleagues unravel all the factors in our ultraprocessed food environment that most encourage overeating, there will probably be other important negative effects of UPFs beyond weight gain, obesity, and its downstream consequences: how these foods might cause gut dysbiosis and inflammation, for example, altering our immune systems to possibly increase the risk of autoimmune diseases and other health conditions. Overeating isn’t the only thing we need to be concerned about when it comes to UPFs, nor is it the only thing we’ve yet to understand about how these foods impact our health. As for the question of whether it’s the nutrients or processing and formulation that matters most to health, here’s how we think about it: It’s not necessarily the processing or formulation of the UPFs in and of itself that’s most problematic. It’s that the industrial processing and formulation of these foods made it possible to pack together energy and ingredients in a wide variety of new combinations, and at a scale, ubiquity, price, and convenience your grandma’s apple pie will never see. Delicious foods that were once expensive, labor intensive, and only available on special occasions—cakes, potato chips, pizza, cookies—have become cheap and easy to eat almost every day, all day. Not to mention, as Carlos Monteiro has pointed out since NOVA’s beginnings, these UPFs—enabled by their huge profit margins—are heavily marketed to the public. So what does all this mean on a personal level? Buy fewer foods that are energy dense and hyperpalatable, ultra-processed and otherwise, because they likely drive overeating and weight gain. You can try to cut back on UPFs generally, and reengineer your personal food environment by not keeping the stuff you can’t stop eating at home or within easy reach. If you make meals using UPFs, lower their energy density and add nutrients by combining them with lots of vegetables. The vegetables don’t have to be fresh or organic; frozen or canned will do just fine. But telling people to change their behavior once again puts the onus on individuals when it’s overwhelmingly clear: The food environment is the problem. If the risk of diet-related disease is so heavily influenced by our environment, then policy and regulation—not individual health advice— have to be the answers. As Kelly Brownell, the Duke University researcher who coined the phrase toxic food environment told us, we approach every other sphere of life, where it’s difficult to be personally responsible, like this. Instead of just telling people to drive carefully, we build guardrails and mandate speed limits, air bags, and seat belts; instead of suggesting that people boil their water, we treat and filter it at the community level; instead of asking people not to smoke, we heavily restrict the sale and marketing of cigarettes, tax tobacco, and outlaw smoking in public spaces. “There are lots of places where we deputize our government to step in and protect us from negative influences, and in ways that make it easier for us to be responsible and to survive negative things that are out there in the environment.” There’s overwhelming evidence of the vast cost and suffering caused by diet-related chronic diseases and that the ultra-processed food environment has played a leading role. In America alone, diet-related conditions like obesity, cardiovascular disease, and diabetes affect more than one hundred million people and annually cost almost 9 percent of the gross domestic product. Doing nothing is certainly an option—but it doesn’t seem like a particularly wise one. So where to start? Skip Notes *1 Kevin’s study was not designed to test the effects of protein, and other studies suggest that protein may have a satiating effect, as we discussed in Chapter 3. *2 Some neuroscientists have even begun to question dopamine’s central role in addiction. *3 The so-called reinforcing nature of some UPFs is best encapsulated by the classic Lay’s potato chip “betcha can’t eat just one” ads. While hyperpalatable, classic Lay’s potato chips aren’t actually UPF. CHAPTER 8 Purer Food We just took a close look at how environments featuring different kinds of food can cause overeating and maybe even addiction. Now let’s zoom back out again—to all the other ways our food environment shapes our health, beyond the specifics of the foods themselves. In Sclafani’s studies, just like Kevin’s, the participants got unfettered access to as much food as they wanted. But neither study tested how factors like price, marketing, the variety of foods offered, or convenience affect eating behavior and body weight. These nonnutritional features of the food environment may be the most important drivers of overeating in the real world. To have lots of diet-related chronic disease, we need the compelling, potentially addictive foods. But we also need those foods to be accessible, affordable, abundant, and heavily marketed. And right now, hyperpalatable and calorie-dense UPFs are almost freely available for many people as compared to the cost and inconvenience of buying and preparing healthy, whole foods. As Chris van Tulleken describes it, rising obesity rates are a “commerciogenic” phenomenon, caused by cheap, easy, tasty, ultraprocessed foods. Let’s pretend we live in an alternate universe for a moment. This is a place where delicious, healthy meals with lots of veggies are the most accessible, convenient, and affordable—and UPF treats are rare and expensive. Imagine if the healthy foods were not only easy to access and eat but heavily advertised. We bet that changing the economics of food to make the real world look more like this alternate universe would cause a lot of our food-related health woes to go away. Now for some good news: Economics are what we can most readily influence through regulation and policy. If the last chapter was the ultimate case that our toxic food environment —not individual failures of willpower—is the major driver of the current epidemic of diet-related chronic disease, in this chapter, we’ll explain what we can do about it. Luckily, there’s a road map from recent history about what works to change our food for the better. The early twentieth century faced a problem that parallels the one we have today. While we imagine that our greatgrandparents lived in a time of food purity, eating healthy, home-cooked meals from farm-fresh ingredients, in reality, processed foods were sickening and killing people, including children, at alarming rates. The chief chemist at the U.S. Department of Agriculture was determined to figure out why, while the food industry fought his efforts by exploiting their secret handshake with government. AMERICA’S STOMACHACHE The oak tables are draped in white linen and set with cloth napkins, and plain china. Every day, for breakfast, lunch, and dinner, a dozen young men in suits arrive for their meals. At one table, half of the men eat seasonal fare —string beans, potatoes, roast beef, chicken, fresh fruit, bread, and butter. At a second table, the other diners eat the same foods and drink the same beverages—coffee, tea, milk, and water. They also take capsules containing the kinds of chemicals that are used to bleach laundry, burn off warts, and embalm dead bodies. The cook and waiters in this restaurant, in the basement of the Department of Agriculture in Washington, D.C., are public servants. The diners—young men, mostly in their twenties, mostly clerks at the department—are volunteers standing as “America’s stomach.” Described in the newspapers as “martyrs to science,” they’ve come forward on behalf of the public to be poisoned. For a whole year, starting in 1902, the men rotate between the two tables for the meals, eating, drinking, and taking the poisons—while promising to consume nothing outside of this unusual restaurant. Everything they imbibe is carefully weighed. Their health status—including their body weight, temperature, heart rate, blood markers, and any symptoms—is monitored and every detail recorded. Even their feces and urine are analyzed. As the weeks progress, the researcher overseeing the study—a chemist named Harvey Washington Wiley—ratchets up the dose of the poison. Whenever the health of the two groups of men diverges, the poison in the capsules is to blame. Wiley, the head of the agriculture bureau’s department of chemistry, resorted to deliberately poisoning colleagues because he had grown desperate. At a time when America was rapidly urbanizing, industrial food processing was on the rise to meet the needs of new throngs of city dwellers. Food makers had begun experimenting with additives to preserve their products and boost their profits. Wiley wanted more transparency and oversight of this emerging, and entirely unregulated, industry because he suspected that some of their practices were as harmful as they were stomach churning. Dairy producers routinely spiked milk with formaldehyde to stem off rotting, while also dribbling in plaster of paris and chalk to whiten the liquid and make it appear more “wholesome.” “To mimic the expected layer of cream on top, they might also add a final squirt of something yellowish, occasionally puréed calf brains,” the science journalist Deborah Blum details in her eye-opening Wiley biography, The Poison Squad. Candymakers laced their confections with toxic metals such as arsenic, lead, and copper. Jars labeled as honey were actually corn syrup, capped with a sliver of honeycomb, to fool consumers. Ketchup contained only traces of tomato along with a slurry of cornstarch with apple and pumpkin scraps dyed red. Clay dust and red lead were deployed to color spices, candy, and cheese. Flour might feature gypsum—normally found in walls and ceilings—and crushed stones. And consumers had no way to protect themselves from the deceptively marketed and adulterated foods. Manufacturers at the time weren’t required to so much as disclose the weight of the food in the jars and bottles they were selling, let alone correctly list all the ingredients. By the time the poison squad studies launched, Wiley had already been lobbying for a decade for more transparent food labeling regulations. At the very least, he believed, consumers ought to know what they were buying so they could make more informed choices. But no one had systematically studied how small doses of the new chemicals in industrially processed food affected human health. As much as the link between processing methods and poisoning seemed obvious, the dearth of concrete proof was one reason officials could put off regulating the industry. Wiley designed the poison squad trials to fill in the gap, testing the most common additives, in graduated doses. From there, he would be able to deduce whether the chemicals were safe to eat, and at what amount. FINDING THE POISONS The first phase of Wiley’s experiment focused on borax, then one of the most popular chemical preservatives in food. In turn-of-the-century America, borax was used to curtail the growth of fungi and bacteria in milk and meat products—a practice that animal studies of the time suggested should be safe in small doses. But by less than half a gram per day, Wiley discovered, the participants in his trial began reporting stomachaches, mental fog, and nausea. By 3 grams per day, they were vomiting or dropping out of the study. One by one, he continued to test popular additives—salicylic acid, formaldehyde, sulfuric acid. People, he established, were being poisoned by common ingredients in processed food. Wiley grew increasingly concerned about not only individual additives but their cumulative effects. For Americans who were eating several processed dairy and meat products every day, it would not be difficult to exceed the minimal doses that seemed safe. The research shifted his stance: Rather than just recommending transparency about poisonous additives, he began to advise their outright removal from the food supply. The media frenzy around the trial grew—one Wiley encouraged. He’d dedicated almost as much time to science as he did to sharing his findings with the public and advocacy groups: participating in media interviews, penning popular magazine articles, delivering talks to influential stakeholders, including women’s groups. He knew that he needed the public on his side to make change. The stories of what had been happening in the basement of the Department of Agriculture spread, finally illuminating the cause of the stomachache Americans had endured for years: tainted processed food. “The experiment tends to explain away many of the poison ‘mysteries’ following the eating of canned goods and preserved foods generally,” the New York Times reported in 1904. But even with the clear evidence of harm, and even as Canada and Europe worked to rid their food of similar poisons, little changed in America. The food industry pushed behind the scenes to both undermine Wiley and his science and lobby Congress to resist regulation. President Theodore Roosevelt tried to pressure Wiley into being friendlier to industry, eventually appointing a board of scientific experts to discredit him. As Blum, the Wiley biographer, told us, of reading through some 1,400 linear feet of her subject’s papers—including internal government memos: “You start seeing spread before you in a panorama the secret handshake that exists between government and industry.” A work of carefully researched fiction finally brought the change Wiley had campaigned for. Upton Sinclair’s The Jungle was published in 1905 as a serialized novel exposing the gruesome conditions of the U.S. meatpacking industry—how workers were made to handle sick cows in stinking and filthy processing plants, awash with chemicals to disguise the rot and stench. Sinclair, a muckraking journalist and socialist, hoped to use the novel to raise awareness about the plight of U.S. factory workers. But the public was primed to read it another way: as further evidence of the impurity of the food supply. Fact-checking teams—including one dispatched by President Roosevelt, who did not trust Sinclair “the socialist”—traveled to Chicago to verify whether the journalist was depicting reality. They found conditions even worse than fiction. When the findings from the Roosevelt-commissioned report leaked to the press, other governments began barring imports of U.S. meat. Inaction and stalling were no longer options. In 1906, two laws passed: the Meat Inspection Act, and “Wiley’s Law” or the Pure Food and Drug Act. The Meat Inspection Act required the cleanup of slaughterhouses and asked meat processors to stop selling adulterated and misbranded products. Wiley’s Law prevented the manufacture, sale, and transportation of adulterated and mislabeled food and medicines. Savvy businessmen like Henry Heinz got involved in supporting the standards. (Heinz had invented a way to keep his products from spoiling without adding poisonous adulterants.) Regulation helped businesses like H. J. Heinz Company because their pure food products had been playing on an uneven playing field until Wiley’s Law. The food safety mandates arguably spurred the innovation that eventually led to the rise of fast food and unhealthy UPFs. Now so successful, they form the foundation of the food environment that is killing us slowly rather than quickly. THE CHRONIC FOOD POISONING ERA Thanks to Wiley’s era, we have regulatory guardrails in place to prevent acute food poisonings. This thing that seems so obvious that we now take it for granted—we probably won’t leave lunch with a stomachache, or be killed by our dinner—was not at all obvious a century ago. Now, instead of mere processing, we use ultra-processing to make shelfstable foods that comprise most of the calories feeding populations in places like the United States and the UK, and growing proportions elsewhere in the world. We have more energy and protein at our disposal than at any other time in history—but our food environment is fattening and sickening us on time scales of years and decades. The public is once again losing faith in the healthfulness and safety of the food supply, and those who have the privilege and wherewithal are cutting UPFs from their diets. We need to raise regulatory guardrails and institute policies to prevent our food from chronically sickening us. As Kelly Brownell, the Duke “toxic food environment” researcher, told us, “If six people go to a dinner and get sick from the tainted meat, the health authorities are all over it. But if thousands of people get obesity, diabetes, heart disease, and cancer, the government stands back.” Almost all of the Food and Drug Administration’s (FDA’s) food safety budget goes toward acute poisoning, not chronic disease, even though more Americans die every day from chronic food illnesses than from acute food poisonings every year. To build those guardrails, we need aggressive policies and regulations that achieve the kind of inversion we mentioned at the top of this chapter— making the healthiest foods more convenient, affordable, attractive, and accessible, while the least healthy undergo the opposite transition, into relative obscurity. PICK YOUR TARGET (AKA DEFINING UNHEALTHY) Wiley never suggested doing away with food processing. Instead, he identified the chemicals that were poisoning people and recommended that those be removed from the food supply. We think a similar logic applies to the food environment that’s making us chronically sick. We need to identify the attributes of food that contribute most to diet-related disease, and target regulations and policies to minimize them.[*1] Science is our best guide here. In an ideal world, we’d have clear data ready before acting so we don’t waste political capital and squander public trust on the wrong moves. And right now, this science is barely happening in America. In a recent review of research funding priorities, a U.S. senator issued a report noting that “only 1.3 percent of all NIH-funded research projects addressed the role of diet and nutrition in the prevention and treatment of disease,” representing a “historic devaluing of the importance of nutrition science…[that] indicates a misalignment between NIH spending and public health impact.” Kevin experienced the devaluing firsthand. Shortly after his first UPF study was published, Kevin’s bosses tried to close the research facilities that he and his colleagues use to study nutrition and metabolism. Kevin pushed back, asking high-ranking allies to lobby on behalf of him and his colleagues. In the end, the metabolic unit was not closed, but Kevin had to settle for substantial resource cuts that slowed his research to a crawl. That ongoing second UPF trial is progressing at less than half the pace of his prior studies. So rather than waiting for the data, which foods could we target based on what we know now? Some public health campaigners contend that the answer is all UPFs. There’s a lot of overlap between UPFs and foods high in the nutrients of concern, including sugar-sweetened beverages, cookies, candies, cakes, chips, and prepared meals. When people consume more UPFs, they eat more calories, sodium, saturated fat, and sugar, and take in less fiber, protein, and micronutrients—the exact opposite of how we ought to be eating for health. But many nutrition scientists (and food manufacturers, of course) are skeptical. The UPF category in NOVA spans a range of products “so broad that it borders on useless,” as one New York Times opinion article put it. It includes foods we shouldn’t have to eliminate—like some brands of hummus, canned beans, or whole-grain breads. Instead of targeting all UPFs, we propose focusing on energy-dense and hyperpalatable UPFs that also fail to meet the new FDA definition of “healthy.” This definition goes into effect in 2028 and provides manufacturers with the requirements their products must meet in order to voluntarily label them as healthy on the front of the package. They must contain ingredients from food categories people need to eat more of—like vegetables, fruits, and whole grains—while also limiting sugar, sodium, and saturated fat.[*2] If a food doesn’t meet the FDA’s new standard, that doesn’t necessarily make it unhealthy—it could have a neutral effect, neither promoting nor harming health. But by targeting UPFs that are also energy dense and hyperpalatable, we can zero in on the foods that, at least for now, seem to be the greatest diet-related disease culprits. Some UPFs are already considered healthy by the FDA’s standard, whereas others are neither energy dense nor hyperpalatable (again, think whole-grain bread, yogurt). None of these foods would be the subject of policies or regulations. Neither would non-UPF cakes or cookies from your local bakery, nor takeout pizza from the mom-and-pop pizza shop around the corner. The rationale for excluding non-UPFs is not that they’re good for us, but because we want to avoid affecting small producers who don’t have far-reaching impacts on population health. Meanwhile, other target products could include those that have been clearly linked to increased risks of diet-related chronic diseases—like sugar-sweetened beverages and processed meat products. (Sugary drinks are UPF but not energy dense or hyperpalatable; some processed meat may not be UPF.) So now we know our target: UPFs that are energy dense and hyperpalatable and that don’t meet the FDA definition of healthy, as well as foods that aren’t necessarily all of the above but have been consistently linked to serious health harms. These foods should be minimized, treated as “recreational substances” to be indulged in rarely. But we must also apply a battery of other public health policies and regulations, similar to the ones that reduced tobacco consumption: marketing restrictions, mandatory labeling, and aggressive taxes. We also need to introduce policies that make healthy food more convenient, affordable, and widely available in our food environment. Let’s start with the marketing. END BAD MARKETING The aggressive marketing of unhealthy food is a global problem that goes back decades. In the United States, public health campaigners have argued for half a century that we must curtail the marketing of junk food products, especially to children. In all those years, little has changed. Today in the United States alone, food and beverage companies spend close to $14 billion annually marketing unhealthy products—too much of it targeted to kids. The least nutritious fare—sugary breakfast cereals, candy, fast food— tends to be the most heavily peddled. Many of the ads and food packages look as appealing as cartoons: bright and colorful, covered in images children love. These ads meet kids everywhere they are: the breakfast table, the soccer field, school, toys and games, social media feeds. Maddeningly, they’re supported by taxpayer dollars: Tax law in the United States incentivizes the food industry to develop and heavily advertise unhealthy foods. Research clearly shows that marketing is linked to kids eating more of the marketed products (why else would companies invest?), as well as kids choosing advertised foods when given options, or asking the adults in their lives to buy them. The UK advocacy group Bite Back showed how this plays out—how marketing subliminally influences young people’s food choices. They exposed eight adolescents to ads for unhealthy “triple dipped chicken” using a variety of marketing strategies across multiple domains, especially on social media. Later, they asked the volunteers to choose from a fifty-item menu in a restaurant; all selected the triple dipped chicken. The teens were mystified about the predictability of their choices and dismayed at how they’d been manipulated. Governments should do what health groups have been demanding for fifty years: outlaw advertising of unhealthy foods—our target foods—to kids. Better yet, they could go even further and mandate blander packaging and front-of-pack warnings. Latin America is leading the way here. Chile, Peru, and Mexico have all put warning labels on products with high levels of added sugar, salt, and saturated fat. Chile also banned all food marketing to kids, from six a.m. to ten p.m., on all media, and mandated the removal of cartoon characters like Tony the Tiger from junk food. The UK government similarly decided that it had enough. As of 2025, online junk food ads directed at kids were outlawed, and such promotions can only appear on TV after nine p.m. The marketing that remains has to be held to a higher standard of accuracy and clarity. It makes no sense that companies can market jelly candies as sources of “real fruit juice,” or “whole-grain” breads and cereals that have zero whole grain in them, or “breakfast bars” as healthy sources of fiber and vitamins when they’re essentially thinly disguised candy bars. In America, the First Amendment to the Constitution makes it legally tricky to pull off such advertising and marketing restrictions. But with public interest and political will, anything is possible. America could also immediately eliminate the tax deductions that food manufacturers use to offset the costs of developing and marketing these products. In addition to the voluntary healthy food labels, we could mandate unhealthy warnings on our target foods, following Latin America’s lead. To that end, the government could enforce the FDA’s proposed mandatory front-of-pack labels, which inform consumers about the levels of saturated fat, sodium, and added sugar in their products. Despite being in the works for decades, the labels haven’t yet gone into effect and, once again, food manufacturers are fighting them while also pushing back against the FDA’s healthy food definition. Maybe, with lots of public outcry, as we saw in Wiley’s time, the restrictions would finally materialize. Maybe they could even reflect the latest science, featuring information about the other properties of foods we know lead to overeating—like hyperpalatability and energy density. MAKING UNHEALTHY FOODS HEALTHIER Let’s imagine all that happens: The unhealthy target foods are now less visible in the food environment. Consumers are also better informed about what they’re eating, with more accurate marketing, and even warning labels. What about making the target foods healthier? Before we get into how to go about doing that, we should acknowledge that many public health advocates disagree with the idea that UPFs can or should be reformulated to contribute to healthy diets. If the purpose of UPFs is to maximize profit to multinational industries by displacing traditional healthy eating patterns, there’s no way they have a role in our future health, the logic goes. We agree that opposing the transition from traditional foods in countries whose food supplies aren’t already dominated by UPFs is a reasonable strategy. They are working to prevent the food equivalent of the gas-guzzling automobile from taking over. That’s a good idea. But for other countries, including the United States and England, the ultra-processed food transition has already taken place. Most people rely on UPFs to make life work, either replacing cooking or making it easier (think premade sauces, frozen meals, and salad dressings). Some live in conditions without access to a kitchen or equipment to cook. Others lack the bandwidth or motivation. Even before the act of cooking itself, there’s the time spent planning and buying fresh ingredients, a maze to navigate that falls overwhelmingly on women, the meal preppers and grocery shoppers in most families. Like many working moms, Julia lives this tension and it’s an irony of this book that the more time she spent writing it, the less time she had for meal prep. And she’s a privileged foodie who enjoys cooking and has help in the house. Food companies also know this tension and exploit it. From the earliest days of food processing, they’ve heavily marketed the convenience of their products to women who were entering the workforce in growing numbers. Today, it’s arguably easier than ever to cash in on the family meal preppers. In the time UPFs took off, the social safety net frayed, and real wages stagnated; more families have single parents or two working parents and less leisure time. We’ve experienced inflation that drove up the cost of food. UPFs are not only faster and more convenient than whole foods, they’re also often cheaper. On a calorie-per-dollar basis, sugar, vegetable oils, and refined grains cost far less than fruits and vegetables, which has led to a situation where you can buy a dozen donuts for 99 cents and four apples for $9 in America. It’s not surprising that only around 10 percent of Americans eat the recommended daily intake of fruits and vegetables—a number that drops to 7 percent among poor people. It’s also no mystery why chicken fingers and frozen pizza win the fight for what goes on the dinner table at night. (Another key detail: We don’t even grow enough fruits and vegetables to feed everyone the recommended amounts; stay tuned for more in Chapter 12.) So while we’d love for everyone to sit down to freshly cooked, seasonal, healthy meals every day, the reality is that’s unlikely to happen. Food manufacturers aren’t going anywhere. But there are ways to put pressure on them to make healthy food options while also penalizing their unhealthy products. Some of this will involve reformulated UPFs. Reformulation today shouldn’t look like reformulation that happened in the past—removing specific nutrient bogeymen like fat or gluten, while quietly ramping up sugar, sodium, and other additives. We’re not talking about making “healthy” cookies and ice cream. We’re talking about reimagining the foods people rely on to feed their families so they’re less harmful. The newly reformulated products should have to meet the new FDA definition of healthy, while not being energy dense or hyperpalatable. Foods like pizza, which 11 percent of Americans eat on any given day and which is a seemingly ever-present item at birthday parties attended by Kevin’s children. Most ready-made pizza would not qualify as healthy according to the FDA definition, but that doesn’t have to be the case. Nutrition professor Mike Lean, at the University of Glasgow, teamed up with entrepreneur Donnie Maclean to reformulate a healthy pizza that is now being sold in the UK. Low in sodium and saturated fat, the pizza also features a multigrain dough and real cheese and tomatoes on top. Other researchers estimated that reformulated pizzas in the United States alone could have a substantial positive health impact. Some of these meals might still technically qualify as UPFs, as hummus and packaged whole-grain bread often do, but there’s little doubt that eating them would improve the diets of many people. Kevin’s study even suggested as much. Remember that his most recent randomized trial shows that UPFs can be part of a daily diet that doesn’t cause excess calorie intake and weight gain—as long as the high-UPF meals don’t contain too many hyperpalatable foods and their calorie density is lowered often by adding fruits and vegetables. His study participants also rated the healthier meals that made them eat fewer calories as being as enjoyable as the unhealthy ones. Companies have long resisted and lobbied against reformulation policies, arguing that they’ll ruin the appeal of their products and hurt sales. As a cautionary example, General Mills voluntarily reformulated Trix breakfast cereal to remove some cosmetic additives. What followed were unhappy customers, presumably choosing products from competitors. This prompted General Mills to reverse course and return to the original formulation. We argue that if product reformulation had been required for all companies, then a level playing field would be set, just as we saw when acute poisons were removed from the food supply in Wiley’s time. The most innovative reformulations would be rewarded. Mandatory reformulation could also gradually reset an American palate that has adapted over recent history to overly sweet, fatty, and salty foods. As we’ve seen in this book, we have innate drives for these nutrients—but we’ve also learned to crave them at the megadoses the industry set. We can and should set new mandatory standards that guide the American palate back to health. In the United States, we don’t even need new legislation to do this. The government already unveiled voluntary reductions of sodium. It could make such targets mandatory, and then do the same with sugar, refined grains, and saturated fat.[*3] Some legal scholars argue that existing food safety statutes—the legacy of the Wiley era—can be reinterpreted in the context of protecting against chronic and acute food illness. Again, all we need is the political will to start acting. Having said all that—and this is important—reformulation alone is not nearly enough to prevent disease and reverse course. That’s partly because successfully reformulated healthy foods will likely be more expensive than the current unhealthy alternatives. So let’s move on to the other levers we need to pull to change not just our food but the food environment. PROMOTE HEALTHY FOODS AND TAX THE REST So we’ve already made healthier versions of prominent unhealthy foods, while also making it harder to promote junk through advertising and marketing regulations. But if consumers see healthy foods on the shelf next to unhealthy alternatives that are less expensive, the junk food may too often win out. That’s why we also need the tool that was the most effective at reducing tobacco consumption: taxation. We can start by taxing target foods that can easily be reduced or even eliminated from the diet with no downsides for health (soda, candy, cookies, chips) or replaced with healthy alternatives without much added cost (sugary drinks with water; sweetened yogurt or milk with plain; unhealthy breakfast cereal with healthy options like oatmeal). This is already happening with one UPF. After decades of advocacy by public health campaigning, taxes on sugar-sweetened beverages have been implemented in nearly 120 countries. Colombia recently went further, introducing a new tax on all UPFs with poor nutritional profiles. To make sure the taxes don’t just increase food prices in general, we also need to use some of the tax revenue to promote healthy replacements so that they’re more affordable, more prominent, and more convenient. This is where we have to think of smart ways to move the burden of improving dietary patterns up the food supply chain…by, you guessed it, changing the food environment. What if we held supermarkets to high health standards regarding the overall scope of products they offer? Right now, the quality of grocery purchases is way below dietary recommendations, according to U.S. Department of Agriculture (USDA) data. This is because manufacturers of the most profitable UPFs pay a premium for access to prime shelf space. Affordable, convenient, and heathy alternatives are relatively less available and less visible. Rather than blaming consumers for their poor choices, policies could incentivize the sale of a basket of products that reflect healthier overall dietary patterns. To do this, tax credits could be issued to supermarkets based on the general dietary quality of their sales. Given the immense purchasing power of supermarkets, such policies would exert powerful upstream demands on food manufacturers to produce healthier products. Moving even further upstream, the income of food manufacturers could be taxed to promote improvements in the overall quality of the products sold. Along with eliminating manufacturer income tax deductions for expenses related to the development and marketing of unhealthy products, excise taxes could be levied on them. Food deserts are characterized by the lack of healthy food available in a large geographic area. They result from local markets being pushed out of business when distant supermarkets use their purchasing power to squeeze their vendors to provide discounts allowing for reduced supermarket prices. But research suggests that improved access to healthy fresh ingredients at supermarkets and health food stores doesn’t solve the problem. It doesn’t change consumption patterns, maybe because many people lack the money, skill, equipment, and time to prepare meals from fresh ingredients. To that end, we see a role for supporting businesses that make affordable, healthy, prepared food more available to people in a decentralized way. Nourish, a program based at the University of California San Francisco, is working on this. They connect small businesses and entrepreneurs in food deserts with technical and financial support. Healthy foods should be the main fare offered at schools, daycares, higher ed, hospitals, food banks and pantries, community kitchens, and other institutional settings. This would protect large and vulnerable segments of the population. In Brazil, kids already get school lunches that include foods like pasta with fresh tomato sauce, tuna and vegetables, or a kale salad—because of a government requirement that most of what’s offered for school lunches be fresh and minimally processed. That’s not all. The national school food program recently dropped the limit on the amount of UPFs to at most 15 percent of calories, from 20 percent, while at least two municipalities have already banned UPFs in school canteens entirely. Japan, a developed country with a very low rate of obesity, also views school lunch as an avenue to keeping kids healthy and preventing obesity. The government requires all public schools to offer free nutritious lunches, freshly prepared under the supervision of a school nutritionist. France is taking environmental sustainability into account. They ask schools to serve meals where at least half of the ingredients are sustainably sourced and high quality; by 2040, they’ll outlaw the use of single-use plastics in school catering. All of these institutional mandates change the food environment rather than targeting individual consumers. Unhealthy food shouldn’t be part of government programs that provide food subsidies to the public, such as the Supplemental Nutrition Assistance Program (SNAP) in America. Subsidies could also make healthy products more affordable. For example, in the United States prepared foods are currently subject to retail sales taxes and are not eligible for SNAP benefits, but these restrictions could be eliminated for healthy, high-quality prepared foods. The funds raised by taxing unhealthy products could go toward subsidizing whole-food ingredients, like fruits and vegetables, as well as healthy ready-made meals and snacks. Imagine if we subsidized take-out restaurants serving healthy prepared meals at competitive prices in underserved neighborhoods and made them eligible for SNAP. This might have more of an impact than just adding supermarkets to food deserts. There’s so much more we can do. What if, instead of chocolate bars, chips, and candies at checkout aisles in hardware, grocery, and clothing stores, people were offered nuts and ready-to-eat fruits and vegetables or no food at all? What if we also banned the common practice of making smaller portions relatively more expensive, which just encourages consumers to purchase larger portions that lead to overeating and food waste? What if restaurant chains and other venues that provided prepared meals were also subject to policies that encourage overall menus to meet minimum quality standards? Some savvy companies are already taking it upon themselves to do that. Kevin recently visited Google’s main campus in the San Francisco Bay Area, where well-appointed cafés offered employees free healthy, fresh, and delicious meals: spiced vegetable curries, pecan-encrusted fish, kale salads. Google values the sense of community and collaborations that spring up when employees gather in their cafés, a goal that could theoretically be accomplished over free pizza and beer. But Google’s Food Team goes to great lengths to encourage healthy choices, even hiring a Food Choice Architecture and Nutrition Manager. We wonder whether the free access to tasty, convenient, and healthy food increases productivity and decreases the healthcare burden at the company. Given the true cost of our current food environment in terms of sickness and lost productivity, governments might find it cheaper to subsidize Google-style cafés or high-quality meal delivery services in regions that have little affordable access to healthy food. What if we decided that, like clean drinking water, access to delicious, convenient, healthy meals was considered an essential human right instead of a Silicon Valley employee perk? Getting there means changing our food culture—another critical part of the food environment. Here we can also borrow insights from anti-tobacco campaigns that shifted norms around smoking, making it undesirable and even illegal to puff away in public spaces. Some of this might involve policy and regulation, like ad campaigns that promote fruit and vegetable consumption, or education mandates related to food and nutrition. Japan has been doing this since 2005, part of its strategy to combat preventable chronic diseases. The Shokuiku curriculum for kids is required by law in all schools and covers not only basic knowledge about nutrients and eating for health but also food culture, enjoyment, and appreciation. The food industry shifted the culture of what and how we eat, doing things like normalizing eating on the go, replacing cooking with UPFs, snacking, and massive portion sizes, in a similar way to the tobacco industry’s promotion of smoking as a desirable and cool pastime. We need to fight back to reverse the unhealthy norms. But just like reformulation and marketing restrictions, cultural and education programs alone won’t go far enough. (See the theme here?) PRESUMING SAFETY VS. TAKING PRECAUTION Let’s zoom back in from food environments to the ingredients in what we eat, and what to do about those. Early in the twentieth century Wiley’s work identifying harmful additives in food and the creation of the Pure Food and Drug Act resulted in the removal of all harmful substances from the food system. Just kidding. The situation is more complicated than ever. In Wiley’s time, his team in the Agriculture Department estimated that there were 152 new preservative chemicals in the U.S. food supply. In America today, there are more than ten thousand chemicals that can be added to food or used in the processing equipment and packaging that touches the stuff we eat. This is, in part, because of a loophole that emerged in American food regulation in 1958. Congress introduced the Food Additives Amendment, which required food companies to seek premarket approval for any new substance that would be added to or come into contact with their products. The act also included a clause about substances that are “generally recognized as safe,” or GRAS. The idea was that common food ingredients—flour, olive oil—would not require premarket vetting by the regulator. In 1997, the FDA began allowing companies to self-determine whether a substance was GRAS. Fast-forward to today. Approximately 99 percent of the known chemicals that came onto the U.S. market for food processing since 2000— so virtually all—entered through the self-determined GRAS pathway, instead of securing FDA’s premarket clearance. When food companies start using entirely new ingredients in their products, they can simply hire an expert group to review the substance or ask experts within the company to do so, without any outside oversight. They then notify the FDA that they’ve determined the chemical is good to go—but even that notice is voluntary. [*4] Since 1997, the FDA has been notified of less than half of the chemicals that are being used as GRAS. If the agency has tough questions after receiving such a notification, a company can simply withdraw their notice and continue to use the chemical anyway. As researchers at Harvard Law School and the University of Connecticut summed up, “It is impossible for FDA to know the number of chemicals in use throughout the U.S. food supply because the agency is unaware of the GRAS substances being selfdetermined by companies.” A recent example of an unvetted ingredient making it into the food supply involved Tara flour, made from the seeds of a legume grown in Peru and featured in a product called French Lentil + Leek Crumbles from the vegan food company Daily Harvest. After eating the crumbles—a groundmeat substitute—people reported experiencing abdominal pain, fever, nausea, and joint pain. In the worst cases, they were diagnosed with liver dysfunction and had to have their gallbladders removed. The last time we checked, 393 people are known to have fallen sick after eating the crumbles, and 133 were hospitalized, across 39 states. Daily Harvest’s supplier—which had obtained the Tara flour from Peru —suggested the ingredient as an alternative to a nut-based protein. What the company leadership may not have known: It is not widely used in food in America and had never been vetted for human safety. Their supplier had simply “shared with us documentation and positioned [Tara flour] as Generally Recognized as Safe,” the company’s chief supply chain officer told Fast Company. The case left Bill Marler, a longtime food safety lawyer who is representing people who had been harmed, wondering about how many other companies are doing the same thing with novel food ingredients. “It’s not a system that’s created to actually find things,” he told us. Once in the food supply, ingredients that are known to be toxic can linger for much longer than they should. Two recent examples are brominated vegetable oil and artificial trans fats; each took decades to remove after clear evidence of harm. Meanwhile, foods can still contain hazardous chemicals that we have known since Wiley’s time shouldn’t make appearances on our dinner plates—think lead in our cinnamon, pathogenic bacteria in our salad. Enforcement fails because the FDA doesn’t have the resources and power to police the food supply the way it should. Part of this stems from industry lobbying dating back to Wiley’s time. Thanks to The Jungle, the Meat Inspection Act—which passed the same year as Wiley’s Law—empowered the USDA to do safety and health checks, requiring government inspectors be present in meat processing facilities. That forced the meat industry to lobby for the USDA’s food safety budget. The result: It’s bigger than the FDA’s entire food inspection program, even though the FDA is responsible for 80 percent of the American food supply. Again because of industry lobbying, the FDA doesn’t have the resources to do premarket approvals for most foods, and there’s a perverse incentive to keep it that way. “The less resources FDA has, the less of a problem they’ll be for the industry,” said Jerry Mande, a former senior official at the FDA and the USDA. “Who decides the safety of food additives in America? It’s the executives in food companies whose job it is to make money, rather than the scientists whose job it would be to ensure we’re safe.” So even though our knowledge of food safety may be much more sophisticated compared to Wiley’s era, “sometimes we look just like the nineteenth damn century in the way that we apply this knowledge,” Deborah Blum, the Wiley biographer, told us. Trying to steer clear of toxic ingredients or problematic foods is as difficult today as trying to avoid borax or formaldehyde in Wiley’s era. Even if you cook all your meals and you can afford to do things like buy your milk and yogurt in glass bottles, the use of additives and food contact chemicals happens so far upstream—or so covertly—you can’t insulate yourself. Consider perchlorate, which is used to make rocket fuel; even small amounts seem to affect thyroid function and development. Yet it was approved for use in plastic packaging of dry food ingredients, like oats, flour, and spices. It’s also a degradation product of the bleach that’s used to disinfect food and manufacturing facilities. Jim Jones, the former Deputy Commissioner of the Human Foods Program at the FDA, who worked on food additives, told us perchlorate “certainly should be a priority for FDA review.” For now, you won’t find perchlorate listed on the labels of foods, just as you wouldn’t find any detail of the additives that are included in foods as “natural” and “artificial” flavors. Despite the different names, both are human-made and can be places where unhealthy ingredients can hide out. Similarly, you won’t see details about chemicals present in trace amounts or incidental additives. Unless they’re major allergens, they don’t need to be declared on food labels because they’re under certain thresholds, and don’t perform any function in the final product. But they make their way into food anyway, sometimes unintentionally during processing. Other times, manufacturers may exploit this loophole by including multiple ingredients with similar functions but each below thresholds required for reporting to deliver consumers “cleaner” ingredient labels. Even in minuscule amounts, however, researchers have raised concerns about certain additives, such as methylene chloride. Banned by the FDA in cosmetics because “it has been shown to cause cancer in animals and is likely harmful to human health,” methylene chloride remains in decaf coffee, beer, and spices because of the incidental additives exemption. So again, in your food, not on your food label. These chemicals, hidden or otherwise, can add up in the diet. Since Wiley’s time, regulators haven’t tackled a fine point even he worried about: the cumulative effects of additives. Maricel Maffini and Tom Neltner, experts on GRAS, decided to analyze how often companies considered the cumulative effects of the chemicals they use—something the 1958 amendment required. They downloaded and reviewed all 877 safety determinations that companies included in their GRAS notices to the FDA between 1997—when the FDA launched its voluntary GRAS notification program—and 2020. Only one of the 877 notices had accounted for the cumulative effect requirement. For the record, we don’t want to stoke chemophobia. You shouldn’t be afraid of “chemicals” in food because, as we’ve said since page one of this book, everything is a chemical! You’re made of chemicals! Not all chemicals are bad for you, and of the ones that are, it’s often the dose that makes the poison. Small amounts of arsenic are not dangerous, while excessive water consumption can be deadly. The issue with food ingredients is that we don’t always know they’re there, they’re not always independently vetted, nor do regulators typically consider important nuances, like cumulative effects. What happens if we eat the same foods every day and the chemicals persist in our bodies? Or what about when the dose doesn’t matter because the chemical behaves like a hormone? Or when tiny amounts are detrimental, in the case of exposures during fetal development? In other words, the safe dose hasn’t been established or communicated. Instead of precaution, we presume safety. The problem with some of these chemicals might not even end up being that they directly cause diseases like cancer; rather, it’s that they make food easier to overeat and cause obesity, which raises the risk for other diseases, like cancer. We think governments should institute a high bar for the healthfulness of the ingredients allowed in and around our food and place the burden on manufacturers to demonstrate safety prior to hitting the market. All new ingredients should be disclosed to regulators, and even previously added substances should be regularly reviewed, reflecting science as it evolves. Legal mandates and resources to help enforce existing food safety law, and better postmarket safety monitoring, would help spur the removal of problematic ingredients and additives. Governments could also immediately increase transparency by requiring companies to label all ingredients and additives contained in processed foods. But just informing regulators or the public won’t solve the problem, the same way it wasn’t a solution in Wiley’s era. CALLING MORE WILEYS Of course, we know full well the food industry will vehemently fight any changes that threaten to take a bite out of their profits, as they’ve done since the Wiley era. And the industry is far more powerful than ever, one of the most influential in the world. Global revenues are now valued at more than the oil and gas industry—roughly $6 to $8 trillion USD. Between 1998 and 2020, the alcohol, gambling, UPF, and tobacco industries poured approximately $3.26 billion into lobbying in the United States. The UPF industry spent the most ($1.15 billion). It’s not that these companies set out to sicken people, or that the people who work at them are evil. It’s that resisting anything that’ll decrease profits is a fiduciary duty of food manufacturers to their shareholders. As the eminent public health and nutrition researcher Marion Nestle puts it, “Food companies are not social service agencies, and they’re not public health agencies. They’re businesses. The shareholder-value movement is predominant, and that means that corporations’ primary goal is to sell products and make profits for shareholders.” Kevin learned this the hard way. In an attempt to rapidly advance the science of UPFs, he spent more than six months talking to representatives from over a dozen food companies to gauge their interest in participating in a government-funded research program that would deeply investigate what’s wrong with many UPFs and how to reengineer them to promote health. After multiple rounds of meetings and private webinars, almost every single company declined to so much as express interest in participating. “No thanks,” was essentially their response, “there’s no real problem here.” The power of the food lobby helps explain why, in America, we’ve had only occasional voluntary nutrition policies—like reducing sodium in foods —with negligible health impact. In the UK, governments proposed nearly seven hundred policies between 1992 and 2020—most of them purposebuilt to be toothless and targeted at individuals—while obesity and type 2 diabetes only worsened. Even the countries with the most impressive and radical suite of policies aimed at improving the quality of the food supply— including Chile, Brazil, Colombia, and Mexico—don’t go far enough. Their laudable efforts to tax soda, recommend whole-foods consumption in their national dietary guidelines, and add front-of-package warning labels to UPFs high in salt, sugar, and saturated fat, among other policies, have yet to make an appreciable dent on population obesity. Changes in chronic disease rates tend to take time, of course, and this is just the first generation of healthy food laws passed. But the food environment is also so increasingly toxic that real impact on health will require pulling every lever we possibly can. Still, we think there’s reason for hope that the near future may not look like the recent past. Political will around diet-related chronic diseases has been building to a boiling point in many countries. More and more low- and middle-income countries in Africa and Asia are watching what Latin America is doing and organizing to follow suit. In the United States, public discussion about the health effects of diets high in UPFs is motivating political action arguably more than the previous dietary bogeymen like fat, salt, and sugar. The Make America Healthy Again movement, or MAHA, promises to drive down chronic disease, especially in children, and played an important role in determining the outcome of the 2024 U.S. presidential election. People are keenly aware that there’s something wrong with what we eat, and that it’s causing immense suffering and economic hardship. The newly appointed U.S. Secretary of Health and Human Services Robert F. Kennedy Jr. has the “MAHA moms,” women who are concerned about the health of their kids and each other, promoting his agenda through social media, just as Wiley had the suffragettes. Wiley’s story is a reminder that even one committed person can change the system to better protect everyone—but we can do even better with legions of committed people. You can start by voting with your dollars. Support food businesses whose values align with yours and with science. Call your political representatives. Write letters to the regulatory agencies. Volunteer with or donate to organizations that are working on food safety issues. Be vocal on social media. Call out disingenuous messaging from the food industry. Organize with others who care about the same things you do. Litigate. Make the changes we’ve discussed in this chapter on a small scale —at your school, workplace, restaurant, or local grocery store. Get the candy out of the communal kitchen, and the UPF dessert off the kids’ school menus. Better yet, work with your local representative and a culinary school to open a Google-style public kitchen in your neighborhood. Just remember the point of this chapter: We need those guardrails to protect the public against chronic disease, making healthy foods the accessible, prominent, and affordable default for everyone In one of our conversations with Maricel Maffini, the GRAS expert, something she said stuck with us. Knowing all she does about the hidden ingredients and chemicals in foods, we asked her how she eats—which products she tries to avoid. The answer was none, and not out of a sense of futility. “The worst thing that you can do, in my opinion, is to create guilt in families. Everybody’s rushing. Everybody has lots of things to do. There are pockets in the U.S. of immense poverty where they can barely find healthy foods to eat. And it would be very hypocritical to say, ‘What we have to do is to eat organic food.’ ” She went on: “I’m working to change the system, not to have different groups who can afford certain foods.” While we wait for population-level change, what should we eat now? And should you eat the same diet as your sibling or friend? Let’s go there next. Skip Notes *1 For example, we did this recently with artificial trans fat, which was removed from the food supply based on several studies showing that its consumption increased the risk of cardiovascular disease. Of course, the food industry cast doubt on the studies and impeded regulation for decades. *2 Other countries could use their own nutritional profiling systems instead of the FDA’s healthy definition. *3 Reductions in saturated fat replaced with refined grains and sugars is probably a wash when it comes to health. Replacing saturated fat with trans fat is worse for health. Replacing saturated fat with unsaturated fats is beneficial. Some quibble about overall guidelines and recommendations to reduce saturated fat because they tend to target natural animal-based foods like meat, cheese, and dairy. *4 Europe generally takes a more precautionary approach on food additives than the United States does. For example, regulators are legally required to routinely look back at chemicals already on the market according to the latest science—something American regulators have no mandate to do. CHAPTER 9 Imprecision Nutrition? Eat more vegetables” is familiar diet advice. National dietary guidelines around the world recommend that people ramp up their veggie intake—along with fiber, legumes, whole grains, and fruits. They also generally recommend limiting sodium, sugar, saturated fat, and junk foods. You know this boring diet advice intimately, we’re sure. You’ve probably heard it since you were a toddler eating in a high chair, as your parents admonished you to finish your broccoli. These standard dietary guidelines typically come from expert panels who’ve systematically reviewed large bodies of evidence from decades of nutrition science. Updates to the guidelines sometimes change around the edges, based on the latest trends and data—ultra-processed foods are a key focus now—but the foundations are remarkably stable over time and near universal. From the Mediterranean to the New Nordic to traditional Chinese and Brazilian, healthy diets look fundamentally alike. Journalist Michael Pollan summed up healthy eating nicely with his aphorism, “Eat food. Mostly plants. Not too much.” You wouldn’t go wrong adopting such untargeted advice, perhaps tailoring it to your individual preferences and lifestyle. Diets ranging from low-carb keto to low-fat vegan can all be adapted to the generic approach, which can be adjusted further to fit with different food cultures. But what if the diet that’s best for you doesn’t look at all like generic dietary guidance? One size doesn’t fit all when it comes to our clothes because our bodies are different. So why shouldn’t diet advice be targeted to each unique individual? Maybe you happen to be the lucky one whose health is optimized by avoiding veggies and regularly indulging in candy and deep-fried foods. Just as the precision medicine paradigm promises to better treat our individual diseases, precision nutrition might better prevent diseases from occurring in the first place. The claim underpinning the field is that onesize-fits-all generic advice fails because everyone is different. But as Julia recently discovered, and studies by Kevin and others have revealed, the state of the art in precision nutrition[*1] doesn’t quite deliver on that promise. THE BEST DIET FOR YOU, AND ONLY YOU It’s just after six in the evening and two women are standing in a meeting room at a trendy private club in London. It’s time for Federica Amati, head nutritionist at the personalized nutrition[*2] company Zoe, to attach a continuous glucose monitor (CGM) on Julia’s arm. The device will spit out real-time feedback on what’s happening with Julia’s sugar levels as she eats and goes about her day—part of Zoe’s promise to help its hundreds of thousands of customers understand how to eat in a way that’s best for them, and only them. But wait a minute. “I’m going to wash my hands,” Amati says, popping out of the room to make sure that even in this less-thanofficial medical setting, the CGM’s tiny electrode is inserted hygienically. Julia has traveled to London to try the Zoe program because she wanted some help. Like many parents, her diet had begun to backslide after she had kids. Though she went to great lengths to make sure they were eating well, she didn’t always do the same for herself. Weight was not the issue anymore; the problem now was how she felt. Often exhausted, usually underslept, Julia noticed that she was too often turning to food for an energy boost. And sometimes the wrong foods at that. Coffee, cookie, slump, repeat. Precision nutrition seemed like a good place to go in the post-dieting age. We each have a unique genetic background and grew up in different environments. Our lifestyles, schedules, and diets shape our microbiome and our particular blood sugar and blood fat responses to the foods we eat. Direct-to-consumer companies, like Zoe, promise to take in a wealth of data on their users’ health and eating habits—then feed it into a proprietary algorithm to dole out tailored nutrition advice. As Zoe’s Facebook page puts it, “If you always thought your body was different, you’re right.” If you follow their personalized diets, their home page declares, “No more afternoon slump. You can get your energy back by making smarter food choices for your body.” The potential of the field had already enticed many of the biggest players in the nutrition and health space. Major food and drug companies like Nestlé and Bayer had, over the years, invested hundreds of millions of dollars into acquiring precision nutrition companies. The number of businesses grew from twelve in 2016 to nearly four hundred less than ten years later. By 2022, the precision nutrition market was valued at $12 billion—a figure that’s expected to increase to almost $50 billion by 2032. The U.S. president’s council of advisors on science and technology even called out precision nutrition as key to addressing the epidemic of chronic diet-related diseases. According to Zoe, 70 percent of its members had reported having more energy—exactly what Julia was after. In the London meeting room, Amati asked Julia to relax as she positioned the CGM device—with a sensor that tracks glucose in the fluid between the cells and the blood vessels—at the back of the journalist’s upper arm. “You won’t feel anything,” Amati smiled. Pop! The device was in place. Amati handed over Zoe’s yellow test kit, replete with a “poop scoop” for stool collection, vials for blood sampling, and standardized high-fat, high-sugar “metabolic challenge” cookies, designed to gauge how well Julia’s body dealt with foods that are rich in fat and sugar compared to other women the same age. The encounter had the feel of a medical visit, albeit an unorthodox one. Before saying goodbye and heading into the cool London night, Amati shared a couple of bits of advice. Don’t knock the CGM off going through a doorframe, and don’t try to interpret its data by yourself. But do take the metabolic challenge cookies seriously. “The science is coming out from us and others that the postprandial [after eating] response is a more immediate marker that can give you a signal before your A1C goes up and before your triglycerides are already high.” This suggested to Julia that Zoe’s tests could see what standard diagnostic testing for diabetes and blood lipids could not. The next day, following a bad night’s sleep, Julia dutifully scooped her poop (not for the prudish) and spent the morning eating only Zoe’s two sugary white-chocolate cookies while walking around London. After a final fast and blood sample draw, she dropped the blood- and poop-filled vials into one of the city’s big red mailboxes. (Don’t worry, they went into a well-sealed envelope first.) Then she awaited her results. Kevin suggested Julia shouldn’t stop there. What would happen if she tried other precision nutrition programs at the same time? Would she get the same tailored diet advice or something radically different? Would they interpret her CGM and microbiome findings similarly? So while using Zoe, Julia subscribed to two other companies—the Nutrisense system, focused on improving metabolic health outcomes using CGM data and coaching from registered dietitians, as well as Viome, which promises to optimize eating patterns for “microbiome activity.” For Nutrisense, she affixed another glucose sensor to her arm, a procedure she would repeat after two weeks to collect a month of CGM data. For Viome, she squeezed blood out of her fingers, spit in a test vial, and split the poop she’d sent to Zoe, to ensure that both companies interpreted the same sample. The results, Kevin predicted, would be all over the map in terms of advice about specific foods. But their overall diet guidance would look totally standard—eat more plants and whole foods and less ultra-processed stuff. That boring, old generic nutrition advice. He had good reason to be skeptical. MONITORING THE MONITORS Kevin had grown interested in CGMs a few years prior. Normally, he could get information about blood glucose levels from his study participants only at particular time points. CGMs opened up a new window on what was happening for the entire day, every day volunteers spent in his lab. CGMs were originally developed to help people with type 1 diabetes keep their blood sugar in check and dose insulin accordingly. [*3] CGMs also warn about low glucose levels, a life-threatening event that people with type 1 diabetes literally lost sleep over fearing that they might never wake up. People with type 2 diabetes use the devices to know when their blood glucose levels are out of the desired range. But what about people without diabetes? They spend roughly 94 percent of the time in the normal glucose range, only stepping outside those bounds for short periods. Did CGMs have enough precision and accuracy to deliver meaningful diet recommendations to them or provide guidance about the implications of their blood sugar fluctuations? The CGM devices already formed the bedrock of many direct-toconsumer precision nutrition companies, thanks largely to an influential study from 2015. Researchers at the Weizmann Institute of Science in Israel used CGMs on eight hundred people without diabetes and collected millions of data points about rises and falls in glucose as they ate, drank, and went about their usual daily routines over the course of a week (the only exception being standardized breakfasts). With the help of a machinelearning algorithm, the researchers crunched data on the participants—their age, sex, lifestyle, carb intake, microbiome, CGM measures—looking for patterns about different foods that most spiked glucose in different people. “It was a paradigm-shifting paper,” Leanne Redman, a professor of clinical medicine at the Pennington Biomedical Research Center and a precision nutrition researcher, told us. Not only did the CGM responses to the same foods appear different across individuals; the researchers showed that they could use their algorithm to cluster groups of people, based on characteristics like microbiome composition, then accurately predict diets that reduced a person’s risk of glucose spikes—or sharp rises in blood sugar, usually followed by a crash. The paper spawned a precision nutrition company, the now defunct DayTwo, and caused a big pivot in the field. Previously, the focus had been diets by DNA. Nutrigenomics—eating according to one’s genotype— exploded in the 1990s, thanks to the Human Genome Project, and the quest in biomedicine to search for the genetic causes of disease. The Israeli paper, and others that followed, including a study from the founders of Zoe, suggested that nongenetic factors—like how much a person has exercised or slept, or their microbiome composition—were more potent determinants of an individual’s responses to diet than their DNA. Many precision nutrition companies went from analyzing genes to looking at novel biological risk factors, such as microbiome and CGM data, tailoring diet advice accordingly. By the time we were working on this book, eating to optimize one’s glucose response was arguably the viral eating trend, “a medical sensitivity turned nutritional obsession for the masses,” The Wall Street Journal proclaimed. The French social media influencer Jessie Inchauspé—aka the “Glucose Goddess”—stoked glucose zealousness by preaching to her millions of followers “hacks” aimed at slowing the arrival of glucose into the bloodstream and lowering one’s chances of a blood sugar spike. Pair a piece of cake with protein- and fat-rich yogurt, or bread with butter; start the day with a savory breakfast; eat your vegetables and protein before simple carbohydrates at a meal. To find out how your blood sugar responds to what you eat, Inchauspé also advised trying a CGM. Using one changed her life, she swore, dramatically improved her health, and helped her pinpoint the effects of particular foods on her glucose levels and body. With her best-selling books and a $999 online glucose course, Inchauspé’s acolytes could experience similar transformations. (Inchauspé did not respond to our requests to be interviewed for this book.) When Kevin started looking into the data, once again he noticed there was hardly any foundational research on the ideas that were going viral. The notion that different people have different glucose responses to the same meal is nothing new. Physicians have been using standardized oral glucose tolerance tests—gauging a person’s unique response to the same 75 grams of glucose—to diagnose prediabetes, gestational diabetes, and type 2 diabetes for decades. What is new: the idea that different foods could cause discordant responses between people. When you eat a banana your blood glucose spikes sky high, but after eating an apple it stays rock solid. Meanwhile, your friend might have the opposite response to bananas and apples. If that was true, CGMs seemed like the perfect tool to figure out which foods most boosted an individual’s glucose levels, driving down the risk of spikes, and potentially, disease. How much of the apparent variability between the banana-versus-apple spikers was just noise versus real biological signal, Kevin wondered. A lot of the existing research focused not on meals, which is usually how we eat, but on responses to simple foods and beverages, like a muffin, or a muffin and a milkshake. One general study found that CGMs systematically overestimated blood glucose responses to such simple meals in people without diabetes. What’s more, combining foods within a meal alters how quickly nutrients like glucose are absorbed. This is the reason glucosecentric dieters are told to pay attention to meal order, starting with low-carb foods first. Other factors like physical activity, sleep quality, and stress could also affect glucose responses to meals. Kevin suspected the CGM findings could vary a lot within an individual for each meal or food they ate. Beginning in 2018, Kevin started asking all the study participants who entered his clinical trials to wear CGMs for weeks at a time while researchers knew exactly what they were eating. Out of that effort came two papers so far. In the first, he monitored glucose levels using two CGMs from different manufacturers worn on each person at the same time. The size of the blood sugar rises after a meal were modestly correlated between the devices. If eating walnut quinoa cereal and berries caused a blood sugar spike in one device, the other would also tend to register an increase—but not to the degree Kevin expected. One device consistently read glucose values significantly lower than the other. In fact, one CGM would often show a glucose spike in response to a meal, while the other device would not. Choosing meals to minimize glucose spikes according to one CGM device didn’t minimize them when simultaneously measured by another. It gets worse. For the second paper, Kevin found that when people ate the same meal on two occasions, their individual glucose responses measured using the same CGM were just as variable as their glucose responses to different meals. When a person ate berry and walnut quinoa cereal for two different breakfasts, their CGM measurements differed as much as comparing that response to a bagel with cream cheese and turkey bacon. The lack of reliable responses to the same repeated meals suggested a good deal of biological variability and technical imprecision. This surprised Kevin. His studies happened in such highly controlled conditions, the kind that should have minimized variability compared to real life. If a person’s glucose responses to the same meal are as variable as their responses to different meals, this raised a couple of questions: Was it even possible to recommend eating one meal over another to drive down glucose levels? And how much of the individual variation was due to diet, as opposed to all those other nonfood factors that affect blood sugar—stress, sleep, physical activity? Or did small differences in food intake between the meals explain differences in glucose responses? Even when Kevin adjusted for many of these factors, the results were still unreliable and differed by device. One place CGM results were consistent, Kevin discovered, was when the carbohydrate and fat content of meals differed a lot—bacon versus white bread, for example. This was something that’s been demonstrated in other studies that feature simple meals known to vary widely in their average effects on blood sugar. But yet another question arose: Was a CGM any better than simply assessing a meal’s glycemic load, the long-standing method scientists used to quantify differences between foods in their ability to consistently raise blood glucose on average across people? The glycemic load ranks foods, telling us, for example, that white bread or polished rice raises blood sugar more than whole-grain bread, apples, or bacon. High-glycemic-load foods tend to be the ones most of us know we should avoid—high in sugar, low in nutrition—foods that are linked, in observational research, to poor health outcomes, foods all precision nutrition companies tell their users to avoid. But the companies and influencers—whose goal is often to lower glucose levels—hardly mention glycemic load. Instead, they argue that CGMs increase engagement with glucose responses in a way that glycemic load can’t. It’s also a metric that was derived by averaging glucose responses across many of us—something we have in common. The gaps in the scientific evidence on CGMs haven’t stopped entrepreneurs from, yet again, rushing to market. Not only is the precision nutrition market segment growing, the FDA recently allowed device manufacturers in the United States to sell CGMs to anybody, including those without diabetes or prediabetes. We expect this will add rocket fuel to the start-up precision nutrition companies that are using the devices for individualized diets. ONE PERSON, THREE RECOMMENDATIONS A few weeks after watching her glucose levels rise and fall on three CGMs, Julia detected few reliable patterns other than that the baguettes she’d grown accustomed to in Paris reliably spiked her sugar levels—something, as Kevin had suggested, the glycemic load could have told her. But so did exercising, perhaps because physical activity is known to cause the liver to release glucose to fuel working muscles. When she went for a run or rushed to pick up her son from school, she saw a spike—and her blood sugar would remain elevated for hours after. Meanwhile, foods that weren’t so great for health or the waistline—cake, ice cream—didn’t regularly spike her blood sugar, presumably because Julia ate them at the end of meals. Even pasta, maybe because she usually ate it with fat, fiber, and protein (nutty pesto, or broccoli and chickpeas). The test results and recommendations across the three companies were also difficult to square. Both Zoe and Viome determined that Julia had an excellent gut microbiome profile—again, they used the same poop sample —but then suggested different foods to optimize it. At Zoe, spinach and pecans were among her top “gut boosters,” while at Viome, they appeared on the no-no list. Viome later explained this was because they determined Julia’s “microbiome is not effectively degrading oxalates. As a result, consuming oxalate-rich foods such as spinach or pecans could be detrimental to [Julia’s] health at this time.” Zoe’s foods to steer clear of seemed more reasonable, but their ranking was equally curious. Chicken pie, a food Julia’s pretty sure she’s never eaten, was her top “gut suppressor,” followed by syrup, burgers, and cream crackers, foods she rarely eats. Of this, Zoe said, “The fact you don’t eat these aligns well with you having an excellent microbiome score—if you did eat them frequently, you would not have a good [microbiome] score.” Viome’s other results, based on a gene expression analysis of blood, saliva, and stool samples, were equally puzzling. Eighty-seven pages long, their report featured obscure tests that made apparently unvalidated claims about the function of Julia’s various body systems. The first six pages were a barrage of “not optimal” scores on such measures as “oral flagellar assembly production pathways” and “oral mucin degradation pathways.” Then along with the positive gut health score came nearly thirty pages of other “good” and “average” results. Viome determined that Julia’s “cavity promoting pathways” and “cognitive health” were in top shape (phew!). But her “immune system activation” and “salt stress pathways” were merely average.” At least one score Julia would accept without question: She is apparently two years younger than her chronological age. These differences could be explained by the fact that the companies used different methods to measure the microbiota, and different algorithms to draw their conclusions. Viome, for example, told us it’s the only company using metatranscriptomics, “an advanced RNA-based technology that measures not just which microbes are present in your gut, but what they’re actually doing.” The company’s diet recommendations “are tailored precisely to your biological response, ensuring optimal metabolic health.” Also relevant: The microbiome is a new frontier in health science and, like CGMs for precision nutrition, not quite ready for prime time. There’s a lot of associational data suggesting that our microbiome may play a crucial role in health, but our ability to manipulate it in ways that improve health outcomes is massively overhyped—obesity treatments and probiotics being two examples. In humans, microbiome fecal transplantation studies haven’t led to any meaningful changes in weight or body fat. Probiotics have been similarly oversold. While moderately useful for preventing diarrhea that can come with antibiotics in kids, the thousands of other claims people make about probiotics’ benefits haven’t yet been proven—and may never be. As for her glucose findings, based on the test cookies and CGM data, Zoe determined that Julia’s glucose responses were poor, and so was her diet—intriguing given the “excellent” microbiome results. Nutrisense came to the opposite conclusion: good glucose control, good diet, albeit based on more days of food tracking. Viome also graded Julia’s “metabolic fitness”—a score that “represents active microbial organisms and functions that are associated with your blood sugar, insulin resistance, or weight control”—as “good.” Was Zoe’s conclusion about poor glucose control linked to the fact that Julia did the cookie test after a travel day and bad sleep?[*4] Or was it an early signal of trouble to come? The cholesterol findings also varied among the companies. Zoe checked Julia’s blood fat response to eating, by measuring her triglycerides and estimating her fasting cholesterol. There, she got a “good” grade. Viome, meanwhile, took a different measure—of Julia’s “LDL cholesterol pathways,” the company’s assessment “of gene expression in the blood associated with cholesterol accumulation, oxidation, and clearance.” By Viome’s cholesterol measure, Julia was “average.” BORING OLD NUTRITION ADVICE, REPACKAGED For all their bells and whistles—and sometimes confusing and contrasting specifics—the three companies essentially doled out the standard healthy eating fare. At a high level, all recommended that Julia eat more plants and fiber, while minimizing or eliminating ultra-processed food and added sugar, just as Kevin predicted. They all—outright or implicitly— emphasized reducing refined carbohydrates, which would moderate Julia’s glucose responses. And it wasn’t just Julia. In studies of precision nutrition approaches, expensive and sophisticated proprietary algorithms to reduce glucose levels in an individualized way generally recommended that people eat diets with lower glycemic loads. Even Zoe’s app—the slickest and most user-friendly of the three, with the loveliest plant-based recipes—couldn’t solve the underlying reason Julia struggled to eat optimally on most days: the exhaustion of life with small children, compounded by work.[*5] By quantifying her eating with the apps, Julia began to feel some stress around food that she hadn’t experienced in years. To earn high meal scores, she was spending more time planning, shopping, cooking, and tracking—an effort that wasn’t sustainable with work deadlines and frequent bouts of single parenting while her husband traveled for his work. For a short time, the precision nutrition experiment even made her look askance at foods she previously enjoyed and was pretty confident were part of a healthy diet—bread, fruit, couscous. A longerlasting effect: the companies’ contradictory findings left her feeling slightly paranoid about her health. To get a sense of how other patients experienced precision nutrition, Julia called Nicola Guess, a University of Oxford academic dietitian and researcher who studies the prevention and management of type 2 diabetes. In her clinical practice, Guess says she now frequently sees patients who are “genuinely freaked out” about diabetes they don’t have after using CGMs. By now, Guess felt angry. “[These patients] eat great, and they’re being sent down this rabbit hole of, ‘Oh my God, I’ve got pre-diabetes.’…So for me, it’s like, what’s the point? Apart from driving yourself crazy.” Meanwhile, she went on, there was a disconnect between the popular obsession with glucose and the biggest health threats globally. “The things that kill people are heart attacks and strokes, and they’re primarily caused by the two major risk factors, which are blood pressure and high LDL cholesterol. That’s what we should be personalizing on.” A related effect she’s seen in her clinic: Patients, fixated on lowering their blood sugar, raise their LDL cholesterol. They’re eating fewer carb-rich foods (bread, fruit, and pasta) and replacing them with foods that contain more saturated fat (meat, butter). To reduce Julia’s stress about her blood sugar, Guess urged her to get an HbA1C—a well-established diagnostic test for diabetes that reflects chronic blood glucose levels. The idea was that if Julia had a normal readout, maybe she wouldn’t have to worry so much about her precision nutrition results, even if her blood glucose levels seemed high on the CGM. Following Guess’s advice, Julia took the test as part of a routine physical. Her insulin and blood glucose results came out normal, but for the first time in her life, she had elevated blood cholesterol levels, a worrying finding no company caught.[*6] Maybe Julia’s high cholesterol was the result of eating too much butter and cheese in Paris, or maybe it was a sign of trouble to come. Good blood sugar, bad blood fat: lab results diametrically opposed to Zoe’s and apparently at odds with Viome’s LDL score. Zoe’s Amati emphasized the blood sugar result was “never intended as a diagnostic or pre-diagnostic—simply an insight to how [Julia’s] body is responding to this sort of metabolic challenge, to help you make more informed choices over time.” While Julia’s blood fat score was good, it was not classified as “excellent.” On the blood sugar discrepancy, Amati also said that HbA1C measures chronic blood sugar, not a person’s response to a single meal, as Zoe does. Plus, Amati added, “it’s not all about blood glucose”—CGM use is optional with Zoe and the company deliberately does not “pathologize glucose spikes” but uses the devices as tools to teach people to build balanced meals. One of the goals of the Zoe program is also to reward “good” quality fats—generic diet advice known to reduce LDL cholesterol. Perhaps that’s why Zoe says they’ve found no evidence that LDL increases in their studies. Viome told us their LDL cholesterol result “does not replace the need for a lipid blood panel with your doctor.” They added, “We encourage our customers to work closely with their physicians as they work to improve their health and make changes to their diet and lifestyle.” The takeaway seemed to be that one’s health concerns could depend on which company tests them and how, and may be at odds with the findings of validated diagnostic tests that all companies defer to. If Julia hadn’t had the physical, she might have been doing what some of Guess’s patients had —changing her diet to improve her CGM responses by eating more of the foods that would worsen her LDL cholesterol—and increasing her risk of heart disease. PRECISION REDUCTIONISM We’ve entered a new kind of nutritional reductionism. We now have the tools to gather all kinds of granular data about ourselves and how our food impacts us. But every nutrition expert we spoke to—who wasn’t involved with a precision nutrition company—told us that we don’t yet know how to translate these findings to meaningful individualized diet advice. Knowing everything about food and what it does in our bodies, history keeps showing us, remains elusive. Yet we continually fall into the same traps, trying to diminish food’s complexity to one or two parameters, in our quest for better health. Today, we eat to minimize glucose spikes by CGM and optimize our microbiome, just as in the past we ate to reduce fat and calories. On the usefulness of CGMs, there are so many basic questions that haven’t yet been answered. What are meaningful differences in meal glucose responses in a person without diabetes? How many times does a person have to measure their CGM responses to particular meals to get reliable readings? One estimate is that at least a dozen, perhaps many more, repeated measurements would be required for each meal. Maybe the devices aren’t useless in the general population, but research like Kevin’s suggests that their signal can easily be buried in a lot of noise. As for what our blood sugar responses to single meals reveal about our health and disease outcomes, in people without diabetes or prediabetes, “Your guess is as good as mine,” said Faidon Magkos, a professor at the University of Copenhagen, who co-led Preventomics, one of the largest and most rigorous trials of precision nutrition to date. Researchers have linked diets rich in simple sugars, and people with more pronounced blood sugar responses, to higher risks of diseases like type 2 diabetes and heart disease. But correlation is not causation: It isn’t yet clear whether those diets or responses cause disease, or whether other underlying factors drive the disease process. So, are elevated glucose responses just innocent bystanders? Maybe they’re merely markers of underlying disease progression, and not directly causing harm. And while researchers like Magkos suspect that glucose spikes have pathological relevance, there’s currently no consensus on what constitutes meaningful differences in variability in people without diabetes. Nor did we find convincing data suggesting that reducing glucose spikes helps with all the things some CGM boosters sketch out—weight loss, preventing diabetes, or any other illness. “There is a big distance going from this association to the [belief] that reducing glucose peaks will be protective,” Magkos added. For Preventomics, Magkos and his colleagues gathered reams of data on one hundred study participants—analyzing DNA and measuring over fifty biomarkers in each individual. Those data were fed into an algorithm that used machine learning to assign people to special diets matched to their unique biological profile. A control group, meanwhile, got a generic diet that followed the standard boring recommendations of healthy eating. The study was double-blinded, meaning the participants and the clinicians involved didn’t know who was getting what. Magkos even went to the trouble of offering the study participants two meals a day for two and a half months, free of charge, to make sure they ate as prescribed. The biologically tailored diets failed to outperform the boring diet approach. After two and a half months, both groups had improved on some biomarkers but they were virtually indistinguishable from each other. Magkos thinks this has to do, in part, with the fact that we can’t yet tailor diets to people in ways that are meaningfully better than standard healthy eating fare. But also, using a couple of metrics to guide eating is naïve given the complexity of food, our bodies, and our lifestyles (much more on this to come in the next two chapters). “Think of oranges, for instance,” Magkos said. Their fiber content can vary twofold and their vitamin C content fivefold depending on the orange varietal and post-harvesting conditions. Now think of how many foods we eat at the same time or throughout the day, he continued. “It isn’t difficult to realize that any diet prescription is inherently imprecise.” Other independent randomized precision nutrition trials have come to similarly disappointing conclusions. When researchers retrospectively analyzed existing data from studies of low-carbohydrate versus low-fat diets for weight loss, they noticed that blood insulin levels and some genes seemed to be predictive of who would be more successful at losing weight on which diet. But when the same researchers at Stanford University tried to test the idea in a subsequent randomized trial, on average, people lost similar amounts of weight on both diets even with their precision assignments.[*7] In the few studies where precision nutrition approaches win out for achieving lower glucose, the algorithms do this by reducing the glycemic load compared to control groups. Or sometimes successful trials offer the personalized dieters a huge amount of support relative to the control group, as was the case in Zoe’s clinical trial. The company uses the results of that study to say “Zoe works,” a move that’s attracted criticism. In the British Medical Journal, science writers questioned how fair the trial was and whether the evidence bore out the claims. The main purpose of the study was to measure differences in blood cholesterol and triglycerides between a group that followed Zoe, and a control group. The BMJ writers noted that the study found no significant differences in cholesterol, and minimal differences in triglycerides. In response, Zoe’s chief scientist Sarah Berry said in a statement to the BMJ, “We specifically undertook the [randomized control trial] to test how effective the Zoe personalised dietary program was compared with the current standard of care.” Zoe admitted that they were not trying to test their personalized algorithms in isolation, but rather “capture multiple elements of personalization that we believe will improve both the efficacy of the advice as well as adherence to the advice.” The study also acknowledges that the differences between the Zoe intervention and control group “should be considered when interpreting the results. Future studies would benefit from assessing the impact of a personalized program versus personalized food scores.” And while there are trials, including Zoe’s, that show personalized approaches can help people stick to generic healthy eating advice better than standard care, no one’s shown that the gains come through precision nutrition algorithms. Instead, they come from the equivalent of intensive sessions with a nutritionist evaluating your diet. For now, the available research in its totality is so uninspiring, Nicola Guess, in a Nature Metabolism commentary, summed up “no data suggest that we can design diets on the basis of these biological differences, or that such diets are superior to generic healthy diets in improving cardiometabolic health.” Maybe this will change in the future, but don’t count on it happening anytime soon. Redman, the precision nutrition researcher at Pennington, argued that we’re nowhere near being able to personalize diets to individuals. At best, she predicts, we’ll get precision nutrition advice for subgroups in the population in her lifetime. “I don’t think we’re going to realize what we’re being sold, which is that a single drop of blood from a prick of your finger or a saliva sample or a retina scan is going to be the window to our nutritional needs, and then have a computer prescribing the diet.” Another major limiting factor behind some current programs and products on the market: They use data that’s easily accessible, and not likely to require medical regulation, while not always accounting for data that’s important to health—like one’s personal medical history. [*8] Eric Topol, a cardiologist and the founder and director of the Scripps Research Translational Institute, experienced this firsthand. When he ran his personal data through the algorithm used by the Israeli company DayTwo to get nutrition recommendations, it suggested precisely the foods he avoids because of his tendency to develop kidney stones. “That’s a big miscue,” he wrote of the experience in The New York Times, “because my pre-existing medical conditions were not one of the test’s inputs.” The only role he sees for CGMs, beyond helping people who already have diabetes, is potentially identifying who among those with prediabetes is most likely to go on to develop full-blown disease, an insight he gained from a forthcoming study. For the rest of the population, Topol told us, “We’re not at a point where you can use [CGMs] as an individual to guide your nutrition choices.” The dearth of solid science for the general population of users signing up is especially concerning given there’s so little oversight of the precision nutrition industry. Researchers at the Center for Science in the Public Interest make the case, in a recent analysis paper, that the regulatory system is uniquely unqualified to address the precision nutrition space. As wellness products that use health testing, they sit in a regulatory gap. “This fragmented, and generally weak, regulatory environment,” they concluded, “has created a perfect storm of lack of regulation that may lead consumers to believe that the claimed health-promoting potential of various personalized nutrition approaches exceeds what the evidence supports.” An investigation at Businessweek uncovered the case of a patient who delayed a diagnosis of ulcerative colitis, an inflammatory bowel disease, after unsuccessfully seeking help for symptoms with Viome. A dozen former employees also told Businessweek that Viome “grossly overpromised what it could deliver.” In a statement, Viome told us, theirs “is a wellness product and does not claim to diagnose, treat, or cure any medical condition. We consistently inform our users that if they suspect or are experiencing symptoms of a chronic health condition, they should immediately consult their healthcare providers.” Zoe too said their program is not intended as a diagnostic, or to replace medical care, and that “the point of [personalized nutrition] is to also improve adherence to well understood healthy dietary recommendations.” They also emphasized that their approach is holistic, “which isn’t just about changing dietary recommendations based on biological features, we now also consider key features that impact adherence and efficacy, including how people eat, how they live their lives, their dietary preferences, et cetera.” Nutrisense’s press contact said the company “is helping members currently [who] are in a certain life/body transition—especially when it comes to prediabetes and menopause—along with other associated challenges that come with those (weight, energy, cravings, sleep, fatigue).” So the companies position themselves in the wellness space as opposed to providing medical diagnostics or prescriptions, which struck us as an interesting paradox. They purport to be cutting-edge, ahead of the mainstream medical curve— but they also defer to doctors and their traditional diagnostics when medical needs arise. They acknowledge that theirs is an emerging field, and emphasize that they’re in compliance with existing regulations. As Viome told us, “While it’s true that the regulatory framework for this space is still evolving, that doesn’t mean personalized nutrition lacks scientific grounding.” Many companies publish research in scientific journals and use it in their marketing materials. The companies say they are out to empower and educate, rather than punish or chastise. But what’s missing in all of this is the rigorous evidence demonstrating that their programs provide clinically meaningful health benefits. There are no randomized trials comparing the companies’ plans to each other or to appropriately matched control groups. Yet when we raised questions about Julia’s results, the companies confidently responded with very specific answers, and little to no acknowledgment of all of this uncertainty. AN EXPENSIVE DISTRACTION Perhaps our biggest concern of all is that the new precision nutrition gimmicks will draw us away from the insight that what we eat (and our health) is influenced far more by our environment than our genes or individual glucose responses. The one-size-fits-all generic advice doesn’t fail because everyone is different; as we’ve seen, it fails because the food environment is toxic and most people can’t afford to insulate themselves from its worst effects. This distraction is happening in research, which trickles into distraction in policy. Nutrition science funding in America is now mainly targeted to the precision nutrition paradigm. It’s the centerpiece and guiding principle of the strategic plan for nutrition research at the NIH, the world’s largest health research funder. To support that priority, the NIH invested roughly $170 million in a study that will recruit thousands of research subjects with the aim of developing algorithms to predict individual diet responses. The study is probably the largest single investment in nutrition science ever. Given everything we know about precision nutrition so far, Kevin is, of course, skeptical that the study will generate a decent return on investment. It’s also baffling that such a large slice of the relatively small funding pie goes to precision nutrition when there’s relatively little research on the food environment and barely any rigorous testing of the effects of the U.S. Dietary Guidelines on health.[*9] We do have lots of observational evidence supporting the consensus on healthy dietary patterns, but the field needs more well-conducted randomized controlled trials, including of the food environment. Fundamental studies like these should have priority over funding for precision nutrition. Even if the precision nutrition approach is eventually successful, its adoption may only increase health disparities. Each company charges its users hundreds of dollars to sign up and test, as well as fees for access to their apps or follow-up support. Only a handful of the privileged gain entry. [*10] If you’re one of them, and have a few hundred dollars to burn, go ahead—try a precision nutrition diet program. But remember that some scientists not involved in these companies think they’re appealing to nascent research to market and sell stuff that isn’t proven. And you’ll probably end up with a minor variation on the standard dietary advice that national guidelines around the world promote. If the precision nutrition program uses a CGM, there’s a good chance that the tailored diet will likely involve recommending eating fewer foods with added sugar and refined carbohydrates. If you’re concerned about carbohydrates and your glucose levels, a CGM might help you engage with your blood sugar responses. But rather than forking out money for a CGM or special diet, read up on glycemic index and glycemic load. And if you don’t have a family history of diabetes or prediabetes, focusing a lot on manipulating your glucose levels—beyond simply cutting back on refined carbohydrates—may be pointless and could even lead you astray. These ideas were on Kevin’s mind when, in the spring of 2024, he presented some of his CGM results at a precision nutrition conference in Copenhagen. A few attendees seemed concerned about the implications of his CGM research, but there was a sense that little could stop the momentum of the precision nutrition movement. At one point, someone posed a question to a panel: Would resources be better spent improving the food supply of the population according to what is already known about a healthy diet or advancing precision nutrition to prescribe the right diet for each individual? The answer was clearly the former. Even the scientists in the room, who had already invested heavily in the field, recognized this. One of the scientists at the Copenhagen meeting, Deirdre Tobias, a nutrition and obesity epidemiologist at Harvard Medical School, told us she’d been dubious about the field for a while. Not only are companies rushing ahead of the research while charging their users lots of money, but she believes that even if we get to the point of making more precise and accurate recommendations, most people “are not even able to achieve the basic health recommendations.” If people find out “blueberries are better for you than strawberries,” Tobias went on, “well, people aren’t eating the fruit to begin with.” Precision nutrition companies also recognize these challenges. To help their users, they all sell supplements. Viome suggested Julia take eight capsules daily, including probiotics and prebiotics, and, to improve her oral microbiome, tailored lozenges and toothpaste. Of the products, Viome says they “carefully select ingredients based on solid, peer-reviewed scientific research, and each personalized formula includes detailed information on why specific ingredients were recommended.” Zoe sells the Daily30+, a “deliciously science-y” prebiotic to sprinkle on everything from salads to yogurt, with the aim of boosting the range of plants customers eat each week. The company says they created the supplement to help the majority eat more fiber and found in a trial that it favorably shifted the gut microbiome composition compared to those who got a probiotic or control (bread croutons). The study ran for six weeks and it’s not clear what that means for long-term health. At Nutrisense, users book follow-up calls with nutritionists. On those calls, nutritionists advised Julia to eat more protein and take supplements. For an extra $50, Nutrisense experts will also consult with customers about how to optimize supplements for blood glucose. The glucose influencers are also all-in on supplements. The Glucose Goddess sells the aptly named Anti-Spike Formula—a supplement to reduce blood sugar and “protect” against sugar cravings—for $65 per bottle. The lucrative and relatively unregulated supplement industry was founded on the back of the twentieth-century vitamin craze—yet another story in the annals of the history of nutrition science, where commercial interests arguably ran ahead of surprisingly unsettled science. Skip Notes *1 This chapter is about the field of precision nutrition, not the Canada-based company of the same name. *2 Zoe calls itself a “personalized” rather than a “precision” nutrition company since it uses more than just an individual’s biological data to tailor recommendations, including how people eat, their lifestyles, and dietary preferences. *3 In these patients, the insulin-producing beta cells of the pancreas don’t function—so external insulin helps them metabolize their carbohydrates and control blood glucose levels. *4 Zoe itself has published on how many factors—including sleep, meal timing, and eating rate— affect CGM results. *5 Our sleep and other health behaviors—physical activity, screen time—are so strongly related, health guidelines in some countries now advise on all of them together in one place. *6 There are several measurements quantifying different aspects of blood fat—for example, different kinds of cholesterol containing particles, triglycerides, ApoB, and so on. *7 The study, like many in the diet literature, found a wide variability in weight loss success on each diet. It’s possible that this was driven less by biological variability and more by the social and economic factors that affect adherence to the diet: things like having a supportive partner, a stable job, a good income, and the time and resources to make lasting lifestyle changes. *8 Viome said their program is “designed to incorporate user-provided information—such as medications, allergies, diagnoses, and health goals—to refine the personalization of our food and supplement recommendations.” Zoe also said they “personalize on multiple features.” *9 To date, there’s only one small, short-term randomized trial comparing a U.S. Dietary Guidelines diet to a typical American diet. It suggested some improvements in blood pressure over eight weeks but no other significant changes. Another large, long-term Spanish study of the Mediterranean diet found important cardiovascular benefits in the groups assigned to Mediterranean diets, but those participants had much more support than the control group, and other researchers later discovered randomization problems that raised questions about the study’s results. *10 Zoe says they’re launching a free to access app to address this problem and that they’ve reduced their membership pricing, and offer free nutrition education through their various platforms. CHAPTER 10 Vitamania So far, we’ve explained the bulk of what’s in food and how these different components work to keep us alive and make us who we are. Most of what we eat is macronutrients—the carbs, fat, and protein we’ve covered—along with the water contained in our food. This chapter is about how we got those categories anyway—and how, for a long time, they missed essential elements in what we eat: the vitamins. Before working on this book, we both took vitamin supplements at various points in our lives, and didn’t really know much of what these chemicals were doing inside us. Even Kevin. As a physics student, he’d pop a multivitamin to balance out a pasta- and pizza-heavy college diet. Beyond that, he didn’t give vitamins much thought. The discovery of vitamins is the ultimate cautionary tale when it comes to food. It shows how difficult it is to pinpoint the health effects of particular food components, and how quickly profiteers capitalize on new findings, no matter how tenuous our knowledge. It shows how ignorant we can be about what’s really in front of us when it comes to what we eat, and how, again and again, we get duped by food’s true complexity. The story of the vitamins is a reminder that nutrition isn’t rocket science; it’s harder. The closer we look, the more wondrously complicated food becomes. PROUT’S RUBRIC William Prout was the brilliant son of tenant farmers in the village of Horton, in Gloucestershire, England, so eager to learn that at age twenty he put an ad in the local newspaper asking for help completing his education. He went on to earn a medical degree and to tinker in his spare time in subjects as varied as chemistry, physics, medicine, and nutrition. One of his favorite pastimes: analyzing anything he could get his hands on in an oxygen combustion apparatus—funnels, glass and iron tubes—that he designed himself. Boa constrictor feces, blood. When he turned to food, he realized that everything we eat could be reduced to just three classes of chemicals. He called them “the saccharine, the oily, and the albuminous,” as we noted back in Chapter 3. Foreshadowing the modern diet know-it-all, he wrote in the 1820s, “A diet, to be complete, must contain more or less of all the three.” To this day, Prout’s categories remain a pretty good approximation of the way many of us think about what we eat: as interchangeable bundles of carbohydrates, fats, and proteins. His work marks both the invention of the nutrient and the birth of modern nutrition science. This was the moment we started to view food as the sum of its parts, an approach that’s reverberated in policy, agricultural practice, food manufacturing, science, and diet advice ever since. But there is, of course, more to food than just carbs, fats, and protein— other chemicals that, when missing from the diet, can cause severe illness and even death. The first set of these micronutrients—the essential minerals —was appreciated even before Prout’s time. Their importance is written into human history. Ancient trade routes, for example, were built for salt extraction, and soldiers in the Roman army were said to be paid in salt— salarium in Latin—from which the word salary is derived. Minerals, including sodium, potassium, calcium, phosphate, and iron, come from the earth. (We’ve covered fire, air, water, and now earth: the final Greek element.) They perform a stunning array of functions inside us. Calcium, the most abundant mineral in the body, acts as a signal to contract our muscles, tells endocrine cells to secrete hormones and neurons to release neurotransmitters, and helps form our bones. Charged sodium and potassium ions create the electrical activity pacing our hearts and firing our neurons. Minerals are inorganic, meaning they don’t decay, which made them relatively easy to see. Evaporate marine water, you’ll find salt crystals. Burn wood, and much of the powdery ash that’s left over is calcium compounds. The second set of essential micronutrients is invisible to the naked eye. Clues to the existence of the vitamins arrived by way of circumstantial evidence, once again in brutal animal experiments in the late 1800s. Researchers tried feeding dogs and mice “purified diets”—mixtures of the macronutrients in as pure a form as possible, supplemented with water and minerals. They found something astonishing: Animals eating purified diets with enough calories, protein, and minerals stopped growing and eventually died. Yet adding small amounts of milk to the menus of the sickly animals restored their normal growth. The known essential components of food were not enough to keep animals alive. Some unknown, minuscule factor in milk seemed to play a vital role. The idea that had kicked around since Prout’s time—that all we needed for growth and survival were water, macronutrients, and minerals—was clearly wrong. An entirely new class of chemicals in food needed to be found, the absence of which could cause horrific diseases in humans. EARLY CLUES Before nutritional deficiency diseases were understood, people thought they could be explained by things like poor climate, infectious pathogens, or shortages of oxygen and protein. In other words, we viewed these ailments through the lens of the prevailing theories of disease at the time. Scientists feted in their fields had to be toppled before the paradigm could be challenged or changed. This was even true of the person who conducted what’s considered not only a seminal vitamin study, but one of the seminal studies in the history of medicine. In 1747, James Lind, a thirty-one-year-old Royal Navy surgeon, took twelve ailing sailors in a scurvy outbreak aboard the HMS Salisbury and divided them into pairs, subjecting each to a different treatment. The duo that got two oranges and a lemon every day for six days saw a remarkable reversal of disease, its hallmark symptoms—bleeding gums, bruising, sore limbs, lethargy, depression—rapidly easing. Lind concluded that “oranges and lemons were the most effectual remedies for this distemper at sea.” Finally—a breakthrough for a disorder that had already haunted explorers and crippled militaries and marines for centuries. Lind, methodical and science-minded, had the best of intentions. “Indeed, before the subject could be set in clear and proper light, it was necessary to remove a great deal of rubbish,” he wrote in his 1753 A Treatise on the Scurvy, a work he based “upon attested facts and observations, without suffering the illusions of theory to influence and pervert the judgement.” Despite Lind’s ambitions, his treatise and trial didn’t settle the question of what caused scurvy. Far from it. And even as citrus appeared to be an effective treatment, scurvy was not yet recognized as a disease caused by a dietary deficiency. In Lind’s time, the ancient humoral system—formalized by Greek physicians, with roots going back to ancient Egyptian civilization—still formed the bedrock of how people viewed health and disease. Illness arose from imbalances in the body’s four humors: the fluids blood, phlegm, and black and yellow bile. The humors were corruptible by factors such as the environment, too little exercise, and food. Treatments involved removing disease-provoking stimuli and restoring balance to the body. What this meant in terms of specific diets varied from individual to individual, depending on their unique humoral makeup. Instead of taking Lind’s finding as definitive, physicians, militaries, and the public continued to look to the humoral theory, layered with newer, often contradictory, ideas about the body, to explain the disease. Lind did, too, bending his observation about citrus to fit humoral thinking. In his treatise, he hypothesized that faulty digestion was the root cause of scurvy and that the moist, “putrid” air at sea helped spread the contagion—all reasonable assumptions based on his own observations and what was known at the time. It took nearly a half century after Lind’s trial, in the 1790s, for the British navy to issue regular citrus doses for sailors, a change advocated by other naval physicians, not Lind himself. As the health of sailors improved and citrus became a widely accepted scurvy prevention and salve, there was still no consensus on what caused scurvy or how lemon juice treated it. When Liebig’s protein theory gained traction in the mid-1800s, scurvy was thought to be the result of protein deficiency, and adding protein to the diet, a remedy. Before that, with the rise of pneumatic chemistry, the focus shifted to air: Scurvy was blamed on whatever sailors were breathing in. Everything from a shortage of carbon dioxide to a lack of oxygen was thought to be a driver. Later, scurvy was viewed by some as a “nervous debility,” caused by a diet not only indigestible but lacking in nutrition and compounded by factors such as indolence, fatigue, and melancholy. Citrus was thought of as a means of stimulating the body’s nervous system. By the end of the nineteenth century, cell theory described cells as the basic unit of not only life but also health and disease. Germ theory started to coalesce. Together, these new systems of thought finally—albeit slowly— supplanted humoral theory to explain why people get sick. These lifesaving breakthroughs also made it easy to overlook the possibility that deficiencies in the diet, of another world of yet-to-be-discovered microscopic substances, could bring on disease and death. Now scurvy was thought to be caused by either germs or toxins. Today Lind is remembered as a contemporary scientific hero for his clinical trial. He’s often credited with discovering the scurvy cure—and helping to defeat Napoleon at Waterloo. His study not only demonstrated that eating citrus worked to treat scurvy, it’s thought to be the first-ever controlled clinical trial, a fundamental methodological advance that laid the foundations for much of what we know about how medicine—or any intervention—works. In his lifetime, though, Lind had no idea of the significance of what he’d found. He seemed to believe that citrus as a potential treatment did not necessarily imply anything about the root cause of the disease. Meanwhile, confusion and skepticism—about the true cause of scurvy and other similar disorders—dragged into the early twentieth century. That scurvy could be cured with food did not mean that it was caused by a dietary deficiency, as we now understand it. The only way to figure that out was to find and chemically isolate the missing dietary ingredients. And once they were discovered, excitement over these chemicals spawned the global supplement industry, the public health benefits of which—a century later— remain dubious. ISOLATING CAUSES In a two-room lab attached to a military hospital in the Dutch East Indies, the colonial territory that comprises most of the modern state of Indonesia, a decade of research on another deficiency disease, beriberi, yielded the big insight—one that eventually led to a scurvy breakthrough. Beriberi had become an urgent threat when the Dutch military response to an uprising stalled because troops were suddenly losing the ability to walk—a hallmark symptom of the disease. In the late 1800s, the Dutch government established a research program in the colony to figure out what was causing the partial paralysis. Germs were the primary suspect. A Dutch military surgeon named Christiaan Eijkman, then stationed in what is now Jakarta, took over the effort to find the beriberi-causing pathogen. After a series of painstaking experiments, Eijkman puzzled together that chickens got beriberi after they switched from eating brown to white “polished” rice. Something in the white rice, he thought, made them sick. Continuing the research, Eijkman’s successor, Gerrit Grijns, correctly reinterpreted Eijkman’s discovery: Something missing from the white rice sickened the birds. But neither Grijns nor anyone else knew what these easily disintegrated substances were. By the time Grijns made his voyage back to Holland, he left behind an early diet war—one camp believing beriberi was the result of an infection or that white rice carried a toxin; and another, that the disease was driven by a nutrient deficiency. The studies that finally spelled the beginning of the end of the scientific dispute were inspired by the Dutch group and carried out in Norway. A medical professor who had visited Eijkman and Grijns’s lab decided to pursue beriberi and rice studies of his own at the University of Oslo. Rather than chickens, Axel Holst wanted to use a mammal that more closely resembled humans. He settled on guinea pigs and asked a pediatrician colleague, Theodor Frölich, to help carry out the research. The choice of both the animal model and the research partner turned out to be lucky. When Holst and Frölich fed guinea pigs a diet of white rice, the animals died, just like the chickens eating white rice in Indonesia. But Frölich, who happened to have an interest in scurvy, noticed something his peers had missed: The guinea pig autopsies showed they had actually died of scurvy, not beriberi. Picking up where Lind had left off 160 years earlier, Frölich and Holst fed the surviving sick animals foods known to relieve human scurvy— lemon juice, fresh potatoes, apples, and cabbage mixed with oats, bread, or rice. Sure enough, the symptoms resolved. Holst and Frölich reasoned that the guinea pigs had suffered from a disease, just like human scurvy, and it had to be driven by a dietary deficiency—not climate, not a protein deficiency, not an infectious agent or toxic chemical. This was the first time anybody had knowingly produced an animal model for scurvy, finally opening the way to testing cures and isolating the active chemical from food. It took more time, and feeding more guinea pigs lemon juice, to figure out that a chemical that naturally occurred in lemons was in fact the substance that, when absent from the diet, caused scurvy. The chemical was later named vitamin C, part of a suite of newly discovered microscopic substances in food that are, as the name suggests, vital to health. THE VITAMINS Unlike minerals, the vitamins are organic molecules, which means they’re “easily disintegrated.” While they’re all found in food, they’re invisible without a microscope—odorless and mostly tasteless. They work their magic in mind-bogglingly tiny doses—think millionths or thousandths of a gram per day. So while we eat macronutrients in grams each day, the recommended daily intake for vitamins is measured in micrograms or milligrams. Hence, micronutrients. It’s no wonder the vitamins were hard to find. Once vitamin C was isolated, all the other scurvy theories and bogus remedies could finally be relegated to dumpster sites of medical history. Getting to this point required researchers to conceptualize deficiency diseases, then stumble upon the animal that, like people, can’t synthesize the missing substances. (Guinea pigs are one of only a few species that can’t build vitamin C from food molecules. Like us, they need to get it by eating plants and other animals that can.)[*1] Then they had to isolate the chemicals from foods known to prevent the diseases and try those, one at a time, on the test animals until their symptoms disappeared. The process more or less repeated again and again until 1948, when scientists found the last of the thirteen known human “essential vitamins”— which means we need them but our bodies can’t produce them so we have to eat them. Around this time, nutrition science as a discipline was becoming well established on both sides of the Atlantic, and its practitioners had to accept that their original focus—on just protein and energy—was far too limited. Nutrition was about more than meeting your energy needs and getting your fill of “the only true nutrient.” We needed more than just minerals and Prout’s trio to survive. What we eat is intimately tied in with health—but so is what we don’t eat. PROMISE AND OVERPROMISE So what do vitamins do? Many of us try to eat more of them or take supplements without knowing the answer to that question. The answer brings us back to Chapter 2, about metabolism. Vitamins are crucial to metabolism. They help catalyze the chemical reactions that unfold all the time inside our cells. These reactions require enzymes—the necessary sparks that help them run faster. Some vitamins act to build important coenzymes and co-factors, stepping in where enzymes need a boost. But they have other functions, too. Vitamins C and E, for example, act as antioxidants, meaning they can neutralize free radicals, reactive molecules in the body that can damage cells and promote aging and disease. Without vitamins, many of the biochemical reactions of metabolism we depend on for normal cell growth and function—building proteins to heal wounds and grow muscles, for instance—wouldn’t be possible. When even one is in short supply, we have symptoms of dietary deficiency. Vitamin C plays a crucial role in the synthesis of collagen, the most abundant protein inside us, and the glue of the body’s cells. When we run short, we literally come apart: Old wounds can crack open, and the skin grows sensitive and appears bruised—the result of internal hemorrhaging. Teeth loosen and fall out, and the gums bleed. Vitamin A, in milk and liver products, combines with a protein in the eye to become part of the visual pigment. When we don’t eat enough, our ability to see is diminished and we can eventually go blind. Vitamin K, found in leafy green vegetables, such as kale, spinach, and mustard greens, helps make proteins in the body that are important for blood clotting. In severe cases of deficiency, people can bleed to death. Seventeen of the scientists who worked on vitamins won Nobel Prizes, including Eijkman and Albert Szent-Györgyi, who isolated vitamin C. (Grijns was unfortunately and controversially overlooked.) Their discoveries fundamentally changed how we eat. The synthesis of vitamins on an industrial scale opened up new possibilities for vitamin-boosted products. National food fortification and enrichment programs—bolstering the food supply with micronutrients—launched in America in the 1920s. Food makers added iodine to salt, vitamin D to milk, folic acid to bread and flour, and thiamine to white rice. These programs happened in parallel with improvements in food availability and a decline in the cost of food relative to income. The result of these changes: Nutrient-deficiency diseases are now extremely rare in the United States. For hundreds of years prior, nutritional disorders such as rickets—the softening and weakening of bones in children caused by vitamin D deficiency—were widespread. In 1950, roughly 25 percent of babies worldwide died in infancy. By 2020, the child mortality rate had fallen to about 4 percent. When we each asked our spouses what beriberi (thiamine deficiency) is, they gave us a blank stare. “Sounds like a smoothie,” one said. But like many other aspects of nutrition science, the enthusiasm rapidly gave way to overpromise and commercialization ahead of the evidence. Women’s magazines championed vitamins as the chemicals in food to obsess over. “ ‘Count the calories,’ long the maxim of food nutrition, has given way to what might well be called ‘Value the Vitamins,’ ” a primer in Ladies’ Home Journal declared in 1930. Journalists printed recipes and cooking tips that optimized vitamin intake. Famous scientists preached the vitamin cause, boasting about the benefits of vitamins for health. Chemist Linus Pauling, a two-time Nobel laureate, promoted vitamin C to prevent the common cold and cancer. Szent-Györgyi became a vitamin C evangelist, extolling the benefits of the chemical as a prevention and salve for a range of illnesses while on speaking tours across Europe. When he realized that paprika, a popular spice in his native Hungary, contained even larger amounts of vitamin C than lemons, he spoke publicly of his hope that more people would eat paprika, and that “large factories will also use paprika as a raw material for the production of pure vitamin C.” He might’ve been pleased to see what happened next. THE SUPPLEMENT RACKET The vitamin market is now worth roughly $150 billion—on par with the global market for healthy snacks. Today, you can walk around just about any grocery store or pharmacy in North America and Western Europe, and you’ll probably find aisles filled with seemingly countless different vitamin formulations. At the checkout counters, you might see little packages and bottles filled with vitamin C for the common cold. These synthetic vitamins can certainly help people with diagnosed deficiencies or in moments of increased demand, such as pregnancy. But their use in the absence of specific medical need has produced, at best, mixed benefits to health and, at worst, outright harm. “Over and over, we’re just faced with these negative studies,” Peter Lurie, a physician and the president of the independent nonprofit Center for Science in the Public Interest, summed up. “You’d be hard-pressed to find too many supplements that accomplish anything other than treat a deficiency state.” Put another way, a century into the grand experiment of vitamin peddling to the masses, we lack a clear understanding of their benefits in synthetic supplement form. We found something amazing in food, got excited about the possibility of extracting it and topping up with megadoses, and hawked those products without really understanding them. Another very old nutrition promise still not realized. Vitamin C, the supplement many of us reflexively turn to during cold and flu season, hasn’t been proven to reduce the risk of catching the sniffles. Research on the effects of multivitamins, taken by an estimated third of American adults, has failed to demonstrate that they help with objective markers of health. A study of twenty-one thousand adults in the United States found that multivitamin users had no health edge over nonusers. When researchers reviewed multivitamin studies involving four hundred thousand participants, they too found no clear benefit for heart disease, cancer, cognitive decline, or overall mortality rates. There’s no proof that antioxidant supplements—beta-carotene, vitamins A and E— reduce the risk of disease or death. In fact, there’s some debate about whether these supplements increase mortality. Vitamin D supplementation seems to carry bone health benefits for people who are deficient—but that’s about it. And yes, you can overdose on some vitamins. Four of the essentials—A, D, E, and K—are fat-soluble.[*2] This means they get stored in the fat tissue, muscles, and liver, with the risk that they accumulate to toxic levels when you take too much. As we were writing this chapter, an elderly man in Britain died of an overdose of vitamin D. Such tragic and preventable deaths thankfully appear to be rare. The greater threat to the public lies in the broader supplement racket that the synthetic vitamins laid the foundation for. It features, as Catherine Price points out in her engaging book Vitamania, more than eighty-five thousand products, including weight loss aids, muscle builders, and brain boosters— as well as the protein and precision nutrition products we’ve already covered. For the American news outlet Vox, Julia and data journalist Soo Oh combed through government databases, court documents, and scientific studies, and uncovered more than 850 supplement products laced with illegal or hidden ingredients—such as banned drugs, pharmaceuticals, and synthetic chemicals that have never been tested on humans. This happens because supplements are treated like foods, not drugs, in many countries. In the United States, this was thanks to lobbying by the supplement industry—and the historical disempowerment of the FDA’s food program relative to the USDA’s meat program, as we discussed in Chapter 8. But it’s also thanks to congressional champions of supplements, particularly from states where manufacturers are concentrated. The Dietary Supplement Health and Education Act (DSHEA) of 1994, which included an amendment to the federal law regulating supplements, allowed manufacturers to bring products to market without any requirement for preapproval. The law placed the burden of proof on the FDA to show that a supplement is unsafe before it can be removed from the market. There’s no need to prove that products are safe or even effective before selling them to people. Manufacturers are supposed to simply ensure that their labels are accurate and ingredients listed—but sometimes even that doesn’t happen. Efforts to bring supplement regulation more in line with pharmaceuticals —where drugs have to be proven safe and effective before reaching market —haven’t gained traction since. At key moments, including before the passage of DSHEA, the public, spurred on by supplement retailers and manufacturers, stepped in to protest regulations. Over the period of a year, ending in the spring of 1994, Americans logged more than one hundred thousand letters and phone calls to Congress in support of the HatchRichardson bill, aka the DSHEA, as historian Rima Apple details in her book also called Vitamania. The effort is sometimes cited as one of the most successful activist campaigns in U.S. history. The result: Today, vitamin E can sit on store shelves next to “Wood-E” sexual enhancing supplements—and consumers are left to experiment on themselves. Researchers have estimated that supplements cause more than twenty-three thousand emergency department visits in the United States annually—leading to more than two thousand hospitalizations. Many of the cases involve young adults taking energy or weight loss supplements. That figure is probably an underestimate, the authors wrote. Patients often forget to tell their doctors that they’re using supplements. Yet pharmaceuticals can interact with them in dangerous ways. Despite this lackluster and even deadly record, supplements have become the go-to for wellness influencers who promise to lower our glucose levels, help us lose weight, and improve our skin, nails, and hair. TikTok, YouTube, Instagram, and Facebook are brimming with such claims. It’s not because the pills and powders are necessarily going to help anybody; it’s because there’s so little regulatory oversight, people can say essentially whatever they want about them. The influencers often market poorly regulated supplements at the same time they decry evils of the wellregulated pharmaceutical industry. We gobble it all up. From 2017 to 2020, nearly 60 percent of American adults, as well as about a third of children and young adults, reported taking at least one dietary supplement in the last thirty days. The numbers are similarly high in the UK and Europe. And in all these places, supplement use increased during the COVID-19 pandemic, at the same time that trust in institutions like the FDA and the medical establishment has been faltering. The power of the supplement industry shapes not only how we use, access, and view these products, but also what research we support. As Jerold Mande, a former FDA and USDA senior official summed up, “There are congressional champions for dietary supplements. There are no true champions of nutrition research.” This helps explain why, since 2015, the NIH’s overall research spending on nutrition[*3] has hovered around 5 percent of the agency’s total budget. VITAMINS AND US Putting the world of synthetic supplements aside, we still have a lot to learn about the fundamentals of naturally occurring vitamins—how they work inside us, how they interact with our overall diet, how much we really need. The U.S. federal government’s advice about vitamin requirements— known as the Dietary Reference Intake, or DRI—is an “ongoing effort,” Gerald Combs, a professor emeritus at Cornell University and co-author of the comprehensive book The Vitamins, told us. The DRI includes a safe upper limit on intake for each vitamin, as well as the Recommended Dietary Allowance (RDA). But some groups are left out. There are no vitamin RDAs for babies, for example. And for the rest of us, the RDAs are based on an expert committee’s understanding of the minimum requirements, adjusted upward to cover most of the population. Of course, individual needs vary depending on factors like a person’s life stage, their health status, and even their diet. Folic acid is difficult to get from food and crucial during pregnancy to prevent neural tube defects. People following a vegan diet probably need to supplement with vitamin B12, an essential vitamin that we get from animal foods. Iron needs increase during menstruation, while the ability to absorb vitamins and minerals can be hampered by illness (sometimes referred to as the “vicious cycle” of malnutrition and infectious disease—malnutrition exacerbates infections, while infections can hamper the absorption of nutrients). More broadly, how much of each nutrient or vitamin we need to consume may depend on our overall dietary pattern. For example, lowcarbohydrate adherents seem to have lower vitamin C requirements, which is probably one reason the meat-eating Arctic explorers we met in Chapter 4 avoided scurvy. Ignoring all this complexity and flux, we tend to think of our vitamin needs as static. We take the exact same supplements every day for years. We eat the same bread or milk, laced with added vitamins and minerals. From nutrition guidelines we can find the toxic upper limit and the minimum requirements for a typical background dietary pattern, but not a lot about how much we need for optimal health, or how our needs are evolving. The vitamin concentrations in whole foods are also variable, depending on how the foods are grown and harvested, and how we’re preparing and cooking them. Thoroughly washing fruits and vegetables, and especially soaking them in water, before you cook can cause water-soluble vitamins to leach out. When you remove the skins on produce you also lose any vitamins that were stored there. According to Combs, you reduce the vitamin content of fresh food by about half when you cook it—more so when you cook for a long time in dry air, as in the oven; less so when you cook food rapidly, as in a pressure cooker or a stir-fry. Food combinations matter—a complexity we’re only beginning to unravel. Pairing foods rich in plant-based iron with foods rich in vitamin C increases the body’s ability to absorb the iron, while drinking alcohol with a meal hampers nutrient absorption. Too many glasses of wine, and the ability to absorb vitamins and minerals, such as thiamine, vitamin B12, folate, and zinc, drops off. It’s because of these nuances and uncertainties that public health bodies stick to that boring nutrition advice we covered in the last chapter: Eat a diverse diet. When we don’t, we may miss out not only on the naturally occurring vitamins and minerals we know we need, but other yet-to-bedetermined compounds in foods that may help our health (more on this next). Yet many of us fail to do it, a situation that our approach to vitamins helped make possible. If it weren’t for the enrichment and fortification of products such as packaged bread, milk, cereal, or juice concentrates, virtually no Americans would meet their vitamin D needs, three-quarters wouldn’t get enough vitamin A, nearly half wouldn’t get enough vitamin C, and 93 percent would be vitamin E deficient, according to one 2011 report in the Journal of Nutrition. In 1969, 14 percent of ready-made cereals were fortified; by 1979, the number was 92 percent. Today, most of the U.S. milk supply contains added vitamin D. As Vitamania’s Price points out, “The socalled standard American diet—high in refined grains and sweets, and associated with ‘Western’ diseases like heart disease and cancer—could not have developed without the help of synthetic vitamins…. While they’re designed—and now often required—to keep us healthy, synthetic vitamins also enable the very products and dietary habits that are making us sick.” Manufacturers boast of the vitamins and minerals added to their foods— fruit candy and sugary cereals “rich in” vitamins and minerals—without any mention of all the micronutrients they stripped out during processing, or the whole foods consumers will replace these products with. The situation is now so perverse that the latest expert nutrition committee that advises the U.S. government on dietary guidelines refused to recommend reducing refined-grain consumption; doing so, they reasoned, might lead to a vitamin deficiency even though many of these fortified products are linked with chronic diseases. EAT YOUR VITAMINS Outside of a diagnosed deficiency or periods of increased need, there’s little evidence that the big doses of vitamins we get through supplements improve health. So instead of relying on pills, try to eat your vitamins in food—we know, easier said than done. When researchers track people who get their vitamins from food compared to those who rely on supplements, they found the real thing linked to a range of positive health outcomes; not so for supplements. Maybe the poor performance of supplements is because vitamins need to be eaten in their context—the foods we extract them from to make supplements. Unlike the pills, natural foods are also loaded with other beneficial elements, such as phytochemicals and fiber. Or perhaps there’s something even more basic at work. Natural foods are what our brains and guts evolved with, to make those all-important nutrient associations that subliminally guide our eating. Eat an orange or drink lemon juice, and the pleasing flavor of citrus is a reliable indicator of vitamin C. Things become confusing when that flavor-nutrient pairing is broken, as with artificially flavored gummies, cereal, or soda. There’s emerging concern that synthetic vitamins and flavors mess with our body’s ability to make sense of our food environment and appropriately adjust our eating behaviors. If we’re born with an innate nutritional wisdom that helps us seek protein, sodium, calories, and micronutrients whenever we need them, these processes may be interrupted when the nutrient pairings don’t make sense anymore. “Modern food may be the most compelling lie humans have ever told,” journalist Mark Schatzker wrote in The Dorito Effect, one of his books on this subject. He and other researchers argue that our reliance on nutrient fortification and enrichment may even be driving obesity. The lure of the quick fix, the magic pill, drew us further away from the kinds of diets we should be eating, and further away from health. How we got to this point goes back to the time of Prout and his contemporaries, the scientists who were trying to understand foods by their macronutrient parts. That is the reductionism project that underpins much of the scientific enterprise: Break something down and study its parts as a way of grasping the complex whole. Nutrition was different from other sciences. We didn’t stop at just studying the parts. Ever since Prout, whenever nutrition researchers discovered something new in food, it was almost instantly declared essential or an elixir, elevating a “limited and often quite preliminary understanding of nutrients to the status of nutritional certainties or truths,” wrote Australian social scientist Gyorgy Scrinis, who articulated the harms of nutritional reductionism in his book Nutritionism. This is the one true way to eat, the truth about food. The nutrient was then promoted and packaged, and consumers were told that more is better and that they could replace the real thing with the packaged product. It’s what Prout did with the macronutrients, and what Liebig did with his meat extract and an infant formula, the latter of which he thought was a complete food but actually lacked vitamins. It’s what many modern vitamin, supplement, and functional food peddlers do today. We pivot from food discovery to commercialization, selling people stuff they may not need or benefit from. An alternate history of nutrition science could have been a history of humanity being humbled by food’s mysteries. A history of waiting to commercialize the magic pills and elixirs until after we understand them. We’ve rarely been humble in our application of nutrition science. We’ve rarely waited. But if humans for hundreds of years missed a tiny but essential component in food while thinking we had it all figured out, what else did we miss about nutrition? What else are we missing? Skip Notes *1 Humans can’t perform the last step of vitamin C synthesis, which if we could, would mean we were able to make vitamin C ourselves and we wouldn’t need it in the diet. It wasn’t always this way: Humans and other species that can’t make vitamin C anymore used to be able to. At some point during evolution, a gene that supported vitamin C synthesis mutated to become inactive, a turn of events that still puzzles researchers given that this micronutrient is essential for survival. One explanation for how we developed this evolutionary disadvantage is that vitamin C was so plentiful in the diet, our bodies didn’t have to do the work of making it and simply stopped. *2 The other nine vitamins are water soluble, so they have to be replenished in the diet more often. Your body can excrete excess amounts when you take or eat more than you need. *3 Nutrition research is very broadly defined by the NIH and includes experiments in mice, zebrafish, and fruit flies being fed different diets. CHAPTER 11 Nutritional Dark Matter Hopefully it’s clear by now: We understand a lot about the component parts of food that we need to survive. Without water, you die within a couple of days. In the absence of macronutrients—the protein and energy that build and sustain the body—it’s lights out within a matter of weeks or months, depending on how much body fat you’ve got stored. Miss out on the essential vitamins and minerals, and the mortality clock ticks at months to years. The time scales of survival have lengthened for each element in food we’ve discovered so far. But that’s still only a fraction of food’s true chemical complexity. And we don’t know as much as we should about how to eat not just for survival but for optimal health. Now we’ll turn to the scientists who are focused squarely on exploring the remaining unknowns. These researchers are trying to identify the components of food that gradually optimize or degrade health over lifetimes. They call this the “dark matter” of nutrition. They’re continuing with the reductionist project, but breaking food into even tinier, less concentrated chemical parts. Some of these chemicals may work their magic on time scales of decades, not just days or years. Many could do nothing at all. The technologies they’re deploying represent the closest humans have gotten to seeing the full chemical complexity on our dinner plates, an exquisitely nuanced portrait that William Prout couldn’t have imagined. It’s possible they’ll discover a new class of nutrients, like vitamins, albeit one that exists at even lower concentrations than the known micronutrients. Or maybe they’ll figure out which of those ten-thousand-plus GRAS additives we should be worried about. Their aim is nothing less than complete knowledge: to close the gap between what’s on our dinner plates and what’s in the public record about every constituent in our food. The dark matter mappers arrive at an interesting moment. There’s a growing call to abandon nutritional reductionism. The reductionist paradigm, after all, missed some of the most critical developments in the field, including the toxic food environment and ultra-processed foods that Kevin and his colleagues are now studying. Holists are skeptical about what breaking food down further can achieve, and argue that what we eat should be viewed as a complex matrix, and in its social and environmental context instead of as mere bundles of disparate chemicals. The debate might change how you look at your food. It reminds us of how much we still have to learn about what we eat—how elusive and complicated food remains—and the real tensions ahead as we hurtle into the next nutrition challenges: navigating between the poles of holism and reductionism, traditional ways and technological wizardry. We’ll argue that the future probably involves a bit of all of the above. LESS THAN 1 PERCENT If you’re a wine lover, you might not want to have a glass with David Wishart. He sees the ancient beverage, all food really, differently from most people—in what is sometimes an uncomfortable level of detail. Wine is not just a gastronomic experience, the perfect accompaniment to cheese, something to savor. For Wishart, a Yale-trained biochemist and biophysicist based at the University of Alberta, wine is a “collection of chemicals,” many more than whatever you read on the label or might imagine when you think of aged grapes. Today, he’s walking Julia through exactly which chemicals he’s recently found in a popular brand of sauvignon blanc. It’s somewhat disheartening. “We’ve identified eighty-eight compounds,” Wishart—gentle voice, baseball cap over glasses, hand cupping a bearded chin—started, far more detail than the grape type and alcohol volume listed on wine labels, which are even more sparse than food labels. Food packages typically display ten or fifteen nutritional compounds—Prout’s rubric along with some essential vitamins and minerals. The USDA tracks a total of 150 components in its food composition database. But by Wishart’s estimate, even this represents less than 1 percent of the universe of chemicals that could be identified in food. The other more than 99 percent are the nutritional “dark matter”—substances that are unidentified, unexamined by regulators, and unknown to the public. As the eighty-eight chemicals flash on the screen, Wishart walks through highlights and lowlights. These include glutamic acid, also known as MSG, and isobutanol alcohol, found in plastics and varnishes. “That’s not supposed to be there,” he laughed. Ditto the toxic alcohol methanol and acetone, found in nail polish remover. Shikimic acid—the starting compound used to synthesize the influenza drug Tamiflu—made the list, as did epicatechin, a chemical found in green tea that may help reduce high blood pressure, and beta-alanine, a compound linked to increased exercise and cognitive performance. “When you see this—are you drinking this sauvignon blanc again?” Julia asks him. Wishart is not at all concerned. “A lot of these [chemicals] are at low levels and not that harmful.” After you eat apple seeds or mustard, the body produces a small amount of cyanide. “Obviously everybody handles it just fine.” With the wine, some compounds are naturally occurring by-products of fermentation. Like the acetone he’s also found in this sauv blanc. “Nail polish remover?” Julia asks. “Exactly,” Wishart replies. Wishart’s relaxed posture is built on the data he’s amassed. He runs the world’s most comprehensive repository of the chemicals in foods, called FooDB. The database features some seventy-five thousand compounds in the 750 whole foods currently listed on the database, anchovies to millet and star fruit. Most aren’t nutrients, Wishart explains, and affect flavor and aroma. But a subset will be “new kinds of fats, new kinds of amino acids, new kinds of carbohydrates, and new kinds of polyphenols or phytoestrogens as we explore food with more and more detail.” “Eating food is like eating a colorful plateful of pills, with none of those pills being labeled,” he adds. “What we’re trying to do is to at least label the active ingredients in those pills.” Wishart is working to turn FooDB into an open-source encyclopedia of the chemical components of every food ingredient—the “foodome”—just as the global research team behind the Human Genome Project mapped the base pairs that make up human DNA. Decoding food’s chemical secrets would allow for not only more transparency on food packages, he and his colleagues argue, but more granular nutrition research and more precise diet advice, potentially supplanting the standard model of nutrition, which is mostly concerned with minimum requirements for survival, rather than optimizing health. Simply labeling the ingredients—finding all the complexity in every bite—is so mind-bogglingly painstaking, it almost appears impossible. And yet like every earlier phase of nutrition science, it’s already bursting with the potential for rapid commercialization. FROM MAPPING TO MARKET To examine food in a mass spectrometer—the world’s most sensitive scale —you begin by blending, pulverizing, and dissolving a minuscule food sample. The sample is placed, via a syringe, into the machine’s chamber, where it’s transformed from a liquid to a vapor. Then the chemical analysis begins. The molecules are electrically charged, then dragged into a vacuum chamber, where they’re subjected to electromagnetic fields. The fields shepherd the charged molecules through the machine, while the vacuum means there’s no air to interfere with their movement. Now the researchers can separate out the dark matter constituents. The different masses of the molecules along with their electric charges determine how far and how fast they travel through the machine. Using a metric called the “mass-to-charge ratio,” each molecule is quantified and stored in a computer that’s attached to the mass spectrometer. The whole process only takes a few minutes, but then identifying the chemicals can take weeks. The chemists need to match the mass-to-charge ratios of the different molecules in the food sample to known chemicals in online databases to figure out their identity, like finding the fingerprint of a suspect in a database of fingerprints. If no match is made, a new compound was discovered. Their mass, to four decimal points, is so minuscule that it “cannot be pondered”—lighter than a single snowflake, a physics PhD student at the “foodomics laboratory” in the sprawling Autonomous University of Madrid campus told Julia. The lab is run by Alejandro Cifuentes, another nutritional dark matter hunter. Spain is considered a global epicenter of food science, and Cifuentes—with a seemingly ever-present, dimpled smile, whose first reaction to many questions is a laugh—is not only interested in finding new chemicals; the eventual goal of some of his projects is commercialization. When Julia visited, Cifuentes took her to a whitewashed room lined with computers and machines in grayscale that buzzed loudly. One of the mass spectrometers stood in the corner, a column jutting out of the top of a rectangular box at nearly twice Julia’s height. Inside the machine, he and about twenty other researchers had been analyzing, among other foods, tiny samples of olive oil and oranges. Two of Spain’s biggest food industries, each generates tons of waste that ends up in landfills. He and his team are trying to isolate dark matter chemicals in the food waste, such as orange peels or olive leaves, from which healthy components can be extracted and reused for supplements and medicinal products. The two food by-products hold a spectrum of terpenoids, chemicals that are purported to have antioxidant, anticancer, neuroprotective, and antiinflammatory properties, Cifuentes explained. Colon cancer is a disease on the rise, especially in younger people, and influenced by diet. Cifuentes hopes he can hit on terpenoids in oranges and olives that can stop colon cancer from developing in cell and animal models of the disease. His research, if successfully tested on mice, could lead to clinical trials in humans, he said, and ultimately result in supplements or treatments for patients who have already been diagnosed with colon cancer. The approach already helped him extract and identify a group of promising chemicals in rosemary plants that seemed to reduce the speed of colon cancer growth in mice. He’ll have to prove their efficacy in humans, and “watch out,” he added. The plant also contains compounds that may be cancer-promoting, so this doesn’t mean one should gobble up rosemary. Back in Cifuentes’s office—wallpapered with postcards from the dozens of cities where he’s delivered lectures—Julia asked him whether his research changed what he eats at all. “No, not really,” he said. Food is “a complex matrix,” he went on, one we’re only “starting…really beginning” to understand. THE FOOD MATRIX Scientists started thinking about food as a matrix as early as 1977. That year, a group of researchers in Bristol, England, published a paper about how people respond to consuming the same amount of carbohydrates in whole and puréed apples compared to apple juice. They recruited five men and five women for the study and asked them to eat Golden Delicious apples harvested from the same orchard, but pressed, mashed, quartered, and cored for test meals that each contained precisely 60 grams of carbohydrates. The researchers were astonished to find that, when the study participants ate the apples in every form, they experienced roughly the same blood sugar rise—but glucose levels dropped fastest after the juice and slowest following the whole apples. They also reported feeling fullest eating apples, and least satiated drinking the juice. Three foods with the same nutrient composition all behaved differently in the body. There was more to food than calories and chemicals; the structure and combinations in which they were delivered to the body mattered. Today, this is known as the “food matrix effect,” an idea that’s recently gained traction in the discussion of ultra-processed foods. Combinations of nutrients are contained within three-dimensional structures that give them their form. These structures influence how accessible the nutrients in foods are, and what and how much our bodies use. They also shape how rapidly and where in the gastrointestinal tract nutrients are absorbed, and how satiated we feel afterward. Researchers have observed food matrix effects in just about every food they’ve looked at. If we process almonds or cook pasta to mushy softness, we get more fat and carbohydrates from them than if we’d eaten the almonds whole or the pasta al dente. Processing—grinding, cooking, aging—deconstructs the matrices, making the nutrients and energy inside them more readily accessible, explained Anthony Fardet, a French food matrix pioneer who has been studying food’s structural effects since the 1990s. Humans began processing and manipulating the food matrix long ago, when we started to control fire and use it to cook. This not only killed pathogens but chemically modified foods. Cooking in turn supported the evolution of our large and energetically expensive brains while allowing for reduced gut size since it was no longer required to do the work of digestion that cooking replaced. Our bodies, from our teeth to the enzymes in our stomachs, evolved to be matrix-busters. Whole foods traditionally made their way through our long gastrointestinal tracts, feeding different populations of microbes, getting broken down and absorbed, at different stages on the journey, and any indigestible or unnecessary elements were excreted as stool. Flash-forward a million years, and Fardet likes to think about the ultraprocessed foods as even “pre-digested.” Since they’re made from whole foods broken down in factories, their nutrients are readily available for digestion and absorption earlier on in the process, possibly starving lower regions of the gut; its associated microbiome potentially affects hormone secretion patterns and the immune system. This matrix effect is one of the theories of how UPFs might harm our health. Preparing foods for the mass spectrometer destroys their structure, which brings us to the problem researchers like Fardet and other holists have with the dark matter project. Taking food apart and analyzing its chemicals takes them out of context—not only from their original structures but from their social, environmental, and health context, too. “It’s a classic repeat of what’s happened to nutrition science from the dawn of nutrition science in the nineteenth century,” said Gyorgy Scrinis, the Nutritionism author. “What do they do when they discover there’s proteins and vitamins? Immediately they assume they’ve discovered the truth and they go hard, first, interpreting those nutrients reductively. And then they assume it’s very precise knowledge and precise enough to override any other kind of considerations whatsoever.” This gave us a food system built for maximizing calories and protein at the expense of just about everything else, something we’ll explore in the next chapter. This sometimes misses the point, Scrinis argued: “Before you look at the nutrient profile of someone’s diet, just look at the size of their pay packet.” In other words, it’s probably easier to guess the healthfulness of what they’re eating by knowing their salary than by attempting to measure their diet’s chemical composition. Just because something we eat contains a particular chemical, it also doesn’t mean it’ll have the same effect once it’s isolated from a whole food and pulverized or extruded to make a supplement or beverage. The way we pair our foods shapes their metabolic fate, as we saw with the vitamins. Once chemicals are released from the matrix, they interact with other components of our diet, or get transformed by the microbiome before being absorbed. Consider fiber. It has a history of being thought of as merely an indigestible component of what we eat—bulk that helps alleviate constipation. We now know that bacteria in our gut use dietary fiber to fuel their metabolism while producing short-chain fatty acids as metabolic byproducts that are absorbed by the body. Both the bacteria and the shortchain fatty acids alter our immune system and potentially other physiological functions of the body we don’t yet understand. As Kevin’s UPF research illustrated, insoluble fiber may also affect how much energy we absorb from our diet. In addition to the transformed fiber, Wishart estimates that there are some 250,000 different compounds in the human metabolome—the chemicals in our bodies derived from existing chemicals in the foods we eat —and that’s just the ones we can detect. At the same time, there’s an unresolved debate about whether some of the nutritional dark matter is mere artifact, chemicals created by experimental techniques; Wishart doesn’t think so. Like Cifuentes, Wishart’s already exploring commercial applications of the dark matter findings. He co-founded a precision nutrition company, Molecular You, which—like the companies we tested in Chapter 9— promises to deliver tailored diet advice, supplements, and even exercise routines, based on a single blood test. As part of this company, Wishart envisions eventually taking blood or urine samples from users to look for nutritional dark matter and figure out what people are eating instead of relying on their self reports. When we asked Wishart about how he thinks about critics who say precision nutrition companies are rushing ahead of the research, he said in an email, “My own reason for doing this is that I am just a scientist who is forever curious. My hope is that this curiosity would rub off on other people so that they would be curious about their food…. So I agree that it is definitely worth learning more about food chemicals before rushing off to sell something or turning a profit on shaky science. On the other hand, there’s no harm sharing the knowledge early and getting people engaged in the discussion. If they feel that the information is compelling or convincing, then it is up to them (alone) to pursue their own path in deciding to consume or avoid certain foods.” Also, like Cifuentes, he hasn’t found a way to apply the dark matter findings to how he eats, not yet. His wife, writer Debby Waldman, later told us, “His idea of cooking is it takes too long. We should either go to a restaurant or…make Kraft Dinner.” ANOTHER LOOK AT FOOD Last spring, Julia asked Wishart if he could subject one of the foods her son sometimes snacked on as a toddler—squishable pouches of fruit and vegetable purée that are a ubiquitous feature in grocery stores around the world—to his mass spectrometer. The little pouches, which eject right into the mouth, no cutlery required, are up there with gummy bears and animal cookies on the hierarchy of most valuable snacks in the playground. She had often suspected, despite the images of fresh produce on the front of the pack, that these purées may deliver a quick hit of sugar, along with a smattering of microplastics. A few weeks later, Wishart delivered the results: three spreadsheets containing nearly two hundred chemicals, and their concentrations. There were no “dangerous” compounds identified, no “smoking guns.” Instead, Wishart explained, the purée contained a lot of sugar (fructose and glucose), along with a blend of nutrients you’d expect to find in a refined sack of fruits and oats. There was benzoic acid (“sometimes used as a preservative”), shikimic acid (“not useful to the body,” but again, used to make Tamiflu), and butyric acid (“a very protective compound…produced in the gut from fiber.”) The analysis didn’t detect contaminants (no lead, for example). The comprehensive test, however, didn’t check for emerging chemicals of concern in the foods we eat and drink—like microplastics. These leach into food during manufacturing processes or from packaging, entering our bodies. Decades of research demonstrating the harmful effects on animal organs have more recently been translated into correlational studies in humans. Microplastics in the blood vessels are strongly associated with a higher risk of cardiovascular disease. Ditto fecal microplastics and inflammatory bowel disease. The chemicals in these plastics—including bisphenol A (BPA), polycarbonates, and phthalates—have been associated with a range of conditions and diseases, including reduced fertility, type 2 diabetes, and obesity. According to a recent estimate, scientists found evidence of the presence of more than three thousand such “food contact chemicals” in human bodies, and that’s considered a lowball. Even if Wishart had looked for all these components, the mass spectrometer and other “omics” tools, such as nuclear magnetic resonance, would only give a readout of hundreds of chemicals in any single food, while the simple baby purée likely carries thousands. The full complexity in what we eat still eludes us. THE FUTURE OF REDUCTIONISM Learning about the results, Julia felt a little deflated. The world’s most intricate portrait of food couldn’t tell her anything particularly new or useful about how to eat. The baby food seemed fine from a nutrient perspective, but maybe there was another way to understand the results? We asked Fardet how he’d evaluate the pouches of purée. For him, the structure came first. “Children are used to eating more and more soft, semisolid, viscous, and liquid foods,” he said. This includes both ultraprocessed foods, which tend to be soft, and also fruit juices and purées, which have been implicated in the explosion of dental cavities. There’s emerging evidence that this proliferation of soft foods may be behind rising rates of teeth misalignment and obstructive sleep apnea in kids; the hypothesis is that kids’ jaws and teeth aren’t challenged enough with chewing, leading to underdeveloped palates and airways. The structures of our bodies may be adapting to our unstructured food. A squishable bag of oats and fruits isn’t a nutrient problem, he went on. It’s a missed opportunity: To expose the digestive tract and microbiome to an intact matrix, to practice chewing, to develop the palate, to get all the other nutrients in whole food we don’t even yet know are beneficial to health. Eating the plants themselves, instead of the factory-produced purée, might also deliver fewer microplastics and food additives. Chemicals that don’t necessarily show up in the mass spectrometer. A growing number of critics and concerned citizens follow Fardet’s logic, arguing that we need to move beyond nutrient reductionism to thinking about foods and dietary patterns holistically. Reductionism served us well when the main diseases of nutrition were deficiency diseases, they argue. It helped governments feed workers and militaries. It also gave us the food systems we have today that both feed and sicken us: It enabled us to grow crops more efficiently; reengineer them for industrial production, appearance, and shelf stability; and create the ultra-processed foods. Mass spectrometers were the machines the food industry used to decode the chemical compounds in flavors and synthesize the artificial copycats— some of which are GRAS—that punch up things like soda and potato chips. These foods spread globally in proportion to chronic, diet-related diseases, such as obesity and type 2 diabetes. As we’ve seen, chronic, diet-related diseases are the central nutrition problems in most countries, and they’re linked to foods with matrices that have been excessively degraded and artificialized. Unstructured foods—and all the food contact chemicals, colors, artificial flavors, emulsifiers, and other additives they contain—may be altering our bodies and our health in ways we’re only starting to make sense of. To focus narrowly on individual food components is to miss this big picture, Fardet argued. Scrinis called for a future where, instead of rushing to supplements and precision nutrition companies, researchers focus on how much we still have to learn about food. Even Kevin, an arch reductionist, agrees with the holists about our limited understanding of the chemical complexity of food and, more important, how chemicals and foods interact with each other in the context of our overall dietary patterns. But he also thinks we’re not done with reductionism; we just need to be far humbler in how it’s deployed. Instead of using it to sell supplements and diets, or even to map all the components of food, we should use reductionism to narrow in on the subset of chemicals in what we eat—as well as characteristics of the broader food environment —that seem to be most important to health. Just as Wiley did, we can then investigate how they affect our bodies. After all, not everything in food affects our health. The plants and animals that we eat did not evolve merely to be foods. We co-evolved with them such that when we eat them our bodies extract what we need and get rid of what we don’t. Many of those chemicals, especially those that are water-soluble, last for only a few minutes inside us before we eliminate them.[*1] And as we’ve seen, sometimes it’s not even the foods themselves that cause problems but how we live with them in our food environments. Food is more than energy, protein, or vitamins. The chemicals in what we eat work together, in structures, interacting with our bodies and other foods. They also work on us in contexts—in food environments and food systems. We can’t ignore this complexity. But reductionism—without the nutritional gobbledygook imparted by the gurus—can get us focused and help us figure out the way forward. It can narrow the scope of what in our toxic environment is chronically poisoning us. The question of what next gets at a classic tension we see in many discussions about important questions facing humanity—between progress and traditionalism. Are we wizards who believe in science and technological solutions to the problems facing the food system—or traditionalists who embrace limits and call for a reversion to old ways?[*2] We’ll argue we need to be a bit of both for a sustainable food future. Skip Notes *1 Others, especially the fat-soluble, can accumulate in our bodies and may therefore be more likely to cause long-term health problems; maybe the reductionist approach should target these for study first. *2 To borrow the construction from Charles Mann’s brilliant book The Wizard and the Prophet. CHAPTER 12 The Calorie Glut We opened the book with the promise that we were going to show that diet-related diseases are not the result of individual failures of willpower, but often of food environments that cause us to overeat. In the last few chapters, we explained how the food around us can disrupt that orchestra of internal eating signals we share with animals, and how, as societies, we can use regulation and policy to change our food environments so that they promote health rather than disease. In this chapter, we are going to step back to explain how we got the modern food environment in the first place. It’s a direct result of a calorie glut that arose in response to the imperative to feed populations enough protein and energy, an obsession that dates all the way back to Liebig and the earliest years of nutrition science. In industrialized countries, agriculture and technology policy have been structured to maximize calories and protein—with the costs to biodiversity, soil health, climate change, environmental pollution, animal welfare, and human health all pushed to the margins. In short, we didn’t create a food system that overperforms for health. We didn’t create a food system that overperforms for the environment. We didn’t create a food system that overperforms for equity, sustainability, or animal welfare. We created a food system that could feed billions of people calories, to excess. The epidemic of obesity and its downstream metabolic diseases are a direct result of designing food systems to produce an oversupply of energy. In this chapter, we’re going to describe the calorie glut, how we got it, its real costs, and how we can change course. FOOD WASTE RISING Kevin’s first major discovery about food and obesity didn’t garner nearly as much press as his work on the Biggest Losers. It had nothing to do with ultra-processed foods or macro diets for weight loss. Instead, it was on the less flashy issue of food waste, how rapidly it was increasing in America, and what that implied about the past, present, and future of our food system. The study was published in 2009, more than fifteen years ago. At the time, researchers already knew that the number of calories in the U.S. food supply had increased in parallel with rising obesity rates since the 1970s. But it wasn’t clear how many of those extra calories were eaten, and whether they were enough to explain the population-level changes in body size. To figure that out, Kevin used the mathematical model that helped him debunk the old 3,500-calorie-per-pound rule. With it, he quantified not only the changes in the food supply, but how much weight gain you’d expect for the boost in calorie intake while also accounting for the energy costs of packing away fat: the rise in metabolic rate and overall energy expenditure that comes with bodies growing larger. The results were mind-boggling. There was an 800-to-1,000-calorie-percapita surge in daily food energy available since the 1970s, more than enough to explain rising obesity rates without even factoring in changes in physical activity levels. (Yes, we do spend less time doing physical activity for work—but more on recreational activities.) In fact, if Americans had eaten all the additional food that farmers and food manufacturers put on grocery store shelves and dinner tables, Kevin’s model suggested that obesity prevalence would have soared much higher. The most likely explanation was that most of the additional calories were not eaten. Kevin calculated that two-thirds were feeding garbage bins and landfills. The food waste problem had actually increased during the obesity crisis. Americans were trashing 50 percent more of their food compared to the 1970s—some 1,400 calories per day for every person alive in the United States, enough to feed a child. All the waste amounted to roughly 40 percent of the total calories in the U.S. food supply. So yes, Americans were throwing away nearly half their food. The findings were so odd, Kevin searched for ways to stress-test them. Combing through food waste data from the U.S. Environmental Protection Agency (EPA), he found the smoking gun. The EPA’s measurements of food waste in landfills showed the same 50 percent progressive increase per person since the 1970s. Putting it all together, Americans were making more calories, and eating more calories—but also trashing more and more food with every passing year. Producing just the food destined for garbage bins, Kevin estimated, accounted for a quarter of all freshwater use in America and required some three hundred million barrels of oil per year to produce. The trash turned out to be the tip of a calorie glut iceberg. Americans had become so effective at growing and manufacturing calories, they had to invent new ways to offload them. In the United States, where the amount of land used for farming has been steadily shrinking since the 1950s, the calories American farmers produced skyrocketed. Over the last century, Americans had transformed water, land, sunlight, and oil into a mindboggling explosion of calories. The combined calorie equivalent of the four main agricultural food commodity crops alone—corn, wheat, soy, and rice —amounted to about 15,000 calories per day per person.[*1] This allowed America to become the world’s largest exporter of food. But even after accounting for all the exports, the commodity crops amounted to about 12,000 calories per day per person—more than four times the energy needs of the population. Most of that food isn’t directly eaten by humans. Instead, it goes to biofuel production and feeding animals, most of which are subsequently eaten by humans. The rest of the crops become the cheap inputs to the industrialized food system that produces ultra-processed foods, which gave us our food environment—and an epidemic of obesity. HOW THE TRAP BECAME THE GLUT That America became a calorie production powerhouse was not at all inevitable. Prior to the expansion of the commodity crops, experts had long augured mass starvation. Most notably, eighteenth-century English economist Thomas Malthus painted a grim portrait of what would happen if population growth continued unchecked: It would outpace agriculture’s ability to feed humanity, creating a calorie deficit that would bring on famine. Dire warnings about mass starvation—known as the “Malthusian trap”—echoed ever since. Thankfully, these predictions weren’t realized in much of the world. Instead of mass famine, the period from Malthus to the present has mostly been defined by the soaring rates of available calories. The global food system now supplies nearly 2,800 daily calories for every single person alive. So how did we avoid Malthus’s predictions? If you want to design a food system that can overfeed populations to the point of epidemic levels of obesity—one that’s rich in ultra-processed products, and one that produces tons of food waste—step one is finding ways to make plants grow faster. Going back to the Liebig-era protein insights, the nitrogen content in soil sets a limit on how quickly plants can synthesize protein and grow. More nitrogen means more plant growth. In the early twentieth century, the German chemists Fritz Haber[*2] and Carl Bosch invented a method for artificially synthesizing ammonia—the form of nitrogen that can be easily absorbed by plants—on an industrial scale. What became known as the Haber-Bosch process removed the nitrogen cap for agriculture. This led to synthetic fertilizers, as well as far quicker plant growth, paving the way for industrial agriculture. Designing hybrid seeds for crops that can flourish with your artificially fixed nitrogen would be step two. You focus on crops that can already produce the most calories per hectare of land—corn, wheat, soybeans, and rice, as opposed to, say, spinach and oranges—and then selectively breed those crops to make them disease resistant, faster maturing, and more calorie and protein dense. With wheat, you do this by increasing the size of the edible kernels at the top of a wheat shaft, while shrinking the length of the inedible shafts. With rice, you design a sturdy stem packed with more grains of rice on top. With corn, you breed varieties that can grow closer together and produce larger kernels filled with starch. In addition, you make sure to breed crops so that they’re adapted to resist disease and grow in cooler climates where they wouldn’t normally thrive. Finally, you invest heavily in developing agricultural equipment to efficiently sow and harvest all those crops. You build up your physical infrastructure, especially for irrigation and transportation, to water your harvest and move it to market. You sprinkle in an industrial chemical revolution, which produces new pesticides, fungicides, and herbicides, further boosting your agricultural productivity. You fund research to find ways to continue increasing crop yields and enact policies that direct farmers toward specialization, and larger farms growing commodity crops “fencerow to fencerow.” Some of these policies include government subsidies encouraging the production of more wheat, soy, rice, and corn. Humanity achieved all of these incredible technological and policy advances. Wheat was transformed from wild grass into a staple crop that now makes up a fifth of the diet around the world. Corn went from being a cultivated crop only in sunny Mexico to the most farmed cereal grain globally. “Miracle rice” matured faster than traditional varieties and produced up to ten times the yield. We shipped the harvests from these crops far and wide. We turned deserts where hardly anything grew into productive farms. We pumped fresh water out of the ground at a rate never seen in history. We made more food, more quickly, than ever before. The result was more than enough protein and calories to feed growing populations with far less labor. Modern industrial agriculture requires only about two hours of human labor to produce 100 bushels of wheat, enough to make forty-two loaves of commercial white bread, compared to about three hundred hours of backbreaking farmwork at the beginning of the nineteenth century. Similar efficiencies were realized for the production of corn and soy. Providing the daily energy requirements of a single person is now accomplished with only a few seconds of human work. Collectively referred to as the Green Revolution, these changes have been credited with averting the dark future Malthus portended. More food from these technologies not only meant more grains and less hunger. It meant more feed available for animal agriculture. It meant less poverty, reduced infant mortality, and raised incomes. It meant fewer people needed to work in agriculture even as they fed billions of additional people now on the planet. This freed us up to do things like produce art, advance science, and start businesses. But the Green Revolution did more than just nourish us. It’s how we got the glut of calories, calories that we had to find other uses for, fundamentally altering what we eat, our bodies, and how we use food and energy. In Europe and North America, a lot of the corn and soy was diverted to the production of biofuels, such as ethanol and biodiesel, used to power transportation, heating, and electricity. A larger portion—the majority—of the calories went to making animal products by feeding cows, chickens, and pigs that are then eaten by humans. The U.S. Department of Agriculture helped fuel the demand for animal protein, subsidizing livestock, and funding marketing and advertising programs to promote the consumption of meat, eggs, and milk. The per-person availability of poultry and eggs soared by more than 400 percent and 240 percent, respectively, between 1800 and 2000. By 2020, people around the world were consuming 574 million metric tons of animal protein in the form of meat, seafood, dairy, and eggs. That’s nearly 75 kilograms per person, and the number is increasing in developing countries. The rest of the calorie glut was transformed by ingenious food scientists who dreamed up ways to turn commodity crops into the inexpensive ingredients in ultra-processed foods. Experimenting with various processing techniques to combine salt, sugar, and fat, and chemicals for color and flavor, they altered the look, smell, and “mouth feel” of the wheat, rice, soy, and corn, so the inedible crops became foods people craved. Advances in food packaging and preservation helped these products last on store shelves for months, even years at a time, increasing the odds that they would be sold. Corn became not only sugary breakfast cereal but chips, cakes, and feed for animals that became processed meats, as well as high fructose corn syrup, a ubiquitous sweetener that costs much less than cane sugar. Wheat and rice became cookies, crackers, baby pablum, puddings, granola bars, and pizzas. Many of these products were emulsified with lecithin or guar gum, made from soybeans. Soybeans and other industrial oilseed crops became cheap sources of fat for baking and frying food. When you walk around a supermarket and look at the vast range of products on display, you have the illusion of diversity but you’re mostly looking at shelves filled with commodity crops in various guises. The rows of soda and sweet drinks, the aisles stocked full of chips and candy, the freezers brimming with ready-to-heat meals: All are derived from corn, rice, wheat, soy, mixed up with artificial flavors and colors. As we’ve seen, many UPFs reduced the time spent cooking, while the cost of eating also dropped off, leaving money in the pocket for saving or investing in other things—or buying more food to eat (or throw in the trash). In the beginning of the twentieth century, people in America used to spend about 60 percent of their disposable income on food. They now spend less than 10 percent. Some people think government subsidies of the commodity crops are the reason ultra-processed foods are so cheap compared with the healthier foods like fruits and vegetables. The real reason UPFs cost less than fruits and vegetables is that commodity crops— the main inputs to UPFs—are so cheap and efficient to grow. These calories are not spread evenly around the globe. Too many people in sub-Saharan Africa and South Asia still live with persistent hunger. Too many children continue to suffer from stunting and wasting because of malnutrition. Altogether, approximately 10 percent of the global population still doesn’t get enough to eat. So while millions of people were growing fatter, there remain throngs still starving to death. On balance, though, more people now have obesity than endure hunger. More die from the diseases of overnutrition than from too little food. To put it another way, in the exact period when starvation was predicted to become more widespread, even in America, we bore witness to the global reduction of hunger and staggering increases in food waste and body weight. TRUE COSTS To sum up so far, our historical obsession with protein and calories led to a calorie glut so big, it not only made food waste and obesity more common, but it had to be funneled elsewhere and went into growing more animal protein and biofuels. We externalized the consequences of these policies. The calorie glut is cheap—until you factor in the enormous costs of dietrelated chronic diseases like obesity and diabetes, as well as the environmental damage it has wrought. We use almost half of the earth’s habitable land to grow food, and nearly 80 percent of that is devoted to producing livestock (both for grazing and producing feed). Along with all of that land, agriculture consumes mindboggling amounts of water. Worldwide, it uses around 70 percent of our fresh water. This has already drained the groundwater in many regions, including across America. Conventional industrial agriculture not only saps water resources but depletes soil of nutrients and biomass, while the vast quantities of fertilizer, pesticides, and herbicides have poisoned environments, with nitrogen leaching into waterways, promoting algae growth and creating dead zones where no other form of life can survive. Ninety percent of land-related biodiversity loss and water stress stems from the extraction and processing of biomass—that is, agriculture and forestry. The food system as a whole is now responsible for a quarter of greenhouse gas emissions. The HaberBosch process alone is one of the world’s largest energy hogs and producers of greenhouse gases, contributing 1 to 2 percent of global carbon emissions. Animal agriculture is a spectacularly inefficient and energy-consuming way of producing calories. As we saw at the start of the book, the energy in our food is ultimately derived from the sun, converting solar energy into chemical energy and the material flow of air and water through the metabolism of plants and animals that we eat. The further removed our food is from its solar energy source, the less energetically efficient. Most of the calories eaten by animals go toward their metabolism, keeping them alive, while only a small proportion goes toward the growth of meat. This is true even for modern chickens, selectively bred and housed in factory farms for meat production—and considered the most energetically efficient agricultural animals eaten in the United States. Yet every calorie in chicken meat requires more than four times that amount in feed.[*3] Because metabolic rates increase with the size of the animal, larger animals like cattle are even less efficient at turning commodity crops into human food. Beef production emits eight to ten times more greenhouse gas than chicken.[*4] (One encouraging trend is that beef consumption has been decreasing at the expense of people eating more chicken in the United States.) The CO2 expired by agricultural animals does not contribute in net to carbon in the atmosphere because the CO2 was originally captured from the air by photosynthesis in the plants that were eaten. But ruminant animals like cows also produce substantial quantities of methane, with even greater global warming potential than CO2 . This makes growing cows for beef and dairy particularly bad for the climate.[*5] When raising these animals destroys carbon-rich rainforest, they’re even more environmentally hazardous. That’s why Brazil’s cattle farming production became a focal point in the conversation about the destruction that comes with red meat. Beyond the resource exploitation and environmental toll inherent in animal agriculture, the food system is simply cruel to animals. Even those relatively efficient chickens. They tend to live in confined spaces, in their own poop and sanitation chemicals, with little natural daylight. Because consumers prefer the white meat of chicken breasts, the birds are bred to grow such enormous chests, their legs can’t hold them up. “How we treat farm animals today will be seen, I believe, as a defining moral failing of our age,” New York Times opinion journalist Ezra Klein has written. “Humans have always eaten animals. We’ve hunted them, bred them, raised them and consumed them. What’s changed over the past century is that we’ve developed the technology to produce meat in industrialized conditions, and that has opened vast new vistas for both production and suffering.” The system is similarly barbaric to the people who turn animals and crops into cheap calories and protein—including migrants and children. The labor exploitation has its roots in the earliest days of the global food trade. Refined sugar arguably went from a luxury product to mass market commodity on the labor of enslaved people. The exploitation continues to this day. Investigations in 2023 by The New York Times exposed how newly arrived child migrants to the United States have been working in factories in all corners of the food system, deboning chicken, processing milk, bagging Cheerios and Lucky Charms. These aren’t isolated cases; researchers recently estimated that forced or child labor remains pervasive in some parts of the U.S. food system. And it’s not just the United States. Another Times investigation of sugarcane cutters in India revealed rampant child labor and even forced hysterectomies to keep women from missing work because of menstrual problems. The global warming all this food production contributes to is, ironically, imperiling our ability to feed ourselves, spurring on new food shortages and a fresh generation of Malthusian worry. The COVID-19 pandemic and conflicts such as those in the Middle East, Ukraine, and South Sudan, compounded by the effects of extreme weather events, have pushed up the number of people experiencing hunger and malnutrition in recent years. If current trends continue, the risk of hunger and malnutrition could rise by as much as 20 percent by 2050. Feeding the world without destroying the planet has once again become a hot topic. In anticipation of running out of land and water for agriculture, billionaires are buying up farms and the control of critical resources. Goodbye family farms becoming agribusinesses; hello bespoke family farms becoming billionaire refuges. On Mark Zuckerberg’s Instagram account, the Facebook CEO boasted about raising “some of the highest quality beef in the world”—wagyu and Angus cattle. The cattle grow up munching macadamia nuts and drinking beer produced on his private $270 million compound in Kauai, Hawaii. “Of all my projects, this is the most delicious,” he added. The farm also features a bunker and private energy reserves. Rich countries are making similar moves, looking beyond their borders to secure food and water provisions because hungry citizens easily become unruly. As Vladimir Lenin is credited with saying, “Every society is only three meals from chaos.” A SUSTAINABLE PATH FORWARD Instead of looking forward with a sense of doom and scarcity, we are optimistic. We’re starting from a position of immense abundance: the calorie glut. What humanity managed to accomplish over the last century was nothing short of miraculous. We went from fears of mass starvation to rampant obesity with the result that we already make way more than enough food to feed everybody. The global food system currently produces the calories and protein for approximately eight billion people with a substantial amount of waste. The United States alone grows enough energy in corn, wheat, soy, and rice to feed more than two billion people and has been getting rid of the excess, as we’ve seen, by funneling it through animal agriculture, biofuel production, and ultra-processed foods. The problem is that we do not do the growing sustainably, healthfully, or equitably. Adapting Michael Pollan’s eternal eating advice to food systems, we need to: Grow enough healthy food we can all eat. Mostly sustainably produced plants or other organisms. Not so much that we have to trash it or turn it into biofuel and animal feed. For the growing part, we need to transition away from the immense amount of fossil fuel our food system relies on, to cleaner sources of energy for powering farm equipment, producing fertilizer, and processing and transporting food. As we’ve seen, agriculture already occupies about half of the world’s habitable land, so we need to get smarter about what we’re growing and where we’re growing it. We also need to better manage water resources for irrigation, and we need more targeted and selective uses of pesticides, herbicides, and synthetic fertilizers. Where possible, we should adopt more sustainable forms of agriculture that regenerate the soil and reduce the environmental and climate impact per unit of food produced. Transitioning toward sustainable farming practices will have to be carefully managed. Crop yields produced with organic agriculture are less stable and typically decrease, especially in the early years of a transition. This played out recently in Sri Lanka, when the government abruptly banned synthetic fertilizers and pesticides. An economic crisis and food shortages followed. To mitigate reduced yields, we could divert less corn and soy to biofuel production. We shouldn’t use more land to make up for the reduced yields; it would be counterproductive for our environmental goals. The land, water, fertilizer, and fossil fuels used to produce plants that we feed to animals could instead be used to make crops directly eaten by humans. To that end, we should transition some of the land currently growing commodity crops to growing more of the so-called specialty crops (such as fruit, vegetables, and legumes) that people need to eat to improve their health. Right now, we don’t make nearly enough to provide everyone with a healthy diet. We tell people to eat more vegetables, but we don’t have enough to feed them and we don’t subsidize their production at the scale required to make them affordable—one of the many disconnects between our food environment and what we know about optimal nutrition. Once those foods are grown, we shouldn’t trash them. Most food waste in America occurs in the late stages of the food supply chain: supermarkets, kitchens, and leftovers. Sometimes food isn’t even harvested from the farm because it isn’t cost-efficient. This is especially true for produce that doesn’t meet the exacting standards of buyers. We should relax stringent size and cosmetic requirements for produce that disincentivize harvests, find new uses for food that’s bound for waste, and invest in technology that safely and efficiently preserves produce. In less developed countries, addressing food waste requires different tactics. Most food is lost upstream during the production, storage, and distribution-to-market stages. Here, investments should be in the same kinds of technologies that have already been demonstrated to preserve and protect food throughout the supply chain, such as refrigeration and packaging. In these regions, especially in sub-Saharan Africa, another focus should be closing the so-called yield gap—to produce more food without using more land. This probably requires the adoption of industrial agriculture practices along with the development of new high-yield and resilient crops that are suited for local growing environments, most likely using genetic engineering technologies. (To tackle consumer reluctance to eating foods made with genetically modified organisms [GMOs], we should invest in independent and transparent food safety research, monitoring, and evaluations.) As we discussed in Chapter 8 on food policy, the food industry could also be incentivized to design convenient and tasty products that incorporate more of the healthy specialty crops. Think vegetable-smothered pizza, legume-and-carrot-stuffed “meatballs,” or ready-made banana-andspinach breakfast pancakes. Combined with asking manufacturers to hit certain benchmarks to reduce problematic ingredients, the new era of food science could involve engineering products for both health and sustainability. Some modern food engineering methods may look nothing like traditional food production, especially when it comes to protein. As developing countries grew economically and more citizens were pulled out of poverty, they have demanded more animal protein. This is part of the nutrition transition we discussed in Chapter 7. But as we mentioned already, feeding livestock uses about 80 percent of the agricultural land on earth, and we can’t continue to cut down more carbon-rich forests to meet the world’s increasing demand for meat. Rather, we need to return some agricultural land to natural ecosystems and invest in the development of more efficient and environmentally friendly ways of producing protein. Many technologies are already being developed to make traditional animal protein less harmful, like suppressing methane production in cows. These important advances won’t be enough to mitigate the environmental harms as developing nations demand the same meat and dairy that developed countries already do. If we want to find ways to do animal agriculture more humanely and sustainably, we have to replace much of it with something that’s nearly as desirable but much less environmentally hazardous. That’s where alternative proteins come in. FOOD 2.0 Julia visited one of the many companies trying to make alternative proteins a reality. In a glass and stainless-steel lab, in the forested outskirts of Munich, Germany, where Justus von Liebig spent the final years of his life, clear plastic bioreactors gurgle what looks like a boiling soup of tiny pasta balls. A young chemist in a lab coat tends to the spongy, white orbs. After they’re scooped out of the water, they’ll be compressed, packaged, seasoned, then fried or baked, and eaten as an alternative to fish, explains Guido Albanese, the founder and chief chemist at this vegan seafood startup, Koralo, and former head of “meat snacks” development at Unilever. When Albanese hit middle age a few years ago, he started thinking about how to use his skills to give back. The meat processing he used to work on increasingly appeared to be at odds with the health of humanity and the planet. That’s why he went from trying to use every part of the animal for meat products to becoming part of a new generation of food scientists trying to eliminate the animal altogether. The finished product—vegan fish—looks like a white fish fillet and shares a similar texture, but the taste…“I taste—and sorry, this is a terrible…something bleachy,” Julia admits haltingly, at a tasting. “I’m with you,” Albanese says, puzzled, his Brillo of salt-and-pepper hair moving as he chews. Perhaps, he speculates, the odd flavor came from pre-frying and then reheating the samples in a microwave. Arriving at a flavor profile that truly rivals meat or fish—one that can draw people away from eating animals—remains the major challenge for all alternative protein companies. Making inexpensive tasty meat alternatives turns out to be technically challenging—as well as labor and resource intensive. If the companies eventually master the taste and texture of animal protein, the benefits for real animals and the planet would be immense. The techno-optimist vision is that protein alternatives can replace some, perhaps all, of animal agriculture by carefully cultivating cells in bioreactors, to grow food we eat instead of fish—or beef or pork or chicken. Back at the bioreactors—five-liter bottles wired with tubes pushing in pressurized air—Albanese explains that he ferments mycelia, the edible roots of mushrooms, to make his seafood alternative. The mycelia are mixed into a solution of water, sugar, and microalgae. The algae and sugar become the mycelia’s lunch. After feeding, they convert part of their meal into the energy that powers their metabolism. The other part becomes part of them—protein that’s integrated into the mycelia’s cells, causing the dense balls to swell from the size of a pin to the size of a fingertip. What’s going on inside the lab, at the cutting edge of food science, is the same ancient biochemical protein building process that regenerates our own bodies. Other companies are skipping the need for agricultural inputs like sugar. Instead, they use a special kind of bacteria called knallgas that can make protein from CO2 and ammonia as inputs; their bacterial metabolism is fueled by oxidizing hydrogen gas produced through the electrolysis of water. Amazingly, modern solar panels now produce electricity so efficiently that powering electrolysis in a bioreactor system can theoretically produce protein with less land and water than growing soybeans. In other words, artificial photosynthesis is now in the immediate future of food. The Finnish company Solar Foods has used the system to create a protein powder product called Solein that they claim provides a complete amino acid profile. Solein was recently granted self-affirmed GRAS status at the FDA, paving the way for the commercial use of the ingredient. The product is also going down a more rigorous route for approval in the European Union, which means we may have more information about its specific safety and bioavailability soon. This brings us back to UPFs. Yes, that’s right. Vilified, unhealthy, evil UPFs. By definition, alternative protein products will be UPFs. Like the commodity crops that went into UPFs 1.0, proteins like Solein will be used in food processing to make fake meat, fish, and other products for Food 2.0. Done well, this new generation could be healthier and more environmentally friendly than the animal protein we’ve been eating for centuries. But getting there requires countries like the United States to invest in the technology and research that will help figure out how to make healthy, tasty, safe, and affordable meat replacement products at scale, the way they invested in the Green Revolution. If the project is successful, these products could be readily adopted by developing nations, too. Just like smartphones leapfrogged over landlines, the developing world could jump directly to Food 2.0 without the environmentally disastrous expansion of animal agriculture. We don’t really have a choice in the matter. Business as usual can’t continue. And the idea that a return to small farms producing animals and specialty crops can feed the world is another illusion: It is simply impossible to fill the stomachs of the eight billion people currently on the planet that way. The global production of the big four commodity crops— corn, soy, wheat, and rice—now amounts to about 3,500 daily calories for every person on the planet. That’s plenty, but remember that humans don’t eat those crops directly. They’re processed, either through animal agriculture or factories that make ultra-processed food. So while current industrial agricultural practices with their high yields provide enough calories for the global population, lower yields from idealized farms wouldn’t meet our needs now, much less the needs of more than ten billion people expected to arrive by the end of this century. Fortunately, that’s about when the world’s population is projected to plateau. With enough investment and planning, Food 2.0 can help humanity once again avoid the Malthusian trap, this time permanently. To get there, many of us are going to have to move past the generalized panic over UPFs and lab-grown meats. (Some U.S. states are already banning the latter.) Instead of knee-jerk alarmism and fear, let’s invest in the science to help us avoid the unintended consequences of another grand nutrition experiment. Instead of acting now and dealing with the consequences later, let’s subject Food 2.0 to research and safety monitoring, making sure new products are healthy, both microbiologically and from a chronic disease perspective—and let’s do this before they enter the market. Let’s avoid simple heuristics, like “natural is better” or “processed is bad.” These miss the big picture and may end up creating a food environment that addresses some problems while once again introducing a host of others. This is where reductionism and holism can work together. Reductionism to get focused, and holism to understand what we’re focused on in the context of the overall food system, our health, and our environment. History has shown us, again and again and again, just how easily food fools us. Food is far more than the sum of its known parts—far more than the macronutrients, micronutrients, even the dark matter. Moving into Food 2.0, let’s not repeat history. Let’s account for the real costs of what we make and eat. Let’s understand its impact on our bodies and the planet. Let’s not get distracted by the gurus and influencers with half-baked theories and untested remedies. Let’s remember that our food environment pulls the strings way more than we might realize, and it’s only in changing our environment that we protect everyone, including our children. Let’s eat each meal with an appreciation for the wonder on our dinner plates, what it took to get there, and all it does to build, fuel, sustain, and nourish us. It is, after all, the reason we’re here. Skip Notes *1 We don’t directly eat—or trash—these calories; commodity crops are grown for trading on commodities markets for subsequent processing. *2 Haber is also known as the “father of chemical warfare.” He helped develop Germany’s poisongas program during World War I. *3 If only chicken feed were as tasty as the meat it becomes… *4 Beans look relatively fantastic: Their production emits about 40 times less greenhouse gas than beef. This is why new health guidelines, such as the EAT-Lancet Commission on Food, Planet, Health, encourage people to consider both the environmental and health costs of their diets—to eat more vegetables, grains, and legumes and far less red meat. *5 Regenerative agricultural practices for grazing cattle have been argued to partially mitigate their carbon footprint, but the extent to which this can be achieved and whether such practices can affordably scale is a subject of vigorous debate. CHAPTER 13 Now You Know Much of what we do in life—logging online, getting on the subway, voting, maybe even reproducing—is carried out without knowing the precise details of the systems we’re participating in. Eating seems to us like the highest-contact activity we carry out daily in blissful ignorance, with the highest stakes (no food, we die; on a poor diet, we’re at an increased risk for chronic illness and an early death). Too many people have little appreciation of what’s really in food or how it nourishes the body, building and fueling us. They don’t know that humans are amazing flex-fuel machines, made from food, with the ability to store enough energy to survive for weeks or months without eating. They don’t realize body fat is a tissue to be revered, and metabolism, one of the most awe-inspiring processes humans have ever discovered. They’re not privy to how the brain conducts the orchestra of eating to meet our nutrient needs below our conscious awareness, that eating in humans is a biologically controlled phenomenon, as it is in other animals. They might not know the incredible extent to which their eating behavior is determined by the food environment, or how much greater the effects of food environments are relative to the stuff we tend to focus on when it comes to health and body size. Many don’t know where their food comes from, apart from where they purchased it. They don’t know that we produce so many calories, we had to invent ways to offload them into things like biofuel production. They don’t know that the feed that went into growing animals to eat and making ultra-processed foods came from redirecting a calorie glut. They don’t know that our food system was built on fears of starvation and an obsession with maximizing calories and protein. They don’t know there’s an astonishing disconnect between the foods we need to be eating for health, and the foods we produce. But now you know. Now you know that nutrition isn’t rocket science; it’s much more difficult and it affects our everyday lives. Now you know that we’ve learned a lot about what we eat, and yet there’s still so much more to know about how food affects our bodies, our health, and the planet. Now you know that the simple sound bite about diet and nutrition is often wrong. It’s what confuses people. You know that we’ve been trapped in a cycle of trading one simple sound bite for another, which has not only distracted us from our real food issues but has also led to waste, harm, bad policy, and poor public health messaging. Now you know that science, despite illuminating the importance of food environments in shaping our health, has disproportionately focused on individuals: biomedicine, precision nutrition, and genetics. We think this goes a long way to explaining why the individual choice and personal responsibility fallacy has flourished in discussions about nutrition for so long. The less we know about the toxic food environment, the more we can pretend the problem lies elsewhere. In 2016 the U.S. federal government defunded clinical research centers nationwide where scientists studied human nutrition and metabolic disease in highly controlled clinical trials. As the last FDA commissioner remarked, funding for nutrition research in America is “pathetic.” We can, and we must, do better to fund this science. Kevin believes that important research questions about the effects of ultraprocessed foods in humans now take years to answer when they could be answered in months with the right support. The pervasive belief that chronic diseases like obesity are about personal responsibility and individual failings also delayed progress on policy. It allowed those in power to evade blame for watching over food systems built on incentives not aligned with human, animal, or environmental health. It enabled the rise of the trillion-dollar wellness industry, featuring all the fake fixes we’ve covered in this book—protein supplements, metabolism boosters, fad diets, and glucose hacks. It also allowed us to blame people with obesity, and to not take their problems seriously—to think about the disease as a mere cosmetic problem. It put the focus on treating cancer and heart disease, instead of preventing the dysregulated metabolism that drives so many cases. Worst of all, it allowed people to think the problem was them, even as they tried their hardest to fight back against the environment that was driving their poor health. People like Tracey Yukich, the Biggest Loser contestant from Chapter 1, who was hospitalized for overexertion during the show. Her body is a microcosm of twenty-first-century diet culture in a broken food system. Over the years, she’s cycled through meal replacement shakes, supplements, step classes, keto, vegetarianism, and veganism, as well as other diets and weight loss gimmicks. “What haven’t I tried?” she summed up. She ate so many raw vegetables, she suspects it led to colon problems, while she exercised so much, she suffers chronic back and joint pain. These days, she’s taking GLP-1 medications, but she’s hit roadblocks with access and affordability. She sometimes gets to the point where she thinks she doesn’t need the drugs anymore, and she skips a dose or two. “I kind of test myself a little bit,” she told us, “and I fail miserably.” By that she means the weight starts to come back. Tracey Yukich did not fail. The food environment failed her. Until we fix it and address the calorie glut that makes toxic food environments not only possible but inevitable, a lot of people will needlessly suffer pain, stigma, and disease, while governments will have to invest billions in taxpayer dollars covering treatments. WHAT INDIVIDUALS CAN DO Our focus in this book on the environmental causes of disease doesn’t mean we think individuals are off the hook or totally helpless. On the contrary. As we discussed in Chapter 9, there are better and worse ways to eat. The evidence on optimal nutrition has been clear and consistent over decades. It’s boring by this point. Eat more vegetables—along with fiber, legumes, whole grains, and fruits. Limit sodium, sugar, saturated fat, and junk foods. In the process of researching this book, we each personally changed how we eat. We reengineered our home food environments and make our meals more environmentally friendly. Kevin cut back on meat. He also cooks more at home and buys less junk food. And when he does buy junk, he stashes it in the basement so it’s not easily obtainable. Julia does the same with junk and meat—limits it in the home and keeps junk out of sight and outside the kitchen. She also tries wherever possible to insert more plant diversity in her meals—for both the nutrients we know we need and those we don’t yet know we need. We both encourage our kids to listen to the signals in their bodies as they eat, and to understand the connections between what they put in, how they feel, and what comes out. We each do everything we can to avoid food waste: looking in the fridge and cupboards before grocery shopping, meal planning, eating leftovers. Having a repetitive diet makes this easier. Kevin’s kids eat virtually the same lunch every school day; Julia often makes the same types of food on the same weekdays (Sundays minestrone, Mondays lentils, Wednesdays fish, and so on). You can start by making similar changes at home, and then helping to make them in your workplaces, schools, communities, cities, and countries. But we both know that too few people have the resources or wherewithal to do these things for themselves and their families, even if they want to. To check if this feeling was correct, we commissioned a survey of a representative sample of Americans, by the polling company Morning Consult, at the end of 2024. Most respondents—64 percent—said their diets weren’t healthy enough. When asked what was stopping them from eating better, by far the most common answer was that it costs too much. This was followed closely by people reporting that it’s difficult to find healthy food options nearby, and that they don’t have the time and know-how to cook. Protecting the most vulnerable means instituting the system-wide policies and regulations we discussed in Chapter 8. The goal should be inverting the current food environment, so healthy foods are the easy and affordable default, and unhealthy foods become rare treats. It’s great to have knowledge about the food system and nutrition, but if we still live in environments that are hostile to healthy choices, too many of us will struggle and get sick. Interestingly, even though many of our survey respondents (40 percent) said they feel personally responsible for the quality and health of the food on their dinner plates, 49 percent told us they agreed the federal government should implement a small tax on ultra-processed foods if the revenue collected would go toward subsidizing tasty, healthy, and convenient meals in the future. Doing this will require enormous political will because the food industry will fight such efforts every step of the way. THE MOMENTUM The political will to address the toxic food environment seems to finally be gaining momentum around the world. In the United States, the appetite for tackling the harms of ultra-processed food appeared to be at an all-time high as we were finishing this book. During the 2024 presidential election and its immediate aftermath, the Make America Healthy Again movement —focused on driving down chronic disease rates, especially in kids—had seemingly found traction among both Republicans and Democrats. Whereas past efforts to regulate the food industry did not garner bipartisan support (for example, Michelle Obama’s Let’s Move! campaign as First Lady was decried as “nanny state” overreach), the focus on ultra-processed foods galvanized interest on both sides of the political aisle. MAHA’s leader, Robert F. Kennedy Jr., has elevated the discussion of toxic food environments in the way of no other national leader we can think of. Among his first initiatives while in office: He directed the FDA to look into eliminating the self-affirmed GRAS loophole and has promised to reorient NIH research to chronic disease prevention as opposed to its past emphasis on finding treatments. In the last couple of years prior to Kennedy’s arrival, U.S. Senate hearings have directly targeted the ultra-processed food industry, the primary culprit in America’s epidemic of diet-related chronic disease and decreasing lifespans. The previous FDA commissioner expressed his opinion that food manufacturers engineer ultra-processed foods to be addictive, and senators from both political parties expressed outrage about the situation. U.S. states have also been rushing in to take action ahead of the federal government. After banning several food additives thought to pose a public health risk, the governor of California recently issued an executive order directing the state’s agencies to provide recommendations on how to mitigate the harms of ultra-processed foods. Many states are introducing bills to remove sugar-sweetened beverages and other ultraprocessed foods from supplemental nutrition assistance programs. Worldwide, over 130 jurisdictions in nearly 120 countries and territories have put in place taxes on sugary drinks. In the UK, their soda tax in 2018 was followed by restrictions on food marketing to kids in 2025, while the House of Lords issued a report titled “Recipe for Health: A Plan to Fix Our Broken Food System” describing the vast scope of the problem with the food environment and policy proposals to fix it. Latin America, as we’ve seen, is leading the rest of the world with the most aggressive suite of healthy food laws, and other low- and middle-income countries are now considering ways to follow suit. Meanwhile, individuals are taking it upon themselves to fight the toxic food environment. Recently, a U.S. teenager with type 2 diabetes sued eleven ultra-processed foods manufacturers, accusing them of knowingly producing and marketing harmful and “addictive” products, and failing to warn consumers about their health effects. The lawsuit also alleges that several Big Food companies engaged in a conspiracy to manipulate the marketplace. Will this UPF lawsuit, or others like it, reveal smoking-gun-like documents from Big Food exposing their knowledge about how their products cause harm, the same way Big Tobacco lawsuits did years ago? Will executives from major food companies be called to Congress to testify as Big Tobacco executives were in 1994? Back then, the executives claimed that nicotine was not addictive, despite decades of scientific evidence to the contrary. Internal industry documents obtained as part of the legal discovery process revealed that the companies long knew about the negative health consequences and addictive nature of smoking. Rather than own up to the problems with their products, Big Tobacco responded by calling on scientists and public relations experts to cast doubt on the emerging scientific consensus, as Naomi Oreskes and Erik M. Conway documented in the seminal book Merchants of Doubt. The food industry is deploying similar tactics right now, in their fight against increased regulation. If the momentum builds and public health triumphs, maybe we’re almost finished with dessert for breakfast for our children, with super-sized, saltand fat-filled lunches and dinners, with sugar-laden everything else. Maybe the era of sending billions of animals to slaughter for our protein while destroying natural habitats to grow them will soon be behind us. Maybe the distraction from the root causes of our nutrition crises is over. Maybe it’s the end of a calorie glut so unevenly spread around the planet that people starve while others develop obesity and diabetes. We’re on the cusp of Food 2.0—and healthier, more equitable, environmentally sustainable eating. The technologists, agriculturists, and food scientists whose genius gave us the ultra-processed calorie glut and the Green Revolution can deliver the cleaner food future we all require and deserve. We’ll look back at the food system of today the way we do the Poison Squad era when borax and formaldehyde were common food additives: That was crazy. We need armies of Harvey Wileys, supported by large swaths of the public, to do the research as well as fight for a sustainable and healthy food system. At the same time, they have to stay true to the science. Wiley may be best remembered as the father of food safety, for establishing the dangers of common food additives in his day. But he also found that some widely used preservatives thought to be especially dangerous turned out to have undeserved bad reputations. He was honest that his data did not support extraordinary fears about these substances, but he got caught up in his fervor and still argued that their use in food was “reprehensible in every respect, and leads to injury to the consumer, which, though in many cases not easily measured, must finally be productive of great harm” [italics added].[*] That this conclusion was not well supported by Wiley’s data is a problem and presents a lesson we urgently need to learn, especially with the MAHA movement now in power in Washington, D.C. To improve public health, activist zeal for change must rest on solid scientific foundations, based on data from well-conducted studies, and not merely compelling narratives. When it comes to exactly which policies we put in place and how they are targeted, we need to follow the science with as little bias as humanly possible. Kevin has been trying to do just that. Unfortunately, he hasn’t always found support. In addition to the anemic funding environment for human nutrition and metabolism research, Kevin’s attempts to engage directly with the public on the science of nutrition and metabolism have been actively thwarted. You might have noticed that we haven’t mentioned where Kevin works. That’s because he almost got fired for writing this book. We can’t go into details, but one of the people we thank in the acknowledgments is Kevin’s lawyer, Mark Zaid, best known for representing high-profile authors from within the U.S. national security community. Mark helped Kevin negotiate with his employer so that he could finish this book. We hope any interest this book inspires in nutrition science and policy will spur more public support for research that tests our understanding of food and its effects on our health. After all, rigorous science is what’s needed if we want solid evidence, rather than magical thinking, to lead in Food 2.0. Skip Notes * Wiley’s approach was precautionary, banning a substance when there is any evidence of potential harm. ACKNOWLEDGMENTS We completely misjudged what it would take to write this book. Going in, Julia had no idea she’d need crash courses in early organic chemistry and Ancient Greek medicine. Kevin didn’t realize how much of the history of nutrition and metabolism science he didn’t know. But anything worth doing takes time and many hands, and we are bursting with gratitude that we had both—and each other. To begin with, this book was only possible because of the support of our families. Thank you for putting up with us over the too-often antisocial years it took to finish Food Intelligence. How can we even begin to thank our spouses, Axel and Kristen, for encouraging us, accepting a third thing in the marriage, and reading more drafts and offering more feedback and guidance than they should have had to? For spending evenings or weekends alone while we worked, or doing the morning routine when we were tired from writing into the night? We owe you so much, but thanks for not making us feel too bad about it. Our dear children, Theodor, Oriana, Brady, and Cameron: We were sometimes at the keyboard or on the phone when we would have rather been playing with you, but we sincerely hope this book will help you and your friends live healthier and happier lives than we have. We’d also like to thank our parents for their unwavering support, humor, and love, even when it took us far away to pursue our dreams. A big thank-you to our brilliant agent, Will Francis of Janklow & Nesbit, who understood what we were writing before we did and patiently coaxed our words along through years and drafts. To our editors, Hannah Steigmeyer at Avery and Alex Clarke and Joe Thomas at Wildfire Books, thank you for such thoughtful and incisive questions and commentary at critical moments (and thank you to Caroline Sutton, formerly of Avery, for commissioning this book). This was a collaborative effort not only between us and our editors and agent, but also among dozens of researchers, scientists, and journalists who so kindly shared their time reading drafts or parts of drafts, and offering insights and feedback. The researchers, clinicians, and scientists (many of whom we also interviewed) include: Charles Brenner, Susan Campisi, Daniel Drucker, David Flood, Paul Franks, Ashley Gearhardt, Tony Goldstone, Nicola Guess, Nick Knuth, Lex Kravitz, Ruth Loos, Faidon Magkos, Jerold Mande, Carlos Monteiro, Pat Munday, Marion Nestle, Stephen O’Rahilly, Stuart Phillips, Barry Popkin, David Raubenheimer, Eric Ravussin, Sonia Rehal, Tony Sclafani, Stephen Simpson, Thorkild Sørensen, John Speakman, Boyd Swinburn, Deirdre Tobias, Chris van Tulleken, and Ethan Weiss. Thank you to food thinker Henry Dimbleby, who might as well be a scientist, for his sharp feedback that shaped this book’s message, and Steven Hoffman at York University, for providing Julia an institutional home and access to critical databases while writing. Thanks also to Hadassah Cypess for offering her Gen Z perspective and helpful comments. Julia would also like to thank Gordon Guyatt, Brian Haynes, John Lavis, and their colleagues at McMaster University for so patiently sharing what they know about research methods and health evidence over many years. Thank you to the community of author and journalist friends for enduring conversations about the book’s content and writing process, among them: Christie Aschwanden, Deborah Blum, Helen Branswell, Timothy Caulfield, David Epstein, Yoni Freedhoff, Cynthia Graber, Daniel Gross, Tamar Haspel, Roxanne Khamsi, Jenny Leonard, Victor Montori, Alexander Panetta, Laura-Julie Perreault, Alexandra Sifferlin, Nicola Twilley, Ed Yong, and Tom Zeller Jr. A heartfelt thank-you to Eliza Barclay, Katie Engelhart, Mark Schatzker, and Richard Warnica, who also jumped in and offered feedback and edits on various parts of Food Intelligence. Julia would like to thank the friends and former colleagues whom she met at Vox —for inspiring simplicity and clarity above all, and whose reporting encouragement led to the genesis of this book—including Eleanor Barkhorn, Ezra Klein, Sarah Kliff, Irene Noguchi, Brad Plumer, Sean Rameswaram, Brian Resnick, and Matthew Yglesias. Kevin thanks all of the talented folks who have been a part of his laboratory over the years. Without their efforts, his research would not have been possible. Kevin especially thanks Juen Guo, who has been the rock of his research team for many years. Thanks to Marvin Gershengorn and James Balow for their support and encouragement in starting a clinical research program. Thanks again to Nick Knuth, who took a huge risk joining Kevin’s lab as a postdoctoral fellow and helped him launch Kevin’s first human studies. Valerie Darcey taught Kevin about neuroscience while she was his postdoc, and he is very proud of her as she embarks on a stellar independent career. There are too many collaborators to thank individually, but you have all inspired and taught Kevin an incalculable amount. Thanks to all of the volunteer study participants who were so gracious with their time and literally bled repeatedly for Kevin’s studies. Thanks also to all of the support staff, the dietitians, chefs, nurses, nurse practitioners, physicians, administrators, and funders who have made Kevin’s studies possible. We want to profusely thank our many additional interview subjects who so generously shared their time, thoughts, and stories with us during our research, not all of whom we were able to name in the text of the book. They include Guido Albanese, Sean Algaier, Amber Alhadeff, Gabe Almedia, Federica Amati, Jacob Anderson, Patrick Anthony, Grant Antoine, Phil Baker, Albert-Lázló Barabási, Louise Baur, Marco Beretta, Kent Berridge, Nick Betley, Deborah Blum, João Victor Basolli Borsatto, Sebastien Bouret, George Bray, Nicola Bridges, Dan Brierley, Kelly Brownell, Danny Cahill, Michelle Cardel, Slavea Chankova, Regan Chastain, Jodi and Jack Churney, Alejandro Cifuentes, Rupert Cole, Gerald Combs, Barbara Corkey, Ivan de Araujo, Bill Dietz, Christopher Duggan, Anthony Fardet, Tera Fazzino, Giorgio Fischer, Mary Fissell, Lasse Folkersen, Yoni Freedhoff, Thomas Galligan, Christopher Gardner, Mike Gibney, Jan Golinski, Mark Grant, Michael Grunwald, John Hayes, Eugenie Hsu, Phillippe Hujoel, Elena Ibañez, Phil James, Hiba Jebeile, Aditi Juneja, Julia and Lewis Kay, Sam Klein, Zack Knight, Hannah Landecker, Rachel Jackson Leach, Emily Broad Leib, Clare Llewellyn, Peter Lurie, Robert Lustig, Maricel Maffini, Bill Marler, Shana McCormack, Giulia Menichetti, Jean-Claude Moubarac, Christoph Müller, Elizabeth Neswald, Prince Obere, Kate Ohno, Joyce and Grace Ovenden, Susan Ozanne, Frank Phillips, Claus Priesner, Johann Peter Prinz, Johanna Ralston, Leanne Redman, Mark Rossi, Catherine St. James, David Savage, Philipp Scherer, Laura Schmidt, Wolfram Schultz, Gyorgy Scrinis, Marci Serota, Ken Shaw, Kathryn and Kerry Simpson, Robin Simsa, Dana Small, Andrea Smith, Scott Sternson, Andra Stratton, Gary Taubes, Andrew Taylor, Gregor Tegl, Diana Thiara, Eric Topol, Dorna Varshavi, Debby Waldman, Desiree Wanders, Alan Watts, Jonathan Wells, Eduard Winter, David Wishart, Dominic Withers, Amy Wood, Tracey Yukich, and David Zeevi. If we’ve left anyone out here, it wasn’t intentional. It’s because this book took so long to write. We are immensely grateful to Cheryl Alkon and Beatrice Hogan, the valiant fact-checkers on this project, who worked with such care, and at warp speed, to get the book into shape. A hearty thank-you to the organizations that supported Julia’s work on this project, including the Alfred P. Sloan Foundation and the Richard Lounsbery Foundation. Without Kevin’s lawyer, Mark Zaid, Food Intelligence would likely have never seen the light of day. Thank you to Eric Rayman, publishing lawyer extraordinaire, for his kindness, guidance, and Paris tips. Last but not least, we would be remiss if we didn’t mention the numerous cafés and libraries where we regularly worked on this book in Bethesda, Vienna, and Paris, including Joe & the Juice, Java Nation, Himmelblau, Bibliothèque Nationale de France at Richelieu, and Noir café. Without their caffeine and quiet, we would’ve taken even longer to finish. |
Direct link: https://paste.plurk.com/show/UPygxSPnxGswzvrYxnns