THCB Gang Episode 135, Thursday September 28

Joining Matthew Holt (@boltyboy) on #THCBGang on Thursday September 28 at 1pm PST 4pm EST are futurist Jeff Goldsmith: author of ponderer of odd juxtapositions Kim Bellard (@kimbbellard); and patient safety expert and all around wit Michael Millenson (@mlmillenson).

You can see the video below & if you’d rather listen than watch, the audio is preserved as a weekly podcast available on our iTunes & Spotify channels.

from The Health Care Blog https://ift.tt/u4YbqT8

War — and Health Care — on the Cheap

By KIM BELLARD

Like many of you, I’m watching the war in Ukraine with great interest and much support. For all the fuss about expensive weapons — like F-16 fighters, Abrams tanks, Stryker and Bradley armored fighting vehicles, Patriot missile defense systems, Javelin anti-tank missiles, Himars long range missiles, and various types of high tech drones — what I’m most fascinated with is how Ukraine is using inexpensive, practically homemade drones as a key weapon.

It’s a new way of waging war. And when I say “waging war,” I can’t help but also think “providing health care.” It’s not so much that I think drones are going to revamp health care, but if very expensive weapons may, in fact, not be the future of warfare, maybe very expensive treatments aren’t necessarily the future of healthcare either.

Just within the last two weeks, for example, The New York Times headlined Budget Drones Prove Their Value in a Billion-Dollar War, AP said Using duct tape and bombs, Ukraine’s drone pilots wage war with low-cost, improvised weapons, ABC News reports: Inside Ukraine’s efforts to bring an ‘army of drones’ to war against Russia, and Defense News describes how Cardboard drone vendor retools software based on Ukraine war hacks.

This is not the U.S. military-industrial complex’s “shock-and-awe” kind of warfare; this is the guy-in-his-garage-building-his-own-weapons kind of warfare.

Ukraine’s minister for digital transformation, Mykhailo Federov, says the government is committed to building a state-of-the-art “army of drones.” He promises: “A new stage of the war will soon begin.”

NYT detailed:

Drones made of plastic foam or plastic are harder to find on radar, reconnaissance teams said. Ukraine buys them from commercial suppliers who also sell to aerial photographers or hobbyists around the world, along with parts such as radios, cameras, antennas and motors. The drone units mix and match parts until they find combinations that can fly past sophisticated Russian air defenses.

“The doctrine of war is changing,” one Ukrainian commander said. “Drones that cost hundreds of dollars are destroying machines costing millions of dollars.” The AP discusses how an elite drone unit – “a ragtag group of engineers, corporate managers and filmmakers” — “assembled with just $700,000, has destroyed $80 million worth of enemy equipment.”

Dmytro Kovalchuk, CEO of drone manufacturer Warbird, told ABC News: “In Ukraine, not a single state enterprise is producing drones. It’s all private enterprises, sometimes partnerships…It [the drone] costs $1,000 and can destroy a tank that costs $500,000.”

And it is not just attacking tanks or just from the air; Just last month, Ukraine used a sea drone to damage an expensive Russian warship. 

One of the many reasons the war in Ukraine is important is because China is watching closely to see what might happen if it were to invade Taiwan, and I’m hoping Taiwan and its allies, including the U.S., are paying close attention to the importance of drones. NYT is skeptical, charging: “A new generation of cheaper and more flexible vessels could be vital in any conflict with China, but the Navy remains lashed to big shipbuilding programs driven by tradition, political influence and jobs.”

“The U.S. Navy is arrogant,” said retired admiral Lorin Selby, who used to head the Office of Naval Research. “We have an arrogance about, we’ve got these aircraft carriers, we’ve got these amazing submarines. We don’t know anything else. And that is just wrong.” Another former officer agreed: “Right now, they are still building a largely 20th-century Navy.”

“We are trying to improve Navy power, but we need to do more than that: We need to reimagine Navy power,” he also said. “We’re kind of at a pivotal point in history. It is vital that we throw off old conventions.”

It’s not that the Navy is unaware of the potential of drones; as NYT acknowledged, it has been testing integrating “drone boats, unmanned submersible vessels and aerial vehicles capable of monitoring and intercepting threats over hundreds of miles.” It’s more that it isn’t a priority; the budget devoted to it, one officer lamented, is “the dust particle on the pocket lint of the budget.”

The Wall Street Journal was more optimistic, reporting on details of a recent speech from Kathleen Hicks, the deputy secretary of defense. She vowed that DoD “plans to spend hundreds of millions of dollars to produce an array of thousands of air-, land- and sea-based artificial-intelligence systems that are intended to be ‘small, smart, cheap’”

Of course, when fighter planes now can cost $135 million each, aircraft carriers cost $13b apiece, and the overall DoD budget is closing in on $1 trillion annually, spending “hundreds of millions” on alternative weapons does kind of sound like pocket lint. The Pentagon admits that China is “displaying growing numbers of autonomous and teaming systems,” including “a substantial amount of development displaying efforts to produce swarming capability for operational applications.” They’re taking this seriously.

“The hundreds of millions of dollars range, while a great start, would only provide hundreds of the truly capable ocean drones we need to establish true deterrence to China and other adversaries,” Kevin Decker, chief executive of Ocean Aero, told WSJ. “They’ve got to start somewhere, and they’ve got to start now.”

“Quite frankly, industry is well ahead of us,” Marine Lt. Gen. Karsten Heckl, deputy commandant for combat admitted. “So we’re trying to catch up but [there is] a lot of promise.”

As the Ukrainian commander said, the doctrine of war is changing.  Weapons systems started in the 1990’s (F-35 fighter) or early 2000’s (the Gerald Ford aircraft carrier) are just going into service and are already outdated.  Admiral Selby has it right: “It is vital that we throw off old conventions.”

———–

So it is with healthcare. Capital sinks like hospitals are healthcare’s aircraft carriers – once essential, but now vastly expensive and hugely vulnerable. Prescription drugs that can cost hundreds of thousands, if not millions, of dollars annually are 20th century pricing in a world of AI drug development, CRISPR, and 3D printing, to name a few innovations. Adding facility fees to even telehealth visits is (stupid) 20th century thinking. Health insurance premiums that are unaffordable even to middle class customers reflect 20th approaches.

Similarly, I’m not worried that healthcare won’t find many uses for AI; rather, I’m worried that it will co-opt AI into making existing cost structures even higher, rather than using it to make healthcare become “small, smart, and cheap.”

The doctrine of healthcare must change.  Where is its ragtag team of engineers, computer scientists, physicians, and entrepreneurs making it faster, smaller, smarter, cheaper, more personal, and definitely more effective?

Kim is a former emarketing exec at a major Blues plan, editor of the late & lamented Tincture.io, and a regular THCB contributor.

from The Health Care Blog https://ift.tt/m0NBPsK

The Carenostics Interview

In my other day job of advising companies, I was introduced to Carenostics by my friends at Bayer G4A. This is a super interesting company which is using AI to diagnose serious chronic diseases like kidney disease, asthma, and others much earlier. So far they are working with health systems like Hackensack in NJ and the VA, and have just raised a $5m round. Last week I spoke with father and son team, Kanishka Rao, COO & Bharat Rao, CEO. In particular look out for Bharat’s explanation of what has to go on behind the curtain to make AI become effective.

from The Health Care Blog https://ift.tt/7jJzwxB

“PictureWhat” ??? Super-Human Poison Ivy. What’s Going On?

By MIKE MAGEE

Connecticut loves its’ trees. And no town in Connecticut loves its’ trees more than West Hartford, CT. The town borders include an elaborate interconnected reservoir system that does double duty as a focal point for a wide range of nature paths for walkers, runners and cyclists.

While walking one path yesterday, I came a tree with the healthiest upward advancing vine I had ever seen. My “PictureThis” app took no time to identify the plant. To my surprise, it was Toxicodendron radicans, known commonly as Poison Ivy.

The description didn’t pull punches. It read, “In pop culture, poison ivy is a symbol of an obnoxious weed because, despite its unthreatening looks, it gives a highly unpleasant contact rash to the unfortunate person who touches it.” And even that doesn’t quite capture the plants negative notoriety.

Its’ pain and itch inducing chemical oil covers every inch of the plant, and is toxic to 80% of humans. It was discovered by Japanese chemist Rikō Majima in the lacquer tree and named urushiol (Japanese for lacquer) in 1922. It is a derivative of catechol, an organic compound with the molecular formula  C₆H₆O₂.

But the giant vine this week was nothing like the creeping little three leaf plant most children have be taught to avoid. This was a giant – a very different aggressor worth investigating. Its leaves were impossibly large and its vine straight and thick, and its vitality unhampered by a need to support elaborate roots or bark.

Others have noticed it too including “Pesky Pete” who has made a good living removing the invader from properties in Massachusetts and southern New Hampshire. And recently business has been booming. This is because the plant, which up to this year has never appeared in the region before May 10th, suddenly appeared this year on April 23rd.

This was no surprise to Bill Schlesinger, resident of Maine and Durham, NC.  Officially, he is “William H. Schlesinger … one of the nation’s leading ecologists and earth scientists …a member of the National Academy of Sciences, …has served as dean of the Nicholas School of the Environment at Duke…”

Turns out Bill was in the lead on a six year project termed the “DukeUniversity Free-Air CO2 Enrichment Experiment” between 2000 and 2006 when the results were published.  They had been following tree declines in the Duke Forest where predatory vines had played a major role. They decided to encircle and isolate six giant forest plots and pump them full of CO2, and then catalogue the effects.

Their 2006 publication revealed that:

  1. CO2 enrichment increased T. radicans photosynthesis by 77%
  2. Increased the efficiency of plant water usage by 51%
  3. Stimulated the growth of poison ivy during the five growing seasons ambient plants
  4. Annual growth increase of 149% in elevated CO2 compared to ambient plants.
  5. Notably larger than the 31% average increase in biomass observed for woody plants

Poison Ivy was the fastest grower of them all in the experimental CO2 forests. Bill’s collaborator,  Jacqueline E. Mohan, carried the work further as head of the Harvard Forest project in Massachusetts. They reported out results, not on CO2  soil, but warmed soil. They heated the upper layer of soil by 9 degrees. Her response to the findings was surprisingly down-to-earth. She said, “My heavens to Betsy, it’s taking off. Poison ivy takes off more than any tree species, more than any shrub species.”

Mohan and coworkers made it clear at the time that this was not great news for 8 out of 10 Americans who are sensitive to poison ivy. Not only did global warming and carbon footprints accelerate growth in the plant by 70% in its leaf size and biomass, but additional experiments revealed that these environmental enablers increased the amount of urushiol in the plant

As Duke was building those first towers to isolate their experimental forests in 2000, the International Geosphere-Biosphere Program was holding its annual meeting in Mexico.  A Working Group subsequently focused on defining planetary boundaries (PB) that would assure both planetary and human health.

Nine years later, the group published  “A Safe Operating Space For Humanity” in Nature. In it they proposed nine “planetary boundaries” to gauge “the continued development of human societies and the maintenance of the Earth system(ES) in a resilient and accommodating state.” In their view, measuring and ongoing monitoring of these boundaries would provide “a science-based analysis of the risk of human perturbations” that might “destabilize the Earth’s Systems (ES) on a planetary scale.” The work was updated in 2015.

The first Planetary Boundary listed was Global Warming with two measures, atmospheric CO2 and air and water temperature. As for human perturbation, as the picture above well illustrates, you can add super-human poison ivy to the list of unintended consequences.

Mike Magee MD is a Medical Historian and regular THCB contributor. He is the author of CODE BLUE: Inside the Medical-Industrial Complex (Grove/2020)

from The Health Care Blog https://ift.tt/eV8b1XP

DNA is Better at Math than You Are

By KIM BELLARD

I was tempted to write about the work being done at Wharton that suggests that AI may already be better at being entrepreneurial than most of us, and of course I’m always interested to see how nanoparticles are starting to change health care (e.g., breast cancer or cancer more generally), but when I saw what researchers at China’s Shanghai Jiao Tong University have done with DNA-based computers, well, I couldn’t pass that up. 

If PCs helped change the image of computers from the big mainframes, and mobile phones further redefined what a computer is, then DNA computers may cause us to one day – in the lifetime of some of you — look back at our chip-based devices as primitive as we now view ENIAC.

It’s been almost 30 years since Leonard Adleman first suggested the idea of DNA computing, and there’s been a lot of excitement in the field since, but, really, not the kind of progress that would make a general purpose DNA computer seem feasible. That may have changed.

At the risk of introducing way too many acronyms, the Chinese researchers claim they have developed a general purpose DNA integrated circuit (DIC), using “multilayer DNA-based programmable gate arrays (DPGAs).” The DPGAs are the building blocks of the DIC and can be mixed and matched to create the desired circuits. They claim that each DPGA “can be programmed with wiring instructions to implement over 100 billion distinct circuits.”

They keep track of what is going on using fluorescence markers, which probably makes watching a computation fun to watch. 

One experiment, involving 3 DPGAs and 500 DNA strands, made a circuit that could solve quadratic equations, and another could do square roots. Oh, and, by the way, another DPGA circuit could identify RNA molecules that are related to renal cancer. They believe their DPGAs offers the potential for “intelligent diagnostics of different kinds of diseases.”

DNA tracking DNA.

“Programmability and scalability constitute two critical factors in achieving general-purpose computing,” the researchers write. “Programmability enables specification of the device to perform various algorithms whereas scalability allows the handling of a growing amount of work by the addition of resources to the system.” The authors believe they’ve made significant progress on both fronts. 

Moreover, they say: “The ability to integrate large-scale DPGA networks without apparent signal attenuation marks a key step towards general-purpose DNA computing.” 

I don’t pretend to understand the chemistry, engineering, or computing logic involved in all that, and I’m not saying you’ll soon be carrying around a bunch of DPGAs instead of your phone. But I’m pretty sure at some point in the foreseeable future we’ll not be carrying around phones as our devices, and I suspect there’s a pretty good chance that DNA is going to be crucial to our computing future.

For one thing, the storage in DNA is unrivaled. As MIT professor Mark Bathe, Ph.D. told NPR: “All the data in the world could fit in your coffee cup that you’re drinking in the morning if it were stored in DNA.” It’s hard to get our heads around how much more efficient – and resilient — nature is with DNA data storage than anything we’ve come up with.

For another, as long as we’re DNA-based creatures, it’s going to be relevant to us, whereas I already have computer storage disks I don’t have ports for and computers that are so out-of-date as to be useless. DNA isn’t going to go out of date.

For a third reason, our current approach to computing rely heavily on a wide range of materials, especially the so-called rare earth elements. It’s not so much that they’re rare as it is that they are incredibly hard to mine and process, and create a significant amount of pollution along the way. A computing future based on our silicon chip approach is not sustainable and probably won’t survive the 21st century. DNA is literally everywhere.

Fourth, biology – specifically, brain cells — brain cells – may be the best path forward to AI, as suggested by a new field called Organoid Intelligence (OI). “Computing and artificial intelligence have been driving the technology revolution, but they are reaching a ceiling,” said Thomas Hartung, the leader of the initiative that established OI. “Biocomputing is an enormous effort of compacting computational power and increasing its efficiency to push past our current technological limits.”

Professor Hartung pointed out that only last year a supercomputer exceeded the computational capacity of a single human brain — “but using a million times more energy.”

Fifth and most specific to health care, we are biological, DNA-based beings, and there’s just something fitting about using biological computing as one of, and perhaps the primary, approaches to how we track and manage our health. As I wrote several years ago, what could be better than being your own medical record?   

Sixth and finally, we’ve had a great run with our current approach to computing, but it is overdue for the next big thing. That next big thing may be DNA/biological computing, or it may be quantum computing, or it may be a combination of both, but I would be willing to bet that 22nd computing doesn’t look much like 2023 computing. We need to be looking ahead.

So, yeah, I’m excited by DNA/biological computing, and I think you should be too.

Kim is a former emarketing exec at a major Blues plan, editor of the late & lamented Tincture.io, and a regular THCB contributor.

from The Health Care Blog https://ift.tt/hXuypDT

Poor Kids. Pitiful Us

By KIM BELLARD

Well, congratulations, America.  The child poverty rate more than doubled from 2021 to 2022, jumping from 5.2% to 12.4%, according to new figures from the Census Bureau.  Once again, we prove we sure have a funny way of showing that we love our kids.

The poverty rate is actually the Supplemental Poverty Measure (SPM), which takes into account government programs aimed at low income families but which are not counted in the official poverty rate. The official poverty rate stayed the same, at 11.5% while the overall SPM increased 4.6% (to 12.4%), the first time the SPM has increased since 2010.  It’s bad enough that over 10% of our population lives in poverty, but that so many children live in poverty, and that their rate doubled from 2021 to 2022 — well, how does one think about that?

The increase was expected. In fact, the outlier number was the “low” 2021 rate.  Poverty dropped due to COVID relief programs; in particular, the child tax credit (CTC).  It had the remarkable (and intended) impact of lowering child poverty, but was allowed to expire at the end of 2021, which accounts for the large increase. We’re basically back to where we were pre-pandemic.

President Biden was quick to call out Congressional Republicans (although he might have chided Senator Joe Manchin just as well):

Today’s Census report shows the dire consequences of congressional Republicans’ refusal to extend the enhanced Child Tax Credit, even as they advance costly corporate tax cuts…The rise reported today in child poverty is no accident—it is the result of a deliberate policy choice congressional Republicans made to block help for families with children while advancing massive tax cuts for the wealthiest and largest corporations.

Many experts agree: child poverty, and poverty more generally, is a choice, a policy choice.

“This data once again highlights that poverty in our country isn’t a personal failing, but rather a policy choice,” said Melissa Boteach, vice president of income security at the National Women’s Law Center.

Economist Paul Krugman blasts the failure to continue the expansion of the CTC, calling it both stupid and cruel for two reasons:

First, avoiding much of this human catastrophe would have cost remarkably little money. Second, child poverty is, in the long run, very expensive for the nation as a whole: Americans who live in poverty as children grow up to become less healthy and productive adults than they should be.

Bruce Leslie, President of First Focus on Children, agrees, telling Time that poverty “really does affect every aspect of the lives of kids. It affects kids’ education, their health, their nutrition, and then has negative consequences on things like child abuse and homelessness.”

But, as Professor Krugman noted: “Unfortunately, children can’t vote and poor adults tend not to vote either. So politicians can get away with policies that harm poor children.”

We’re better than that…aren’t we? “Ensuring that children have their basic needs met is the bare minimum of what we can and should do,” Renee Ryberg, senior research scientist at Child Trends, a research organization, told CNN. “The payoff for the health and wellbeing of our nation’s children and for our society as a whole is immeasurable.”

It’s worth pointing out that, compared to our peer nations, we fare badly, in the bottom quartile, with child poverty rates comparable to Bulgaria and Chile. So, no, we’re not remotely even doing the bare minimum. 

Speaking of child statistics on which the U.S. falls far short, we have both maternal and infant mortality rates that rival third world nations. It’s hard to argue that we love mothers and children when we allow them to die at these shockingly high levels.

A bare minimum we should be doing for moms and kids is to make sure they have health insurance, yet ten states still have not passed Medicaid expansions despite the federal incentives to do so. I’ll leave it as an exercise for the interested reader to compare the states without Medicaid expansion with the ones with the worst maternal/infant mortality

To add insult to injury, COVID allowed millions more to qualify for Medicaid, but those special provisions are “unwinding” and – you guessed it – children are being disproportionately impacted, with millions losing their coverage (often due to procedural reasons rather than ineligibility).

I’ve written before about the value of programs that give direct assistance to low income individuals (e.g., cash transfers and SNAP), and there’s new evidence that such a program helps mothers and infants in particular. The Delaware Healthy Mother and Infant Consortium is testing giving a guaranteed income of $1,000/month to low income pregnant women, and is already claiming a 324% return on investment. Mothers are more likely to get prenatal care and less likely to have birth complications. 

“We’ve demonstrated not only that there’s a great return on investment, but there’s actually decreased cost on the healthcare side,” says DHMIC Chair Dr. Pricilla Mpasi. 

Similarly, despite SNAP and various school lunch programs, the Children’s Defense Fund estimates that 1 in 7 kids – some 10.5 million – are still food insecure, living in households where not everyone gets enough to eat. Massachusetts is trying to put a dent in that for its school-aged children, by making school breakfast and lunch free for all K-12 students. No more red tape, no more stigma for poor kids getting subsidized meals. 

California, Colorado, Maine, Michigan, Minnesota, New Mexico, and Vermont have similar programs. For Pete’s sake, why don’t all states?

————

It’s embarrassing that our overall poverty rate is so high, among the highest in the world. We’re the richest nation in the world but have among the highest percentage of poor people. It is literally killing us. Somehow, we’ve allowed poverty to be a political debate, a policy decision we persist in. 

But child poverty? Allowing it to double? When asked about it, Joe Manchin shrugged: “We all have to do our part. The federal government can’t run everything.” I agree, the federal government can’t do everything, but if it is going to do one thing, helping poor kids should be pretty high on the list.

We shouldn’t just be embarrassed; we should be ashamed.

Kim is a former emarketing exec at a major Blues plan, editor of the late & lamented Tincture.io, and a regular THCB contributor.

from The Health Care Blog https://ift.tt/lJzfS6E

The Times They Are A-Changing….Fast

By KIM BELLARD

If you have been following my Twitter – oops, I mean “X” – feed lately, you may have noticed that I’ve been emphasizing The Coming Wave, the new book from Mustafa Suleyman (with Michael Bhaskar). If you have not yet read it, or at least ordered it, I urge you to do so, because, frankly, our lives are not going to be the same, at all.  And we’re woefully unprepared.

One thing I especially appreciated is that, although he made his reputation in artificial intelligence, Mr. Suleyman doesn’t only focus on AI. He also discusses synthetic biology, quantum computing, robotics, and new energy technologies as ones that stand to radically change our lives.  What they have in common is that they have hugely asymmetric impacts, they display hyper-evolution, they are often omni-use, and they increasingly demonstrate autonomy. 

In other words, these technologies can do things we didn’t know they could do, have impacts we didn’t expect (and may not want), and may decide what to do on their own.  

To build an AI, for the near future one needs a significant amount of computing power, using specialized chips and a large amount of data, but with synthetic biology, the technology is getting to the point where someone can set up a lab in their garage and experiment away.  AI can spread rapidly, but it needs a connected device; engineered organisms can get anywhere there is air or water.

“A pandemic virus synthesized anywhere will spread everywhere,” MIT”s Kevin Esvelt told Axios.

I’ve been fascinated with synthetic biology for some time now, and yet I still think we’re not paying enough attention. “For me, the most exciting thing about synthetic biology is finding or seeing unique ways that living organisms can solve a problem,” David Riglar, Sir Henry Dale research fellow at Imperial College London, told The Scientist. “This offers us opportunities to do things that would otherwise be impossible with non-living alternatives.”

Jim Collins, Termeer professor of medical engineering and science at Massachusetts Institute of Technology (MIT), added: “By approaching biology as an engineering discipline, we are now beginning to create programmable medicines and diagnostic tools with the ability to sense and dynamically respond to information in our bodies.”

For example, researchers just reported on a smart pill — the size of a blueberry! — that can be used to automatically detect key biological molecules in the gut that suggest problems, and wirelessly transmit the information in real time. 

MIT News reports:

Current techniques for diagnosing diseases inside the gut can be invasive (think of a colonoscopy or other endoscopic procedure), and can’t detect molecular biomarkers of disease in real-time. The latter is a problem because several important biomarkers are very short-lived, so they disappear before current techniques can detect them.

The pills involves engineered bacteria, electronics, and a battery (all very small, of course). When the bacteria detect the molecules it is looking for, it produces light (I kid you not), which the electronics detect and convert into a wireless signal. 

“The inner workings of the human gut are still one of the final frontiers of science. Our new pill could unlock a wealth of information about the body’s function, its relationship with the environment, and the impact of disease and therapeutic interventions,” says senior author Timothy Lu, an MIT associate professor of biological engineering and of electrical engineering and computer science.

Alessio Fassano, a professor at the Harvard T.H. Chan School of Public Health, who was not involved in the research, praised the findings: “This system may represent a game changer in the management of IBDs [inflammatory bowel diseases] in terms of early diagnosis, interception of disease flareups, and optimization of a therapeutic plan.” Co-first author Maria Eugenia Inda explains: “We still don’t fully understand it [the gut] because it’s difficult to access and study. We lack the tools to explore it. Knowing more about the gut chemical environment could help us prevent disease by identifying factors that cause inflammation before the inflammation takes over.”

The authors believe the results suggest application beyond those molecules or even just the gut. Co-first author Miguel Jimenez says: “We played to the strengths of the biology and the electronics — our tiny pill shows what is possible when we can bridge bacterial sensing with wireless communication.”

We’re just getting started. Dr. Collins told The Scientist:

There are two big challenges – the first is that we still don’t have a broad set of design principles for biology – and that means that its complexity can still get in the way of our best design plans.  Secondly, we still have a pretty anemic library of biological parts – to the order of a few dozen that have been reused and repurposed in the last two decades. We need to dramatically expand this toolkit through synthesis and biomining efforts.

As an example, his team engineered a bacteria that helps break down antibiotics in the gut. “By applying synthetic biology, we have designed a living therapeutic that has the potential to help counter the potential negative effects of antibiotic use,” he said.

Dr, Collings is a big believer not only how synthetic biology can help improve our health but also elsewhere: “I think the idea of applying engineering principles to living systems that have evolved over billions of years can provide humanity with a real edge to counter some of the existential challenges we’re facing.”

But, of course, these blessings come with a curse. Add the option of engineering our own bodies, and the implications grow. Mr. Suleyman writes: “As people increasingly take power into their hands, I expect inequality’s newest frontier to lie in biology.” Some will try to alter their DNA, others will augment themselves — and some will try to harm others.

Since many of the synthetic biology techniques have become “democratized,” as some experts fear, creating pathogens becomes too easy – especially if aided by AI.  “Even relatively mild pandemic viruses can kill more people than any nuclear device,” writes Dr. Esvelt.

The possibilities of synthetic biology – and AI, quantum computing, and others – are endless.  So are the dangers.

————

I’ll leave you with two of Mr. Suleyman’s cautions:

  • “But we are entering a new era where the previously unthinkable is now a distinct possibility.”
  • “When it comes to technology that could radically extend human life span or capabilities, there clearly has to be a big debate from the get-go about its distribution.”

We need to be thinking that unthinkable, and having that debate.

Kim is a former emarketing exec at a major Blues plan, editor of the late & lamented Tincture.io, and a regular THCB contributor.

from The Health Care Blog https://ift.tt/ND1S0In

Shiv Rao, CEO demos Abridge

Abridge has been trying to document the clinical encounter automatically since 2018. There’s been quit a lot of fuss about them in recent weeks. They announced becoming the first “Pal” on the Epic “Partners& Pals” program, and also that their AI based encounter capture technology was now being used at several hospitals. And they showed up in a NY Times article about tech being used for clinical documentation. But of course they’re not the only company trying to turn the messy speech in a clinician/patient encounter into a buttoned-up clinical note. Suki, Augmedix & Robin all come to mind, while the elephant is Nuance, which has itself been swallowed by the whale that is Microsoft.

But having used their consumer version a few years back and been a little disappointed, I wanted to see what all the fuss was about. CEO Shiv Rao was a real sport and took me through a clinical example with him as the doc and me as a (slightly) fictionalized patient. He also patiently explained where the company was coming from and what their road map was. But they are all in on AI–no off shore typists trying to correct in close to real time here.

And you’ll for sure want to see the demo. (If you want to skip the chat it’s about 8.00 to 16.50). And I think you’ll be very impressed indeed. I know I was. I can’t imagine a doctor not wanting this, and I suspect those armies of scribes will soon be able to go back to real work! — Matthew Holt

from The Health Care Blog https://ift.tt/eVhSQAp

“The Greatest Scientist of All Time” says Scientific American. Who is it?

BY MIKE MAGEE

When it comes to our earthly survival as a human species, words are often under-powered and off-the-mark. Clearer concepts, definitions and terms are required for clarity. Here are five terms that are useful and worth remembering:

  1. Planetary Boundaries
  2. Earth Systems
  3. Human Perturbations
  4. Planetary Scale Destabilization
  5. Holocene Epoch vs. Anthropogenic Epoch 

These terms all tie back to a single source – a child of World War II, only seven when his home in Amsterdam was overrun by Nazis. His father was a waiter, his mother a cook in a local hospital. He’d later recall with a shudder the Fall of 1944, the beginning of “hongerwinter” (winter of  famine) which he blamed for stunting his growth and contributing to his short stature. The event also exposed him to death for the first time, losing several classmates to starvation and frozen temperatures that winter.

There were no early signs of brilliance. He attended a technical school and prepared for a life in construction. He met a Finnish girl, Terttu when he was 25, and they settled in a small town 200 km north of Stockholm. It was his wife who recognized his potential first, pointing him toward a newspaper ad for a job as a programmer at the Stockholm University’s Meterorologic Institute (MISU). No matter that he had no experience in computer programming. They moved to Stockholm. He worked and they both took college courses. By age 30, with sponsorship from the world’s expert on acid rain and first chairman of the Intergovernmental Panel on Climate Change (Bert Bolin), he received a master’s degree in meteorology. Five years later, after focusing on stratospheric chemistry, he earned his doctorate.

When he died at 87, with Terttu, two daughters and three grandchildren at his side, Paul Crutzen was 87 years old.  A tribute in Scientific American at the time stated “Paul Crutzen [may have been] the greatest scientist of all time.” This was not because he had been granted the 1975 Nobel Prize in Chemistry (without ever having taken a chemistry course) for discovering how ozone was formed in the stratosphere; or for his coining the term “Nuclear Winter” to describe the planetary devastation that would follow a nuclear attack in 1984; or for being the major adviser on global warming to Pope Francis in preparing his encyclical Laudato Si/”On Care For Our Common Home”prior to the Paris Climate Accords in 2015.

No, what Crutzen is especially remembered for is a momentary lapse in his usually pleasant and calm demeanor while serving on a panel of The International Geosphere–Biosphere Programme (IGBP) for the the International Council for Science (ICSU) at its May, 2000 meeting in Cuernavaca, Mexico. The speaker at the mic, in outlining the current challenge of maintaining the Earth’s life support systems referred once too often to the Holocene Epoch, that is the period of roughly the last 11,700 years of our planet’s existence when humans were able to survive, thrive and develop in general harmony with their host planet. In a moment of spontaneous scientific combustion, Dr. Crutzen muttered in a muted but still audible whisper, “Let’s stop it. We are no longer in the Holocene. We are in the Anthropocene.”

A hushed silence fell over the crowd as the world’s top Earth scientists were forced to acknowledge an uncomfortable truth – man was destroying the planet. Summarized in a report a few days late, Crutzen wrote, “Considering these and many other major and still growing impacts of human activities on earth and atmosphere…it seems to us more than appropriate to emphasize the central role of mankind in geology and ecology by proposing to use the term ‘anthropocene’ for the current geological epoch.”

For scientists in the field, this was a call to action. As the American Meteorological Society later recounted, “From the perspective of Earth system science, many well-respected scientists in that field are convinced that the transformation from the Holocene to the Anthropocene, a term clearly defined by Crutzen in a moment of exasperation, is truly a once-in-a-lifetime event.”

Nine years later, Crutzen’s colleagues from Stockhom University, Will Steffen and Johan Rockstrom, published a paper on “the environmental limits within which humanity can safely operate.” In that paper, “A Safe Operating Space For Humanity” published in Nature, they proposed nine “planetary boundaries” to gauge “the continued development of human societies and the maintenance of the Earth system(ES) in a resilient and accommodating state.” In there view, measuring and ongoing monitoring of these boundaries would provide “a science-based analysis of the risk of human perturbations” that might “destabilize the ES on a planetary scale.”

But in the world of international geoscience, laying out human responsibility for planetary stress is one thing, but declaring an end to the 11,700 year Holocene Epoch was quite another. In effect, Crutzen was provoking a geological revolution, and that is why a hush fell over the crowd that day.

But coming out of the original 2000 Mexico meeting of the International Geosphere-Biospere Program, participants were energized and decided to form a 40 member global Anthropocene Working Group (AWG). One arm focused on defining planetary boundaries (PB), and specific data measures for each, while another would explore sites that might provide geologic proof in soil samples of the irreversible impact of humans on their planet, and support the now widely held belief that a new geologic epic had indeed been launched 

After extensive analysis, a new paradigm with 9 planetary boundaries was published in 2009 and reviewed in 2015 and included “a science-based analysis of the risk that human perturbations will destabilize Earth state at a planetary scale.” The list with associated measures included: : 

  1. Climate Change (CO2 concentration in the atmosphere < 350 ppm);
  2. Ocean Acidification (Seawater Aragonite levels – crystal calcium carbonate ≥ 80% of pre-industrial levels).
  3. Stratospheric Ozone (less than 5% reduction in total atmospheric O3 from a pre-industrial level).
  4. Nitrogen/Phosphorous Cycle  (artificial eutrophication of air, soil, water)
  5. Global Freshwater supply (< 4000 km3/yr of consumptive use of runoff resources).
  6. Land System use(< 15% of the ice-free land surface under cropland).
  7. Biosphere Integrity (an annual rate of loss of biological diversity of < 10 extinctions per million species).
  8. Novel Chemicals (emissions of toxic compounds such as synthetic organic pollutants and radioactive materials, but also genetically modified organisms, nanomaterials, and micro-plastics).
  9. Atmospheric Aerosols (natural and manmade dust deposited in the lower atmosphere).

The control measures track shifts from Holocene conditions that are human mitigated. For example, CO2 concentrations during Holocene fluctuated around 280 ppm. Since 1950, they have risen to 350 ppm, a level that geologic studies demonstrate last existed on our planet 300,000 years ago. In 2022, the level hit a new record of 417 ppm

The Planetary Boundary (PB) framework was designed to promote maintenance of a “desired Holocene state” that has served human development well. A “safe operating space” for human society development on Earth is not a luxury. By 2015, it was determined that four of the planetary boundaries had already been breached including climate change, biosphere integrity (diversity), biogeochemical flows ( nitrogen and phosphorus cycles ), and land-system change. Seven years later, in 2022, a 5th boundary (introduction of novel entities – formerly “chemical pollution”) was crossed.

At the same time, the 40 member Anthropocene Working Group labored on in search of a single site that might yield a core geologic bore sample that proved that man in real time had shifted Earth’s basic geology. In 2023, Colin Waters, the AWG’s chair reported “We see a clear, abrupt, and global transition from the previous Earth epoch to something new”, and announced the six finalists –   “a peat bog in Poland’s Sudeten Mountains;  Searsville Lake, in California; Crawford Lake, in Ontario; a seafloor in the Baltic Sea;  a bay in Japan;  a water-filled volcanic crater in China; an ice core drilled from the Antarctic Peninsula;  and two coral reefs, in Australia and the Gulf of Mexico.”

On July 11, 2023, Canadian Geographic proudly broadcast, “The Anthropocene is here — and tiny Crawford Lake has been chosen as the global ground zero.” As the article stated “Its nomination still needs to be voted on by three higher bodies of geologists over the coming year, but if they, too, approve the candidacy, Crawford Lake will be endowed with the ‘golden spike,’ a literal brass marker that signifies that the planet shifted, in about 1950, from one unit of geological time to the next.”

Why Crawford Lake? Turns out this “humble little lake” has a very rare geochemical mix including a depth to surface area mix that prevents top and bottom layer mixing, and prominent oxygen levels within its bottom layer. The fact that it is a “meromictic” lake (meaning its top and bottom layers never mix) makes it unique in all of North America. Over the years, as material settled to the lake bottom it was sealed by distinct couplets of calcite deposits that market summer and winter. This allowed core samples to be accurately dated. For example, in 1970, corn pollen found in one of the layers was able to be accurately dated by stratigraphers to the Middle Ages.

What they are looking for in the soil and stone are concrete markers. According to published reports, dry ice frozen core samples were able to be dated back over 1000 years. More relevant to the Anthropocene, “By 1950 or so, a rapid, dramatic increase of carbon-based particles shows up from industrial processes, including coal-fired steel-making in a nearby Hamilton foundry, as well as a rapid rise in plutonium from nuclear testing, a change in nitrogen isotopes from fertilizer use, and the chemical fallout from acid rain.” 

These, and other findings, allowed 75 local scientists to champion Crawford Lakes candidacy with Francine McCarthy, a geologist at Brock University in the lead. She stated, “If people see that stratigraphers, a conservative bunch of geologists, are willing to put a line on the timescale and call it by the name that recognizes — that admits — the role of humans as a causal agency, then that’s mammoth.”

Were Paul Crutzen alive, he would surely agree that the announcement of the 39th Epoch in our 4.6 billion year planetary history was not a call for celebration, but rather a call to action. The challenge for human and planetary survival is now scientifically linked, and no less urgent than was Paul’s own childhood survival during the “hongerwinter” of 1944.

Mike Magee MD is a Medical Historian and regular contributor to THCB. He is the author of CODE BLUE: Inside the Medical-Industrial Complex (Grove/2020).

from The Health Care Blog https://ift.tt/FtV9DJZ

Beyond the Scale: How organizations should evaluate the success of obesity management solutions

By CAITLYN EDWARDS

Obesity treatment is often framed as a race to the bottom — how much weight can someone lose? Five percent? Ten percent? And with recent scientific advancements in anti-obesity medications such as GLP-1s, what about even 15-20%?

Obesity treatment, though, isn’t just about the number on the scale. It’s about moving the needle on biomarkers that really matter to overall health. Seven out of the top ten leading causes of death and disability in the United States today are chronic diseases that have links to overweight and obesity. The metabolic benefits of just 5% weight loss can be life-changing for many people with obesity-related comorbidities. This means that for organizations looking to treat their chronic conditions, obesity care shouldn’t be all about striving for the lowest possible weight.

Indeed, consensus and practice statements from groups including the American Heart Association, the American College of Cardiology, the American Diabetes Association, and The Obesity Society, support weight loss programs that achieve clinically significant weight loss outcomes, defined as greater than or equal to 5% of an individual’s baseline body weight. This number is derived from decades of research demonstrating that even modest weight loss has impacts on physiological health including type 2 diabetes, dyslipidemia, hypertension, and many kinds of cancer.

People who attain just 5% weight loss see the following health improvements:

  • Reductions in systolic and diastolic blood pressure
  • Risk reductions of developing type 2 diabetes by almost 60%
  • Reductions in HbA1c and fasting blood glucose levels
  • Greater insulin sensitivity
  • Decreased need for newly prescribed diabetes, hypertension, and lipid-lowering medications

Understanding that obesity outcomes include more than just the number on the scale, how can benefit managers and health plan leaders measure success? Here are some things organizations should look for when evaluating an obesity management solution:

N-size of outcomes

While a high weight loss average may sound impressive, it doesn’t tell the whole story. A better measure might be the number of people in a program able to achieve greater than 5% weight loss. The fact is that weight loss averages are easily skewed by outliers.  An exceptionally high average may not be representative of what is actually taking place at the individual level. What matters is that a large percentage of people in the program are able to see clinically significant results.

Emphasis on behavior change

Another way to measure the success of an obesity management solution is by the sustainability of its outcomes — primarily through adopting healthier behaviors. Intensive behavioral therapy is crucial to obesity treatment and can reduce the risk of type 2 diabetes. Support from expert dietitians and coaches can help promote a healthy relationship with food for optimal weight loss.

Through medical nutrition therapy, dietitians create personalized calorie and macronutrient goals to foster weight loss in a healthy, sustainable way. Also, self-directed cognitive behavioral therapy can help people become more aware of underlying thoughts and behavior patterns around food.

Step therapy approach to treatment

Some obesity management solutions avoid medications entirely while others rely solely on expensive GLP-1s. But both of those methods fall short of providing the best care to the most people at the lowest cost possible.

The best obesity management solutions take a clinically rigorous step-therapy approach to treatment. This way, they carefully manage access to expensive anti-obesity medications while achieving meaningful outcomes. Many of their members will achieve clinically significant weight loss through behavior change alone. Some may need a boost from lower-intensity, lower-cost anti-obesity medications to reach their goals. Others, with severe obesity or multiple cardio-metabolic conditions, may require higher-intensity anti-obesity medications like GLP-1s. Treatment levels can be safely tried in succession with needs and costs in mind.

It’s likely only 5-10% of a given population would end up using GLP-1s with this step-therapy approach, while the majority of people would still get clinically meaningful results without such intensive treatment.

Address SDOH to personalize care

One-size-fits-all solutions — like those that insist on a highly restrictive diet — miss the mark on health equity. Not everyone can afford expensive meat-heavy diets and they don’t always line up with people’s cultural preferences. Similarly, a program that simply doles out GLP-1s without helping people manage side effects doesn’t work and will only drain budgets.

The key to unlock improved outcomes is by helping people address SDOH challenges like food insecurity, language barriers, cultural factors, physical environment, and more. A good obesity solution should expand access to bilingual registered dietitians who are trained in dietary considerations and eating patterns for many different cultures and ethnic groups. They can help folks plan meals around limited budgets and specific dietary needs.

Conclusion

Organizations have much to consider when evaluating obesity solutions for their population. It’s easy to be swayed by simple metrics that seem indisputable. But, in the end, outcomes like 5% weight loss and reductions in HbA1c for the majority of an eligible population are what counts. Sustainable outcomes rely on real behavior change, a careful step-therapy approach to medication, and personalized care when it comes to social determinants of health.

Caitlyn Edwards, PhD, RDN, is a Senior Clinical Research Specialist at Vida Health

from The Health Care Blog https://ift.tt/vZLRDz6