Feed aggregator

Postal Service EV fleet back on Congress’ hit list

ClimateWire News - Tue, 06/24/2025 - 6:58am
Republicans have proposed selling the Postal Service's electric vehicles. The issue may come up during a hearing Tuesday.

‘Getting especially ugly’: Industry analyst sees uncertain future for US carmakers

ClimateWire News - Tue, 06/24/2025 - 6:58am
Edmunds' Ivan Drury is trying to make sense of an American auto market in constant flux.

Japan boosts effort to curb methane leaks from LNG supply chains

ClimateWire News - Tue, 06/24/2025 - 6:57am
The announcement was made after a three-day energy summit in Tokyo where government officials urged energy importers to secure gas past 2050.

Scientists stumble upon way to cut cow dung methane emissions

ClimateWire News - Tue, 06/24/2025 - 6:57am
Two local scientists began testing the addition of polyferric sulfate in an attempt to recycle the water in cow dung lagoons and made a startling observation.

EU climate boss fought Commission plan to nix greenwashing rules

ClimateWire News - Tue, 06/24/2025 - 6:56am
The vice president of the EU executive pressured the environment commissioner over several days to preserve the law.

Greenpeace joins anti-Bezos protest in Venice about wedding, tax breaks

ClimateWire News - Tue, 06/24/2025 - 6:55am
Activists argue Jeff Bezos' wedding exemplifies broader failures in municipal governance, particularly the prioritization of tourism over resident needs.

Protect young secondary forests for optimum carbon removal

Nature Climate Change - Tue, 06/24/2025 - 12:00am

Nature Climate Change, Published online: 24 June 2025; doi:10.1038/s41558-025-02355-5

The authors generate ~1-km2 growth curves for aboveground live carbon in regrowing forests, globally. They show that maximum carbon removal rates can vary by 200-fold spatially and with age, with the greatest rates estimated at about 30 ± 12 years, highlighting the role of secondary forests in carbon cycling.

Copyright Cases Should Not Threaten Chatbot Users’ Privacy

EFF: Updates - Mon, 06/23/2025 - 10:07pm

Like users of all technologies, ChatGPT users deserve the right to delete their personal data. Nineteen U.S. States, the European Union, and a host of other countries already protect users’ right to delete. For years, OpenAI gave users the option to delete their conversations with ChatGPT, rather than let their personal queries linger on corporate servers. Now, they can’t. A badly misguided court order in a copyright lawsuit requires OpenAI to store all consumer ChatGPT conversations indefinitely—even if a user tries to delete them. This sweeping order far outstrips the needs of the case and sets a dangerous precedent by disregarding millions of users’ privacy rights.

The privacy harms here are significant. ChatGPT’s 300+ million users submit over 1 billion messages to its chatbots per day, often for personal purposes. Virtually any personal use of a chatbot—anything from planning family vacations and daily habits to creating social media posts and fantasy worlds for Dungeons and Dragons games—reveal personal details that, in aggregate, create a comprehensive portrait of a person’s entire life. Other uses risk revealing people’s most sensitive information. For example, tens of millions of Americans use ChatGPT to obtain medical and financial information. Notwithstanding other risks of these uses, people still deserve privacy rights like the right to delete their data. Eliminating protections for user-deleted data risks chilling beneficial uses by individuals who want to protect their privacy.

This isn’t a new concept. Putting users in control of their data is a fundamental piece of privacy protection. Nineteen states, the European Union, and numerous other countries already protect the right to delete under their privacy laws. These rules exist for good reasons: retained data can be sold or given away, breached by hackers, disclosed to law enforcement, or even used to manipulate a user’s choices through online behavioral advertising.

While appropriately tailored orders to preserve evidence are common in litigation, that’s not what happened here. The court disregarded the privacy rights of millions of ChatGPT users without any reasonable basis to believe it would yield evidence. The court granted the order based on unsupported assertions that users who delete their data are probably copyright infringers looking to “cover their tracks.” This is simply false, and it sets a dangerous precedent for cases against generative AI developers and other companies that have vast stores of user information. Unless courts limit orders to information that is actually relevant and useful, they will needlessly violate the privacy rights of millions of users.

OpenAI is challenging this order. EFF urges the court to lift the order and correct its mistakes.  

The NO FAKES Act Has Changed – and It’s So Much Worse

EFF: Updates - Mon, 06/23/2025 - 3:39pm

A bill purporting to target the issue of misinformation and defamation caused by generative AI has mutated into something that could change the internet forever, harming speech and innovation from here on out.

The Nurture Originals, Foster Art and Keep Entertainment Safe (NO FAKES) Act aims to address understandable concerns about generative AI-created “replicas” by creating a broad new intellectual property right. That approach was the first mistake: rather than giving people targeted tools to protect against harmful misrepresentations—balanced against the need to protect legitimate speech such as parodies and satires—the original NO FAKES just federalized an image-licensing system.

Take Action

Tell Congress to Say No to NO FAKES

The updated bill doubles down on that initial mistaken approach by mandating a whole new censorship infrastructure for that system, encompassing not just images but the products and services used to create them, with few safeguards against abuse.

The new version of NO FAKES requires almost every internet gatekeeper to create a system that will a) take down speech upon receipt of a notice; b) keep down any recurring instance—meaning, adopt inevitably overbroad replica filters on top of the already deeply flawed copyright filters;  c) take down and filter tools that might have been used to make the image; and d) unmask the user who uploaded the material based on nothing more than the say so of person who was allegedly “replicated.”

This bill would be a disaster for internet speech and innovation.

Targeting Tools

The first version of NO FAKES focused on digital replicas. The new version goes further, targeting tools that can be used to produce images that aren’t authorized by the individual, anyone who owns the rights in that individual’s image, or the law. Anyone who makes, markets, or hosts such tools is on the hook. There are some limits—the tools must be primarily designed for, or have only limited commercial uses other than making unauthorized images—but those limits will offer cold comfort to developers given that they can be targeted based on nothing more than a bare allegation. These provisions effectively give rights-holders the veto power on innovation they’ve long sought in the copyright wars, based on the same tech panics. 

Takedown Notices and Filter Mandate

The first version of NO FAKES set up a notice and takedown system patterned on the DMCA, with even fewer safeguards. NO FAKES expands it to cover more service providers and require those providers to not only take down targeted materials (or tools) but keep them from being uploaded in the future.  In other words, adopt broad filters or lose the safe harbor.

Filters are already a huge problem when it comes to copyright, and at least in that instance all it should be doing is flagging for human review if an upload appears to be a whole copy of a work. The reality is that these systems often flag things that are similar but not the same (like two different people playing the same piece of public domain music). They also flag things for infringement based on mere seconds of a match, and they frequently do not take into account context that would make the use authorized by law.

But copyright filters are not yet required by law. NO FAKES would create a legal mandate that will inevitably lead to hecklers’ vetoes and other forms of over-censorship.

The bill does contain carve outs for parody, satire, and commentary, but those will also be cold comfort for those who cannot afford to litigate the question.

Threats to Anonymous Speech

As currently written, NO FAKES also allows anyone to get a subpoena from a court clerk—not a judge, and without any form of proof—forcing a service to hand over identifying information about a user.

We've already seen abuse of a similar system in action. In copyright cases, those unhappy with the criticisms being made against them get such subpoenas to silence critics. Often that the criticism includes the complainant's own words as proof of the criticism, an ur-example of fair use. But the subpoena is issued anyway and, unless the service is incredibly on the ball, the user can be unmasked.

Not only does this chill further speech, the unmasking itself can cause harm to users. Either reputationally or in their personal life.

Threats to Innovation

Most of us are very unhappy with the state of Big Tech. It seems like not only are we increasingly forced to use the tech giants, but that the quality of their services is actively degrading. By increasing the sheer amount of infrastructure a new service would need to comply with the law, NO FAKES makes it harder for any new service to challenge Big Tech. It is probably not a coincidence that some of these very giants are okay with this new version of NO FAKES.

Requiring removal of tools, apps, and services could likewise stymie innovation. For one, it would harm people using such services for otherwise lawful creativity.  For another, it would discourage innovators from developing new tools. Who wants to invest in a tool or service that can be forced offline by nothing more than an allegation?

This bill is a solution in search of a problem. Just a few months ago, Congress passed Take It Down, which targeted images involving intimate or sexual content. That deeply flawed bill pressures platforms to actively monitor online speech, including speech that is presently encrypted. But if Congress is really worried about privacy harms, it should at least wait to see the effects of the last piece of internet regulation before going further into a new one. Its failure to do so makes clear that this is not about protecting victims of harmful digital replicas.

NO FAKES is designed to consolidate control over the commercial exploitation of digital images, not prevent it. Along the way, it will cause collateral damage to all of us.

Take Action

Tell Congress to Say No to NO FAKES

New Journalism Curriculum Module Teaches Digital Security for Border Journalists

EFF: Updates - Mon, 06/23/2025 - 12:00pm
Module Developed by EFF, Freedom of the Press Foundation, and University of Texas, El Paso Guides Students Through Threat Modeling and Preparation

SAN FRANCISCO – A new college journalism curriculum module teaches students how to protect themselves and their digital devices when working near and across the U.S.-Mexico border. 

“Digital Security 101: Crossing the US-Mexico Border” was developed by Electronic Frontier Foundation (EFF) Director of Investigations Dave Maass and Dr. Martin Shelton, deputy director of digital security at Freedom of the Press Foundation (FPF), in collaboration with the University of Texas at El Paso (UTEP) Multimedia Journalism Program and Borderzine

The module offers a step-by-step process for improving the digital security of journalists passing through U.S. Land Ports of Entry, focusing on threat modeling: thinking through what you want to protect, and what actions you can take to secure it. 

This involves assessing risk according to the kind of work the journalist is doing, the journalist’s own immigration status, potential adversaries, and much more, as well as planning in advance for protecting oneself and one’s devices should the journalist face delay, detention, search, or device seizure. Such planning might include use of encrypted communications, disabling or enabling certain device settings, minimizing the data on devices, and mentally preparing oneself to interact with border authorities.  

The module, in development since early 2023, is particularly timely given increasingly invasive questioning and searches at U.S. borders under the Trump Administration and the documented history of border authorities targeting journalists covering migrant caravans during the first Trump presidency. 

"Today's journalism students are leaving school only to face complicated, new digital threats to press freedom that did not exist for previous generations. This is especially true for young reporters serving border communities," Shelton said. "Our curriculum is designed to equip emerging journalists with the skills to protect themselves and sources, while this new module is specifically tailored to empower students who must regularly traverse ports of entry at the U.S.-Mexico border while carrying their phones, laptops, and multimedia equipment." 

The guidance was developed through field visits to six ports of entry across three border states, interviews with scores of journalists and students from on both sides of the border, and a comprehensive review of CBP policies, while also drawing from EFF and FPF’s combined decades of experience researching constitutional rights and security techniques when it comes to our devices.  

“While this training should be helpful to investigative journalists from anywhere in the country who are visiting the borderlands, we put journalism students based in and serving border communities at the center of our work,” Maass said. “Whether you’re reviewing the food scene in San Diego and Tijuana, covering El Paso and Ciudad Juarez’s soccer teams, reporting on family separation in the Rio Grande Valley, or uncovering cross-border corruption, you will need the tools to protect your work and sources." 

The module includes a comprehensive slide deck that journalism lecturers can use and remix for their classes, as well as an interactive worksheet. With undergraduate students in mind, the module includes activities such as roleplaying a primary inspection interview and analyzing pop singer Olivia Rodrigo’s harrowing experience of mistaken identity while reentering the country. The module has already been delivered successfully in trainings with journalism students at UTEP and San Diego State University. 

“UTEP’s Multimedia Journalism program is well-situated to help develop this digital security training module,” said UTEP Communication Department Chair Dr. Richard Pineda. “Our proximity to the U.S.-Mexico border has influenced our teaching models, and our student population – often daily border crossers – give us a unique perspective from which to train journalists on issues related to reporting safely on both sides of the border.” 

For the “Digital security 101: Crossing the US-Mexico border” module: https://freedom.press/digisec/blog/border-security-module/ 

For more about the module: https://www.eff.org/deeplinks/2025/06/journalist-security-checklist-preparing-devices-travel-through-us-border

For EFF’s guide to digital security at the U.S. border: https://www.eff.org/press/releases/digital-privacy-us-border-new-how-guide-eff 

For EFF’s student journalist Surveillance Self Defense guide: https://ssd.eff.org/playlist/journalism-student 

Contact:  DaveMaassDirector of Investigationsdm@eff.org

A Journalist Security Checklist: Preparing Devices for Travel Through a US Border

EFF: Updates - Mon, 06/23/2025 - 11:31am

This post was originally published by the Freedom of the Press Foundation (FPF). This checklist complements the recent training module for journalism students in border communities that EFF and FPF developed in partnership with the University of Texas at El Paso Multimedia Journalism Program and Borderzine. We are cross-posting it under FPF's Creative Commons Attribution 4.0 International license. It has been slightly edited for style and consistency.

Before diving in: This space is changing quickly! Check FPF's website for updates and contact them with questions or suggestions. This is a joint project of Freedom of the Press Foundation (FPF) and the Electronic Frontier Foundation.

Those within the U.S. have Fourth Amendment protections against unreasonable searches and seizures — but there is an exception at the border. Customs and Border Protection (CBP) asserts broad authority to search travelers’ devices when crossing U.S. borders, whether traveling by land, sea, or air. And unfortunately, except for a dip at the start of the COVID-19 pandemic when international travel substantially decreased, CBP has generally searched more devices year over year since the George W. Bush administration. While the percentage of travelers affected by device searches remains small, in recent months we’ve heard growing concerns about apparent increased immigration scrutiny and enforcement at U.S. ports of entry, including seemingly unjustified device searches.

Regardless, it’s hard to say with certainty the likelihood that you will experience a search of your items, including your digital devices. But there’s a lot you can do to lower your risk in case you are detained in transit, or if your devices are searched. We wrote this checklist to help journalists prepare for transit through a U.S. port of entry while preserving the confidentiality of your most sensitive information, such as unpublished reporting materials or source contact information. It’s important to think about your strategy in advance, and begin planning which options in this checklist make sense for you.

First thing’s first: What might CBP do?

U.S. CBP’s policy is that they may conduct a “basic” search (manually looking through information on a device) for any reason or no reason at all. If they feel they have reasonable suspicion “of activity in violation of the laws enforced or administered by CBP” or if there is a “national security concern,” they may conduct what they call an “advanced” search, which may include connecting external equipment to your device, such as a forensic analysis tool designed to make a copy of your data.

Your citizenship status matters as to whether you can refuse to comply with a request to unlock your device or provide the passcode. If you are a U.S. citizen entering the U.S., you have the most legal leverage to refuse to comply because U.S. citizens cannot be denied entry — they must be let back into the country. But note that if you are a U.S. citizen, you may be subject to escalated harassment and further delay at the port of entry, and your device may be seized for days, weeks, or months.

If CBP officers seek to search your locked device using forensic tools, there is a chance that some (if not all of the) information on the device will be compromised. But this probability depends on what tools are available to government agents at the port of entry, if they are motivated to seize your device and send it elsewhere for analysis, and what type of device, operating system, and security features your device has. Thus, it is also possible that strong encryption may substantially slow down or even thwart a government device search.

Lawful permanent residents (green-card holders) must generally also be let back into the country. However, the current administration seems more willing to question LPR status, so refusing to comply with a request to unlock a device or provide a passcode may be risky for LPRs. Finally, CBP has broad discretion to deny entry to foreign nationals arriving on a visa or via the visa waiver program.

At present, traveling domestically within the United States, particularly if you are a U.S. citizen, is lower risk than travelling internationally. Our luggage and the physical aspects of digital devices may be searched — e.g., manual inspection or x-rays to ensure a device is not a bomb. CBP is often present at airports, but for domestic travel within the U.S. you should only be interacting with the Transportation Security Administration. TSA does not assert authority to search the data on your device — this is CBP’s role.

At an international airport or other port of entry, you have to decide whether you will comply with a request to access your device, but this might not feel like much of a choice if you are a non-U.S. citizen entering the country! Plan accordingly.

Your border digital security checklist Preparing for travel

Make a backup of each of your devices before traveling.
Use long, unpredictable, alphanumeric passcodes for your devices and commit those passwords to memory.
☐ If bringing a laptop, ensure it is encrypted using BitLocker for Windows, or FileVault for macOS. Chromebooks are encrypted by default. A password-protected laptop screen lock is usually insufficient. When going through security, devices should be turned all the way off.
☐ Fully update your device and apps.
☐ Optional: Use a password manager to help create and store randomized passcodes. 1Password users can create temporary travel vaults.
☐ Bring as few sensitive devices as possible — only what you need.
☐ Regardless which country you are visiting, think carefully about what you are willing to post publicly on social media about that country to avoid scrutiny.
☐ For land ports of entry in the U.S., check CBP’s border wait times and plan accordingly.
☐ If possible, print out any travel documents in advance to avoid the necessity to unlock your phone during boarding, including boarding passes for your departure and return, rental car information, and any information about your itinerary that you would like to have on hand if questioned (e.g., hotel bookings, visa paperwork, employment information if applicable, conference information). Use a printer you trust at home or at the office, just in case.
☐ Avoid bringing sensitive physical documents you wouldn’t want searched. If you need them, consider digitizing them (e.g., by taking a photo) and storing them remotely on a cloud service or backup device.

Decide in advance whether you will unlock your device or provide the passcode for a search. Your overall likelihood of experiencing a device search is low (e.g., less than .01% of international travelers are selected), but depending on what information you carry, the impact of a search may be quite high. If you plan to unlock your device for a search or provide the passcode, ensure your devices are prepared:

☐ Upload any information you would like to keep in cloud providers in advance (e.g., using iCloud) that you would like stored remotely, instead of locally on your device.
☐ Remove any apps, files, chat histories, browsing histories, and sensitive contacts you would not want exposed during a search.
☐ If you delete photos or files, delete them a second time in the “Recently Deleted” or “Trash” sections of your Files and Photos apps.
☐ Remove messages from the device that you believe would draw unwanted scrutiny. Remove yourself — even if temporarily — from chat groups on platforms like Signal.
☐ If you use Signal and plan to keep it on your device, use disappearing messages to minimize how much information you keep within the app.
☐ Optional: Bring a travel device instead of your usual device. Ensure it is populated with the apps you need while traveling, as well as login credentials (e.g., stored in a password manager), and necessary files. If you do this, ensure your trusted contacts know how to reach you on this device.
☐ Optional: Rather than manually removing all sensitive files from your computer, if you are primarily accessing web services during your travels, a Chromebook may be an affordable alternative to your regular computer.
☐ Optional: After backing up your devices for every day use, factory reset it and add only the information you need back onto the device.
☐ Optional: If you intend to work during your travel, plan in advance with a colleague who can remotely assist you in accessing and/or rotating necessary credentials.
☐ If you don’t plan to work, consider discussing with your IT department whether temporarily suspending your work accounts could mitigate risks at border crossings.

On the day of travel

☐ Log out of accounts you do not want accessible to border officials. Note that border officers do not have authority to access live cloud content — they must put devices in airplane mode or otherwise disconnect them from the internet.
☐ Power down your phone and laptop entirely before going through security. This will enable disk encryption, and make it harder for someone to analyze your device.
☐ Immediately before travel, if you have a practicing attorney who has expertise in immigration and border issues, particularly related to members of the media, make sure you have their contact information written down before visiting.
☐ Immediately before travel, ensure that a friend, relative, or colleague is aware of your whereabouts when passing through a port of entry, and provide them with an update as soon as possible afterward.

If you are pulled into secondary screening

☐ Be polite and try not to emotionally escalate the situation.
☐ Do not lie to border officials, but don’t offer any information they do not explicitly request.
☐ Politely request officers’ names and badge numbers.
☐ If you choose to unlock your device, rather than telling border officials your passcode, ask to type it in yourself.
☐ Ask to be present for a search of your device. But note officers are likely to take your device out of your line of sight.
☐ You may decline the request to search your device, but this may result in your device being seized and held for days, weeks, or months. If you are not a U.S. citizen, refusal to comply with a search request may lead to denial of entry, or scrutiny of lawful permanent resident status.
☐ If your device is seized, ask for a custody receipt (Form 6051D). This should also list the name and contact information for a supervising officer.
☐ If an officer has plugged your unlocked phone or computer into another electronic device, they may have obtained a forensic copy of your device. You will want to remember anything you can about this event if it happens.
☐ Immediately afterward, write down as many details as you can about the encounter: e.g., names, badge numbers, descriptions of equipment that may have been used to analyze the device, changes to the device or corrupted data, etc.

Reporting is not a crime. Be confident knowing you haven’t done anything wrong.

More resources

EFF to European Commission: Don’t Resurrect Illegal Data Retention Mandates

EFF: Updates - Mon, 06/23/2025 - 11:26am

The mandatory retention of metadata is an evergreen of European digital policy. Despite a number of rulings by Europe’s highest court, confirming again and again the incompatibility of general and indiscriminate data retention mandates with European fundamental rights, the European Commission is taking major steps towards the re-introduction of EU-wide data retention mandates. Recently, the Commission launched a Call for Evidence on data retention for criminal investigations—the first formal step towards a legislative proposal.

The European Commission and EU Member States have been attempting to revive data retention for years. For this purpose, a secretive “High Level Group on Access to Data for Effective Law Enforcement” has been formed, usually referred to as High level Group (HLG) “Going dark”. Going dark refers to the false narrative that law enforcement authorities are left “in the dark” due to a lack of accessible data, despite the ever increasing collection and accessing of data through companies, data brokers and governments. Going dark also describes the intransparent ways of working of the HLG, behind closed doors and without input from civil society.

The Groups’ recommendations to the European Commission, published in 2024, read like a wishlist of government surveillance.They include suggestions to backdoors in various technologies (reframed as “lawful access by design”), obligations on service providers to collect and retain more user data than they need for providing their services, and intercepting and providing decrypted data to law enforcement in real time, all the while avoiding to compromise the security of their systems. And of course, the HLG calls for a harmonized data retention regime, including not only the retention of but also the access to data, and extending data retention to any service provider that could provide access to data.

EFF joined other civil society organizations in addressing the dangerous proposals of the HLG, calling on the European Commission to safeguard fundamental rights and ensuring the security and confidentiality of communication.

In our response to the Commission's Call for Evidence, we reiterated the same principles. 

  • Any future legislative measures must prioritize the protection of fundamental rights and must be aligned with the extensive jurisprudence of the Court of Justice of the European Union. 
  • General and indiscriminate data retention mandates undermine anonymity and privacy, which are essential for democratic societies, and pose significant cybersecurity risks by creating centralized troves of sensitive metadata that are attractive targets for malicious actors. 
  • We highlight the lack of empirical evidence to justify blanket data retention and warn against extending retention duties to number-independent interpersonal communication services as it would violate CJEU doctrine, conflict with European data protection law, and compromise security.

The European Commission must once and for all abandon the ghost of data retention that’s been haunting EU policy discussions for decades, and shift its focus to rights respecting alternatives.

Read EFF’s full submission here.

Largest DDoS Attack to Date

Schneier on Security - Mon, 06/23/2025 - 7:04am

It was a recently unimaginable 7.3 Tbps:

The vast majority of the attack was delivered in the form of User Datagram Protocol packets. Legitimate UDP-based transmissions are used in especially time-sensitive communications, such as those for video playback, gaming applications, and DNS lookups. It speeds up communications by not formally establishing a connection before data is transferred. Unlike the more common Transmission Control Protocol, UDP doesn’t wait for a connection between two computers to be established through a handshake and doesn’t check whether data is properly received by the other party. Instead, it immediately sends data from one machine to another...

Cities lose hope for restarting disaster projects killed by Trump

ClimateWire News - Mon, 06/23/2025 - 6:09am
The president canceled $4.5 billion in FEMA grants that helped communities prepare for rising disaster damage.

EPA leaves social cost of carbon on the cutting-room floor

ClimateWire News - Mon, 06/23/2025 - 6:08am
The agency declined to consider the economic cost of increasing planet-warming pollution in its proposed repeal of power plant rules.

AI could cut more emissions than it creates

ClimateWire News - Mon, 06/23/2025 - 6:07am
A new study estimated that the power-hungry technology could make the grid cleaner.

US hybrid car sales accelerate while EVs sputter

ClimateWire News - Mon, 06/23/2025 - 6:07am
EVs have lower emissions than hybrids but have range limitations. In three years, 20 percent of new U.S. car sales will be hybrids.

Clean energy project cancellations surged to $1.4B in May

ClimateWire News - Mon, 06/23/2025 - 6:06am
The industry has lost $15.5 billion in investments since January — seven times greater than losses at this point in 2024, according to a new analysis.

Michigan urges federal court to dump Trump climate lawsuit

ClimateWire News - Mon, 06/23/2025 - 6:05am
The administration's efforts to prevent future state lawsuits against the fossil fuel industry are a "freewheeling exercise in speculation," the state argues.

Study finds offsetting fossil fuels with trees is nearly impossible

ClimateWire News - Mon, 06/23/2025 - 6:04am
Researchers found that the trees’ collective ability to remove carbon through photosynthesis can’t stand up to the potential emissions from the fossil fuel reserves of the 200 largest oil, gas and coal fuel companies.

Pages