Feed aggregator

3 Questions: Applying lessons in data, economics, and policy design to the real world

MIT Latest News - Thu, 07/24/2025 - 3:45pm

Gevorg Minasyan MAP ’23 first discovered the MITx MicroMasters Program in Data, Economics, and Design of Policy (DEDP) — jointly led by the Abdul Latif Jameel Poverty Action Lab (J-PAL) and MIT Open Learning — when he was looking to better understand the process of building effective, evidence-based policies while working at the Central Bank of Armenia. After completing the MicroMasters program, Minasyan was inspired to pursue MIT’s Master’s in Data, Economics, and Design of Policy program.

Today, Minasyan is the director of the Research and Training Center at the Central Bank of Armenia. He has not only been able to apply what he has learned at MIT to his work, but he has also sought to institutionalize a culture of evidence-based policymaking at the bank and more broadly in Armenia. He spoke with MIT Open Learning about his journey through the DEDP programs, key takeaways, and how what he learned at MIT continues to guide his work.

Q: What initially drew you to the DEDP MicroMasters, and what were some highlights of the program?

A: Working at the Central Bank of Armenia, I was constantly asking myself: Can we build a system in which public policy decisions are grounded in rigorous evidence? Too often, I observed public programs that were well-intentioned and seemed to address pressing challenges, but ultimately failed to bring tangible change. Sometimes it was due to flawed design; other times, the goals simply didn’t align with what the public actually needed or expected. These experiences left a deep impression on me and sparked a strong desire to better understand what works, what doesn’t, and why.

That search led me to the DEDP MicroMasters program, which turned out to be a pivotal step in my professional journey. From the very first course, I realized that this was not just another academic program — it was a completely new way of thinking about development policy. The courses combined rigorous training in economics, data analysis, and impact evaluation with a strong emphasis on practical application. We weren’t just learning formulas or running regressions — we were being trained to ask the right questions, to think critically about causality, and to understand the trade-offs of policy choices.

Another aspect that set the MicroMasters apart was its blended structure. I was able to pursue a globally top-tier education while continuing my full-time responsibilities at the Central Bank. This made the learning deeply relevant and immediately applicable. Even as I was studying, I found myself incorporating insights from class into my day-to-day policy work, whether it was refining how we evaluated financial inclusion programs or rethinking the way we analyzed administrative data.

At the same time, the global nature of the program created a vibrant, diverse community. I engaged with students and professionals from dozens of countries, each bringing different perspectives. These interactions enriched the coursework and helped me to realize that despite the differences in context, the challenges of effective policy design — and the power of evidence to improve lives — were remarkably universal. It was a rare combination: intellectually rigorous, practically grounded, globally connected, and personally transformative.

Q: Can you describe your experiences in the Master’s in Data, Economics, and Design of Policy residential program?

A: The MicroMasters experience inspired me to go further, and I decided to apply for the full-time, residential master’s at MIT. That year was nothing short of transformative. It not only sharpened my technical and analytical skills, but also fundamentally changed the way I think about policymaking.

One of the most influential courses I took during the master’s program was 14.760 (Firms, Markets, Trade, and Growth). The analytical tools it provided mapped directly onto the systemic challenges I saw among Armenian firms. Motivated by this connection, I developed a similar course, which I now teach at the American University of Armenia. Each year, I work with students to investigate the everyday constraints that hinder firm performance, with the ultimate goal of producing data-driven research that could inform business strategy in Armenia.

The residential master’s program taught me that evidence-based decision-making starts with a mindset shift. It’s not just about applying tools, it’s about being open to questioning assumptions, being transparent about uncertainty, and being humble enough to let data challenge intuition. I also came to appreciate that truly effective policy design isn’t about finding one-off solutions, but about creating dynamic feedback loops that allow us to continuously learn from implementation.

This is essential to refining programs in real time, adapting to new information, and avoiding the trap of static, one-size-fits-all approaches. Equally valuable was becoming part of the MIT and J-PAL’s global network. The relationships I built with researchers, practitioners, and fellow students from around the world gave me lasting insights into how institutions can systematically embed analysis in their core operations. This exposure helped me to see the possibilities not just for my own work, but for how public institutions like central banks can lead the way in advancing an evidence-based culture.

Q: How are you applying what you’ve learned in the DEDP programs to the Central Bank of Armenia?

A: As director of the Research and Training Center at the Central Bank of Armenia, I have taken on a new kind of responsibility: leading the effort to scale evidence-based decision-making not only within the Central Bank, but across a broader ecosystem of public institutions in Armenia. This means building internal capacity, rethinking how research informs policy, and fostering partnerships that promote a culture of data-driven decision-making.

Beyond the classroom, the skills I developed through the DEDP program have been critical to my role in shaping real-world policy in Armenia. A particularly timely example is our national push toward a cashless economy — one of the most prominent and complex reform agendas today. In recent years, the government has rolled out a suite of bold policies aimed at boosting the adoption of non-cash payments, all part of a larger vision to modernize the financial system, reduce the shadow economy, and increase transparency. Key initiatives include a cashback program designed to encourage pensioners to use digital payments and the mandatory installation of non-cash payment terminals across businesses nationwide. In my role on an inter-agency policy team, I rely heavily on the analytical tools from DEDP to evaluate these policies and propose regulatory adjustments to ensure the transition is not only effective, but also inclusive and sustainable.

The Central Bank of Armenia recently collaborated with J-PAL Europe to co-design and host a policy design and evaluation workshop. The workshop brought together policymakers, central bankers, and analysts from various sectors and focused on integrating evidence throughout the policy cycle, from defining the problem to designing interventions and conducting rigorous evaluations. It’s just the beginning, but it already reflects how the ideas, tools, and values I absorbed at MIT are now taking institutional form back home.

Our ultimate goal is to institutionalize the use of policy evaluation as a standard practice — not as an occasional activity, but as a core part of how we govern. We’re working to embed a stronger feedback culture in policymaking, one that prioritizes learning before scaling. More experimentation, piloting, and iteration are essential before committing to large-scale rollouts of public programs. This shift requires patience and persistence, but it is critical if we want policies that are not only well-designed, but also effective, inclusive, and responsive to people’s needs.

Looking ahead, I remain committed to advancing this transformation, by building the systems, skills, and partnerships that can sustain evidence-based policymaking in Armenia for the long term. 

Robot, know thyself: New vision-based system teaches machines to understand their bodies

MIT Latest News - Thu, 07/24/2025 - 3:30pm

In an office at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL), a soft robotic hand carefully curls its fingers to grasp a small object. The intriguing part isn’t the mechanical design or embedded sensors — in fact, the hand contains none. Instead, the entire system relies on a single camera that watches the robot’s movements and uses that visual data to control it.

This capability comes from a new system CSAIL scientists developed, offering a different perspective on robotic control. Rather than using hand-designed models or complex sensor arrays, it allows robots to learn how their bodies respond to control commands, solely through vision. The approach, called Neural Jacobian Fields (NJF), gives robots a kind of bodily self-awareness. An open-access paper about the work was published in Nature on June 25.

“This work points to a shift from programming robots to teaching robots,” says Sizhe Lester Li, MIT PhD student in electrical engineering and computer science, CSAIL affiliate, and lead researcher on the work. “Today, many robotics tasks require extensive engineering and coding. In the future, we envision showing a robot what to do, and letting it learn how to achieve the goal autonomously.”

The motivation stems from a simple but powerful reframing: The main barrier to affordable, flexible robotics isn't hardware — it’s control of capability, which could be achieved in multiple ways. Traditional robots are built to be rigid and sensor-rich, making it easier to construct a digital twin, a precise mathematical replica used for control. But when a robot is soft, deformable, or irregularly shaped, those assumptions fall apart. Rather than forcing robots to match our models, NJF flips the script — giving robots the ability to learn their own internal model from observation.

Look and learn

This decoupling of modeling and hardware design could significantly expand the design space for robotics. In soft and bio-inspired robots, designers often embed sensors or reinforce parts of the structure just to make modeling feasible. NJF lifts that constraint. The system doesn’t need onboard sensors or design tweaks to make control possible. Designers are freer to explore unconventional, unconstrained morphologies without worrying about whether they’ll be able to model or control them later.

“Think about how you learn to control your fingers: you wiggle, you observe, you adapt,” says Li. “That’s what our system does. It experiments with random actions and figures out which controls move which parts of the robot.”

The system has proven robust across a range of robot types. The team tested NJF on a pneumatic soft robotic hand capable of pinching and grasping, a rigid Allegro hand, a 3D-printed robotic arm, and even a rotating platform with no embedded sensors. In every case, the system learned both the robot’s shape and how it responded to control signals, just from vision and random motion.

The researchers see potential far beyond the lab. Robots equipped with NJF could one day perform agricultural tasks with centimeter-level localization accuracy, operate on construction sites without elaborate sensor arrays, or navigate dynamic environments where traditional methods break down.

At the core of NJF is a neural network that captures two intertwined aspects of a robot’s embodiment: its three-dimensional geometry and its sensitivity to control inputs. The system builds on neural radiance fields (NeRF), a technique that reconstructs 3D scenes from images by mapping spatial coordinates to color and density values. NJF extends this approach by learning not only the robot’s shape, but also a Jacobian field, a function that predicts how any point on the robot’s body moves in response to motor commands.

To train the model, the robot performs random motions while multiple cameras record the outcomes. No human supervision or prior knowledge of the robot’s structure is required — the system simply infers the relationship between control signals and motion by watching.

Once training is complete, the robot only needs a single monocular camera for real-time closed-loop control, running at about 12 Hertz. This allows it to continuously observe itself, plan, and act responsively. That speed makes NJF more viable than many physics-based simulators for soft robots, which are often too computationally intensive for real-time use.

In early simulations, even simple 2D fingers and sliders were able to learn this mapping using just a few examples. By modeling how specific points deform or shift in response to action, NJF builds a dense map of controllability. That internal model allows it to generalize motion across the robot’s body, even when the data are noisy or incomplete.

“What’s really interesting is that the system figures out on its own which motors control which parts of the robot,” says Li. “This isn’t programmed — it emerges naturally through learning, much like a person discovering the buttons on a new device.”

The future is soft

For decades, robotics has favored rigid, easily modeled machines — like the industrial arms found in factories — because their properties simplify control. But the field has been moving toward soft, bio-inspired robots that can adapt to the real world more fluidly. The trade-off? These robots are harder to model.

“Robotics today often feels out of reach because of costly sensors and complex programming. Our goal with Neural Jacobian Fields is to lower the barrier, making robotics affordable, adaptable, and accessible to more people. Vision is a resilient, reliable sensor,” says senior author and MIT Assistant Professor Vincent Sitzmann, who leads the Scene Representation group. “It opens the door to robots that can operate in messy, unstructured environments, from farms to construction sites, without expensive infrastructure.”

“Vision alone can provide the cues needed for localization and control — eliminating the need for GPS, external tracking systems, or complex onboard sensors. This opens the door to robust, adaptive behavior in unstructured environments, from drones navigating indoors or underground without maps to mobile manipulators working in cluttered homes or warehouses, and even legged robots traversing uneven terrain,” says co-author Daniela Rus, MIT professor of electrical engineering and computer science and director of CSAIL. “By learning from visual feedback, these systems develop internal models of their own motion and dynamics, enabling flexible, self-supervised operation where traditional localization methods would fail.”

While training NJF currently requires multiple cameras and must be redone for each robot, the researchers are already imagining a more accessible version. In the future, hobbyists could record a robot’s random movements with their phone, much like you’d take a video of a rental car before driving off, and use that footage to create a control model, with no prior knowledge or special equipment required.

The system doesn’t yet generalize across different robots, and it lacks force or tactile sensing, limiting its effectiveness on contact-rich tasks. But the team is exploring new ways to address these limitations: improving generalization, handling occlusions, and extending the model’s ability to reason over longer spatial and temporal horizons.

“Just as humans develop an intuitive understanding of how their bodies move and respond to commands, NJF gives robots that kind of embodied self-awareness through vision alone,” says Li. “This understanding is a foundation for flexible manipulation and control in real-world environments. Our work, essentially, reflects a broader trend in robotics: moving away from manually programming detailed models toward teaching robots through observation and interaction.”

This paper brought together the computer vision and self-supervised learning work from the Sitzmann lab and the expertise in soft robots from the Rus lab. Li, Sitzmann, and Rus co-authored the paper with CSAIL affiliates Annan Zhang SM ’22, a PhD student in electrical engineering and computer science (EECS); Boyuan Chen, a PhD student in EECS; Hanna Matusik, an undergraduate researcher in mechanical engineering; and Chao Liu, a postdoc in the Senseable City Lab at MIT. 

The research was supported by the Solomon Buchsbaum Research Fund through MIT’s Research Support Committee, an MIT Presidential Fellowship, the National Science Foundation, and the Gwangju Institute of Science and Technology.

Pedestrians now walk faster and linger less, researchers find

MIT Latest News - Thu, 07/24/2025 - 1:45pm

City life is often described as “fast-paced.” A new study suggests that’s more true that ever.

The research, co-authored by MIT scholars, shows that the average walking speed of pedestrians in three northeastern U.S. cities increased 15 percent from 1980 to 2010. The number of people lingering in public spaces declined by 14 percent in that time as well.

The researchers used machine-learning tools to assess 1980s-era video footage captured by renowned urbanist William Whyte, in Boston, New York, and Philadelphia. They compared the old material with newer videos from the same locations.

“Something has changed over the past 40 years,” says MIT professor of the practice Carlo Ratti, a co-author of the new study. “How fast we walk, how people meet in public space — what we’re seeing here is that public spaces are working in somewhat different ways, more as a thoroughfare and less a space of encounter.”

The paper, “Exploring the social life of urban spaces through AI,” is published this week in the Proceedings of the National Academy of Sciences. The co-authors are Arianna Salazar-Miranda MCP ’16, PhD ’23, an assistant professor at Yale University’s School of the Environment; Zhuanguan Fan of the University of Hong Kong; Michael Baick; Keith N. Hampton, a professor at Michigan State University; Fabio Duarte, associate director of the Senseable City Lab; Becky P.Y. Loo of the University of Hong Kong; Edward Glaeser, the Fred and Eleanor Glimp Professor of Economics at Harvard University; and Ratti, who is also director of MIT’s Senseable City Lab.

The results could help inform urban planning, as designers seek to create new public areas or modify existing ones.

“Public space is such an important element of civic life, and today partly because it counteracts the polarization of digital space,” says Salazar-Miranda. “The more we can keep improving public space, the more we can make our cities suited for convening.”

Meet you at the Met

Whyte was a prominent social thinker whose famous 1956 book, “The Organization Man,” probing the apparent culture of corporate conformity in the U.S., became a touchstone of its decade.

However, Whyte spent the latter decades of his career focused on urbanism. The footage he filmed, from 1978 through 1980, was archived by a Brooklyn-based nonprofit organization called the Project for Public Spaces and later digitized by Hampton and his students.

Whyte chose to make his recording at four spots in the three cities combined: Boston’s Downtown Crossing area; New York City’s Bryant Park; the steps of the Metropolitan Museum of Art in New York, a famous gathering point and people-watching spot; and Philadelphia’s Chestnut Street.

In 2010, a group led by Hampton then shot new footage at those locations, at the same times of day Whyte had, to compare and contrast current-day dynamics with those of Whyte’s time. To conduct the study, the co-authors used computer vision and AI models to summarize and quantify the activity in the videos.

The researchers have found that some things have not changed greatly. The percentage of people walking alone barely moved, from 67 percent in 1980 to 68 percent in 2010. On the other hand, the percentage of individuals entering these public spaces who became part of a group declined a bit. In 1980, 5.5 percent of the people approaching these spots met up with a group; in 2010, that was down to 2 percent.

“Perhaps there’s a more transactional nature to public space today,” Ratti says.

Fewer outdoor groups: Anomie or Starbucks?

If people’s behavioral patterns have altered since 1980, it’s natural to ask why. Certainly some of the visible changes seem consistent with the pervasive use of cellphones; people organize their social lives by phone now, and perhaps zip around more quickly from place to place as a result.

“When you look at the footage from William Whyte, the people in public spaces were looking at each other more,” Ratti says. “It was a place you could start a conversation or run into a friend. You couldn’t do things online then. Today, behavior is more predicated on texting first, to meet in public space.”

As the scholars note, if groups of people hang out together slightly less often in public spaces, there could be still another reason for that: Starbucks and its competitors. As the paper states, outdoor group socializing may be less common due to “the proliferation of coffee shops and other indoor venues. Instead of lingering on sidewalks, people may have moved their social interactions into air-conditioned, more comfortable private spaces.”

Certainly coffeeshops were far less common in big cities in 1980, and the big chain coffeeshops did not exist.

On the other hand, public-space behavior might have been evolving all this time regardless of Starbucks and the like. The researchers say the new study offers a proof-of-concept for its method and has encouraged them to conduct additional work. Ratti, Duarte, and other researchers from MIT’s Senseable City Lab have turned their attention to an extensive survey of European public spaces in an attempt to shed more light on the interaction between people and the public form.

“We are collecting footage from 40 squares in Europe,” Duarte says. “The question is: How can we learn at a larger scale? This is in part what we’re doing.” 

New machine-learning application to help researchers predict chemical properties

MIT Latest News - Thu, 07/24/2025 - 1:00pm

One of the shared, fundamental goals of most chemistry researchers is the need to predict a molecule’s properties, such as its boiling or melting point. Once researchers can pinpoint that prediction, they’re able to move forward with their work yielding discoveries that lead to medicines, materials, and more. Historically, however, the traditional methods of unveiling these predictions are associated with a significant cost — expending time and wear and tear on equipment, in addition to funds.

Enter a branch of artificial intelligence known as machine learning (ML). ML has lessened the burden of molecule property prediction to a degree, but the advanced tools that most effectively expedite the process — by learning from existing data to make rapid predictions for new molecules — require the user to have a significant level of programming expertise. This creates an accessibility barrier for many chemists, who may not have the significant computational proficiency required to navigate the prediction pipeline. 

To alleviate this challenge, researchers in the McGuire Research Group at MIT have created ChemXploreML, a user-friendly desktop app that helps chemists make these critical predictions without requiring advanced programming skills. Freely available, easy to download, and functional on mainstream platforms, this app is also built to operate entirely offline, which helps keep research data proprietary. The exciting new technology is outlined in an article published recently in the Journal of Chemical Information and Modeling.

One specific hurdle in chemical machine learning is translating molecular structures into a numerical language that computers can understand. ChemXploreML automates this complex process with powerful, built-in "molecular embedders" that transform chemical structures into informative numerical vectors. Next, the software implements state-of-the-art algorithms to identify patterns and accurately predict molecular properties like boiling and melting points, all through an intuitive, interactive graphical interface. 

"The goal of ChemXploreML is to democratize the use of machine learning in the chemical sciences,” says Aravindh Nivas Marimuthu, a postdoc in the McGuire Group and lead author of the article. “By creating an intuitive, powerful, and offline-capable desktop application, we are putting state-of-the-art predictive modeling directly into the hands of chemists, regardless of their programming background. This work not only accelerates the search for new drugs and materials by making the screening process faster and cheaper, but its flexible design also opens doors for future innovations.” 

ChemXploreML is designed to to evolve over time, so as future techniques and algorithms are developed, they can be seamlessly integrated into the app, ensuring that researchers are always able to access and implement the most up-to-date methods. The application was tested on five key molecular properties of organic compounds — melting point, boiling point, vapor pressure, critical temperature, and critical pressure — and achieved high accuracy scores of up to 93 percent for the critical temperature. The researchers also demonstrated that a new, more compact method of representing molecules (VICGAE) was nearly as accurate as standard methods, such as Mol2Vec, but was up to 10 times faster.

“We envision a future where any researcher can easily customize and apply machine learning to solve unique challenges, from developing sustainable materials to exploring the complex chemistry of interstellar space,” says Marimuthu. Joining him on the paper is senior author and Class of 1943 Career Development Assistant Professor of Chemistry Brett McGuire.

How Solid Protocol Restores Digital Agency

Schneier on Security - Thu, 07/24/2025 - 7:04am

The current state of digital identity is a mess. Your personal information is scattered across hundreds of locations: social media companies, IoT companies, government agencies, websites you have accounts on, and data brokers you’ve never heard of. These entities collect, store, and trade your data, often without your knowledge or consent. It’s both redundant and inconsistent. You have hundreds, maybe thousands, of fragmented digital profiles that often contain contradictory or logically impossible information. Each serves its own purpose, yet there is no central override and control to serve you—as the identity owner...

What clean energy bosses say about Trump’s attacks on renewables

ClimateWire News - Thu, 07/24/2025 - 6:54am
Earnings calls Wednesday revealed how the biggest wind and solar companies are confronting the president’s hostility toward their industry.

EPA’s endangerment gambit could cause rules to spring back

ClimateWire News - Thu, 07/24/2025 - 6:52am
The agency appears poised to tie its deregulatory agenda to undoing the 2009 scientific finding behind most climate rules.

‘Attaboys’ dominate Texas flood hearing as lawmakers shy from assigning blame

ClimateWire News - Thu, 07/24/2025 - 6:51am
Top Republican legislators said pointing fingers would undermine efforts to improve the state's future responses.

FEMA chief lauds Texas response, stays mum about agency’s future

ClimateWire News - Thu, 07/24/2025 - 6:51am
House lawmakers are planning to introduce bipartisan legislation to overhaul the agency.

UN court declares countries must tackle climate change

ClimateWire News - Thu, 07/24/2025 - 6:50am
Though the decision is nonbinding, the ruling from the International Court of Justice could open the door for more litigation against corporate polluters.

Former Kamala Harris aide goes to climate group

ClimateWire News - Thu, 07/24/2025 - 6:49am
Ernesto Apreza will work on getting corporate commitments for net zero.

House appropriators look to cancel funding for IEA

ClimateWire News - Thu, 07/24/2025 - 6:48am
The International Energy Agency has come under fire from Republicans for its work on climate change.

With US out of picture, EU tries to fill the climate void with China

ClimateWire News - Thu, 07/24/2025 - 6:47am
When leaders meet Thursday in Beijing, they might strike a climate deal, but there’s no guarantee it will be meaningful.

A look at megafires as Oregon blaze nears 100,000-acre mark

ClimateWire News - Thu, 07/24/2025 - 6:45am
At least 14 wildfires each burned more than 100,000 acres in the U.S. in 2024, according to the National Interagency Coordination Center.

Storms in Vietnam leave 1 dead as Wipha weakens

ClimateWire News - Thu, 07/24/2025 - 6:45am
Flooding damaged hundreds of homes, destroyed crops and cut off remote communities, officials said.

Forest fire in Greece forces several villages to evacuate

ClimateWire News - Thu, 07/24/2025 - 6:44am
More than 180 firefighters, 15 planes and 12 helicopters were tackling the wildfire near Corinth, the fire department said.

Scientists apply optical pooled CRISPR screening to identify potential new Ebola drug targets

MIT Latest News - Thu, 07/24/2025 - 5:00am

The following press release was issued today by the Broad Institute of MIT and Harvard.

Although outbreaks of Ebola virus are rare, the disease is severe and often fatal, with few treatment options. Rather than targeting the virus itself, one promising therapeutic approach would be to interrupt proteins in the human host cell that the virus relies upon. However, finding those regulators of viral infection using existing methods has been difficult and is especially challenging for the most dangerous viruses like Ebola that require stringent high-containment biosafety protocols.

Now, researchers at the Broad Institute and the National Emerging Infectious Diseases Laboratories (NEIDL) at Boston University have used an image-based screening method developed at the Broad to identify human genes that, when silenced, impair the Ebola virus’s ability to infect. The method, known as optical pooled screening (OPS), enabled the scientists to test, in about 40 million CRISPR-perturbed human cells, how silencing each gene in the human genome affects virus replication.

Using machine-learning-based analyses of images of perturbed cells, they identified multiple host proteins involved in various stages of Ebola infection that when suppressed crippled the ability of the virus to replicate. Those viral regulators could represent avenues to one day intervene therapeutically and reduce the severity of disease in people already infected with the virus. The approach could be used to explore the role of various proteins during infection with other pathogens, as a way to find new drugs for hard-to-treat infections.

The study appears in Nature Microbiology.

“This study demonstrates the power of OPS to probe the dependency of dangerous viruses like Ebola on host factors at all stages of the viral life cycle and explore new routes to improve human health,” said co-senior author Paul Blainey, a Broad core faculty member and professor in the Department of Biological Engineering at MIT.

Previously, members of the Blainey lab developed the optical pooled screening method as a way to combine the benefits of high-content imaging, which can show a range of detailed changes in large numbers of cells at once, with those of pooled perturbational screens, which show how genetic elements influence these changes. In this study, they partnered with the laboratory of Robert Davey at BU to apply optical pooled screening to Ebola virus.

The team used CRISPR to knock out each gene in the human genome, one at a time, in nearly 40 million human cells, and then infected each cell with Ebola virus. They next fixed those cells in place in laboratory dishes and inactivated them, so that the remaining processing could occur outside of the high-containment lab.

After taking images of the cells, they measured overall viral protein and RNA in each cell using the CellProfiler image analysis software, and to get even more information from the images, they turned to AI. With help from team members in the Eric and Wendy Schmidt Center at the Broad, led by study co-author and Broad core faculty member Caroline Uhler, they used a deep learning model to automatically determine the stage of Ebola infection for each single cell. The model was able to make subtle distinctions between stages of infection in a high-throughput way that wasn’t possible using prior methods.

“The work represents the deepest dive yet into how Ebola virus rewires the cell to cause disease, and the first real glimpse into the timing of that reprogramming,” said co-senior author Robert Davey, director of the National Emerging Infectious Diseases Laboratories at Boston University, and professor of microbiology at BU Chobanian and Avedisian School of Medicine. “AI gave us an unprecedented ability to do this at scale.”

By sequencing parts of the CRISPR guide RNA in all 40 million cells individually, the researchers determined which human gene had been silenced in each cell, indicating which host proteins (and potential viral regulators) were targeted. The analysis revealed hundreds of host proteins that, when silenced, altered overall infection level, including many required for viral entry into the cell.

Knocking out other genes enhanced the amount of virus within inclusion bodies, structures that form in the human cell to act as viral factories, and prevented the infection from progressing further. Some of these human genes, such as UQCRB, pointed to a previously unrecognized role for mitochondria in the Ebola virus infection process that could possibly be exploited therapeutically. Indeed, treating cells with a small molecule inhibitor of UQCRB reduced Ebola infection with no impact on the cell’s own health.

Other genes, when silenced, altered the balance between viral RNA and protein. For example, perturbing a gene called STRAP resulted in increased viral RNA relative to protein. The researchers are currently doing further studies in the lab to better understand the role of STRAP and other proteins in Ebola infection and whether they could be targeted therapeutically.

In a series of secondary screens, the scientists examined some of the highlighted genes’ roles in infection with related filoviruses. Silencing some of these genes interrupted replication of Sudan and Marburg viruses, which have high fatality rates and no approved treatments, so it’s possible a single treatment could be effective against multiple related viruses.

The study’s approach could also be used to examine other pathogens and emerging infectious diseases and look for new ways to treat them.

“With our method, we can measure many features at once and uncover new clues about the interplay between virus and host, in a way that’s not possible through other screening approaches,” said co-first author Rebecca Carlson, a former graduate researcher in the labs of Blainey and Nir Hacohen at the Broad and who co-led the work along with co-first author J.J. Patten at Boston University.

This work was funded in part by the Broad Institute, the National Human Genome Research Institute, the Burroughs Wellcome Fund, the Fannie and John Hertz Foundation, the National Science Foundation, the George F. Carrier Postdoctoral Fellowship, the Eric and Wendy Schmidt Center at the Broad Institute, the National Institutes of Health, and the Office of Naval Research.

Astronomers discover star-shredding black holes hiding in dusty galaxies

MIT Latest News - Thu, 07/24/2025 - 12:00am

Astronomers at MIT, Columbia University, and elsewhere have used NASA’s James Webb Space Telescope (JWST) to peer through the dust of nearby galaxies and into the aftermath of a black hole’s stellar feast.

In a study appearing today in Astrophysical Journal Letters, the researchers report that for the first time, JWST has observed several tidal disruption events — instances when a galaxy’s central black hole draws in a nearby star and whips up tidal forces that tear the star to shreds, giving off an enormous burst of energy in the process.

Scientists have observed about 100 tidal disruption events (TDEs) since the 1990s, mostly as X-ray or optical light that flashes across relatively dust-free galaxies. But as MIT researchers recently reported, there may be many more star-shredding events in the universe that are “hiding” in dustier, gas-veiled galaxies.

In their previous work, the team found that most of the X-ray and optical light that a TDE gives off can be obscured by a galaxy’s dust, and therefore can go unseen by traditional X-ray and optical telescopes. But that same burst of light can heat up the surrounding dust and generate a new signal, in the form of infrared light.

Now, the same researchers have used JWST — the world’s most powerful infrared detector — to study signals from four dusty galaxies where they suspect tidal disruption events have occurred. Within the dust, JWST detected clear fingerprints of black hole accretion, a process by which material, such as stellar debris, circles and eventually falls into a black hole. The telescope also detected patterns that are strikingly different from the dust that surrounds active galaxies, where the central black hole is constantly pulling in surrounding material.

Together, the observations confirm that a tidal disruption event did indeed occur in each of the four galaxies. What’s more, the researchers conclude that the four events were products of not active black holes but rather dormant ones, which experienced little to no activity until a star happened to pass by.

The new results highlight JWST’s potential to study in detail otherwise hidden tidal disruption events. They are also helping scientists to reveal key differences in the environments around active versus dormant black holes.

“These are the first JWST observations of tidal disruption events, and they look nothing like what we’ve ever seen before,” says lead author Megan Masterson, a graduate student in MIT’s Kavli Institute for Astrophysics and Space Research. “We’ve learned these are indeed powered by black hole accretion, and they don’t look like environments around normal active black holes. The fact that we’re now able to study what that dormant black hole environment actually looks like is an exciting aspect.”

The study’s MIT authors include Christos Panagiotou, Erin Kara, Anna-Christina Eilers, along with Kishalay De of Columbia University and collaborators from multiple other institutions.

Seeing the light

The new study expands on the team’s previous work using another infrared detector — NASA’s Near-Earth Object Wide-field Infrared Survey Explorer (NEOWISE) mission. Using an algorithm developed by co-author Kishalay De of Columbia University, the team searched through a decade’s worth of data from the telescope, looking for infrared “transients,” or short peaks of infrared activity from otherwise quiet galaxies that could be signals of a black hole briefly waking up and feasting on a passing star. That search unearthed about a dozen signals that the group determined were likely produced by a tidal disruption event.

“With that study, we found these 12 sources that look just like TDEs,” Masterson says. “We made a lot of arguments about how the signals were very energetic, and the galaxies didn’t look like they were active before, so the signals must have been from a sudden TDE. But except for these little pieces, there was no direct evidence.”

With the much more sensitive capabilities of JWST, the researchers hoped to discern key “spectral lines,” or infrared light at specific wavelengths, that would be clear fingerprints of conditions associated with a tidal disruption event.

“With NEOWISE, it’s as if our eyes could only see red light or blue light, whereas with JWST, we’re seeing the full rainbow,” Masterson says.

A Bonafide signal

In their new work, the group looked specifically for a peak in infrared, that could only be produced by black hole accretion — a process by which material is drawn toward a black hole in a circulating disk of gas. This disk produces an enormous amount of radiation that is so intense that it can kick out electrons from individual atoms. In particular, such accretion processes can blast several electrons out from atoms of neon, and the resulting ion can transition, releasing infrared radiation at a very specific wavelength that JWST can detect. 

“There’s nothing else in the universe that can excite this gas to these energies, except for black hole accretion,” Masterson says.

The researchers searched for this smoking-gun signal in four of the 12 TDE candidates they previously identified. The four signals include: the closest tidal disruption event detected to date, located in a galaxy some 130 million light years away; a TDE that also exhibits a burst of X-ray light; a signal that may have been produced by gas circulating at incredibly high speeds around a central black hole; and a signal that also included an optical flash, which scientists had previously suspected to be a supernova, or the collapse of a dying star, rather than tidal disruption event.

“These four signals were as close as we could get to a sure thing,” Masterson says. “But the JWST data helped us say definitively these are bonafide TDEs.”

When the team pointed JWST toward the galaxies of each of the four signals, in a program designed by De, they observed that the telltale spectral lines showed up in all four sources. These measurements confirmed that black hole accretion occurred in all four galaxies. But the question remained: Was this accretion a temporary feature, triggered by a tidal disruption and a black hole that briefly woke up to feast on a passing star? Or was this accretion a more permanent trait of “active” black holes that are always on? In the case of the latter, it would be less likely that a tidal disruption event had occurred.

To differentiate between the two possibilities, the team used the JWST data to detect another wavelength of infrared light, which indicates the presence of silicates, or dust in the galaxy. They then mapped this dust in each of the four galaxies and compared the patterns to those of active galaxies, which are known to harbor clumpy, donut-shaped dust clouds around the central black hole. Masterson observed that all four sources showed very different patterns compared to typical active galaxies, suggesting that the black hole at the center of each of the galaxies is not normally active, but dormant. If an accretion disk formed around such a black hole, the researchers conclude that it must have been a result of a tidal disruption event.

“Together, these observations say the only thing these flares could be are TDEs,” Masterson says.

She and her collaborators plan to uncover many more previously hidden tidal disruption events, with NEOWISE, JWST, and other infrared telescopes. With enough detections, they say TDEs can serve as effective probes of black hole properties. For instance, how much of a star is shredded, and how fast its debris is accreted and consumed, can reveal fundamental properties of a black hole, such as how massive it is and how fast it spins.

“The actual process of a black hole gobbling down all that stellar material takes a long time,” Masterson says. “It’s not an instantaneous process. And hopefully we can start to probe how long that process takes and what that environment looks like. No one knows because we just started discovering and studying these events.”

This research was supported, in part, by NASA.

Theory-guided strategy expands the scope of measurable quantum interactions

MIT Latest News - Thu, 07/24/2025 - 12:00am

A new theory-guided framework could help scientists probe the properties of new semiconductors for next-generation microelectronic devices, or discover materials that boost the performance of quantum computers.

Research to develop new or better materials typically involves investigating properties that can be reliably measured with existing lab equipment, but this represents just a fraction of the properties that scientists could potentially probe in principle. Some properties remain effectively “invisible” because they are too difficult to capture directly with existing methods.

Take electron-phonon interaction — this property plays a critical role in a material’s electrical, thermal, optical, and superconducting properties, but directly capturing it using existing techniques is notoriously challenging.

Now, MIT researchers have proposed a theoretically justified approach that could turn this challenge into an opportunity. Their method reinterprets neutron scattering, an often-overlooked interference effect as a potential direct probe of electron-phonon coupling strength.

The procedure creates two interaction effects in the material. The researchers show that, by deliberately designing their experiment to leverage the interference between the two interactions, they can capture the strength of a material’s electron-phonon interaction.

The researchers’ theory-informed methodology could be used to shape the design of future experiments, opening the door to measuring new quantities that were previously out of reach.

“Rather than discovering new spectroscopy techniques by pure accident, we can use theory to justify and inform the design of our experiments and our physical equipment,” says Mingda Li, the Class of 1947 Career Development Professor and an associate professor of nuclear science and engineering, and senior author of a paper on this experimental method.

Li is joined on the paper by co-lead authors Chuliang Fu, an MIT postdoc; Phum Siriviboon and Artittaya Boonkird, both MIT graduate students; as well as others at MIT, the National Institute of Standards and Technology, the University of California at Riverside, Michigan State University, and Oak Ridge National Laboratory. The research appears this week in Materials Today Physics.

Investigating interference

Neutron scattering is a powerful measurement technique that involves aiming a beam of neutrons at a material and studying how the neutrons are scattered after they strike it. The method is ideal for measuring a material’s atomic structure and magnetic properties.

When neutrons collide with the material sample, they interact with it through two different mechanisms, creating a nuclear interaction and a magnetic interaction. These interactions can interfere with each other.

“The scientific community has known about this interference effect for a long time, but researchers tend to view it as a complication that can obscure measurement signals. So it hasn’t received much focused attention,” Fu says.

The team and their collaborators took a conceptual “leap of faith” and decided to explore this oft-overlooked interference effect more deeply.

They flipped the traditional materials research approach on its head by starting with a multifaceted theoretical analysis. They explored what happens inside a material when the nuclear interaction and magnetic interaction interfere with each other.

Their analysis revealed that this interference pattern is directly proportional to the strength of the material’s electron-phonon interaction.

“This makes the interference effect a probe we can use to detect this interaction,” explains Siriviboon.

Electron-phonon interactions play a role in a wide range of material properties. They affect how heat flows through a material, impact a material’s ability to absorb and emit light, and can even lead to superconductivity.

But the complexity of these interactions makes them hard to directly measure using existing experimental techniques. Instead, researchers often rely on less precise, indirect methods to capture electron-phonon interactions.

However, leveraging this interference effect enables direct measurement of the electron-phonon interaction, a major advantage over other approaches.

“Being able to directly measure the electron-phonon interaction opens the door to many new possibilities,” says Boonkird.

Rethinking materials research

Based on their theoretical insights, the researchers designed an experimental setup to demonstrate their approach.

Since the available equipment wasn’t powerful enough for this type of neutron scattering experiment, they were only able to capture a weak electron-phonon interaction signal — but the results were clear enough to support their theory.

“These results justify the need for a new facility where the equipment might be 100 to 1,000 times more powerful, enabling scientists to clearly resolve the signal and measure the interaction,” adds Landry.

With improved neutron scattering facilities, like those proposed for the upcoming Second Target Station at Oak Ridge National Laboratory, this experimental method could be an effective technique for measuring many crucial material properties.

For instance, by helping scientists identify and harness better semiconductors, this approach could enable more energy-efficient appliances, faster wireless communication devices, and more reliable medical equipment like pacemakers and MRI scanners.   

Ultimately, the team sees this work as a broader message about the need to rethink the materials research process.

“Using theoretical insights to design experimental setups in advance can help us redefine the properties we can measure,” Fu says.

To that end, the team and their collaborators are currently exploring other types of interactions they could leverage to investigate additional material properties.

“This is a very interesting paper,” says Jon Taylor, director of the neutron scattering division at Oak Ridge National Laboratory, who was not involved with this research. “It would be interesting to have a neutron scattering method that is directly sensitive to charge lattice interactions or more generally electronic effects that were not just magnetic moments. It seems that such an effect is expectedly rather small, so facilities like STS could really help develop that fundamental understanding of the interaction and also leverage such effects routinely for research.”

This work is funded, in part, by the U.S. Department of Energy and the National Science Foundation.

You Shouldn’t Have to Make Your Social Media Public to Get a Visa

EFF: Updates - Wed, 07/23/2025 - 6:33pm

The Trump administration is continuing its dangerous push to surveil and suppress foreign students’ social media activity. The State Department recently announced an unprecedented new requirement that applicants for student and exchange visas must set all social media accounts to “public” for government review. The State Department also indicated that if applicants refuse to unlock their accounts or otherwise don’t maintain a social media presence, the government may interpret it as an attempt to evade the requirement or deliberately hide online activity.

The administration is penalizing prospective students and visitors for shielding their social media accounts from the general public or for choosing to not be active on social media. This is an outrageous violation of privacy, one that completely disregards the legitimate and often critical reasons why millions of people choose to lock down their social media profiles, share only limited information about themselves online, or not engage in social media at all. By making students abandon basic privacy hygiene as the price of admission to American universities, the administration is forcing applicants to expose a wealth of personal information to not only the U.S. government, but to anyone with an internet connection.

Why Social Media Privacy Matters

The administration’s new policy is a dangerous expansion of existing social media collection efforts. While the State Department has required since 2019 that visa applicants disclose their social media handles—a policy EFF has consistently opposed—forcing applicants to make their accounts public crosses a new line.

Individuals have significant privacy interests in their social media accounts. Social media profiles contain some of the most intimate details of our lives, such as our political views, religious beliefs, health information, likes and dislikes, and the people with whom we associate. Such personal details can be gleaned from vast volumes of data given the unlimited storage capacity of cloud-based social media platforms. As the Supreme Court has recognized, “[t]he sum of an individual’s private life can be reconstructed through a thousand photographs labeled with dates, locations, and descriptions”—all of which and more are available on social media platforms.

By requiring visa applicants to share these details, the government can obtain information that would otherwise be inaccessible or difficult to piece together across disparate locations. For example, while visa applicants are not required to disclose their political views in their applications, applicants might choose to post their beliefs on their social media profiles.

This information, once disclosed, doesn’t just disappear. Existing policy allows the government to continue surveilling applicants’ social media profiles even once the application process is over. And personal information obtained from applicants’ profiles can be collected and stored in government databases for decades.

What’s more, by requiring visa applicants to make their private social media accounts public, the administration is forcing them to expose troves of personal, sensitive information to the entire internet, not just the U.S. government. This could include various bad actors like identity thieves and fraudsters, foreign governments, current and prospective employers, and other third parties.

Those in applicants’ social media networks—including U.S. citizen family or friends—can also become surveillance targets by association. Visa applicants’ online activity is likely to reveal information about the users with whom they’re connected. For example, a visa applicant could tag another user in a political rant or posts photos of themselves and the other user at a political rally. Anyone who sees those posts might reasonably infer that the other user shares the applicant’s political beliefs. The administration’s new requirement will therefore publicly expose the personal information of millions of additional people, beyond just visa applicants.

There are Very Good Reasons to Keep Social Media Accounts Private

An overwhelming number of social media users maintain private accounts for the same reason we put curtains on our windows: a desire for basic privacy. There are numerous legitimate reasons people choose to share their social media only with trusted family and friends, whether that’s ensuring personal safety, maintaining professional boundaries, or simply not wanting to share personal profiles with the entire world.

Safety from Online Harassment and Physical Violence

Many people keep their accounts private to protect themselves from stalkers, harassers, and those who wish them harm. Domestic violence survivors, for example, use privacy settings to hide from their abusers, and organizations supporting survivors often encourage them to maintain a limited online presence.

Women also face a variety of gender-based online harms made worse by public profiles, including stalking, sexual harassment, and violent threats. A 2021 study reported that at least 38% of women globally had personally experienced online abuse, and at least 85% of women had witnessed it. Women are, in turn, more likely to activate privacy settings than men.

LGBTQ+ individuals similarly have good reasons to lock down their accounts. Individuals from countries where their identity puts them in danger rely on privacy protections to stay safe from state action. People may also reasonably choose to lock their accounts to avoid the barrage of anti-LGBTQ+ hate and harassment that is common on social media platforms, which can lead to real-world violence. Others, including LGBTQ+ youth, may simply not be ready to share their identity outside of their chosen personal network.

Political Dissidents, Activists, and Journalists

Activists working on sensitive human rights issues, political dissidents, and journalists use privacy settings to protect themselves from doxxing, harassment, and potential political persecution by their governments.

Rather than protecting these vulnerable groups, the administration’s policy instead explicitly targets political speech. The State Department has given embassies and consulates a vague directive to vet applicants’ social media for “hostile attitudes towards our citizens, culture, government, institutions, or founding principles,” according to an internal State Department cable obtained by multiple news outlets. This includes looking for “applicants who demonstrate a history of political activism.” The cable did not specify what, exactly, constitutes “hostile attitudes.”

Professional and Personal Boundaries

People use privacy settings to maintain boundaries between their personal and professional lives. They share family photos, sensitive updates, and personal moments with close friends—not with their employers, teachers, professional connections, or the general public.

The Growing Menace of Social Media Surveillance

This new policy is an escalation of the Trump administration’s ongoing immigration-related social media surveillance. EFF has written about the administration’s new “Catch and Revoke” effort, which deploys artificial intelligence and other data analytic tools to review the public social media accounts of student visa holders in an effort to revoke their visas. And EFF recently submitted comments opposing a USCIS proposal to collect social media identifiers from visa and green card holders already living in the U.S., including when they submit applications for permanent residency and naturalization.

The administration has also started screening many non-citizens' social media accounts for ambiguously-defined “antisemitic activity,” and previously announced expanded social media vetting for any visa applicant seeking to travel specifically to Harvard University for any purpose.

The administration claims this mass surveillance will make America safer, but there’s little evidence to support this. By the government’s own previous assessments, social media surveillance has not proven effective at identifying security threats.

At the same time, these policies gravely undermine freedom of speech, as we recently argued in our USCIS comments. The government is using social media monitoring to directly target and punish through visa denials or revocations foreign students and others for their digital speech. And the social media surveillance itself broadly chills free expression online—for citizens and non-citizens alike.

In defending the new requirement, the State Department argued that a U.S. visa is a “privilege, not a right.” But privacy and free expression should not be privileges. These are fundamental human rights, and they are rights we abandon at our peril.

Pages