If you’ve ever had a PET scan, you know it’s an ordeal. The scans help doctors detect cancer and track its spread, but the procedure itself is a nightmare for patients.
It starts with fasting for four to six hours before coming to the hospital — and good luck to you if you live in a rural area and your local hospital doesn’t have a PET scanner. When you get to the hospital, they inject you with radioactive material, after which you have to wait an hour for it to wash out of your body. You then go into the PET scanner and have to try to stay still for 30 minutes while the radiologists take the image. After that, you must physically stay away from the elderly, young, and pregnant women for up to 12 hours because you are literally semi-radioactive.
Another traffic jam? PET scanners are concentrated in large cities because their radioactive tracers must be produced in nearby cyclotrons – compact nuclear machines – and used within hours, limiting access to rural and regional hospitals.
But what if you could use artificial intelligence to convert CT scans, which are far more accessible and affordable, into PET scans? That’s the pitch of RADiCAIT, an Oxford spinout that emerged from secrecy this month with $1.7 million in pre-seed funding. The Boston-based startup, which is a Top 20 finalist in the Startup Battlefield at TechCrunch Disrupt 2025, just opened a $5 million raise to advance its clinical trials.
“What we’re really doing is we’ve taken the most limited, complex, and expensive medical imaging solution in radiology and replaced it with the most affordable, simple, and accessible one, which is CT,” Sean Walsh, CEO of RADiCAIT, told TechCrunch.
RADICAIT’s secret sauce is its foundational model — a genetic deep neural network invented in 2021 at the University of Oxford by a team led by the startup’s co-founder and chief medical information officer, Regent Lee.
The model learns by comparing CT and PET scans, mapping them and picking out patterns in how they relate to each other. Sina Shahandeh, RADICAIT’s chief technologist, describes it as connecting “distinct physical phenomena” by translating anatomical structure into physiological function. The model is then directed to pay particular attention to specific features or aspects of the scans, such as certain tissue types or abnormalities. This focused learning is repeated many times with many different examples so that the model can identify which patterns are clinically relevant.
Techcrunch event
San Francisco
|
27-29 October 2025
The final image that goes to doctors for review is created by combining several models working together. Shahandeh compares the approach to Google DeepMind’s AlphaFold, the AI that revolutionized protein structure prediction: Both systems learn to translate one type of biological information into another.
Walsh claims that the RADiCAIT team can mathematically demonstrate that synthetic or generated PET images are statistically similar to real chemical PET scans.
“This is what our tests show,” he said, “that the same quality of decision is made when the physician, radiologist, or oncologist receives chemical PET or [our AI-generated PET].”
RADiCAIT does not promise to replace the need for PET scans in certain therapeutic settings, such as radioligand therapy, which kills cancer cells. However, for diagnostic, staging and follow-up purposes, RADICAIT’s technology may render PET scans obsolete.


“Because it’s such a limited system, there’s not enough supply to meet the demand for diagnostics and therapeutics,” Walsh said, referring to a medical approach that combines diagnostic imaging (i.e., PET scans) with targeted therapy to treat disease (i.e., cancer). “So what we want to do is absorb that demand on the diagnostic side. The PET scanners themselves should pick up the slack on the therapeutic side.”
RADiCAIT has already begun clinical pilots specifically for lung cancer testing with large health systems such as Mass General Brigham and UCSF Health. The startup is now pursuing an FDA clinical trial—a more expensive and thorough process that drives RADICAIT’s $5 million seed round. Once approved, the next step will be to conduct commercial pilots and demonstrate the commercial viability of the product. RADICAIT also wants to run the same process — clinical pilots, clinical trials, commercial pilots — for colon and lymphoma use cases.
Shahandeh said RADICAIT’s approach to using artificial intelligence to obtain valid information without the burden of difficult and expensive testing is “broadly applicable”.
“We are exploring extensions across radiology,” Shahandeh added. “Expect to see similar innovations connecting fields from materials science to biology, chemistry and physics where nature’s hidden relationships can be learned.”
If you want to know more about RADICAIT join us at Disrupt, October 27-29 in San Francisco. Learn more here.

