Would it be easier to deploy devices like this to developing counties without the infrastructure to support liquid helium distribution? I imagine a much simpler device WRT exotic cooling and distribution of material requirements is a plus. Couple that with the scarcity and non-renewable nature of helium, maybe using devices like this at scale for gross MRI imagery makes sense?
The AI used here as I read it is a generative approach trying to specifically compensate for EMI artifacts rather than a physics model and it likely wouldn’t be doing macro changes like sneezes to knees, no?
Zero-boil-off "dry" magnets have been widely used for the last decade -- we engineered away the thousands of litres of liquid helium in exchange for bigger electricity bills and some added complexity (and arguably cost). They basically put the cryocompressor/cold head on a large heatsinked plate and use helium gas as a working fluid to cool it and through conduction the rest of the magnet. The supercon wire has a critical T/B/Ic surface and (to my knowledge) they essentially accept worse Ic in exchange for higher Tc.
The cold head vibration can introduce a bit more B0 drift per day, but it's not practically a problem.
Regarding artefacts, one of the other reasons that MRI rooms are expensive are the Faraday cages. They do help. Not just in terms of noise floors but because there tends to be a lot of intermittent RF transmission from people like paramedics. Did you know a) that the international mayday frequency is 121.5 MHz, b) that overhead helicopter flights may transmit with kW of RF on that frequency, c) that the larmour frequency of protons at ~2.9T is 121.5 MHz, d) that Siemens "3T" magnets are routinely around 2.9T, and e) the voltage of the signal you detect in MR is micro to millivolt at best? I've seen spurious peaks in spectra from this.
The DL method the paper talks about "may work", but as the OP says this is deeply unsatisfactory for a whole host of reasons and is, in my overly sarky opinion, a bit like fixing a wall with rising damp by putting a television in front of it showing a beautiful, high resolution picture of a brick wall in the same colour.
Perhaps a standard bit of kit for an imaging room ought to be a receiver at the operating frequency outside of the room that can pause the sequence when a potential jammer is active, and log the event so that you could potentially make a report to the relevant authorities (perhaps encourage them to keep the transmitters off near your facility).
Pausing the sequence is also not so much of an option when contrast was just administered either, I guess.
(I suppose if the signal weren't so hot that it was saturating the ADCs there might be some opportunity to subtract it off... but that's starting to sound like another ten thousand phd-educated hours of labour mentioned up thread)
The AI used here as I read it is a generative approach trying to specifically compensate for EMI artifacts rather than a physics model and it likely wouldn’t be doing macro changes like sneezes to knees, no?