Does increasing temperature make circuits more vulnerable to EMI?

Separate area for niche lag issues including unexpected causes and/or electromagnetic interference (ECC = retransmits = lag). Interference (EMI, EMF) of all kinds (wired, wireless, external, internal, environment, bad component) can cause error-correction latencies like a bad modem connection. Troubleshooting may require university degree. Your lag issue is likely not EMI. Please read this before entering sub-forum.
Forum rules
IMPORTANT:
This subforum is for advanced users only. This separate area is for niche or unexpected lag issues such as electromagnetic interference (EMI, EMF, electrical, radiofrequency, etc). Interference of all kinds (wired, wireless, external, internal, environment, bad component) can cause error-correction (ECC) latencies like a bad modem connection, except internally in a circuit. ECC = retransmits = lag. Troubleshooting may require university degree. Your lag issue is likely not EMI.
🠚 You Must Read This First Before Submit Post or Submit Reply
Post Reply
Jonnyc55
Posts: 10
Joined: 15 Jan 2024, 08:09

Does increasing temperature make circuits more vulnerable to EMI?

Post by Jonnyc55 » 13 Feb 2024, 10:38

" When a material is heated, the kinetic energy of that material increases and its atoms and molecules move about more. This means that each atom will take up more space due to its movement so the material will expand."
- https://www.physlink.com/education/askexperts/ae40.cfm (halfway in first paragraph)

If the lattice is now expanded more, this may leave components in the circuit more prone to EMI? Though it could be said that energetic particles could still be a hindrance to EMF, maybe at certain frequencies, compared to a more static ordeal with molecules at colder temperatures, allowing gaps to hold longer as gaps for EMF to penetrate?

Also I read this:
"Electrical Conductivity: Heating a metal can increase its electrical conductivity, especially for metals like copper and aluminum. Higher electrical conductivity can result in less attenuation of EMF passing through the metal, allowing more of the electromagnetic energy to propagate."
- chatgpt, on the query "does heated metals let more emf through?"

Whether that is essentially the same as above, I don't know.

I'm basically wondering if, as we play our games longer and longer, the temperatures start to allow for all sorts of random EMI behaviour to occur. It's probably why our tweaks seem to work, or that things all seem taut and fresh for our first few matches, then it goes all muddy, as the great noisy interplay of EMI starts to grow louder burying order of tweaks in a sea of noise.

I'm not saying things need to get to 90c Celsius, just simply, an increase in temperature, increases odds of funny business with EMI.

But like I said earlier, you would think a more chaotic scene of vibrating molecules is equally a hindrance like a compact lattice in cold metals to EMF. So maybe there are certain naughty frequencies in play for cold and hot scenarios that can cause issues.

You could plainly say, well, then a healthy temperature seems to fit the bill then and that seeing software say your CPU is a cool 40c is good however that's where small hot spots are an issue, if you've got cooling mastered, there is always the issue of hot spots, creating variance.

This is why disabling useless features, diminishes circuitry in use, allowing for better cooling, and less hot spots. Or, wanting certain circuity in play to level out the heat across a wider area.

So the topic of heat and EMI, I don't think is black and white. For example, nitro liquid cooled CPUs may allow for greater GHz, but these cold temperatures could allow oddities in their own way regarding EMI. If something is inherently with current and volts in operation for something critical and needed, and the material it goes through is super cold, then the outer regions of the metal entomb the inherent EMI of that current within that cold and thus compact lattice, whereby its own EMI is a hindrance to itself.

It's not far of a car, things get a bit more slick in the engine, tyres and components when things get a bit more heated, and obviously cooling is wanted yet still in a car.

I think when it comes to gaming, your PC wants a healthy temperature range, and to maximise that, that includes defeating hot spots in the circuits if it can be helped maybe.

I was going to make the thread about heat being bad for EMI, but it occurred to me, that can't quite be the case based on simple common sense and basic physics. I think there's a sweet spot.

Back to the cold metal issue; the enclosed inherent EMF of the circuits causing havoc to itself, due to the compact cold lattice, I'd say it's like putting an EMI shield over a NIC chip on a motherboard, sure you've stopped external EMI impacting it, but now you've allowed the NICs own EMF to cause havoc to itself, with no where to go, but to bounce until eventual 'slow' diminishment. This is why EMF prevention is such a song and dance I find.

I feel like temperature itself needs a balancing act therefore. Many different situations I guess. But the relationship of temperature and EMI with circuits is an interesting one.

Post Reply