Generation IV Internal Engine 2005-2014 LS2 | LS3 | LS7 | L92 | LS9

Engine assembly - in the cold?

Thread Tools
 
Search this Thread
 
Old 12-04-2014, 01:38 PM
  #21  
TECH Enthusiast
Thread Starter
iTrader: (3)
 
BigBadWhitey's Avatar
 
Join Date: Jul 2008
Posts: 652
Likes: 0
Received 7 Likes on 5 Posts
LS1Tech 10 Year
Default

Originally Posted by iliveonnitro
I figured I should chime in, since I'm a mechanical engineer and my daily job is to understand heat transfer, thermal stresses, tolerances, etc in automotive applications...and this thread is a bit depressing.

Spark notes: there is no difference to building your engine in 10°F or 100°F.

Aluminum and iron/steel have CTEs (coefficient of thermal expansion) between 5-13 microns/in-°F, and mostly in the 6-9 microns/in-°F range. In other words, even a 100°F change in temperature will only affect tolerances in the 0.0003"/in range. On any surface you can measure, it may only be 25 microns, which is orders of magnitude smaller than anything you can accurately measure at home. Eg, your calipers are, on a good day, only accurate to .001-.002".

Remember, these engines cycle hundreds of degrees during operation, which does far more to tolerance mismatch than 90°F could do while building your engine.

Thank you kindly!
I only deal in organics, so I have zero idea about metallurgy. Thank you very much good sir.
Old 12-04-2014, 09:39 PM
  #22  
On The Tree
iTrader: (3)
 
BattleSausage's Avatar
 
Join Date: May 2010
Location: Ga
Posts: 149
Likes: 0
Received 0 Likes on 0 Posts
Default

Originally Posted by iliveonnitro
I figured I should chime in, since I'm a mechanical engineer and my daily job is to understand heat transfer, thermal stresses, tolerances, etc in automotive applications...and this thread is a bit depressing.

Spark notes: there is no difference to building your engine in 10°F or 100°F.

Aluminum and iron/steel have CTEs (coefficient of thermal expansion) between 5-13 microns/in-°F, and mostly in the 6-9 microns/in-°F range. In other words, even a 100°F change in temperature will only affect tolerances in the 0.0003"/in range. On any surface you can measure, it may only be 25 microns, which is orders of magnitude smaller than anything you can accurately measure at home. Eg, your calipers are, on a good day, only accurate to .001-.002".

Remember, these engines cycle hundreds of degrees during operation, which does far more to tolerance mismatch than 90°F could do while building your engine.
So you're saying that per 1 degree change, you're looking at a change of generally 6-9 microns? Given that 25 microns is roughly .001" (.00098), that is a huge difference. If what you're saying is true, there would be hardly any need to maintain a tightly controlled, consistent temp when machining blocks like what RED/ERL use (or any other competent shop).

While I don't myself have a chart from personal experience, I do remember reading in depth about this exact problem in college (switched majors from Mechanical to Electrical) when building an engine for class. One of the articles from a published journal conducted measurements in increments of 10. What I remember most was that at just a ~70 degree change, there was almost a .0015" change in main diameter iirc. I also think Engine Masters or Hot Rod had an article illustrating the same point with very similar results.

Just from quickly searching the AERA website for some date about this, I found this article which I quoted the pertinent portion found here :
http://www.aera.org/engine-professio...re-distortion/

Not only are there thermal dimensional changes, but these changes are far from uniform throughout the bore. The major effect is the gross dimensional change in bore diameter, which can change .0005 – .001” of an inch, per inch of bore diameter, at temperatures between 65°F and 210°F. For a common 4” bore, this equals .002 – .004” of growth.
That said, being that 25 microns (.001) should be easily measured with a halfway decent dial bore gauge or caliper, given that any competent machinist should be using one accurate to .0005" or better.
Old 12-04-2014, 10:03 PM
  #23  
TECH Senior Member
iTrader: (7)
 
Project GatTagO's Avatar
 
Join Date: Mar 2003
Location: The City of Fountains
Posts: 10,216
Received 1,499 Likes on 935 Posts
LS1Tech 20 Year Member
Default

Originally Posted by BattleSausage
So you're saying that per 1 degree change, you're looking at a change of generally 6-9 microns? Given that 25 microns is roughly .001" (.00098), that is a huge difference. If what you're saying is true, there would be hardly any need to maintain a tightly controlled, consistent temp when machining blocks like what RED/ERL use (or any other competent shop).

While I don't myself have a chart from personal experience, I do remember reading in depth about this exact problem in college (switched majors from Mechanical to Electrical) when building an engine for class. One of the articles from a published journal conducted measurements in increments of 10. What I remember most was that at just a ~70 degree change, there was almost a .0015" change in main diameter iirc. I also think Engine Masters or Hot Rod had an article illustrating the same point with very similar results.

Just from quickly searching the AERA website for some date about this, I found this article which I quoted the pertinent portion found here :
http://www.aera.org/engine-professio...re-distortion/



That said, being that 25 microns (.001) should be easily measured with a halfway decent dial bore gauge or caliper, given that any competent machinist should be using one accurate to .0005" or better.
So your saying that a delta of 145 degree results in a change in a 4" bore of roughly .003"...right?

Fair enough...but the OP is trying to build an engine in a 10 degree environment, which is 60 degree delta from STP...so....how is your citation relevant?

Andrew
Old 12-04-2014, 11:07 PM
  #24  
TECH Enthusiast
Thread Starter
iTrader: (3)
 
BigBadWhitey's Avatar
 
Join Date: Jul 2008
Posts: 652
Likes: 0
Received 7 Likes on 5 Posts
LS1Tech 10 Year
Default

Originally Posted by Project GatTagO
So your saying that a delta of 145 degree results in a change in a 4" bore of roughly .003"...right?

Fair enough...but the OP is trying to build an engine in a 10 degree environment, which is 60 degree delta from STP...so....how is your citation relevant?

Andrew
Also in his reply he stated at d70 there was a 0.0015 change.

That still isnt exactly make it or break it, but that is good to know.

Thank you for the article, that'll be my nightly study break read.
Old 12-04-2014, 11:53 PM
  #25  
On The Tree
iTrader: (3)
 
BattleSausage's Avatar
 
Join Date: May 2010
Location: Ga
Posts: 149
Likes: 0
Received 0 Likes on 0 Posts
Default

I'm not trying to prove you wrong, just asking you to explain more in depth.

What coefficient of thermal expansion would one use here? If using the STP that the NIST uses, that would be 68 degrees which would make sense given that's what our instruments are calibrated for. The values that I looked up for steel were roughly .0000067/degree or .0004 for that delta and for iron were .0000059/degree or .000354 which coincides with your numbers. Aluminum was .0000123 or .000738 which is still within a threshold most would tolerate. Using the aluminum coefficient at the 145 degree delta is only .0018 rounding up which is well below what they were showing, putting iron and steel far below those stated numbers.

So, is there a different set of coefficients other than linear that could explain their results or does that coefficient somewhat change below STP? Just trying to figure up because the two don't jive with one another, though the numbers above are all arrived at assuming uniformity through the metal which we know is definitely not the case with cast blocks.

I will say that I completely agree those numbers are small when compared to the tolerances most folks are accustomed to, but shouldn't be taken for granted.
Old 12-04-2014, 11:58 PM
  #26  
On The Tree
iTrader: (3)
 
BattleSausage's Avatar
 
Join Date: May 2010
Location: Ga
Posts: 149
Likes: 0
Received 0 Likes on 0 Posts
Default

Originally Posted by BigBadWhitey
Also in his reply he stated at d70 there was a 0.0015 change.

That still isnt exactly make it or break it, but that is good to know.

Thank you for the article, that'll be my nightly study break read.
The reason I mentioned that about the 70 degrees is because it coincided with the AERA article and made sense to me. All coefficients are assuming metals are uniform which something cast can vary greatly, I would think aluminum more so.

I'll look and see if I can find that article with the incremental breakdown and perhaps it explains how they arrived at those numbers.
Old 12-08-2014, 07:45 AM
  #27  
Staging Lane
 
iliveonnitro's Avatar
 
Join Date: Nov 2010
Posts: 93
Likes: 0
Received 1 Like on 1 Post

Default

I need to correct my previous post. When I was looking up CTEs, i was reading it as "microns" and it was labeled "microin," meaning microinches. Even I make mistakes, which is also why you shouldn't take advice from the internet

What this means is that when I said that there can be a 5-13 micron/in-F change in in expansion, what I really meant was 5-13 microin/in-F (microinches). This means it's NOT a .0002"/in-F (5 microns), it's actually 0.000005"/in-F. This also makes more intuitive sense. This actually helps my stance on there being no difference in building an engine in cold or hot weather.

A 100F change in ambient temperature will only make for a mismatch in ~0.0006"/in.


Originally Posted by BattleSausage
So you're saying that per 1 degree change, you're looking at a change of generally 6-9 microns? Given that 25 microns is roughly .001" (.00098), that is a huge difference. If what you're saying is true, there would be hardly any need to maintain a tightly controlled, consistent temp when machining blocks like what RED/ERL use (or any other competent shop).
No, not quite. I'll use my corrected numbers. Per 1 degree change, there is closer to 0.000006" (about .152 microns) change, per inch of material.

Originally Posted by BattleSausage
That said, being that 25 microns (.001) should be easily measured with a halfway decent dial bore gauge or caliper, given that any competent machinist should be using one accurate to .0005" or better.
Sure, a machinist can accurately measure 0.001", maybe even with calipers, but we are not machinists. They have a ton of practice in repeatable measurements. They use calibrated measurement tools, and when they start getting into that <0.001" range, they stop using calipers/micrometers and switch to non-handheld instruments to eliminate human error.

Last edited by iliveonnitro; 12-08-2014 at 07:58 AM.
Old 12-08-2014, 07:55 AM
  #28  
Staging Lane
 
iliveonnitro's Avatar
 
Join Date: Nov 2010
Posts: 93
Likes: 0
Received 1 Like on 1 Post

Default

Originally Posted by BattleSausage
What coefficient of thermal expansion would one use here? If using the STP that the NIST uses, that would be 68 degrees which would make sense given that's what our instruments are calibrated for. The values that I looked up for steel were roughly .0000067/degree or .0004 for that delta and for iron were .0000059/degree or .000354 which coincides with your numbers. Aluminum was .0000123 or .000738 which is still within a threshold most would tolerate. Using the aluminum coefficient at the 145 degree delta is only .0018 rounding up which is well below what they were showing, putting iron and steel far below those stated numbers.

So, is there a different set of coefficients other than linear that could explain their results or does that coefficient somewhat change below STP? Just trying to figure up because the two don't jive with one another, though the numbers above are all arrived at assuming uniformity through the metal which we know is definitely not the case with cast blocks.

I will say that I completely agree those numbers are small when compared to the tolerances most folks are accustomed to, but shouldn't be taken for granted.
Your referenced numbers are correct (see my above post). I feel like this has happened before (link)...

CTEs between steel and iron are practically negligible at the temperatures and tolerances we are talking about.
Old 12-08-2014, 09:56 AM
  #29  
TECH Enthusiast
iTrader: (1)
 
Briou's Avatar
 
Join Date: May 2012
Posts: 570
Likes: 0
Received 0 Likes on 0 Posts
Default

I'd like to see an uncertainty analysis on this sort of thing just out of curiosity....Though it is also probably more negligible than anything else, I wonder how much the change in temp would affect the measurement tools...I'm sure it is within the uncertainty bounds given by the manufacturer, but it would be interesting to see the total uncertainty of this situation
Old 12-08-2014, 01:31 PM
  #30  
Staging Lane
 
iliveonnitro's Avatar
 
Join Date: Nov 2010
Posts: 93
Likes: 0
Received 1 Like on 1 Post

Default

Originally Posted by Briou
I'd like to see an uncertainty analysis on this sort of thing just out of curiosity....Though it is also probably more negligible than anything else, I wonder how much the change in temp would affect the measurement tools...I'm sure it is within the uncertainty bounds given by the manufacturer, but it would be interesting to see the total uncertainty of this situation
Which measurement tools? People make a living from understanding these differences, so I'm no expert. It can be significant, though. Eg, your hand heating up the frame of a micrometer can affect actual measurement. It's common for micrometers or larger gage blocks to change by 10µin just by holding them in your hand for a few minutes. This is all beyond the scope of this thread, though.

http://www2.mitutoyo.de/fileadmin/us...U_4mmspine.pdf



Quick Reply: Engine assembly - in the cold?



All times are GMT -5. The time now is 12:39 AM.