- #1
nwong103
- 1
- 0
I'm an entry level engineer and hopefully a future designer. When I tolerance things, I just tolerance things by the default (everything is +-.05 inches) and for more accurate items, I just tolerance it +-.005 inches. I do some basic calculations to check worse case scenarios and make sure it will still work. This is definitely the WRONG way since doing anything in engineering without understanding why it is, is wrong in my book. There is a better way to do this and I'd like to know.
I'm often wondering what defines the limits of tolerances and how are those limits determined?
For example, if I set a tolerance of +-.0001 inch for a circular cut, will that be achievable? I'm guessing it all depends on the type of CNC machine and how accurate it can make these cuts.
I have very limited machining experience but when I do machine something at the shop using a lathe, mill, or drill press, I can't even imagine getting +-.05 inch accuracy from eye ball measuring. I'm sure machinist almost never eye ball things as eye balling measurements is hardly accurate enough. So back to my question, how do you know what is a proper tolerance that is actually achievable?
If possible can you please show me some resources so I can educate myself? Does the 14.5Y GD&T book go over these details? I always thought that guide just defines how to properly dimension your work so it's not a huge mess, follows standards, and makes it easy to communicate drawing to machining. I'm guessing it doesn't teach how an engineer properly tolerances things and explain why an engineer choose that specific numerical value over another value.
Do you think taking a Machine Shop class will teach me this stuff? Or will I just waste my time learning how to use the mill, lathe, drill press, etc that doesn't have the best accuracy? I used all the machines in the shop at campus and something tells me only automated machines like CNCs can produce accurate results.
I'm often wondering what defines the limits of tolerances and how are those limits determined?
For example, if I set a tolerance of +-.0001 inch for a circular cut, will that be achievable? I'm guessing it all depends on the type of CNC machine and how accurate it can make these cuts.
I have very limited machining experience but when I do machine something at the shop using a lathe, mill, or drill press, I can't even imagine getting +-.05 inch accuracy from eye ball measuring. I'm sure machinist almost never eye ball things as eye balling measurements is hardly accurate enough. So back to my question, how do you know what is a proper tolerance that is actually achievable?
If possible can you please show me some resources so I can educate myself? Does the 14.5Y GD&T book go over these details? I always thought that guide just defines how to properly dimension your work so it's not a huge mess, follows standards, and makes it easy to communicate drawing to machining. I'm guessing it doesn't teach how an engineer properly tolerances things and explain why an engineer choose that specific numerical value over another value.
Do you think taking a Machine Shop class will teach me this stuff? Or will I just waste my time learning how to use the mill, lathe, drill press, etc that doesn't have the best accuracy? I used all the machines in the shop at campus and something tells me only automated machines like CNCs can produce accurate results.