I was recently appointment QA manager at my company and have dove head first into the realm of Schmidt Hammer(roll hardness) testing. A very significant issue we have is with warp. I have noticed that most of the time when I see a difference of greater than 10 on the Schmidt Hammer test we suffer from unacceptable warp issues. Since I’ve started hardness testing inbound rolls we have adopted a pass fail system where any roll that tests to greater than 10 is rejected. To add to this problem our supplier has adhered to a system that says anything under 15 is acceptable by industry standards.
What is your take on this? Do you feel that it is unreasonable to ask the mill to commit to less than a 10 difference?
Also, the mill is questioning my testing methods. TAPPI T834 suggests to sample a roll every 6 inches across the roll (the mill uses this sample frequency). I have always been taught that as for a scientific process that the larger sample size yields more accurate result. With this being said, I chose to test my rolls at a frequency of every 3 inches which would increase my sample size.
Do you feel that my testing method is in some way negatively affecting the results? As in, is my sample size causing the rolls to “fail” on a more frequent basis?
I love the three inch method. However, you would not be able to correlate to any of the 140 +- containerboard machine out there. I would agree you are more likely to find a wet streak with your protocol, but mill specs usually call for moisture deviation to be six inches or less.
Do you have linerboard specification sheets from other companies? You may want to investigate roll hardness from other manufactures. If your supplier is so entrenched in their thinking you might consider saving their rolls and having one of their Corrugator supervisors show you how to run the board without warp. It’s a bold move!