Researchers eye papermaking improvements through high-performance computing

phys.org | 10/5/2017 | Staff
cindy95240 (Posted by) Level 3
Click For Photo: https://3c1703fe8d.site.internapcdn.net/newman/gfx/news/2017/6-labresearche.jpg

With the naked eye, a roll of paper towels doesn't seem too complicated. But look closely enough, and you'll see it's made up of layers of fibers with thousands of intricate structures and contact points. These fluffy fibers are squeezed together before they are printed in patterns, and this resulting texture is key to the paper's performance.

For a large paper product manufacturer like Procter and Gamble , which regularly uses high-performance computing to develop its products, simulating this behavior—the way in which those paper fibers contact each other— is complicated and expensive. The preprocessing stage of generating the necessary computational geometry and simulation mesh can be a major bottleneck in product design, wasting time, money and energy.

Company - Development - Process - Lawrence - Livermore

To help the company speed up the development process, Lawrence Livermore National Laboratory (LLNL) researcher Will Elmer and his team of programmers focused their efforts on developing a parallel program called p-fiber. Written in Python, the program prepares the fiber geometry and meshing input needed for simulating thousands of fibers, relying on a meshing tool called Cubit, created at Sandia National Laboratories, to generate the mesh for each individual fiber. The p-fiber code has been tested on parallel machines developed at Livermore for mission-critical applications. P-fiber prepares the input for ParaDyn, the parallel-computing version of DYNA3D, a code for modeling and predicting thermomechanical behavior.

The ensuing research, performed for an HPC4Manufacturing (HPC4Mfg) project with the papermaking giant, resulted in the largest multi-scale model of paper products to date, simulating thousands of fibers in ParaDyn with resolution down to the micron scale.

Problem - Industry - Machines - Cores - Comparison

"The problem is larger than the industry is comfortable with, but we have machines with 300,000 cores, so it's small in comparison to some of the things we run," Elmer said. "We found that you can save on design cycle time. Instead of having to...
(Excerpt) Read more at: phys.org
Wake Up To Breaking News!
Sign In or Register to comment.

Welcome to Long Room!

Where The World Finds Its News!