期刊论文详细信息
Engineering Reports
Virtual microstructure design for steels using generative adversarial networks
Nam Hoon Goo1  Myungho Pyo2  Woon Bae Park2  Kee‐Sun Sohn3  Jin‐Woong Lee3 
[1] Advanced Research Team Hyundai Steel DangJin Works DangJin Republic of Korea;Department of Printed Electronics Engineering Sunchon National University Chonnam Republic of Korea;Faculty of Nanotechnology and Advanced Materials Engineering Sejong University Seoul Republic of Korea;
关键词: cycle GAN;    DCGAN;    metallography;    micrograph;    microstructure;    Pix2Pix;   
DOI  :  10.1002/eng2.12274
来源: DOAJ
【 摘 要 】

Abstract The prediction of macro‐scale materials properties from microstructures, and vice versa, should be a key part in modeling quantitative microstructure‐physical property relationships. It would be helpful if the microstructural input and output were in the form of visual images rather than parameterized descriptors. However, only a typical supervised learning technique would be insufficient to build up a model with real‐image‐output. A generative adversarial network (GAN) is required to treat visual images as output for a promising PMPR model. Recently developed deep‐learning‐based GAN techniques such as a deep convolutional GAN (DCGAN), a cycle‐consistent GAN (Cycle GAN), and a conditional GAN‐based image to image translation (Pix2Pix) could be of great help via the creation of realistic microstructures. In this regard, we generated virtual micrographs for various types of steels using a DCGAN, a Cycle GAN, and a Pix2Pix and confirmed the generated micrographs are qualitatively indistinguishable from the ground truth.

【 授权许可】

Unknown   

  文献评价指标  
  下载次数:0次 浏览次数:0次