Abstract
For clarifying the difference in the results of hydrogen compatibility testing with different techniques, slow strain rate tensile (SSRT) tests were conducted with three different techniques at temperatures ranging from 77 K and to room temperature (RT): external hydrogen testing of solid specimens (S-E), internal hydrogen testing of solid specimens (S-I), and external hydrogen testing of hollow specimens (H-E). At all testing temperatures, the relative reduction in area (RRA) was the highest in H-E, followed by S-E and S-I. However, the absolute value of RRA varied with temperature, indicating that the RRA was dependent on test temperature regardless of testing technique. In the absence of hydrogen, fracture surfaces of the specimens exhibited ductile features characterized by dimples, independent of both test temperature and technique. In contrast, in the presence of hydrogen, intergranular (IG) facets appeared on the fracture surfaces, and the fraction of the facet correlated with the RRA value. Namely, the lower the RRA, the higher the facet fraction observed. The RRA variation was attributed to differences in hydrogen supply mechanisms. Specifically, in the S-E condition, hydrogen diffusing along grain boundaries during testing contributes to the IG fracture, whereas in the S-I condition, the IG fracture was primarily caused by pre-charged hydrogen trapped at the grain boundaries prior to testing. In the S-E condition, additional time and strain were required to cause the IG fracture, resulting in higher RRA values than those in the S-I condition. Furthermore, the highest RRA values in the H-E condition were likely influenced by not only the hydrogen supply mechanism but also the specimen geometry.