1 Introduction
In recent years, the application of hexahedral (hex) meshes in finite element and isogeometric analysis has become increasingly widespread, because of its good numerical performance, small storage space requirements, and natural advantage of being able to construct tensorproduct splines. However, hexmesh generation is not yet mature, and it cannot be guaranteed that a good quality initial mesh can be generated in all cases. For complex shapes and structural models, the octreebased mesh generation method was proposed
zhang2006adaptive ; Ito2010Octree . This method is efficient and robust, and it can ensure a topologically valid and wellformed meshing result. However, it generates a large number of cells and too many singularities. In some scenarios, we do not need a dense mesh and complicated interior structures. Meshes with simple structure and fewer singularities are more conducive to accelerating computational and convergence speed bourdin2007comparison . Therefore, it is very important to propose an effective singularity structure simplification method for hexmeshes.Some research work has contributed to this topic in the past 10 years. In woodbury2011localized , an adaptive hexmesh localization method was proposed. Topological operations such as collapsing and pillowing are used to process the locality, and localized roughening is maintained while maintaining topological connectivity and shape of the input mesh, which provides a basic idea of hexmesh coarsening. In gao2015hexahedral , the mesh structure is simplified according to the reparameterization requirements, and singularity is effectively reduced while maintaining the number of mesh elements. Template matching is used to split patches and eliminate the leading blocks. However, its implementation is very limited and not robust. It cannot simplify selfinterleaved and closed loops, resulting in poor results on input meshes obtained from octreebased methods. In gao2017robust , a robust hexmesh structure simplification method was proposed. It is possible that a feasible solution with a simpler and coarser structure exists, but the algorithm might fail to find it. Especially, the ranking method for the selection of tobecollapsed base complex sheets/chords is only based on the thickness, and it cannot guarantee to remove most of the singular structures. It will also introduce a few closedloops and terminate the simplification process in advance. For an initial hexmesh with many singular vertices, a proper priority ranking algorithm is needed to guide the simplification of the singularity structure. Moreover, a local parameterization is also needed to improve the mesh quality and repair topology structure after simplification. In this paper, we propose an improved singularity structure simplification method of hexmeshes. The main contribution can be summarized as follows:

A new weighted ranking approach for singularity structure simplification is proposed by combining the valence prediction function of local singularity structure, shape quality metric of elements and the width of base complex sheets/chords.

A local optimization for SLIM rabinovich2017scalable is proposed to improve the uniformity of hexelements while maintaining the element quality;

An adaptive sheet refinement method is proposed to preserve surface features while maintaining similar number of hexelements.
Based on these improvements, the proposed weighted ranking method can achieve a smaller number of singularities with comparable Hausdorff distance ratio, effectively remove the presence of kinks in the hexmesh, and yield better mesh quality compared to the thickness ranking method gao2017robust .
The remainder of the paper is structured as follows. A review of related hexmesh generation and mesh simplification is presented in Section 2. Some basic concepts and framework overview are described in Section 3. Section 4 presents the sheet and chord collapsing operation of basecomplex. The proposed weighted ranking approach is described in Section 5. Adaptive sheet refinement is presented in Section 6. In Section 7, the experimental results are illustrated. Finally, the paper is concluded and future work is outlined in Section 8.
2 Related Work
Hexaheral mesh generation. Hex mesh has been widely studied for decades. However, an automatic method that can generate high quality hexmeshes for any complex geometry is still unavailable because of the strong topological constraints shepherd2007topologic , i.e., the dual chord and the dual sheet. Unlike tetrahedral meshes, any local changes in the mesh would propagate to the whole mesh by dual chords or dual sheets shepherd2007topologic , which makes hexmesh generation a very challenging task. Some methods were devised for specific types of geometries. For example, the mapping method is very preferable for mappable geometries, while the sweeping method roca2009paving is often used for swept volumes. By combining with domain partition, they can be applied to complex geometries wu2018fuzzy roca2009paving . Based on the idea of paving, several geometric and topological approaches have been proposed for allhex meshing. Plastering blacker1993seams and HMorph STEVEN2015H generate layers of hex elements in geometric ways, whereas the whisker weaving tautges1996whisker ledoux2008extension method uses spatial twist continuum and generates the topological dual of hexmesh. Unconstrained plastering staten2010unconstrained is extended from plastering. Different from other paving methods, it starts from propagating the original geometry boundary instead of a premeshed boundary into the interior domain, and hex elements are generated when three propagating fronts intersect each other. The octreebased approach schneiders1997algorithm is very robust and can be executed in a highly automatic way, however, it yields poor quality elements near boundary and the final mesh heavily relies on the orientation of the coordinate system. The polycube based meshing approach uses a low distortion mapping between the input model and polycube, and computes the corresponding volumetric mappings. The deformation methods are introduced for polycube construction Gregson2011All ; Liu2015Feature ; Hu2016Centroidal ; Hu2017Surface , and frame fields are proposed to guide the polycube construction Fang2016All ; Yu2014Optimizing . In Matthias2011CubeCover , Nieser et al. computes a global parameterization of the volume on the basis of a frame filed to construct hexmeshes. Theoretical conditions on singularities and the gradient frame field are derived for degenerated parameterization, and badly placed singularities can lead to distortion. Based on spherical harmonics representation, Huang et al. Jin2011Boundary generated a boundaryaligned smooth frame field by minimizing an energy function. Though impressive results were obtained from the frame field based approaches, further efforts are still needed for practical use.
Mesh simplification. Mesh simplification generally reduces the number of elements and maximizes the appearance of the original mesh by performing local coarsening operations. Triangular elements can be combined with the edge flipping operation and local MSL form of the minimum energy function. This method was also applied to hierarchical mesh generation with step by step simplification. In quadrilateral and hexmesh simplification, similar local operations were also proposed Tarini2010Practical ; shepherd2010adaptive . Sheets and chords are extracted by the inherent dual structure, and the local operation is simplified for the object gao2015hexahedral ; gao2017robust . Recent progress in structure simplification has achieved great success in polycube simplification Gianmarco2016Polycube and hexmesh optimization WANG2018103 . In Gianmarco2016Polycube , the singularity misalignment problem was solved directly in the polycube space, and the corner optimization strategy was introduced to produce coarser block structured surface and volumetric meshes. Moreover, the induced meshes are suited for spline fitting. Topology control operations in hexmesh simplification can also be applied to adjusting low quality mesh elements. In WANG2018103 , an adjustment strategy for repairing the inverted elements was proposed by combining the basic mesh editing operations with frame field optimization. Based on the singularity structure in the mesh, a basecomplex block structure is extracted in gao2017robust . Then the simplification operation is performed to collapses base complex sheets and chords while redistributing the distortion based on a volumetric parametrization. However, the selection of appropriate base complex sheets/chords to be collapsed is only based on their thickness, which will introduce a few closedloops, cause an early termination of simplification and a slow convergence rate. In this paper, a new weighted ranking function will be proposed by combining the valence prediction function of local singularity structure, shape quality metric of elements and the width of base complex sheets/chords.
3 Basic concepts and framework overview
The proposed hexmesh simplification can effectively reduce the singularity structure while maintaining the specified number of elements. In this section, we briefly introduce the definition of singularity structure, basecomplex and two types of structure called basecomplex sheet and basecomplex chord.
Basecomplex. The valence of vertex, edge and face is denoted as the number of its neighboring hex elements. A vertex is said to be regular if its valence is four on the boundary or eight in the interior. Similar to the regular vertex, an edge is regular when its valence is two on the boundary or four in the interior. Then a series of connected irregular edges with the same valence compose of a singular edge, and its two ending vertices are called singular vertices, except the case of closed singular edges. The singularity structure is composed of these singular edges and singular vertices. According to the above definitions, we can extract the singularity structure of a hexmesh. Each singular edge with a valence of can be extended to segmented surfaces, and the valid manifold hexmesh can be divided into cubelike components by these segmented surfaces (refer to gao2015hexahedral for more details). A segmented structure called basecomplex can be extracted in this way. The basecomplex of the hexmesh is denoted as , where is the set of cubelike components (composed of hex elements), and are the set of 8 corners of each cubelike component and the set of basecomplex edges (a series of connected edges between two basecomplex vertices) respectively, and contains basecomplex faces of each component.
Basecomplex sheet and basecomplex chord can be extracted based on the basecomplex structure. Since each of these components aligns with its adjacent components with continuity, and the singularities are located at its eight corners and three groups of four topologically parallel basecomplex edges. Removing components can effectively simplify singularity structure by collapsing basecomplex sheets and chords. The basecomplex sheet consists of three parts: the left surface (or the right surface ) contains all basecomplex vertices, edges and faces in the boundary of the left (or right) part, and the middle volume contains the basecomplex edges with two end nodes on and respectively. Topology elements in and can form element groups. Basecomplex chord has a similar definition, in which two sides follow the main diagonal direction. Fig. 1 shows the structure of basecomplex sheet and basecomplex chord.
Framework overview. As shown in Algorithm 1, we propose an improved singularity structure simplification method for hexmeshes while maintaining the shape boundary and the target number of elements. After comparison with experimental data, we find that the collapsing order of basecomplex sheets and chords has a significant effect on the final simplification results. Hence, we propose an optimized weighted ranking approach for components removing based on the analysis of edge valence. All the basecomplex sheets/chords are ranked with the valence error by minimizing an objective function of singularity structure. With the proposed method, the singularity structure complexity of a hexmesh decreases rapidly. Furthermore, a few closeloops and entangled sheets can be commendably eliminated, leading to a high simplification rate. In addition, two extral ranking terms are adopted to maintain the elements quality and shape boundary. In the simplification, sheet refinement is performed to obtain a similar number of elements as the target number . We propose an adaptive sheet refinement method based on the pointsampled Hausdorff distance on surface, which can improve the hexelement uniformity and reduce the error between the input and output hexmesh geometry. To locally improve the uniformity and aspect ratio, we also propose a local regularization optimization in the parametrization for sheet/chord collapsing.
4 Coarsening operators on hexmeshes
In this section, we introduce two local coarsening operations on hexmeshes: the basecomplex sheet collapsing operation and the basecomplex chord collapsing operation, which are two generalized concepts to reduce singularity structure complexity of hexmeshes. The basecomplex sheet collapsing operation is mainly applied to change singularities globally, and has a bigger impact on the boundary shape. The basecomplex chord collapsing operation is used to optimize local singularity structure, especially for removing edge pairs with a valence of 35. These two operations may introduce nonmanifold and doublet configurations as shown in Fig. 2. Moreover, the collapsing operations may lead to local higher complexity which should be prevented. Hence, several filtering criteria will be proposed to avoid these problematic cases.
4.1 Basecomplex sheet collapsing operation
A basecomplex sheet collapsing operation similar to gao2017robust will be adopted here. Both sides of a sheet can be found by components, and then we remove the middle part of the basecomplex sheet and preserve the side of or . Finally, parametrization is employed to relocate these vertices within the ring neighborhood region ( is set to 4 as gao2017robust ). Before sheet collapsing, several filtering criteria are used to detect whether it should be put into the priority queue.
Valence prediction. Edge pairs in and are collapsed into a single edge, and the corresponding edge valence may be changed. Generally, the valence of an inner edge is greater than two; otherwise, the adjacent elements will degenerate or form a doublet configuration (two hexahedra share two or more faces as in Fig. 2), which is forbidden in our framework. For the edge pair of and in a nonselfintersection sheet, if the new edge is denoted as , then the valence of can be computed as follows:
(1) 
where is the valence of a basecomplex edge. For the basecomplex face directly connects and , when it is on the boundary; otherwise, . The basecomplex face is either on the boundary or in the interior of hexmesh.
Boundary shape. The feature vertices/lines are extracted in the initialization stage, and in order to preserve sharp features, the sheet and chord contain sharp feature vertices are not allowed to be removed. Moreover, the basecomplex sheet is not collapsed when the feature edges lie on basecomplex edges. In the collapsing operation, we use a similar way for hexmesh sheet collapsing. Firstly, we find all elements for both sides, then choose the temporary positions for vertex pairs. The topology element pairs in or are only preserved in one side, then we remove all hexahedra between these two sides. In the optimization step, local parameterization gao2017robust is adopted. The boundary shape error and interior distortion will be distributed to ring neighboring elements by solving with the SLIM approach rabinovich2017scalable .
4.2 Basecomplex chord collapsing operation
The basecomplex chord collapsing operation is mainly used to optimize bad singularity structure locally. It only has effect on one column of basecomplex components. Different from chord collapsing in hexmesh that merging four vertices per group into a new position, Fig. 2 shows the 2D case of chord collapsing. We extract two pairs of opposite basecomplex edges, and merge them along the diagonal direction. Here the collapsing direction is denoted as the main diagonal direction and the orthogonal direction along boundary is referred as the subdiagonal direction. If the number of elements in opposite edges is different, we will collapse several subsheets before applying the collapsing.
Collapsing direction. The collapsing direction can be chosen in two directions, and the collapse following these two directions will have quite different influence on singularity structure. The valences of basecomplex edges in two sides along the main diagonal direction may be changed. Here, we only consider the four groups of topologyparallel basecomplex edges in the surface of chord following the direction of dual string. We compute the predicting valence of these created basecomplex edges, and obtain the valence difference between the created edge and the regular edge. Our objective is to remove pairs with a valence of 35 without introducing high valence singularities. In this paper, we measure the difference between the predicted valence and the regular valence using
(2) 
where and are basecomplex edges in the subdiagonal direction, and are in the main diagonal direction as shown in Fig. 2, is the number of contained components of the basecomplex chord. We choose the optimal collapsing direction by minimizing . In our experiments, the chord collapsing operation is not allowed when . In addition, we implement an easytodetect method in advance to improve efficiency. The four groups of parallel edges containing less than 2 groups are all singularities, which will not remove the singular edges locally while collapsing. This kind of chord will not be pushed to the priority queues.
The above operations are iteratively performed during simplification. Basecomplex sheet collapsing can make significant impact on mesh globally, but it is extremely difficult to remove selfintersection sheets with complex tangles and closeloop configurations without creating vertices with high valence. Basecomplex chord collapsing is used to eliminate the entangled regions, and it contributes to improving the simplification ratio of sheets. Experimental results show that a higher simplification rate can be achieved by alternately performing these two operations.
4.3 Local parameterization for uniformity improvement
After collapsing arbitrary sheets/chords, we apply a local parametrization gao2017robust based on SLIM rabinovich2017scalable to relocate points within the collapsing region. The framework of SLIM uses the local/global algorithm Gotsman2010A , and solves the distortion term globally while fixing the rotation as computed in the local step. In 3D case, the mapping from the original tetrahedral element to a deformed shape in a local orthogonal frame can be denoted as a Jacobian, and the deformation can be expressed indirectly by a transformation from the tetrahedron with three orthogonal edges to both shapes as shown in Fig. 3. The mapping between the reference element to the original element is defined as
(3) 
where
(4) 
is a constant matrix. Similarly, the mapping between the reference element and the deformed element is:
(5) 
Since and are affine matrices, finally the Jacobian of can be denoted as:
(6) 
Our experiments show that adjusting the Jacobian of a transformation to the target shape in a local operation can lead to an ideal mesh result after global simplification. In this paper, we also propose a local optimization strategy to move vertices within the collapsing region during parameterization. For edges in the collapsing region, their length will be rescaled while maintaining the element quality.
Let be the mesh of the parameterized region, is the set of nodes, and is the set of connectivity information, including nodes and edges . The discrete operator on is defined as
(7) 
and the iterative form can be defined as
(8) 
and the iteration is terminated when the threshold of variance
is reached,(9) 
where and are the vertex labels, is the number of neighboring vertices of the th vertex, is the set of inner vertices (not including vertices in and ), and is the set of vertices on the boundary.
5 Weighted ranking for structure simplification
Many hexmesh generation method such as octreebased and framefield methods often yield unnecessary interior singularities. The resulting hexmesh will have a large number of small components in basecomplex since the singular edges are distributed along the twelve edges of a cubelike component. The number of singularities can be progressively decreased by performing collapsing operations based on components, and the simplified singularity structure is obviously different with various collapsing sequences. In this paper, we introduce a weight ranking sequence, which can choose the optimal candidate to remove iteratively. The ranking sequence aims to remove singularities within fewer iterative steps. We formulate this problem as an energy minimization framework, and introduce a valence term related to the valence difference caused by collapsing to achieve a rapid removal of singularities. In addition, optimization will be performed after each simplification step, and the distortion error caused by collapsing is distributed to neighboring elements and sheets. On the other hand, the collapsing operation is also under the constraints that the resulting elements should not be inverted and the max Hausdorff distance ratio should be kept. Hence, the sheet/chord removal leading to less mesh distortion will have the collapsing priority. From this motivation, we also introduce two extra ranking terms, called the distortion term and the width term. In our framework, the ranking function is a combination of the valence term, the distortion term and the width term, which is more robust than the previous ranking method gao2017robust only based on the thickness of basecomplex sheets/chords.
5.1 Ranking method of basecomplex sheet
In the basecomplex sheet ranking sequence, we combine the valence term, the distortion term and the width term as the normalized form daniels2008quadrilateral . The ranking function which can greatly improve the simplification rate of basecomplex components is defined as
(10) 
where , and are weights of different ranking terms. In our implementation, the valence term has the biggest weight, i.e, , and . We also control the value of each term within to reduce the impact of the actual numerical size.
Valence term. The proposed weighted ranking algorithm for basecomplex sheet collaping mainly focuses on the valence difference of singular edges during the simplification. It has been proved in gao2017robust that the singularities of a hexmesh will be progressively simplified within a finite number of iterations, and the number of components will decrease while reducing the valence of singular edges. In this paper, we propose an indirect energy function of valence difference between the current mesh and the mesh without singularities. For the mesh with singular basecomplex edges set , the energy function is defined as
(11) 
Since the simplification process is based on two kinds of collapsing operations, and the singular edges are only located in , and , then the energy function has a local representation on the basecomplex sheet when it is collapsed,
(12) 
where is the basecomplex edge in of , is the basecomplex edge to be collapsed, and is the created basecomplex edge.
According to the energy function , some analysis on the structure of basecomplex sheets can be performed. The basecomplex sheet has an interesting property: the interior edges which are topology parallel to the dual face of sheet are all regular, the singular edges only exist in or and , and the collapsing will introduce edges with a different valence. Hence, we can accurately predict the influence of collapsing.
During a collapsing operation, the edges in the middle part will be eliminated. For a singular edge , if the whole edge is contained in as , then the value of will decrease. This type of elimination is equivalent to creating new regular edges while collapsing. Moreover, the singularity structure will not change when is a part of a singular edge , and the type of collapsing does not affect the other part of the singular edge and the basecomplex faces extended from it. Such basecomplex edges will not be considered in our valence calculation. Two types of are shown in Fig. 4(a).
Since a singular edge is completely contained in or of one or more basecomplex sheets, the collapsing may remove the singular edges in both sides directly. Concerning the valence variation of edges in an edge pair of and , we have the following three cases which correspond to c1, c2 and c3 in Fig. 4(b) respectively: (c1) all the edges in and are regular, (c2) edges in only one side of or are irregular, (c3) both edges in and are irregular. In case of (c1), the valence of the created edge will be regular; in case of (c2), the created edge will have the same valence as an irregular edge, and it does not affect the surrounding singularity configurations; in case of (c3), the valence of created edges will change, which means that the singularities of the rest part of hexmesh will be changed, and the flow direction of neighboring basecomplex sheets might lead to different directions. Moreover, there are several configurations in case of (c3), the created edges might have different valences compared with basecomplex edge pairs in and . The singular structure will be simplified when the valence difference between irregular and regular edges decreases. In contrast, the removal making the valence of singular edges higher should be avoided. The created edges might not unknot selfinterested sheets which are hard to remove, and it will greatly influence the final component reduction ratio, and cause an early termination for simplification.
To improve the convergence rate of , we greedily select the basecomplex sheet which can effectively reduce locally without introducing edges with higher valence. The valence term is defined as
(13) 
in which
where and form an edge pair, and they belong to and respectively, is the whole singular edge in , is a large value to control the scale of this term, which is set as the maximum number of in the hexmesh. In our experiments, is set to be 1.67. For the purpose of minimizing the energy function, the convergence rate will be faster when the value of is much larger. is a ranking term that encourages the collapsing candidate which could eliminate more singularities.
Distortion term. The distortion term is an optional term for hexmesh with complex structure, where the sheet passing through the regions with dense singularities often contains patches with serious distortion. Removing these sheets can greatly improve the average value of Jacobians, and lead to a significant complexity reduction in simplification. Here we use the shape metric of hexahedron knupp2003algebraic to measure the sheet distortion. if the hexahedron is a cube with parallel faces, and if the hexahedron is degenerated, and is a scaleinvariant. In our paper, we obtain the central difference of in each element for three parametric directions, and select the maximum difference as the differential value of the hexahedron. From the experiments, we find that serious distortion happens when the differential value is up to . In this term, we use the central difference of to find the regions with distortion, and twist is more serious while the differential value is bigger. Since the local parameterization can improve the element quality, removing regions with serious distortion in advance will increase the average value of Jacobians locally. is defined as
(14) 
where is of the neighboring element for the th element in the th parametric direction.
Width term. The width term in the weighted ranking function measures the width of sheet, which prevents wrong collapsing since if the sheet is too wide and then the collapsing will lead to big distortion on the boundary geometry and affect the adjacent sheets seriously. Hence it is reasonable to remove sheets with thin shape. For this term, we use the width of basecomplex edges in , which is more accurate than the length between the vertex pair on surface. In our framework, is defined by combining the average width and the minimum length as follows,
(15) 
in which is the average length of element edges, is the length of the basecomplex edge connecting and , and the weights and .
5.2 Ranking approach for basecomplex chord
The basecomplex chord collapsing only influences one column of components, which is used to adjust regions with many edge pairs having a valence of 35. From our observation, edge pair with a valence of 35 often exists in the entangled sheets, which is difficult to eliminate. In order to untangle them, we propose a priority metric ,
(16) 
in which is the valence term and is the geometry error term.
Geometry error term. The chord collapsing operation often leads to simplification results with inverted elements. We propose a simple strategy for priority processing on chords with narrow shape and smaller length. The aspect ratio of a chord is defined as the ratio of the average length of the main diagonal to the subdiagonal, which is applied to the measurement of thickness. To reduce the collapsing effect on boundary geometry, Gaussian curvature xu2006convergence is used to measure the shape error locally after collapsing. In our implementation, we use the variance of curvature to find patches with significant curvature changes. A patch may contain sharp features when its variance of curvature is large as shown in Fig. 5. The geometry error term is defined as
(17) 
in which are the average length of the main diagonal and the subdiagonal respectively, and is the Gaussian curvature of a vertex on two sides.
Valence error term. The valence error term measures the valence error of four topological parallel edges. To eliminate entangled sheets and simplify the local complexity, we require that the three topological parallel edges created by collapsing should be all regular. The ideal situation is that the valence error tends to be zero. In our framework, the valence error is set as one of the optimization goals,
(18) 
In this step, the edges with high valence will not be introduced, hence the candidates will not be pushed to the priority queue when .
6 Sheet refinement
Sheet refinement is performed during the simplification pipeline in order to maintain the input mesh geometry with a similar number of elements to the userdefined target number. A similar method in gao2017robust can be used to split one element on a specific sheet into two elements along the direction perpendicular to the parallel edges. In this paper, we propose an adaptive sheet refinement method to improve the accuracy of boundary geometry approximation.
In our implementation, we find that choosing a sheet with the maximum width to refine is not a robust strategy, where some boundary patches with large boundary approximation error may not be refined. In our method, we firstly obtain the average length of all edges along the collapsing direction, and then compute the average Hausdorff distance ratio by the means of point sampling for each sheet in the priority queue. According to the descending order of , the first four basecomplex sheets will be selected in advance, and the average length in the collapsing direction is denoted as . We choose one from the first four sheets to perform refinement if ; otherwise, we refine the candidate with the maximum and meeting the above condition. During simplification, collapsing operations may fail frequently due to the element quality and shape error constrains. In order to relax these constrains, we also perform the refinement process when a sheet collapsing fails. The basecomplex sheets sharing and with the removed sheet are selected as candidates. The refinement process narrows the parameterized region of failed sheets, such that it reduces the shape error by introducing more elements, and the sheet may be collapsed in the next iteration. In addition, another criterion is introduced to control the number of elements strictly. For the input hexmesh with elements, if the target number is before performing refinement, we check whether the number of hexahedra contained in a sheet is less than . This criterion can effectively prevent some sheets being refined repeatedly.
7 Experimental results
We tested our algorithm on a fourcore i7 processor with 8 GB memory. The maximal number of iterations of the SLIM solver is set as 5, and we set (the threshold of Hausdorff distance ratio that defined by the user, the simplification rate becomes larger when increases) and (the rate of the target number over the number of elements in the input mesh) for all experiments and figures. We also report the number of hex elements (), the number of basecomplex components () and the minimal, average and standard variance value of scaled Jacobians (MSJ/ASJ/Std). The boundary geometry error is measured by the Hausdorff distance ratio (HR). For the experiments on the database given in gao2017robust , we perform the proposed method in of meshes in this database. Most meshes achieve higher simplification ratio compared with gao2015hexahedral and gao2017robust , and the average simplification rate for these meshes is .
Weighted ranking candidates. Here we show some comparison results of the thickness ranking method gao2017robust and the proposed weighted ranking method. In Fig. 6(a), we show the top 4 candidates in the fertility mesh when the simplification rates achieve , and respectively. In the initial priority queue, our weighted ranking term can effectively pick up the basecomplex sheets with serious distortion and closeloop configurations. Moreover, the number of singularities can also be reduced faster. For a simplification rate of , the thickness ranking method needs iterations, and our proposed method only needs iterations. For the comparison results as shown in Fig. 6(a), when the simplification rates reach and , our ranking algorithm can preferentially remove sheets to promote singular edge elimination, and the regions with dense singularities (marked with red circles) have been greatly improved. Compared with the simplification results by thickness ranking, regions with dense singular edges can be successfully eliminated by our method, and selfintersected sheets can be removed as well at the same time. In the simplification process, the distortion term is used to eliminate elements with poor shape quality, and to spread the distortion to neighboring elements while gradually improving the value of MSJ/ASJ in the hexmesh. Our ASJ is better than thickness ranking during these three stages, and we can achieve ASJ improvement over the input and ASJ improvements over the simplification result by gao2017robust . The average running time of the entire dataset is minutes, which is slightly slower than gao2017robust .
Element uniformity. In the proposed approach, we use local parameterization to improve the uniformity of hexmesh elements. We also propose a measurement of element uniformity called the volume deviation ratio (VDR), which is denoted as the standard volume deviation of neighboring elements divided by the average element volume. The range of VDR is , and the uniformity is better while the value is closer to 0 (for all elements with the same volume, VDR). Compared with the thickness ranking method gao2017robust , our simplification results have and improvement in the average volume deviation ratio (AVDR) and the max volume deviation ratio (MVDR). In our experiments, the average AVDR and MVDR of meshes from polycubebase methods are and respectively, and the average AVDR/MVDR are in the simplification results of octreebase meshes. AVDR and MVDR gain and improvement compared with the thickness ranking approach for octreebased meshes. Two comparison examples are shown in Fig. 7 with the VDR colormap.
Simplification of hexmesh from polycubebased methods. For hexmeshes generated by polycubebased methods Gregson2011All ; Huang2014ℓ ; Fang2016All , the singularity structures are completely distributed on the surface, and the distribution of singular edges is sparse. Hence, the valence term has a small effect, and the weights and are set to a smaller value ( and in our experiments). As shown in Fig. 8 and Table 2, the proposed approach can achieve a higher basecomplex component reduction with similar element quality as the results in gao2017robust . In our experiments, the average scaled Jacobian is improved to , and the meshes obtain improvement for AVDR and MVDR compared with gao2017robust , respectively. Moreover, the average components reduction ratio is promoted to , and some results are close to the structure of meshes generated by Gianmarco2016Polycube .
Input hex mesh  Simplified result  

Model  #H  #BC  MSJ  ASJ  Std  #H  #BC  MSJ  ASJ  Std  HR (%)  R (%)  Time (m) 
Bimba (Fig.9)  25,347  25,347  0.06  0.80  0.162  27,900  134  0.43  0.97  0.049  0.95  99.47  103.48 
Bottle (Fig.9)  35,886  35,860  0.13  0.79  0.167  34,558  266  0.22  0.98  0.054  0.91  99.26  200.97 
Deckle (Fig.9)  53,658  53,116  0.03  0.84  0.187  53,680  806  0.10  0.95  0.082  1.00  98.48  793.93 
Fertility (Fig.6)  21,370  20,840  0.10  0.84  0.150  21,016  310  0.32  0.94  0.079  0.87  98.51  153.83 
Toy1 (Fig.5)  18947  18883  0.12  0.81  0.161  15784  144  0.51  0.96  0.059  0.66  99.23  48.53 
Toy2 (Fig.7)  14,288  14,288  0.15  0.81  0.158  13,952  129  0.49  0.96  0.059  0.90  99.10  48.59 
Lock (Fig.7)  28,753  25,720  0.01  0.80  0.244  28,501  2,990  0.17  0.93  0.109  0.91  88.37  381.46 
Eight (Fig.11)  4,571  3,867  0.17  0.78  0.155  5,428  43  0.53  0.92  0.065  0.69  98.89  7.53 
Bone (Fig.11)  2,751  2,520  0.15  0.78  0.159  2,484  37  0.69  0.93  0.069  0.75  98.53  4.24 
Simplification of hexmesh from octreebased methods. Octreebased hexmeshing approaches often generate a complex structure with dense local singularities. In gao2017robust , the greedy collapsing by thickness ranking was utilized under a set of filters. It can not find a coarser structure in the hexmesh with a large number of interior singularities and kinking, since the thickness ranking term does not have a direct effect on singularity removal. The corresponding simplification gao2017robust has a slow convergence rate, and it can achieve an average simplification rate around for the hexmesh database. Instead, our weighted ranking method can obtain a much simpler singularity structure with much fewer basecomplex components. The average simplification rate in the proposed framework can increase with respect to the initial number of basecomplex components in the input hexmesh, and gain improvement compared with gao2017robust . Moreover, in the proposed framework, adaptive refinement is performed during the simplification process, which can effectively maintain the quality of boundary geometry and promote the simplification process under the constraint of . Our ASJ/MSJ achieves , and gain ASJ improvements over the thickness ranking method. Some simplification results are shown in Fig. 9, and statistics are presented in Table 1. Comparison examples with gao2017robust are also presented in Fig. 10 and Table 2.
Model  #H  #BC  MSJ  ASJ  Std  AVDR  MVDR  HR (%)  R (%)  Time (m)  

Input  21,167  25,669  0.20  0.91  0.907  0.11  0.64  
Thickness ranking  22,524  805  0.14  0.96  0.068  0.30  3.52  0.98  89.36  30.67  
Gargoyle (Fig.8)  Weighted ranking  23,352  451  0.27  0.96  0.071  0.20  3.47  0.92  94.04  41.22 
Input  84,489  2,227  0.18  0.87  0.130  0.05  1.29  
Thickness ranking  80,295  1,817  0.09  0.95  0.069  0.10  2.74  0.77  18.41  46.89  
Stb (Fig.8)  Weighted ranking  83,678  819  0.24  0.93  0.092  0.10  1.35  0.97  63.22  60.32 
Input  16,608  16,487  0.11  0.86  0.139  0.18  2.69  
Thickness ranking  10,278  636  0.44  0.93  0.081  0.38  1.77  0.99  96.14  32.25  
Rocker (Fig.10)  Weighted ranking  10,790  441  0.35  0.93  0.088  0.26  1.75  0.99  97.33  50.55 
Input  13,987  13,987  0.02  0.79  0.168  0.46  8.02  
Thickness ranking  10,704  2,305  0.23  0.92  0.102  0.38  3.89  0.99  83.52  31.24  
Pig (Fig.10)  Weighted ranking  11,218  876  0.18  0.95  0.086  0.23  3.02  0.99  93.74  38.69 
Input  4,247  3,640  0.03  0.82  0.159  0.18  0.51  
Thickness ranking  2,868  580  0.24  0.90  0.117  0.36  2.43  1.00  84.07  14.57  
Bird (Fig.10)  Weighted ranking  2,935  278  0.25  0.90  0.127  0.23  1.45  0.95  92.36  16.54 
Input  19,075  18,355  0.13  0.85  0.151  0.28  3.20  
Thickness ranking  17,680  691  0.44  0.95  0.070  0.33  4.36  0.98  96.24  113.67  
Buste (Fig.10)  Weighted ranking  16,336  158  0.28  0.96  0.065  0.23  2.82  0.97  99.14  53.55 
is the standard deviation of the scaled Jacobians,
stands for the Hausdorff distance, and is the simplification rate.More importantly, octreebased meshes can be simplified into a similar singularity structure as polycube meshes. As shown in Fig. 11, singularities were mainly distributed on the boundary. The simplification rate achieves , and the interior singular edges are eliminated completely.
8 Conclusion and future work
In this paper, an improved singularity structure simplification method of hex meshes is proposed based on a weighted ranking function, which is a combination of the valence prediction function of local singularity structure, shape quality metric of elements and the width of basecomplex sheets/chords. Local optimization and adaptive sheet refinement are also proposed to improve the element quality of simplified hexmesh. Compared with the thickness ranking method, simpler singularity structure with fewer basecomplex components can be achieved by the proposed weighted ranking approach while achieving better mesh quality and Hausdorff distance ratio. The proposed approach has a few limitations. Sharp features can not be preserved very well on the boundary, and the boundary approximation error may increase in models with high genus. Possible solution might be a more strictly feature edge extraction and vertex mapping. In the future, we will apply the proposed hexmesh simplification method to volume parameterization, which is a bottleneck in isogeometric analysis.
References
 (1) Y. Zhang, C. Bajaj, Adaptive and quality quadrilateral/hexahedral meshing from volumetric data, Computer Methods in Applied Mechanics and Engineering 195 (912) (2006) 942–960.
 (2) Y. Ito, A. M. Shih, B. K. Soni, Octreebased reasonablequality hexahedral mesh generation using a new set of refinement templates, International Journal for Numerical Methods in Engineering 77 (13) (2010) 1809–1833.
 (3) X. Bourdin, X. Trosseille, P. Petit, P. Beillas, Comparison of tetrahedral and hexahedral meshes for organ finite element modeling: an application to kidney impact, in: 20th International Technical Conference on The Enhanced Safety of Vehicle, 2007.
 (4) A. C. Woodbury, J. F. Shepherd, M. L. Staten, S. E. Benzley, Localized coarsening of conforming allhexahedral meshes, Engineering with Computers 27 (1) (2011) 95–104.
 (5) X. Gao, Z. Deng, G. Chen, Hexahedral mesh reparameterization from aligned basecomplex, ACM Transactions on Graphics 34 (4) (2015) 142.
 (6) X. Gao, D. Panozzo, W. Wang, Z. Deng, G. Chen, Robust structure simplification for hex remeshing, ACM Transactions on Graphics 36 (6) (2017) 185.
 (7) M. Rabinovich, R. Poranne, D. Panozzo, O. SorkineHornung, Scalable locally injective mappings, ACM Transactions on Graphics 36 (2) (2017) 16.
 (8) J. F. Shepherd, Topologic and geometric constraintbased hexahedral mesh generation, Vol. 68, PhD Dissertation, University of Utah, 2007.
 (9) X. Roca Navarro, Paving the path towards automatic hexahedral mesh generation, PhD Dissertation, Universitat Politècnica de Catalunya, 2009.
 (10) H. Wu, S. Gao, R. Wang, J. Chen, Fuzzy clustering based pseudoswept volume decomposition for hexahedral meshing, ComputerAided Design 96 (2018) 42–58.
 (11) T. D. Blacker, R. J. Meyers, Seams and wedges in plastering a 3D hexahedral mesh generation algorithm, Engineering with Computers 9 (2) (1993) 83–93.
 (12) S. J. Owen, S. Saigal, Hmorph: an indirect approach to advancing front hex meshing, International Journal for Numerical Methods in Engineering 49 (1‐2) (2015) 289–312.
 (13) T. J. Tautges, T. Blacker, S. A. Mitchell, The whisker weaving algorithm: a connectivitybased method for constructing allhexahedral finite element meshes, International Journal for Numerical Methods in Engineering 39 (19) (1996) 3327–3349.
 (14) F. Ledoux, J.C. Weill, An extension of the reliable whisker weaving algorithm, in: Proceedings of The 16th International Meshing Roundtable, Springer, 2008, pp. 215–232.
 (15) M. L. Staten, R. A. Kerr, S. J. Owen, T. D. Blacker, M. Stupazzini, K. Shimada, Unconstrained plastering hexahedral mesh generation via advancingfront geometry decomposition, International Journal for Numerical Methods in Engineering 81 (2) (2010) 135–171.
 (16) R. Schneiders, An algorithm for the generation of hexahedral element meshes based on an octree technique, in: 6th International Meshing Roundtable, 1997, pp. 195–196.
 (17) J. Gregson, A. Sheffer, E. Zhang, Allhex mesh generation via volumetric polycube deformation, Computer Graphics Forum 30 (5) (2011) 1407–1416.
 (18) L. Liu, Y. Zhang, Y. Liu, W. Wang, Featurepreserving Tmesh construction using skeletonbased polycubes, ComputerAided Design 58 (2015) 162–172.
 (19) K. Hu, Y. J. Zhang, Centroidal Voronoi tessellation based polycube construction for adaptive allhexahedral mesh generation, Computer Methods in Applied Mechanics and Engineering 305 (2016) 405–421.
 (20) K. Hu, Y. J. Zhang, T. Liao, Surface segmentation for polycube construction based on generalized centroidal Voronoi tessellation, Computer Methods in Applied Mechanics and Engineering 316 (2017) 280–296.
 (21) X. Fang, W. Xu, H. Bao, J. Huang, Allhex meshing using closedform induced polycube, ACM Transactions on Graphics 35 (4) (2016) 124.
 (22) W. Yu, K. Zhang, S. Wan, X. Li, Optimizing polycube domain construction for hexahedral remeshing, ComputerAided Design 46 (1) (2014) 58–68.
 (23) M. Nieser, U. Reitebuch, K. Polthier, Cubecoverparameterization of 3D volumes, Computer Graphics Forum 30 (5) (2011) 1397–1406.
 (24) J. Huang, Y. Tong, H. Wei, Boundary aligned smooth 3D crossframe field, ACM Transactions on Graphics 30 (6) (2011) 1–8.
 (25) M. Tarini, N. Pietroni, P. Cignoni, D. Panozzo, E. Puppo, Practical quad mesh simplification, Computer Graphics Forum 29 (2) (2010) 407–418.
 (26) J. F. Shepherd, M. W. Dewey, A. C. Woodbury, S. E. Benzley, M. L. Staten, S. J. Owen, Adaptive mesh coarsening for quadrilateral and hexahedral meshes, Finite Elements in Analysis and Design 46 (12) (2010) 17–32.
 (27) G. Cherchi, M. Livesu, R. Scateni, Polycube simplification for coarse layouts of surfaces and volumes, Computer Graphics Forum 35 (5) (2016) 11–20.
 (28) R. Wang, S. Gao, Z. Zheng, J. Chen, Hex mesh topological improvement based on frame field and sheet adjustment, ComputerAided Design 103 (2018) 103 – 117.
 (29) C. Gotsman, L. Liu, L. Zhang, Y. Xu, S. J. Gortler, A local/global approach to mesh parameterization, Computer Graphics Forum 27 (5) (2010) 1495–1504.
 (30) J. Daniels, C. T. Silva, J. Shepherd, E. Cohen, Quadrilateral mesh simplification, ACM Transactions on Graphics 27 (5) (2008) 148.
 (31) P. M. Knupp, Algebraic mesh quality metrics for unstructured initial meshes, Finite Elements in Analysis and Design 39 (3) (2003) 217–241.
 (32) G. Xu, Convergence analysis of a discretization scheme for Gaussian curvature over triangular surfaces, Computer Aided Geometric Design 23 (2) (2006) 193–207.
 (33) J. Huang, T. Jiang, Z. Shi, Y. Tong, H. Bao, M. Desbrun, L1based construction of polycube maps from complex shapes, ACM Transactions on Graphics 33 (3) (2014) 1–11.
 (34) MeshGems, Volume meshing: meshgemshexa, http://www.meshgems.com.
 (35) N. Kowalski, F. Ledoux, P. Frey, Blockstructured hexahedral meshes for CAD models using 3D frame fields, Procedia Engineering 82 (2014) 59–71.
 (36) P. Murdoch, S. Benzley, T. Blacker, S. A. Mitchell, The spatial twist continuum: a connectivity based method for representing allhexahedral finite element meshes, Finite Elements in Analysis and Design 28 (2) (1997) 137–149.
 (37) Y. Wu, Y. He, H. Cai, QEMbased mesh simplification with global geometry features preserved, in: International Conference on Computer Graphics and Interactive Techniques in Australasia and Southeast Asia, 2004, pp. 50–57.
 (38) M. Livesu, A. Sheffer, N. Vining, M. Tarini, Practical hexmesh optimization via edgecone rectification, ACM Transactions on Graphics 34 (4) (2015) 141.
 (39) S. H. Liao, R. F. Tong, J. X. Dong, F. D. Zhu, Gradient field based inhomogeneous volumetric mesh deformation for maxillofacial surgery simulation, Computers & Graphics 33 (3) (2009) 424–432.
 (40) X. Fu, C. Bai, Y. Liu, Efficient volumetric polycube‐map construction, Computer Graphics Forum 35 (7) (2016) 97–106.
 (41) Y. Li, Y. Liu, W. Xu, W. Wang, B. Guo, Allhex meshing using singularityrestricted field, ACM Transactions on Graphics 31 (6) (2012) 1–11.
Comments
There are no comments yet.