Progressive Meshes -- CS 679 Project, Jasmin Patry, April 2000


Note: This web page is a summary of the results of my project. For more information please see the slides for the talk gave about this project.

Introduction

The goal of this project was to implement the construction process of Progressive Meshes, as described by Hugues Hoppe [1]. A Progressive Mesh is a continuous level-of-detail representation of a geometric model, consisting of a low-resolution base mesh M0 and a series of detail records. These detail records are vertex splits, which describe the inverse of the edge collapse operation used to simplify the mesh during the construction phase. Both of these operations are shown below.

A Quadric Error Metric (QEM) can be used to order the edge collapses to yield a good low-resolution approximation of the original mesh [3]. The QEM that I implemented has the following options:

The standard QEM (with the memoryless enhancement turned off) "remembers" information about the original high-resolution mesh, while the memoryless version discards this information as the simplification progresses. While initially counter-intuitive, Hoppe showed that the memoryless version improved the quality of the resulting meshes.

It is also possible to select whether or not vertex normal information should be included in the QEM. If normal information is used, then the QEM incorporates the error in the normal vectors of the simplified mesh as well as the geometric error. Otherwise, only the geometric error is considered.


Mesh Data Structure

I used the mesh data structure described in [2]. This data structure consists of an array of vertices (storing positions and normals), and an array of faces (each face stores vertex indices, as well as the indices of the neighbouring faces).

While this data structure is recommended (in [2]) to efficiently implement progressive meshes once the PM representation has been constructed, it is very awkward to use during the construction process. Other mesh data structures (e.g., half-edge) may be more convenient to use for the construction phase of the algorithm.


Results

Bunny Simplification

Settings:

69 451 faces

9 327 faces

3 823 faces

Dragon Simplification

Settings:

50 757 faces

5 455 faces

Comparison of Different QEMs

Original Spock model (3525 faces):

After refinement to 745 faces:

  Geometry+Normals Geometry only
Standard
Memoryless

The memoryless version of the QEM gave much better results for case in which normal error was incorporated into the QEM, although there is a slight improvement in the other case as well. This result is consistent with the results of Hoppe [3].

It is clear that incorporating the normal error into the QEM signficantly degrades the quality of the resulting meshes. I postulate that this is due to the way the normals were generated for the original mesh. Since normal information was not present in the data file, normals were generated for each vertex by averaging the normals of the surrounding faces. The result is a smooth normal field with no discontinuities. However, the geometry of the original model clearly has creases (e.g., along the hairline at the forehead), so this smooth normal field is inaccurate. With normal error incorporated into the QEM, the result is a "smoothing out" of the mesh to match the normal field.

References

  1. Hoppe, Hugues, Progessive Meshes. Computer Graphics (SIGGRAPH 1996 Proceedings), pages 99-108.
  2. Hoppe, Hugues, Efficient Implementation of Progessive Meshes. Computers & Graphics, Vol. 22, No. 1, 1998, pages 27-36.
  3. Hoppe, Hugues, New Quadric Metric for Simplifying Meshes with Appearance Attributes. IEEE Visualization 1999, October 1999, pages 59-66.