Title: Multiresolution flow visualization
Authors: Jobard, Bruno
Lefer, Wilfrid
Citation: Journal of WSCG. 2001, vol. 9, no. 1-3.
Issue Date: 2001
Publisher: Václav Skala - UNION Agency
Document type: článek
article
URI: http://hdl.handle.net/11025/15732
http://wscg.zcu.cz/wscg2001/WSCG2001_Program.htm
ISSN: 1213-6972 (print)
1213-6980 (CD-ROM)
1213-6964 (online)
Keywords: vizualizace proudění;proudnice;vícenásobné reprezentace;interaktivní vizualizace;velké vektorové pole
Keywords in different language: flow visualization;streamlines;multiresolution representation;interactive visualization;large vector fields
Abstract in different language: Flow visualization has been an active research field for several years and various techniques have been proposed to visualize vector fields, streamlines and textures being the most effective and popular ones. While streamlines are suitable to get rough information on the behavior of the flow, textures depict the flow properties at the pixel level. Depending on the situation the suitable representation could be streamlines or texture. This paper presents a method to compute a sequence of streamline-based images of a vector field with different densities, ranging from sparse to texturelike representations. It is based on an effective streamline placement algorithm and a production scheme that recalls those used in the multiresolution theory. Indeed a streamline defined at level J of the hierarchy is defined for all levels J’>J. A viewer allows us to interactively select the desired density while zooming in and out in a vector field. The density of streamlines in the image can also be automatically computed as a function of a derived quantity, such as velocity or vorticity.
Rights: © Václav Skala - UNION Agency
Appears in Collections:Volume 9, number 1-3 (2001)

Files in This Item:
File Description SizeFormat 
R183.pdfPlný text2,29 MBAdobe PDFView/Open


Please use this identifier to cite or link to this item: http://hdl.handle.net/11025/15732

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.