项目作者: chrdiller

项目描述 :
Visualizes meshes, pointclouds and video flythroughs in publication quality
高级语言: Python
项目地址: git://github.com/chrdiller/mitsuba-visualize.git
创建时间: 2020-04-19T13:11:47Z
项目社区:https://github.com/chrdiller/mitsuba-visualize

开源协议:MIT License

下载


mitsuba-visualize: Automates High-Quality Mitsuba Renders

Visualizes meshes, pointclouds and video flythroughs in publication quality

  • Collection of scripts automating path-traced rendering with mitsuba (0.6.0), directly using their python API.
  • Can also render flythroughs from predefined camera poses using bezier curves and catmull-rom splines

This implementation was used for the visualizations in Dai, Angela, Christian Diller, and Matthias Nießner. “SG-NN: Sparse Generative Neural Networks for Self-Supervised Scene Completion of RGB-D Scans.” (CVPR’20)

Sample Rendering

You can find more samples, including flythroughs, in our SG-NN video.

Installation

  1. Install mitsuba 0.6.0, following the tutorial here: @sreenithyc21/10-steps-to-install-mitsuba-renderer-on-ubuntu-38a9318fbcdf">10 Steps to Install Mitsuba Renderer on Ubuntu
  2. Clone this repository and install python dependencies:
    1. git clone https://github.com/chrdiller/mitsuba-visualize
    2. cd mitsuba-visualize
    3. poetry install
  3. After building mitsuba, adjust path in set_python_path.py to point to the folder you cloned mitsuba to (has to contain subdirectory dist/python)
    1. MITSUBA_BASE = Path('/path/to/mitsuba/clone')

Usage

Enter environment: poetry shell

Single images

  • Meshes: python mesh.py ...
    ```bash
    usage: mesh.py [-h] -o OUTPUT [—scenes_list SCENES_LIST] [—scene SCENE]
    1. [--cameras CAMERAS] [--width WIDTH] [--height HEIGHT]
    2. [--samples SAMPLES] [--workers WORKERS]
    3. [input_paths [input_paths ...]]

Render directory

positional arguments:
input_paths Path(s) to directory containing all files to render

optional arguments:
-h, —help show this help message and exit
-o OUTPUT, —output OUTPUT
Path to write renderings to
—scenes_list SCENES_LIST
Path to file containing filenames to render in base
path
—scene SCENE One scene. Overrides scenes_list
—cameras CAMERAS XML file containing meshlab cameras
—width WIDTH Width of the resulting image
—height HEIGHT Height of the resulting image
—samples SAMPLES Number of integrator samples per pixel
—workers WORKERS How many concurrent workers to use

  1. - Point Clouds: ``python pointcloud.py ...``
  2. ```bash
  3. usage: Render pointcloud with mitsuba by placing little spheres at the points' positions
  4. [-h] -o OUTPUT [--radius RADIUS] [--width WIDTH] [--height HEIGHT]
  5. [--samples SAMPLES] [--workers WORKERS]
  6. input [input ...]
  7. positional arguments:
  8. input Path(s) to the ply file(s) containing a pointcloud(s)
  9. to render
  10. optional arguments:
  11. -h, --help show this help message and exit
  12. -o OUTPUT, --output OUTPUT
  13. Path to write renderings to
  14. --radius RADIUS Radius of a single point
  15. --width WIDTH Width of the resulting image
  16. --height HEIGHT Height of the resulting image
  17. --samples SAMPLES Number of integrator samples per pixel
  18. --workers WORKERS How many concurrent workers to use

Flythroughs

As in the SG-NN video

  • (Optional) Export cameras from meshlab
    • Open the mesh in meshlab and press Cmd/Ctrl + C to copy the current camera view
    • Paste into a text editor (will be a few lines of XML)
    • Save as xml file
    • Repeat for every camera keypoint
  • (Alternatively) Do not specify cameras; this will render a slightly tilted top-down turntable view
  • Usage: python flythough.py ...
    ```bash
    usage: Render a flythrough video of a scene [-h] -o OUTPUT
    1. [--remote [REMOTE [REMOTE ...]]]
    2. [--novideo] [--norender] [--keep]
    3. [--scenes_list SCENES_LIST]
    4. [--frames FRAMES]
    5. [--cameras CAMERAS]
    6. [--shutter_time SHUTTER_TIME]
    7. [--width WIDTH] [--height HEIGHT]
    8. [--fov FOV] [--samples SAMPLES]
    9. [--interpolation {catmullrom,bezier}]
    10. [--workers WORKERS]
    11. input [input ...]

positional arguments:
input Path to the ply file to render

optional arguments:
-h, —help show this help message and exit
-o OUTPUT, —output OUTPUT
Path to write output video to
—remote [REMOTE [REMOTE …]]
Urls of the remote render servers
—novideo Only render frames, do not produce video
—norender Only render video from existing frames, no rendering
—keep Whether to keep the frame images
—scenes_list SCENES_LIST
Path to file containing filenames to render in base
path
—frames FRAMES Number of frames to render (The video file will have
30fps)
—cameras CAMERAS XML file containing meshlab cameras (or path to
directory only containing such files). If set, this is
used for spline interpolation. Otherwise, a rotating
flyover is generated
—shutter_time SHUTTER_TIME
Shutter time of the moving sensor
—width WIDTH Width of the resulting image
—height HEIGHT Height of the resulting image
—fov FOV Field of view of the sensor in degrees (meshlab
default is 60)
—samples SAMPLES Number of integrator samples per pixel
—interpolation {catmullrom,bezier}
Which method to use for interpolation between control
points
—workers WORKERS How many local concurrent workers to use
```