Style Transfer for 360 Video
style transfer for 360 photos and videos.
for more information on approach and dataset, refer to “perceptual losses for real time style transfer and super resolution.”
building torch/lua from source for osx ruined my weekend. it is not for the faint of heart so you should do one of the following:
docker pull mynameisvinn/unclip_style_transfer
git clone https://github.com/mynameisvinn/unclip_style_transfer
cd unclip_style_transfer
docker build -t mynameisvinn/unclip_style_transfer .
assuming youve placed images in /date/in, from command line, do
docker run -v /Users/vincenttang/dropbox/temp/unclip_style_transfer/data:/root/fast-neural-style/data mynameisvinn/unclip_style_transfer th fast_neural_style.lua \
-model models/eccv16/the_wave.t7 \
-image_size 200 \
-input_dir data/in/ \
-output_dir data/out/
if successful, you should see modified images in /data/out.
from command line, do
docker run -it -v /Users/vincenttang/dropbox/temp/unclip_style_transfer/data:/root/fast-neural-style/data mynameisvinn/unclip_style_transfer
then, from container, do
th fast_neural_style.lua \
-model models/eccv16/the_wave.t7 \
-image_size 200 \
-input_dir data/in/ \
-output_dir data/out/