iOS Core ML implementation of waifu2x
Video support based on Metal Performance Shaders is also included in this repo. Models are loaded directly from Core ML models (see CoreML-MPS). It is meant to be run on macOS with a powerful discerete GPU through Mac Catalyst. Running it on iOS devices will significantly drop battery life and cause thermal issues. Most likely it will crash immediately.
The author is not responsible of any damage to your device.
After cloning this repo, remember to update submodules:
git submodule update --init
Video Teston the top to pick video files. Output path will be printed in the console (starts with
RGBcolor space works fine. Others should be converted to
RGBbefore processing otherwise output image will be broken. Alpha channel is scaled using bicubic interpolation. Generally it runs on GPU. It automatically falls back to CPU if image is too large for Metal to process, which is extremely slow. (A bad idea)
The built-in video decoder on iOS and macOS is very limited. If your video doesn't work, you can convert to a supported format using ffmpeg:
ffmpeg -i -c:v libx264 -preset ultrafast -pix_fmt yuv420p -c:a aac -f mp4 .mp4
This repository includes all the models converted from waifu2x-caffe. If you want to dig into Core ML, it is recommended that you should convert them by yourself.
You can use the same method described in MobileNet-CoreML. You should not specify any input and output layer in python script.
A working model should have input and output like the following example:
denoise level 2with
scale 2xmodel on anime-style images from Pixiv.
Image resolution: `600849`*
Image resolution: `30003328`*
Device: iPad Image resolution: `30003328`*
|Before using upconv models||141.7||1.86|
|After using upconv models||63.6||1.28|
|After adding pipeline on output||56.8||1.28|
|After adding pipeline on prediction||49.2||0.38|
|Pure MPSCNN implementation*||29.6||1.06|
*: With crop size of 384 and double command buffers.
About 1.78 frames per second while scaling
1080p -> 2160pon 5700XT GPU.
Runs out of memory and crashes immediately with the same video on iOS with 4GB memory.