This repository has an objective to implement Neural Style Transfer according to A Neural Algorithm of Artistic Style. Tensorflow and Python3 are used for development, and pre-trained VGG19 is adapted from CS20: "TensorFlow for Deep Learning Research", Stanford
To understand this following note, I would recommend to know the concept of convolutional neural network
Neural Style Transfer is the technique that creates a new image with the style derived from an artistic image. It can be understood easily through the examples below:
In general, when the input image is passed through feed-forward convolutional neural network, hidden layers act as a collection of filters, which extract certain features from the image. With regard to A Neural Algorithm of Artistic Style, through internal representations, it is possible to manipulate content and style of that image and, therefore, style can be transferred to other images
- Firstly, we need to build the VGG model and load pre-defined weights
- In this part, with some modification, I use the code from Chip Huyen, CS20: "TensorFlow for Deep Learning Research", Stanford
- Unlike traditional nerual network, the image variables are used as a trainable parameters while weights and biases of the model are fixed
- To calculate content loss, we use the square error of hidden representations between input image and generated image
- Particularly, this solution uses CONV4_2 as a representation layer of content
- Calculating style loss is a bit trickier. It is needed to calculate the Gram Matrix to find the correlations among filters first
- And after getting a Gram Matrix, we calculate the square error to obtain style loss
- Run style_transfer.py and generated image will be available in outputs folder
- Tensorflow
- cv2
- Numpy
- urllib