Multiresolution Stochastic Texture Synthesis is a non-parametric example-based algorithm for image generation we developed at Embark Studios. This was our first exploration of automatically extracting rules from examples. See this Medium blog post for more details around the concept and why we believe it is awesome and a logical evolution of procedural systems. For more technical details, please see this NordicGames2020 talk: "More Like This, Please! Texture Synthesis and Remixing from a Single Example".
Did I mention it is opensource? It's opensource :) See our github page
WHAT CAN IT DO?
1) Generate similar-looking images from a single example image.
2) We can also provide multiple example images and the algorithm will "remix" them into a new image.
3) We can also guide the generation by providing a transformation "FROM"-"TO" in a form of guide maps
4) We can auto-generate example guide maps, which produces a style transfer-like effect.
5) We can fill-in missing information with inpaint.
6) Make non-tiling textures tile.
7) We can also combine multiple modes together. For example, multi-example guided synthesis:
Or chaining multiple stages of generation together:
WHAT IT CANNOT DO
Struggles with complex semantics beyond pixel color (unless you guide it)
Not great with regular textures (seams can become obvious)
Cannot infer new information from existing information (only operates on what’s already there)
Designed for single exemplars or very small datasets (unlike Deep Learning based approaches)
 [Opara & Stachowiak] "More Like This, Please! Texture Synthesis and Remixing from a Single Example"
 [Harrison] Image Texture Tools
 [Ashikhmin] Synthesizing Natural Textures
 [Efros & Leung] Texture Synthesis by Non-parametric Sampling
 [Wey & Levoy] Fast Texture Synthesis using Tree-structured Vector Quantization
 [De Bonet] Multiresolution Sampling Procedure for Analysis and Synthesis of Texture Images