Today I came across a post on Foundary's website about data analysis and matchmoving.
Alastair Barber tries to improve and speed up the pipeline with data analysis and algorithms while working with DNEG.
Matchmoving is something close to my heart. I started my Hollywood film career as a matchmove artist. I'm also quite interested in machine learning, data analysis and artificial intelligence. Thus this article grabbed my attention.
It seems that DNEG is a great choice for data analysis, having accumulated 20 years of production data.
The article does not have specific mention of how the result of the analysis is helping to speed up the process of matchmoving, but we know for sure that something is happening in the field of data and the VFX production process.
Graph source: https://www.foundry.com/trends/business/matchmoving-big-data
However there is one concrete thing I find fascinating in the article. This graph in the article actually gives an overview of the time and resources taken up by each stage in the VFX process.